In the modern era, the cyberbullying (CB) is an intentional and aggressive action of an individual or a group against a victim via electronic media. The consequence of CB is increasing alarmingly, affecting the victim either physically or psychologically. This allows the use of automated detection tools, but research on such automated tools is limited due to poor datasets or elimination of wide features during the CB detection. In this paper, an integrated model is proposed that combines both the feature extraction engine and classification engine from the input raw text datasets from a social media engine. The feature extraction engine extracts the psychological features, user comments, and the context into consideration for CB detection. The classification engine using artificial neural network (ANN) classifies the results, and it is provided with an evaluation system that either rewards or penalizes the classified output. The evaluation is carried out using Deep Reinforcement Learning (DRL) that improves the performance of classification. The simulation is carried out to validate the efficacy of the ANN-DRL model against various metrics that include accuracy, precision, recall, and f-measure. The results of the simulation show that the ANN-DRL has higher classification results than conventional machine learning classifiers.
With increasing advancements in the field of telecommunication, the attainment of a higher data transfer rate is essentially a greater need to meet high-performance communication. The exploitation of the fuzzy system in the wireless telecommunication systems, especially in Fifth Generation Mobile Networks (or) 5G networks is a vital paradigm in telecommunication markets. A comprehensive survey is dealt in the paper, where it initially reviews the basic understanding of fuzzy systems over 5G telecommunication. The literature studies are collected from various repositories that include reference materials, Internet, and other books. The collection of articles is based on empirical or evidence-based from various peer-reviewed journals, conference proceedings, dissertations, and theses. Most of the existing soft computing models are streamlined to certain applications of 5G networking. Firstly, it is hence essential to provide the readers to find research gaps and new innovative models on wide varied applications of 5G. Secondly, it deals with the scenarios in which the fuzzy systems are developed under the 5G platform. Thirdly, it discusses the applicability of fuzzy logic systems on various 5G telecommunication applications. Finally, the paper derives the conclusions associated with various studies on the fuzzy systems that have been utilized for the improvement of 5G telecommunication systems.
With new telecommunications engineering applications, the cognitive radio (CR) networkbased internet of things (IoT) resolves the bandwidth problem and spectrum problem. However, the CR-IoT routing method sometimes presents issues in terms of road finding, spectrum resource diversity and mobility. This study presents an upgradable cross-layer routing protocol based on CR-IoT to improve routing efficiency and optimize data transmission in a reconfigurable network. In this context, the system is developing a distributed controller which is designed with multiple activities, including load balancing, neighbourhood sensing and machine-learning path construction. The proposed approach is based on network traffic and load and various other network metrics including energy efficiency, network capacity and interference, on an average of 2 bps/Hz/W. The trials are carried out with conventional models, demonstrating the residual energy and resource scalability and robustness of the reconfigurable CR-IoT. INTRODUCTIONWireless networks reconfigurable (RWN) is mainly an adaptive network firmware developed to satisfy the demands of modern applications, changing network topologies and changing network conditions. In particular, the RWM can be reconfiguredThis is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.
In recent time, data analysis using machine learning accelerates optimized solutions on clinical healthcare systems. The machine learning methods greatly offer an efficient prediction ability in diagnosis system alternative with the clinicians. Most of the systems operate on the extracted features from the patients and most of the predicted cases are accurate. However, in recent time, the prevalence of COVID-19 has emerged the global healthcare industry to find a new drug that suppresses the pandemic outbreak. In this paper, we design a Deep Neural Network (DNN) model that accurately finds the protein-ligand interactions with the drug used. The DNN senses the response of protein-ligand interactions for a specific drug and identifies which drug makes the interaction that combats effectively the virus. With limited genome sequence of Indian patients submitted to the GISAID database, we find that the DNN system is effective in identifying the protein-ligand interactions for a specific drug.
Cloud storage provides a potential solution replacing physical disk drives in terms of prominent outsourcing services. A threaten from an untrusted server affects the security and integrity of the data. However, the major problem between the data integrity and cost of communication and computation is directly proportional to each other. It is hence necessary to develop a model that provides the trade-off between the data integrity and cost metrics in cloud environment. In this paper, we develop an integrity verification mechanism that enables the utilisation of cryptographic solution with algebraic signature. The model utilises elliptic curve digital signature algorithm (ECDSA) to verify the data outsources. The study further resists the malicious attacks including forgery attacks, replacing attacks and replay attacks. The symmetric encryption guarantees the privacy of the data. The simulation is conducted to test the efficacy of the algorithm in maintaining the data integrity with reduced cost. The performance of the entire model is tested against the existing methods in terms of their communication cost, computation cost, and overhead cost. The results of simulation show that the proposed method obtains reduced computational of 0.25% and communication cost of 0.21% than other public auditing schemes.
In recent times, the utility and privacy are trade-off factors with the performance of one factor tends to sacrifice the other. Therefore, the dataset cannot be published without privacy. It is henceforth crucial to maintain an equilibrium between the utility and privacy of data. In this paper, a novel technique on trade-off between the utility and privacy is developed, where the former is developed with a metaheuristic algorithm and the latter is developed using a cryptographic model. The utility is carried out with the process of clustering, and the privacy model encrypts and decrypts the model. At first, the input datasets are clustered, and after clustering, the privacy of data is maintained. The simulation is conducted on the manufacturing datasets over various existing models. The results show that the proposed model shows improved clustering accuracy and data privacy than the existing models. The evaluation with the proposed model shows a trade-off privacy preservation and utility clustering in smart manufacturing datasets.
Considering task dependencies, the balancing of the Internet of Health Things (IoHT) scheduling is considered important to reduce the make span rate. In this paper, we developed a smart model approach for the best task schedule of Hybrid Moth Flame Optimization (HMFO) for cloud computing integrated in the IoHT environment over e-healthcare systems. The HMFO guarantees uniform resource assignment and enhanced quality of services (QoS). The model is trained with the Google cluster dataset such that it learns the instances of how a job is scheduled in cloud and the trained HMFO model is used to schedule the jobs in real time. The simulation is conducted on a CloudSim environment to test the scheduling efficacy of the model in hybrid cloud environment. The parameters used by this method for the performance assessment include the use of resources, response time, and energy utilization. In terms of response time, average run time, and lower costs, the hybrid HMFO approach has offered increased response rate with reduced cost and run time than other methods.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.