This study proposes a novel framework to improve intrusion detection system (IDS) performance based on the data collected from the Internet of things (IoT) environments. The developed framework relies on deep learning and metaheuristic (MH) optimization algorithms to perform feature extraction and selection. A simple yet effective convolutional neural network (CNN) is implemented as the core feature extractor of the framework to learn better and more relevant representations of the input data in a lower-dimensional space. A new feature selection mechanism is proposed based on a recently developed MH method, called Reptile Search Algorithm (RSA), which is inspired by the hunting behaviors of the crocodiles. The RSA boosts the IDS system performance by selecting only the most important features (an optimal subset of features) from the extracted features using the CNN model. Several datasets, including KDDCup-99, NSL-KDD, CICIDS-2017, and BoT-IoT, were used to assess the IDS system performance. The proposed framework achieved competitive performance in classification metrics compared to other well-known optimization methods applied for feature selection problems.
The development of smart network infrastructure of the Internet of Things (IoT) faces the immense threat of sophisticated Distributed Denial-of-Services (DDoS) security attacks. The existing network security solutions of enterprise networks are significantly expensive and unscalable for IoT. The integration of recently developed Software Defined Networking (SDN) reduces a significant amount of computational overhead for IoT network devices and enables additional security measurements. At the prelude stage of SDN-enabled IoT network infrastructure, the sampling based security approach currently results in low accuracy and low DDoS attack detection. In this paper, we propose an Adaptive Machine Learning based SDN-enabled Distributed Denial-of-Services attacks Detection and Mitigation (AMLSDM) framework. The proposed AMLSDM framework develops an SDN-enabled security mechanism for IoT devices with the support of an adaptive machine learning classification model to achieve the successful detection and mitigation of DDoS attacks. The proposed framework utilizes machine learning algorithms in an adaptive multilayered feed-forwarding scheme to successfully detect the DDoS attacks by examining the static features of the inspected network traffic. In the proposed adaptive multilayered feed-forwarding framework, the first layer utilizes Support Vector Machine (SVM), Naive Bayes (NB), Random Forest (RF), k-Nearest Neighbor (kNN), and Logistic Regression (LR) classifiers to build a model for detecting DDoS attacks from the training and testing environment-specific datasets. The output of the first layer passes to an Ensemble Voting (EV) algorithm, which accumulates the performance of the first layer classifiers. In the third layer, the adaptive frameworks measures the real-time live network traffic to detect the DDoS attacks in the network traffic. The proposed framework utilizes a remote SDN controller to mitigate the detected DDoS attacks over Open Flow (OF) switches and reconfigures the network resources for legitimate network hosts. The experimental results show the better performance of the proposed framework as compared to existing state-of-the art solutions in terms of higher accuracy of DDoS detection and low false alarm rate.
The convergence of the Internet, sensor networks, and Radio Frequency Identification (RFID) systems has ushered to the concept of Internet of Things (IoT) which is capable of connecting daily things, making them smart through sensing, reasoning, and cooperating with other things. Further, RFID technology enables tracking of an object and assigning it a unique ID. IoT has the potential for a wide range of applications relating to healthcare, environment, transportation, cities… Moreover, the middleware is a basic component in the IoT architecture. It handles heterogeneity issues among IoT devices and provides a common framework for communication. More recently, the interest has focusing on developing publish/subscribe middleware systems for the IoT to allow asynchronous communication between the IoT devices. The scope of our paper is to study routing protocols for publish/subscribe schemes that include content and context-based routing. We propose an Energy-Efficient Content-Based Routing (EECBR) protocol for the IoT that minimizes the energy consumption. The proposed algorithm makes use of a virtual topology that is constructed in a centralized manner and then routes the events from the publishers to the intended interested subscribers in a distributed manner. EECBR has been simulated using Omnet++. The simulation results show that EECBR has a significant performance in term of the energy variance compared to the other schemes.
An electrocardiogram (ECG) is an essential piece of medical equipment that helps diagnose various heart-related conditions in patients. An automated diagnostic tool is required to detect significant episodes in long-term ECG records. It is a very challenging task for cardiologists to analyze long-term ECG records in a short time. Therefore, a computer-based diagnosis tool is required to identify crucial episodes. Myocardial infarction (MI) and conduction disorders (CDs), sometimes known as heart blocks, are medical diseases that occur when a coronary artery becomes fully or suddenly stopped or when blood flow in these arteries slows dramatically. As a result, several researchers have utilized deep learning methods for MI and CD detection. However, there are one or more of the following challenges when using deep learning algorithms: (i) struggles with real-life data, (ii) the time after the training phase also requires high processing power, (iii) they are very computationally expensive, requiring large amounts of memory and computational resources, and it is not easy to transfer them to other problems, (iv) they are hard to describe and are not completely understood (black box), and (v) most of the literature is based on the MIT-BIH or PTB databases, which do not cover most of the crucial arrhythmias. This paper proposes a new deep learning approach based on machine learning for detecting MI and CDs using large PTB-XL ECG data. First, all challenging issues of these heart signals have been considered, as the signal data are from different datasets and the data are filtered. After that, the MI and CD signals are fed to the deep learning model to extract the deep features. In addition, a new custom activation function is proposed, which has fast convergence to the regular activation functions. Later, these features are fed to an external classifier, such as a support vector machine (SVM), for detection. The efficiency of the proposed method is demonstrated by the experimental findings, which show that it improves satisfactorily with an overall accuracy of 99.20% when using a CNN for extracting the features with an SVM classifier.
The cloud computing paradigm is evolving rapidly to address the challenges of new emerging paradigms, such as the Internet of Things (IoT) and fog computing. As a result, cloud services usage is increasing dramatically with the recent growth of IoT-based applications. To successfully fulfill application requirements while efficiently harnessing cloud computing power, intelligent scheduling approaches are required to optimize the scheduling of IoT application tasks on computing resources. In this paper, the chimp optimization algorithm (ChOA) is incorporated with the marine predators algorithm (MPA) and disruption operator to determine the optimal solution to IoT applications’ task scheduling. The developed algorithm, called CHMPAD, aims to avoid entrapment in the local optima and improve the exploitation capability of the basic ChOA as its main drawbacks. Experiments are conducted using synthetic and real workloads collected from the Parallel Workload Archive to demonstrate the applicability and efficiency of the presented CHMPAD method. The simulation findings reveal that CHMPAD can achieve average makespan time improvements of 1.12–43.20% (for synthetic workloads), 1.00–43.43% (for NASA iPSC workloads), and 2.75–42.53% (for HPC2N workloads) over peer scheduling algorithms. Further, our evaluation results suggest that our proposal can improve the throughput performance of fog computing.
Recently, the 6G-enabled Internet of Medical Things (IoMT) has played a key role in the development of functional health systems due to the massive data generated daily from the hospitals. Therefore, the automatic detection and prediction of future risks such as pneumonia and retinal diseases are still under research and study. However, traditional approaches did not yield good results for accurate diagnosis. In this paper, a robust 6G-enabled IoMT framework is proposed for medical image classification with an ensemble learning (EL)-based model. EL is achieved using MobileNet and DenseNet architecture as a feature extraction backbone. In addition, the developed framework uses a modified honey badger algorithm (HBA) based on Levy flight (LFHBA) as a feature selection method that aims to remove the irrelevant features from those extracted features using the EL model. For evaluation of the performance of the proposed framework, the chest X-ray (CXR) dataset and the optical coherence tomography (OCT) dataset were employed. The accuracy of our technique was 87.10% on the CXR dataset and 94.32% on OCT dataset—both very good results. Compared to other current methods, the proposed method is more accurate and efficient than other well-known and popular algorithms.
The integration of the Internet of Things with machine learning in different disciplines has benefited from recent technological advancements. In medical IoT, the fusion of these two disciplines can be extremely beneficial as it allows the creation of a receptive and interconnected environment and offers a variety of services to medical professionals and patients. Doctors can make early decisions to save a patient's life when disease forecasts are made early. IoT sensor captures the data from the patients, and machine learning techniques are used to analyze the data and predict the presence of the fatal disease i.e., diabetes. The goal of this research is to make a smart patient's health monitoring system based on machine learning that helps to detect the presence of a chronic disease in patient early and accurately. For the implementation, the diabetic dataset has been used. In order to detect the presence of the fatal disease, six different machine learning techniques are used i.e., Support Vector Machine (SVM), Logistic Regression, Artificial Neural Network (ANN), Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), and Long Short-Term Memory (LSTM). The performance of the proposed model is evaluated by using four evaluation metrics i.e., accuracy, precision, recall, and F1-Score. The RNN outperformed remaining algorithms in terms of accuracy (81%), precision (75%), and F1-Score (65%). However, the recall (56%) for ANN was higher as compared to SVM and logistic regression, CNN, RNN, and LSTM. With the help of this proposed patient's health monitoring system, doctors will be able to diagnose the presence of the disease earlier.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.