Image processing plays a vital role in many areas such as healthcare, military, scientific and business due to its wide variety of advantages and applications. Detection of computed tomography (CT) liver disease is one of the difficult tasks in the medical field. Hand crafted features and classifications are the two types of methods used in the previous approaches, to classify liver disease. But these classification results are not optimal. In this article, we propose a novel method utilizing deep belief network (DBN) with grasshopper optimization algorithm (GOA) for liver disease classification. Initially, the image quality is enhanced by preprocessing techniques and then features like texture, color and shape are extracted. The extracted features are reduced by utilizing the dimensionality reduction method like principal component analysis (PCA). Here, the DBN parameters are optimized using GOA for recognizing liver disease. The experiments are performed on the real time and open source CT image datasets which embraces normal, cyst, hepatoma, and cavernous hemangiomas, fatty liver, metastasis, cirrhosis, and tumor samples. The proposed method yields 98% accuracy, 95.82% sensitivity, 97.52% specificity, 98.53% precision, and 96.8% F-1 score in simulation process when compared with other existing techniques. K E Y W O R D S deep belief network (DBN), grasshopper optimization algorithm (GOA), liver disease classification, principal component analysis (PCA)
The energy efficient and delay are the two important optimization issues in the mobile adhoc network (MANET), where the nodes move randomly at any direction with limited battery life, resulting in occasional change of network topology. In this article, a hybrid fruit fly optimization algorithm and whale optimization algorithm (FOA-WOA) is proposed for energy efficient with delay aware cluster head (CH) selection. The major objective of the proposed method is "to solve the problems of energy efficient with delay and develop a clustering mechanism". The performance of the hybrid FOA-WOA is evaluated based on packet delivery ratio (PDR), delay, energy consumption, and throughput. Moreover, the proposed method is analyzed with two existing algorithms, like ant colony optimization (ACO) and genetic algorithm (GA). The experimental results show that the proposed method attains 11.6% better than ACO and 1.8% better than GA based on packet delivery ratio, 57.6% better than ACO and 27.3% better than GA based on delay and 15.3% better than ACO and 36.4% better than GA based on energy consumption.
Brain medical image classification is an essential procedure in Computer-Aided Diagnosis (CAD) systems. Conventional methods depend specifically on the local or global features. Several fusion methods have also been developed, most of which are problem-distinct and have shown to be highly favorable in medical images. However, intensity-specific images are not extracted. The recent deep learning methods ensure an efficient means to design an end-to-end model that produces final classification accuracy with brain medical images, compromising normalization. To solve these classification problems, in this paper, Histogram and Time-frequency Differential Deep (HTF-DD) method for medical image classification using Brain Magnetic Resonance Image (MRI) is presented. The construction of the proposed method involves the following steps. First, a deep Convolutional Neural Network (CNN) is trained as a pooled feature mapping in a supervised manner and the result that it obtains are standardized intensified pre-processed features for extraction. Second, a set of time-frequency features are extracted based on time signal and frequency signal of medical images to obtain time-frequency maps. Finally, an efficient model that is based on Differential Deep Learning is designed for obtaining different classes. The proposed model is evaluated using National Biomedical Imaging Archive (NBIA) images and validation of computational time, computational overhead and classification accuracy for varied Brain MRI has been done.
Simultaneous wireless information and power transfer (SWIPT) has given new opportunities for dealing with the energe shortage problem in wireless networks.Green transmission for 5G cellular networks of mobile cloud access networks based on SWIPT is being examined. Considering SWIPT as a future potential solution for increasing the battery life, this technique improves energy efficiency (EE). One of the technologies is wireless communication to transfter the power used to give sufficient resources to energy-constrained networks that have consequences for 5G and the internet of things (IoT), energy efficiency, co-operative communication and suitable are supported by the SWIPT. To enhance the capacity, data rate improvement, and better performance of quality of services of further networks. In addition to these criteria, it is also our moral responsibility to protect the environment of wireless networks by lowering power usage. As a result, green communication is a critical requirement. We looked at a variety of strategies for power optimization in the impending 5G network in this article. The utilization of relays and microcells to enhance the network’s energy efficiency is the main focus. The many relaying scenarios for next-generation networks have been discussed.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.