Major depressive disorder (MDD) is a mood state that is not usually associated with vision problems. Recent research has found that the inhibitory neurotransmitter GABA levels in the occipital brain have dropped. Aim. The aim of the research is to evaluate mental workload by single channel electroencephalogram (EEG) approach through visual-motor activity and comparison of parameter among depressive disorder patient and in control group. Method. Two tests of a visual-motor task similar to reflect drawings were performed in this study to compare the visual information processing of patients with depression to that of a placebo group. The current study looks into the accuracy of monitoring cognitive burden with single-channel portable EEG equipment. Results. The alteration of frontal brain movement in reaction to fluctuations in cognitive burden stages generated through various vasomotor function was examined. By applying a computerised oculomotor activity analogous to reflector image diagram, we found that the complexity of the path to be drawn was more important than the real time required accomplishing the job in determining perceived difficulty in depressive disorder patients. The overall perceived difficulty of the exercise is positively linked with EEG activity measured from the motor cortex region at the start of every experiment test. The average rating for task completion for depression patients and in control group observed and no statistical significance association reported between rating scale and time spent on each trial ( p = 1.43 ) for control group while the normalised perceived difficulty rating had 0.512, 0.623, and 0.821 correlations with the length of the pathway, the integer of inclination in the pathway, and the time spent to complete every experiment test, respectively ( p < 0.0001 ) among depression patients. The findings imply that alterations in comparative cognitive burden levels during an oculomotor activity considerably modify frontal EEG spectrum. Conclusion. Patients with depression perceived the optical illusion in the arrays as weaker, resulting in a little bigger disparity than individuals who were not diagnosed with depression. This discovery provided light on the prospect of adopting a user-friendly mobile EEG technology to assess mental workload in everyday life.
Over the last decade, the healthcare sector has accelerated its digitization and electronic health records (EHRs). As information technology progresses, the notion of intelligent health also gathers popularity. By combining technologies such as the internet of things (IoT) and artificial intelligence (AI), innovative healthcare modifies and enhances traditional medical systems in terms of efficiency, service, and personalization. On the other side, intelligent healthcare systems are incredibly vulnerable to data breaches and other malicious assaults. Recently, blockchain technology has emerged as a potentially transformative option for enhancing data management, access control, and integrity inside healthcare systems. Integrating these advanced approaches in agriculture is critical for managing food supply chains, drug supply chains, quality maintenance, and intelligent prediction. This study reviews the literature, formulates a research topic, and analyzes the applicability of blockchain to the agriculture/food industry and healthcare, with a particular emphasis on AI and IoT. This article summarizes research on the newest blockchain solutions paired with AI technologies for strengthening and inventing new technological standards for the healthcare ecosystems and food industry.
Nowadays, the demand for low-cost, compact, and interference rejected antennas with ultrawideband capability has been increased. Metamaterial-inspired loaded structures have capability of providing exceptional solutions for narrow range wireless communication and low consuming power while transmitting and receiving the signal. It is a difficult task to construct ideal metamaterial-inspired antennas with a variety of features such as extremely large bandwidth, notching out undesirable bands, and frequency. Metamaterial-inspired structures such as SRR and CSRR, and triangle-shaped TCSRR are most commonly used structures to achieve optimized characteristics in ultrawideband antennas. In this paper, an extensive literature survey is accomplished to get conception about metamaterial-inspired patch antennas. This review paper elucidates variants of metamaterial-inspired structures/resonators utilized in order to acquire sundry applications such as WiMAX, WLAN, satellite communication, and radar. Various researchers have used different methodology to design, stimulate, and analyze the metamaterial-inspired structure loaded antennas. Also, the results of different metamaterial-inspired antennas such as bandwidth, gain, return loss, and resonant frequency have been also represented in this paper. This manuscript also gives brief introduction about the metamaterial, its types, and then its application in microstrip patch antenna over the last decade. This manuscript throws light over the various studies conducted in the field of metamaterial-inspired antenna in the past. It has been seen that with the inclusion of metamaterial in conventional antenna, various characteristics such as impedance bandwidth, reflection coefficient, gain, and directivity have been improved. Also, frequency rejection of narrow bands which exits in ultrawideband frequency range can be done by embedding metamaterial-inspired structures such as SRR and CSRR.
Cancer is one of the top causes of death globally. Recently, microarray gene expression data has been used to aid in cancer’s effective and early detection. The use of DNA microarray technology to uncover information from the expression levels of thousands of genes has enormous promise. The DNA microarray technique can determine the levels of thousands of genes simultaneously in a single experiment. The analysis of gene expression is critical in many disciplines of biological study to obtain the necessary information. This study analyses all the research studies focused on optimizing gene selection for cancer detection using artificial intelligence. One of the most challenging issues is figuring out how to extract meaningful information from massive databases. Deep Learning architectures have performed efficiently in numerous sectors and are used to diagnose many other chronic diseases and to assist physicians in making medical decisions. In this study, we have evaluated the results of different optimizers on a RNA sequence dataset. The Deep learning algorithm proposed in the study classifies five different forms of cancer, including kidney renal clear cell carcinoma (KIRC), Breast Invasive Carcinoma (BRCA), lung adenocarcinoma (LUAD), Prostate Adenocarcinoma (PRAD) and Colon Adenocarcinoma (COAD). The performance of different optimizers like Stochastic gradient descent (SGD), Root Mean Squared Propagation (RMSProp), Adaptive Gradient Optimizer (AdaGrad), and Adaptive Momentum (AdaM). The experimental results gathered on the dataset affirm that AdaGrad and Adam. Also, the performance analysis has been done using different learning rates and decay rates. This study discusses current advancements in deep learning-based gene expression data analysis using optimized feature selection methods.
<abstract> <p>One of the most effective approaches for identifying breast cancer is histology, which is the meticulous inspection of tissues under a microscope. The kind of cancer cells, or whether they are cancerous (malignant) or non-cancerous, is typically determined by the type of tissue that is analyzed by the test performed by the technician (benign). The goal of this study was to automate IDC classification within breast cancer histology samples using a transfer learning technique. To improve our outcomes, we combined a Gradient Color Activation Mapping (Grad CAM) and image coloring mechanism with a discriminative fine-tuning methodology employing a one-cycle strategy using FastAI techniques. There have been lots of research studies related to deep transfer learning which use the same mechanism, but this report uses a transfer learning mechanism based on lightweight Squeeze Net architecture, a variant of CNN (Convolution neural network). This strategy demonstrates that fine-tuning on Squeeze Net makes it possible to achieve satisfactory results when transitioning generic features from natural images to medical images.</p> </abstract>
Kidney failure occurs whenever the kidney stops to operate properly and would be unable to cleanse or refine the bloodstream as it should. Chronic kidney disease (CKD) is a potentially fatal consequence. If this condition is diagnosed early, its progression can be delayed. There are various factors that increase the likelihood of developing kidney failure. As a consequence, in order to detect this potentially fatal condition early on, these risk factors must be checked on a regular basis before the individual’s health deteriorates. Furthermore, it lowers the cost of therapy. The chronic kidney or renal disease will be recognized in this work utilizing fuzzy and adaptive neural fuzzy inference systems. The fundamental purpose of this initiative is to enhance the precision of medical diagnostics used to diagnose illnesses. Nephron functioning, glucose levels, systolic and diastolic blood pressure, maturity level, weight and height, and smoking are all elements to consider while developing a fuzzy and adaptable neural fuzzy inference system. The output variable describes a specific patient’s stage of chronic renal disease based on input factors such as stage 1, stage 2, stage 3, stage 4, and stage 5. The outcome will show the present stage of a patient’s kidney. As a result, these methods can assist specialists in determining the stage of chronic renal disease. MATLAB software is used to create the fuzzy and neural fuzzy inference systems.
Modern healthcare is a data-intensive domain representing an amalgamation of long-term electronic medical records, real-time patient monitoring data, and more recently sensor data from wearable computing. Blockchain in healthcare can address a multitude of challenges in healthcare, including care coordination, data security, and interoperability concerns, as technology advances. Technical challenges such as processing speed and massive data duplication will be resolved as improved technology. This data needs to be accessed seamlessly by a multitude of players from the general physicians to hospitals, medical service providers to insurance companies. Thus, healthcare-related data needs to be verified, securely stored, and shared while maintaining patient privacy and control over what portion of the data is shared, with whom it is shared, and how it is consumed. Blockchain has emerged as a technology stack of choice for distributed authentication, secure storage, and automated analysis of stored data in diverse domains including healthcare. Its distributed nature is a natural fit to the healthcare ecosystem with multiple participating entities and patients in different geographic locations. In this paper, we review the technology of blockchain to the healthcare domain analyzing and classifying work done in the field. Open challenges are identified and future directions for research are also presented.
Multiagent has become a multidomain intersecting hot issue as artificial intelligence technology has advanced in the industrial sector. Multiagent system creation control has lately undergone a lot of academic research, and it has got applications in a wide range of fields, including drag reduction, monitoring, telecommunications relay, and searching. A matrix theory using algebraic graph theory for automatic control principle learning is proposed based on multiagent system collaborative control strategy method to study the application of PID control method in human-machine cluster multi-intelligence system. The method first presents the determination of the stability set of the low-order controller parameters for the multidelay single-input single-output system with complex coefficients; through matrix theory, the multi-intelligence system is decomposed into multiple subsystems and the problem is transformed into subsystem stability analysis problem; thus, the complexity of the system is reduced. It has been proved that the power system device adopted by the UAV hardware platform can make the UAV flight time up to 24.7 min and additional load up to 1.5 kg. Based on the simulation analysis of PID control algorithm, the PID parameters of UAV are adjusted, which improves the parameter tuning efficiency and enhances the response speed of UAV to error and the stability of motion. The experimental findings suggest that the UAV hardware platform in the UAV control system has high dynamic performance. Good PID settings allow the UAV to respond to control orders quickly and correctly; also, the data received by the sensor decrease the complexity of operating several UAVs at the same time and minimize the operators’ burden.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.