We present non-invasive means that detect unilateral hand motor brain activity from one individual and subsequently stimulate the somatosensory area of another individual, thus, enabling the remote hemispheric link between each brain hemisphere in humans. Healthy participants were paired as a sender and a receiver. A sender performed a motor imagery task of either right or left hand, and associated changes in the electroencephalogram (EEG) mu rhythm (8–10 Hz) originating from either hemisphere were programmed to move a computer cursor to a target that appeared in either left or right of the computer screen. When the cursor reaches its target, the outcome was transmitted to another computer over the internet, and actuated the focused ultrasound (FUS) devices that selectively and non-invasively stimulated either the right or left hand somatosensory area of the receiver. Small FUS transducers effectively allowed for the independent administration of stimulatory ultrasonic waves to somatosensory areas. The stimulation elicited unilateral tactile sensation of the hand from the receiver, thus establishing the hemispheric brain-to-brain interface (BBI). Although there was a degree of variability in task accuracy, six pairs of volunteers performed the BBI task in high accuracy, transferring approximately eight commands per minute. Linkage between the hemispheric brain activities among individuals suggests the possibility for expansion of the information bandwidth in the context of BBI.
Side effects occur when excessive or low doses of analgesics are administered compared to the required amount to mediate the pain induced during surgery. It is important to accurately assess the pain level of the patient during surgery. We proposed a pain classifier based on a deep belief network (DBN) using photoplethysmography (PPG). Our DBN learned about a complex nonlinear relationship between extracted PPG features and pain status based on the numeric rating scale (NRS). A bagging ensemble model was used to improve classification performance. The DBN classifier showed better classification results than multilayer perceptron neural network (MLPNN) and support vector machine (SVM) models. In addition, the classification performance was improved when the selective bagging model was applied compared with the use of each single model classifier. The pain classifier based on DBN using a selective bagging model can be helpful in developing a pain classification system.
ObjectivesThe phase characteristics of the representative frequency components of the Electroencephalogram (EEG) can be a means of understanding the brain functions of human senses and perception. In this paper, we found out that visual evoked potential (VEP) is composed of the dominant multi-band component signals of the EEG through the experiment.MethodsWe analyzed the characteristics of VEP based on the theory that brain evoked potentials can be decomposed into phase synchronized signals. In order to decompose the EEG signal into across each frequency component signals, we extracted the signals in the time-frequency domain with high resolution using the empirical mode decomposition method. We applied the Hilbert transform (HT) to extract the signal and synthesized it into a frequency band signal representing VEP components. VEP could be decomposed into phase synchronized δ, θ, α, and β frequency signals. We investigated the features of visual brain function by analyzing the amplitude and latency of the decomposed signals in phase synchronized with the VEP and the phase-locking value (PLV) between brain regions.ResultsIn response to visual stimulation, PLV values were higher in the posterior lobe region than in the anterior lobe. In the occipital region, the PLV value of theta band was observed high.ConclusionsThe VEP signals decomposed into constituent frequency components through phase analysis can be used as a method of analyzing the relationship between activated signals and brain function related to visual stimuli.
This paper presents a brain-machine interface (BMI)-based motor rehabilitation system for detecting the correct onset-time of motor intention by Microsoft Kinect TM . BMI-based motor rehabilitation without time delay is a key challenge due to muscle spasticity of patient for detecting the motor intention in a real-time rehabilitation system. To circumvent the time delay problem, we employed Kinect on a BMI rehabilitation system which calibrates the onset-time of brain signal and assesses the behavioral pattern of motor activity. In this paper, we demonstrate that the onset-time calibration of brain response by the Kinect-BMI rehabilitation system can minimize time delay and estimate motor intention before the actual movement of the user. In addition, visual interaction through Kinect would increase engagement for a given task and could positively encourage motor recovery for patients.
Emotion affects many parts of human life such as learning ability, behavior and judgment. It is important to understand human nature. Emotion can only be inferred from facial expressions or gestures, what it actually is. In particular, emotion is difficult to classify not only because individuals feel differently about emotion but also because visually induced emotion does not sustain during whole testing period. To solve the problem, we acquired bio-signals and extracted features from those signals, which offer objective information about emotion stimulus. The emotion pattern classifier was composed of unsupervised learning algorithm with hidden nodes and feature vectors. Restricted Boltzmann machine (RBM) based on probability estimation was used in the unsupervised learning and maps emotion features to transformed dimensions. The emotion was characterized by non-linear classifiers with hidden nodes of a multi layer neural network, named deep belief network (DBN). The accuracy of DBN (about 94 %) was better than that of back-propagation neural network (about 40 %). The DBN showed good performance as the emotion pattern classifier.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.