Background Non-invasive brain–computer interfaces (BCIs) have been developed for realizing natural bi-directional interaction between users and external robotic systems. However, the communication between users and BCI systems through artificial matching is a critical issue. Recently, BCIs have been developed to adopt intuitive decoding, which is the key to solving several problems such as a small number of classes and manually matching BCI commands with device control. Unfortunately, the advances in this area have been slow owing to the lack of large and uniform datasets. This study provides a large intuitive dataset for 11 different upper extremity movement tasks obtained during multiple recording sessions. The dataset includes 60-channel electroencephalography, 7-channel electromyography, and 4-channel electro-oculography of 25 healthy participants collected over 3-day sessions for a total of 82,500 trials across all the participants. Findings We validated our dataset via neurophysiological analysis. We observed clear sensorimotor de-/activation and spatial distribution related to real-movement and motor imagery, respectively. Furthermore, we demonstrated the consistency of the dataset by evaluating the classification performance of each session using a baseline machine learning method. Conclusions The dataset includes the data of multiple recording sessions, various classes within the single upper extremity, and multimodal signals. This work can be used to (i) compare the brain activities associated with real movement and imagination, (ii) improve the decoding performance, and (iii) analyze the differences among recording sessions. Hence, this study, as a Data Note, has focused on collecting data required for further advances in the BCI technology.
Non-invasive brain-computer interfaces (BCI) have been developed for recognizing human mental states with high accuracy and for decoding various types of mental conditions. In particular, accurately decoding a pilot’s mental state is a critical issue as more than 70% of aviation accidents are caused by human factors, such as fatigue or drowsiness. In this study, we report the classification of not only two mental states (i.e., alert and drowsy states) but also five drowsiness levels from electroencephalogram (EEG) signals. To the best of our knowledge, this approach is the first to classify drowsiness levels in detail using only EEG signals. We acquired EEG data from ten pilots in a simulated night flight environment. For accurate detection, we proposed a deep spatio-temporal convolutional bidirectional long short-term memory network (DSTCLN) model. We evaluated the classification performance using Karolinska sleepiness scale (KSS) values for two mental states and five drowsiness levels. The grand-averaged classification accuracies were 0.87 (±0.01) and 0.69 (±0.02), respectively. Hence, we demonstrated the feasibility of classifying five drowsiness levels with high accuracy using deep learning.
The snapping shrimp sound is known to be a major biological noise source of ocean soundscapes in coastal shallow waters of low and mid-latitudes where sunlight reaches. Several studies have been conducted to understand the activity of snapping shrimp through comparison with surrounding environmental factors. In this paper, we report the analysis of the sound produced by snapping shrimp inhabiting an area where sunlight rarely reaches. The acoustic measurements were taken in May 2015 using two 16-channel vertical line arrays (VLAs) moored at a depth of about 100 m, located ∼100 km southwest of Jeju Island, South Korea, as part of the Shallow-water Acoustic Variability Experiment (SAVEX-15). During the experiment, the underwater soundscape was dominated by the broadband impulsive snapping shrimp noise, which is notable considering that snapping shrimp are commonly observed at very shallow depths of tens of meters or less where sunlight can easily reach. To extract snapping events in the ambient noise data, an envelope correlation combined with an amplitude threshold detection algorithm were applied, and then the sea surface-bounced path was filtered out using a kurtosis value of the waveform to avoid double-counting in snap rate estimates. The analysis of the ambient noise data received for 5 consecutive days indicated that the snap rate fluctuated with a strong one-quarter-diurnal variation between 200 and 1,200 snaps per minute, which is distinguished from the periodicity of the snap rate reported in the euphotic zone. The temporal variation in the snap rate is compared with several environmental factors such as water temperature, tidal level, and current speed. It is found that the snap rate has a significant correlation with the current speed, suggesting that snapping shrimp living in the area with little sunlight might change their snapping behavior in response to changes in current speed.
Non-invasive brain-computer interface (BCI) has been developed for recognizing and classifying human mental states with high performances. Specifically, classifying pilots' mental states accurately is a critical issue because their cognitive states, which are induced by mental fatigue, workload, and distraction, may be fundamental in catastrophic accidents. In this study, we present an electroencephalogram (EEG) classification of four mental states (fatigue, workload, distraction, and the normal state) from EEG signals in both offline and pseudo-online analyses. To the best of our knowledge, this study is the first attempt to classify pilots' mental states using only EEG signals during continuous decoding. We recorded EEG signals from seven pilots under various simulated flight conditions. We proposed a multiple feature block-based convolutional neural network (MFB-CNN) with temporal-spatio EEG filters to recognize the pilot's current mental states. We validated the proposed method for two analyses across all subjects. In the offline analysis, we confirmed the classification accuracy of 0.75 (±0.04) and, in the pseudo-online analysis, we obtained the detection accuracy of 0.72 (±0.20), 0.72 (±0.27), and 0.61 (±0.18) for fatigue, workload, and distraction, respectively. Hence, we demonstrate the feasibility of classifying various types of mental states for implementation in real-world environments. INDEX TERMS Brain-computer interface (BCI), Electroencephalogram (EEG), Mental states, Deep convolutional neural network.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.