Abstract-Objective. The N2pc event-related potential (ERP) appears on the opposite side of the scalp with respect to the visual hemisphere where an object of interest is located. We explored the feasibility of using it to extract information on the spatial location of targets in aerial images shown by means of a rapid serial visual presentation (RSVP) protocol using single-trial classification. Methods. Images were shown to 11 participants at a presentation rate of 5 Hz while recording electroencephalographic signals.With the resulting ERPs we trained linear classifiers for singletrial detection of target presence and location. We analysed the classifiers' decisions and their raw output scores on independent test sets as well as the averages and voltage distributions of the ERPs. Results. The N2pc is elicited in RSVP presentation of complex images and can be recognised in single trials (the median area under the receiver operating characteristic curve was 0.76 for left vs right classification). Moreover, the peak amplitude of this ERP correlates with the horizontal position of the target within an image. The N2pc varies significantly depending on handedness, and these differences can be used for discriminating participants in terms of their preferred hand. Conclusion and Significance. The N2pc is elicited during RSVP presentation of real complex images and contains analogue information that can be used to roughly infer the horizontal position of targets. Furthermore, differences in the N2pc due to handedness should be taken into account when creating collaborative brain-computer interfaces.
The N2pc is a lateralised Event-Related Potential (ERP) that signals a shift of attention towards the location of a potential object of interest. We propose a single-trial target-localisation collaborative Brain-Computer Interface (cBCI) that exploits this ERP to automatically approximate the horizontal position of targets in aerial images. Images were presented by means of the rapid serial visual presentation technique at rates of 5, 6 and 10 Hz. We created three different cBCIs and tested a participant selection method in which groups are formed according to the similarity of participants’ performance. The N2pc that is elicited in our experiments contains information about the position of the target along the horizontal axis. Moreover, combining information from multiple participants provides absolute median improvements in the area under the receiver operating characteristic curve of up to 21% (for groups of size 3) with respect to single-user BCIs. These improvements are bigger when groups are formed by participants with similar individual performance, and much of this effect can be explained using simple theoretical models. Our results suggest that BCIs for automated triaging can be improved by integrating two classification systems: one devoted to target detection and another to detect the attentional shifts associated with lateral targets.
We explored the possibility of controlling a spacecraft simulator using an analogue Brain-Computer Interface (BCI) for 2-D pointer control. This is a difficult task, for which no previous attempt has been reported in the literature. Our system relies on an active display which produces event-related potentials (ERPs) in the user's brain. These are analysed in real-time to produce control vectors for the user interface. In tests, users of the simulator were told to pass as close as possible to the Sun. Performance was very promising, on average users managing to satisfy the simulation success criterion in 67.5% of the runs. Furthermore, to study the potential of a collaborative approach to spacecraft navigation, we developed BCIs where the system is controlled via the integration of the ERPs of two users. Performance analysis indicates that collaborative BCIs produce trajectories that are statistically significantly superior to those obtained by single users.
Hand movement is controlled by a large number of muscles acting on multiple joints in the hand and forearm. In a forearm amputee the control of a hand prosthesis is traditionally depending on electromyography from the remaining forearm muscles. Technical improvements have made it possible to safely and routinely implant electrodes inside the muscles and record high-quality signals from individual muscles. In this study, we present a database of intramuscular EMG signals recorded with fine-wire electrodes alongside recordings of hand forces in an isometric setup and with the addition of spike-sorted metadata. Six forearm muscles were recorded from twelve able-bodied subjects and nine forearm muscles from two subjects. The fully automated recording protocol, based on command cues, comprised a variety of hand movements, including some requiring slowly increasing/decreasing force. The recorded data can be used to develop and test algorithms for control of a prosthetic hand. Assessment of the signals was done in both quantitative and qualitative manners.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.