In recent years there has been an increase in the number of portable low-cost electroencephalographic (EEG) systems available to researchers. However, to date the validation of the use of low-cost EEG systems has focused on continuous recording of EEG data and/or the replication of large system EEG setups reliant on event-markers to afford examination of event-related brain potentials (ERP). Here, we demonstrate that it is possible to conduct ERP research without being reliant on event markers using a portable MUSE EEG system and a single computer. Specifically, we report the results of two experiments using data collected with the MUSE EEG system—one using the well-known visual oddball paradigm and the other using a standard reward-learning task. Our results demonstrate that we could observe and quantify the N200 and P300 ERP components in the visual oddball task and the reward positivity (the mirror opposite component to the feedback-related negativity) in the reward-learning task. Specifically, single sample t-tests of component existence (all p's < 0.05), computation of Bayesian credible intervals, and 95% confidence intervals all statistically verified the existence of the N200, P300, and reward positivity in all analyses. We provide with this research paper an open source website with all the instructions, methods, and software to replicate our findings and to provide researchers with an easy way to use the MUSE EEG system for ERP research. Importantly, our work highlights that with a single computer and a portable EEG system such as the MUSE one can conduct ERP research with ease thus greatly extending the possible use of the ERP methodology to a variety of novel contexts.
It is far more difficult to detect a small tactile stimulation on a finger that is moving compared to when it is static. This suppression of tactile information during motion, known as tactile gating, has been examined in some detail during single-joint movements. However, the existence and time course of this gating has yet to be examined during visually guided multi-joint reaches, where sensory feedback may be paramount. The current study demonstrated that neurologically intact humans are unable to detect a small vibratory stimulus on one of their index fingers during a bimanual reach toward visual targets. By parametrically altering the delay between the visual target onset and the vibration, it was demonstrated that this gating was even apparent before participants started moving. A follow up experiment using electromyography indicated that gating was likely to occur even before muscle activity had taken place. This unique demonstration of tactile gating during a task reliant on visual feedback supports the notion this phenomenon is due to a central command, rather than a masking of sensory signals by afferent processing during movement.
Humans' sensory systems are bombarded by myriad events every moment of our lives. Thus, it is crucial for sensory systems to choose and process critical sensory events deemed important for a given task and, indeed, those that affect survival. Tactile gating is well known, and defined as a reduced ability to detect and discriminate tactile events before and during movement. Also, different locations of the effector exhibit different magnitudes of sensitivity changes. The authors examined that time course of tactile gating in a reaching and grasping movement to characterize its behavior. Tactile stimulators were attached to the right and left mid-forearms and the right index finger and fifth digit. When participants performed reach-to-grasp and lift targets, tactile acuity decreased at the right forearm before movement onset (F. L. Colino, G. Buckingham, D. T. Cheng, P. van Donkelaar, & G. Binsted, 2014 ). However, tactile sensitivity at the right index finger decreased by nearly 20% contrary to expectations. This result reflecting that there may be an additional source acting to reduce inhibition related to tactile gating. Additionally, sensitivity improved as movement end approached. Collectively, the present results indicate that predictive and postdictive mechanisms strongly influence tactile gating.
A multitude of events bombard our sensory systems at every moment of our lives. Thus, it is important for the sensory cortex to gate unimportant events. Tactile suppression is a well‐known phenomenon defined as a reduced ability to detect tactile events on the skin before and during movement. Previous experiments found detection rates decrease just prior to and during finger abduction, and decrease according to the proximity of the moving effector. This study examined how tactile detection changes during a reach to grasp. Fourteen human participants used their right hand to reach and grasp a cylinder. Tactors were attached to the index finger, the fifth digit, and the forearm of both the right and left arm and vibrated at various epochs relative to a “go” tone. Results showed that detection rates at the forearm decreased before movement onset; whereas at the right index finger, right fifth digit and at the left index finger, left fifth digit, and forearm sites did not decrease like in the right forearm. These results indicate that the task affects gating dynamics in a temporally‐ and contextually dependent manner and implies that feed‐forward motor planning processes can modify sensory signals.
A multitude of events bombard our sensory systems at every moment of our lives. Thus, it is important for the sensory and motor cortices to gate unimportant events. Tactile suppression is a well-known phenomenon defined as a reduced ability to detect tactile events on the skin before and during movement. Previous experiments (Buckingham et al. in Exp Brain Res 201(3):411-419, 2010; Colino et al. in Physiol Rep 2(3):e00267, 2014) found detection rates decrease just prior to and during finger abduction and decrease according to the proximity of the moving effector. However, what effect does vision have on tactile gating? There is ample evidence (see Serino and Haggard in Neurosci Biobehav Rev 34:224-236, 2010) observing increased tactile acuity when participants see their limbs. The present study examined how tactile detection changes in response to visual condition (vision/no vision). Ten human participants used their right hand to reach and grasp a cylinder. Tactors were attached to the index finger and the forearm of both the right and left arm and vibrated at various epochs relative to a "go" tone. Results replicate previous findings from our laboratory (Colino et al. in Physiol Rep 2(3):e00267, 2014). Also, tactile acuity decreased when participants did not have vision. These results indicate that the vision affects the somatosensation via inputs from parietal areas (Konen and Haggard in Cereb Cortex 24(2):501-507, 2014) but does so in a reach-to-grasp context.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.