Spatial navigation is a complex cognitive process based on multiple senses that are integrated and processed by a wide network of brain areas. Previous studies have revealed the retrosplenial complex (RSC) to be modulated in a task-related manner during navigation. However, these studies restricted participants’ movement to stationary setups, which might have impacted heading computations due to the absence of vestibular and proprioceptive inputs. Here, we present evidence of human RSC theta oscillation (4–8 Hz) in an active spatial navigation task where participants actively ambulated from one location to several other points while the position of a landmark and the starting location were updated. The results revealed theta power in the RSC to be pronounced during heading changes but not during translational movements, indicating that physical rotations induce human RSC theta activity. This finding provides a potential evidence of head-direction computation in RSC in healthy humans during active spatial navigation.
Brain-computer interfaces (BCI) allow users to communicate directly with external devices via their brain signals.Recently, BCIs, and wearable computers in particular, have been receiving more attention by government and industry as an alternative means of interacting with technology. Wearable computers can combine highly-immersive virtual/augmented/mixed reality experiences for entertainment, health monitoring, utilitarian purposes, and, most importantly at present, research. With wearable computers, researchers can design, simulate, and finely control experiments to examine human brain dynamics outside the laboratory. Yet despite the power of BCIs, take-up is slow. This form of interaction is unnatural to humans and often requires external stimuli. Further, the response feedback produced by the computer part of the system is nowhere near as quick as our brains. Hence, we undertook a review of the current state-of-the-art in BCI research and distilled the current findings into a stimulus-free BCI, called direct-sense BCIs, that operates directly and seamlessly from our thinking. This is a novel paradigm that, in the short term, could substantially improve the quality of a user's experience with BCI, and, over the long term, lead to much more widespread take-up of BCI technology.
The availability of accurate and reliable dry sensors for electroencephalography (EEG) is vital to enable large-scale deployment of brain–machine interfaces (BMIs). However, dry sensors invariably show poorer performance compared to the gold standard Ag/AgCl wet sensors. The loss of performance with dry sensors is even more evident when monitoring the signal from hairy and curved areas of the scalp, requiring the use of bulky and uncomfortable acicular sensors. This work demonstrates three-dimensional micropatterned sensors based on a subnanometer-thick epitaxial graphene for detecting the EEG signal from the challenging occipital region of the scalp. The occipital region, corresponding to the visual cortex of the brain, is key to the implementation of BMIs based on the common steady-state visually evoked potential paradigm. The patterned epitaxial graphene sensors show efficient on-skin contact with low impedance and can achieve comparable signal-to-noise ratios against wet sensors. Using these sensors, we have also demonstrated hands-free communication with a quadruped robot through brain activity.
Modern work environments have extensive interactions with technology and greater cognitive complexity of the tasks, which results in human operators experiencing increased mental workload. Air traffic control operators routinely work in such complex environments, and we designed tracking and collision prediction tasks to emulate their elementary tasks. The physiological response to the workload variations in these tasks was elucidated to untangle the impact of workload variations experienced by operators. Electroencephalogram (EEG), eye activity, and heart rate variability (HRV) data were recorded from 24 participants performing tracking and collision prediction tasks with three levels of difficulty. Our findings indicate that variations in task load in both these tasks are sensitively reflected in EEG, eye activity and HRV data. Multiple regression results also show that operators' performance in both tasks can be predicted using the corresponding EEG, eye activity and HRV data. The results also demonstrate that the brain dynamics during each of these tasks can be estimated from the corresponding eye activity, HRV and performance data. Furthermore, the markedly distinct neurometrics of workload variations in the tracking and collision prediction tasks indicate that neurometrics can provide insights on the type of mental workload. These findings have applicability to the design of future mental workload adaptive systems that integrate neurometrics in deciding not just "when" but also "what" to adapt. Our study provides compelling evidence in the
Spatial navigation is a complex cognitive process based on multiple senses that are integrated and processed by a wide network of brain areas. Previous studies have revealed the retrosplenial complex (RSC) to be modulated in a task-related manner during navigation. However, these studies restricted participants' movement to stationary setups, which might have impacted heading computations due to the absence of vestibular and proprioceptive inputs. Here, we investigated neural dynamics of RSC in an active spatial navigation task where participants actively ambulated from one location to several other points while the position of a landmark and the starting location were updated. The results revealed theta power in the RSC to be pronounced during heading changes but not during translational movements, indicating that physical rotations induce human RSC theta activity. This finding provides a potential evidence of head-direction computation in RSC in healthy humans during active spatial navigation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.