To identify the neural substrate of rapid eye movements (REMs) during REM sleep in humans, we conducted simultaneous functional magnetic resonance imaging (fMRI) and polysomnographic recording during REM sleep. Event-related fMRI analysis time-locked to the occurrence of REMs revealed that the pontine tegmentum, ventroposterior thalamus, primary visual cortex, putamen and limbic areas (the anterior cingulate, parahippocampal gyrus and amygdala) were activated in association with REMs. A control experiment during which subjects made self-paced saccades in total darkness showed no activation in the visual cortex. The REM-related activation of the primary visual cortex without visual input from the retina provides neural evidence for the existence of human pontogeniculo-occipital waves (PGO waves) and a link between REMs and dreaming. Furthermore, the time-course analysis of blood oxygenation level-dependent responses indicated that the activation of the pontine tegmentum, ventroposterior thalamus and primary visual cortex started before the occurrence of REMs. On the other hand, the activation of the putamen and limbic areas accompanied REMs. The activation of the parahippocampal gyrus and amygdala simultaneously with REMs suggests that REMs and/or their generating mechanism are not merely an epiphenomenon of PGO waves, but may be linked to the triggering activation of these areas.
During a dyadic social interaction, two individuals can share visual attention through gaze, directed to each other (mutual gaze) or to a third person or an object (joint attention). Shared attention is fundamental to dyadic face-to-face interaction, but how attention is shared, retained, and neutrally represented in a pair-specific manner has not been well studied. Here, we conducted a two-day hyperscanning functional magnetic resonance imaging study in which pairs of participants performed a real-time mutual gaze task followed by a joint attention task on the first day, and mutual gaze tasks several days later. The joint attention task enhanced eye-blink synchronization, which is believed to be a behavioral index of shared attention. When the same participant pairs underwent mutual gaze without joint attention on the second day, enhanced eye-blink synchronization persisted, and this was positively correlated with inter-individual neural synchronization within the right inferior frontal gyrus. Neural synchronization was also positively correlated with enhanced eye-blink synchronization during the previous joint attention task session. Consistent with the Hebbian association hypothesis, the right inferior frontal gyrus had been activated both by initiating and responding to joint attention. These results indicate that shared attention is represented and retained by pair-specific neural synchronization that cannot be reduced to the individual level.
It has been revealed that spontaneous coherent brain activity during rest, measured by functional magnetic resonance imaging (fMRI), self-organizes a "small-world" network by which the human brain could sustain higher communication efficiency across global brain regions with lower energy consumption. However, the state-dependent dynamics of the network, especially the dependency on the conscious state, remain poorly understood. In this study, we conducted simultaneous electroencephalographic recording with resting-state fMRI to explore whether functional network organization reflects differences in the conscious state between an awake state and stage 1 sleep. We then evaluated whole-brain functional network properties with fine spatial resolution (3781 regions of interest) using graph theoretical analysis. We found that the efficiency of the functional network evaluated by path length decreased not only at the global level, but also in several specific regions depending on the conscious state. Furthermore, almost two-thirds of nodes that showed a significant decrease in nodal efficiency during stage 1 sleep were categorized as the default-mode network. These results suggest that brain functional network organizations are dynamically optimized for a higher level of information integration in the fully conscious awake state, and that the default-mode network plays a pivotal role in information integration for maintaining conscious awareness.
Using a technique for measuring brain activity simultaneously from two people, known as hyperscanning, we can calculate inter-brain neural effects that appear only in interactions between individuals. Hyperscanning studies using fMRI are advantageous in that they can precisely determine the region(s) involved in inter-brain effects. However, it is almost impossible to record inter-brain effects in daily life. By contrast, hyperscanning EEG studies have high temporal resolution and could be used to capture moment-to-moment interactions. In addition, EEG instrumentation is portable and easy to wear, offering the opportunity to record inter-brain effects during daily-life interactions. However, the disadvantage of this approach is that it is difficult to localize the epicenter of the inter-brain effect. fNIRS has better temporal resolution and portability than fMRI, but has limited spatial resolution and a limited ability to record deep brain structures. Future studies should employ hyperscanning EEG-fMRI, because this approach combines the high temporal resolution of EEG with the high spatial resolution of fMRI. Hyperscanning EEG-fMRI allows us to use inter-brain effects as neuromarkers of the properties of social interactions in daily life. We also wish to emphasize the need to develop a mathematical model explaining how two brains can exhibit synchronized activity.
Persons with autism spectrum disorders (ASD) are known to have difficulty in eye contact (EC). This may make it difficult for their partners during face to face communication with them. To elucidate the neural substrates of live inter-subject interaction of ASD patients and normal subjects, we conducted hyper-scanning functional MRI with 21 subjects with autistic spectrum disorder (ASD) paired with typically-developed (normal) subjects, and with 19 pairs of normal subjects as a control. Baseline EC was maintained while subjects performed real-time joint-attention task. The task-related effects were modeled out, and inter-individual correlation analysis was performed on the residual time-course data. ASD–Normal pairs were less accurate at detecting gaze direction than Normal–Normal pairs. Performance was impaired both in ASD subjects and in their normal partners. The left occipital pole (OP) activation by gaze processing was reduced in ASD subjects, suggesting that deterioration of eye-cue detection in ASD is related to impairment of early visual processing of gaze. On the other hand, their normal partners showed greater activity in the bilateral occipital cortex and the right prefrontal area, indicating a compensatory workload. Inter-brain coherence in the right IFG that was observed in the Normal-Normal pairs (Saito et al., 2010) during EC diminished in ASD–Normal pairs. Intra-brain functional connectivity between the right IFG and right superior temporal sulcus (STS) in normal subjects paired with ASD subjects was reduced compared with in Normal–Normal pairs. This functional connectivity was positively correlated with performance of the normal partners on the eye-cue detection. Considering the integrative role of the right STS in gaze processing, inter-subject synchronization during EC may be a prerequisite for eye cue detection by the normal partner.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.