Absence seizures are the most pure form of generalized epilepsy. They are characterized in the electroencephalogram by widespread bilaterally synchronous spike-wave discharges (SWDs), which are the reflections of highly synchronized oscillations in thalamocortical networks. To reveal network mechanisms responsible for the initiation and generalization of the discharges, we studied the interrelationships between multisite cortical and thalamic field potentials recorded during spontaneous SWDs in the freely moving WAG/Rij rat, a genetic model of absence epilepsy.Nonlinear association analysis revealed a consistent cortical "focus" within the peri-oral region of the somatosensory cortex.The SWDs recorded at other cortical sites consistently lagged this focal site, with time delays that increased with electrode distance (corresponding to a mean propagation velocity of 1.4 m/sec). Intra-thalamic relationships were more complex and could not account for the observed cortical propagation pattern. Cortical and thalamic sites interacted bi-directionally, whereas the direction of this coupling could vary throughout one seizure. However, during the first 500 msec, the cortical focus was consistently found to lead the thalamus.These findings argue against the existence of one common subcortical pacemaker for the generation of generalized spikewave discharges characteristic for absence seizures in the rat. Instead, the results suggest that a cortical focus is the dominant factor in initiating the paroxysmal oscillation within the corticothalamic loops, and that the large-scale synchronization is mediated by ways of an extremely fast intracortical spread of seizure activity. Analogous mechanisms may underlie the pathophysiology of human absence epilepsy.
In our natural world, a face is usually encountered not as an isolated object but as an integrated part of a whole body. The face and the body both normally contribute in conveying the emotional state of the individual. Here we show that observers judging a facial expression are strongly influenced by emotional body language. Photographs of fearful and angry faces and bodies were used to create face-body compound images, with either matched or mismatched emotional expressions. When face and body convey conflicting emotional information, judgment of facial expression is hampered and becomes biased toward the emotion expressed by the body. Electrical brain activity was recorded from the scalp while subjects attended to the face and judged its emotional expression. An enhancement of the occipital P1 component as early as 115 ms after presentation onset points to the existence of a rapid neural mechanism sensitive to the degree of agreement between simultaneously presented facial and bodily emotional expressions, even when the latter are unattended.emotion communication ͉ event-related potentials ͉ visual perception T he face and the body both normally contribute in conveying the emotional state of the individual. Darwin (1) was the first to describe in detail the specific facial and bodily expressions associated with emotions in animals and humans and regarded these expressions as part of emotion-specific adaptive actions. From this vantage point, face and body are part of an integrated whole. Indeed, in our natural world, faces are usually encountered not as isolated objects but as an integrated part of a whole body. Rapid detection of inconsistencies between them is beneficial when rapid adaptive action is required from the observer. To date, there has been no systematic investigation into how facial expressions and emotional body language interact in human observers, and the underlying neural mechanisms are unknown.Here, we investigate the inf luence of emotional body language on the perception of facial expression. We collected behavioral data and simultaneously measured electrical eventrelated potentials (ERP) from the scalp to explore the time course of neuronal processing with a resolution in the order of milliseconds. We predicted that recognition of facial expressions is inf luenced by concurrently presented emotional body language, and that affective information from the face and the body start to interact rapidly. In view of the important adaptive function of perceiving emotional states and previous findings about rapid recognition of emotional signals, we deemed it unlikely that integrated perception of face-body images results from relatively late and slow semantic processes. We hypothesize that the integration of affective information from a facial expression and the accompanying emotional body language is a mandatory automatic process occurring early in the processing stream, which does not require selective attention, thorough visual analysis of individual features, or conscious deliberate evaluatio...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.