Attentional mechanisms have been studied mostly in specific sensory domains, such as auditory, visuospatial, or tactile modalities. In contrast, attention to internal interoceptive visceral targets has only recently begun to be studied, despite its potential importance in emotion, empathy, and self-awareness. Here, we studied the effects of shifting attention to the heart using a cue-target detection paradigm during continuous EEG recordings. Subjects were instructed to count either a series of visual stimuli (visual condition) or their own heartbeats (heart condition). Visual checkerboard stimuli were used as attentional probes throughout the task. Consistent with previous findings, attention modulated the amplitude of the heartbeat-evoked potentials. Directing attention to the heart significantly reduced the visual P1/N1 amplitude evoked by the attentional probe. ERPs locked to the attention-directing cue revealed a novel frontal positivity around 300 ms postcue. Finally, spectral power in the alpha band over parieto-occipital regions was higher while attending to the heart-when compared to the visual task-and correlated with subject's performance in the interoceptive task. These results are consistent with a shared, resource-based attentional mechanism whereby allocating attention to bodily signals can affect early responses to visual stimuli.
Focusing one’s attention by external guiding stimuli towards a specific area of the visual field produces systematical neural signatures. One of the most robust is the change in topological distribution of oscillatory alpha band activity across parieto-occipital cortices. In particular, decreases in alpha activity over contralateral and/or increases over ipsilateral scalp sites, respect to the side of the visual field where attention was focused. This evidence comes mainly from experiments where an explicit cue informs subjects where to focus their attention, thus facilitating detection of an upcoming target stimulus. However, recent theoretical models of attention have highlighted a stochastic or non-deterministic component related to visuospatial attentional allocation. In an attempt to evidence this component, here we analyzed alpha activity in a signal detection paradigm in the lack of informative cues; in the absence of preceding information about the location (and time) of appearance of target stimuli. We believe that the unpredictability of this situation could be beneficial for unveiling this component. Interestingly, although total alpha power did not differ between Seen and Unseen conditions, we found a significant lateralization of alpha activity over parieto-occipital electrodes, which predicted behavioral performance. This effect had a smaller magnitude compared to paradigms in which attention is externally guided (cued). However we believe that further characterization of this spontaneous component of attention is of great importance in the study of visuospatial attentional dynamics. These results support the presence of a spontaneous component of visuospatial attentional allocation and they advance pre-stimulus alpha-band lateralization as one of its neural signatures.
Background Autonomous Sensory Meridian Response (ASMR) describes the experience of a pleasant tingling sensation along the back of the head, accompanied with a feeling of well-being and relaxation, in response to specific audio-visual stimuli, such as whispers, soft sounds, and personal attention. Previous works have assessed individual variations in personality traits associated with ASMR, but no research to date has explored differences in emotion regulation associated with ASMR. This omission occurred even when ASMR, a sensory-emotional experience, has been proposed to be located in a sound sensitivity spectrum as the opposite end of misophonia, a phenomenon associated with difficulties regulating emotions. The present work aimed to assess group differences between ASMR self-reporters and non-ASMR controls associated with emotion regulation strategies. Methods We used the validated Spanish version of the Emotion Regulation Questionnaire to assess individual differences in the use of cognitive reappraisal and expressive suppression. Results Our results showed that participants who experience ASMR had higher scores in the cognitive reappraisal subscale of the emotion regulation questionnaire than the non-ASMR group. Conclusions Individuals who experience ASMR reported higher use of cognitive reevaluation of emotionally arousing situations, suggesting more effectiveness in regulating emotions. Our finding further elucidates individual differences related to this experience, supporting that ASMR is a real psychophysiological phenomenon associated with other psychological constructs and has remarkable consequences in affective/emotional dimensions and general well-being.
Face-to-face communication has several sources of contextual information that enables language comprehension. This information is used, for instance, to perceive mood of interlocutors, clarifying ambiguous messages. However, these contextual cues are absent in text-based communication. Emoticons have been proposed as cues used to stress the emotional intentions on this channel of communication. Most studies have suggested that their role is to contribute to a more accurate perception of emotions. Nevertheless, it is not clear if their influence on disambiguation is independent of their emotional valence and its interaction with text message valence. In the present study, we designed an emotional congruence paradigm, where participants read a set of messages composed by a positive or negative emotional situation sentence followed by a positive or negative emoticon. Participants were instructed to indicate if the sender was in a good or bad mood. With the aim of analyzing the disambiguation process and observing if the role of the emoticons in disambiguation is different according their valence, we measure the rate of responses of perceived mood and the reaction times (RTs) for each condition. Our results showed that the perceived mood in ambiguous messages tends to be more negative regardless of emotion valence. Nonetheless, we observed that this tendency was not the same for positive and negative emoticons. Specifically, negative mood perception was higher for incongruent positive emoticons. On the other hand, RTs for positive emoticons were faster than for the negative ones. Responses for incongruent messages were slower than for the congruent ones. However, the incongruent condition showed different RTs depending on the emoticons’ valence. In the incongruent condition, responses for negative emoticons was the slowest. Results are discussed taking into account previous observations about the potential role of emoticons in mood perception and cognitive processing. We concluded that the role of emoticons in disambiguation and mood perception is due to the interaction of emoticon valence with the entire message.
Visual sensory processing of external events decreases when attention is internally oriented toward self-generated thoughts and also differences in attenuation have been shown depending on the thought’s modality (visual or auditory thought). The present study aims to assess whether such modulations occurs also in auditory modality. In order to investigate auditory sensory modulations, we compared a passive listening condition with two conditions in which attention was internally oriented as a part of a task; a visual imagery condition and an inner speech condition. EEG signal was recorded from 20 participants while they were exposed to auditory probes during these three conditions. ERP results showed no differences in N1 auditory response comparing the three conditions reflecting maintenance of evoked electrophysiological reactivity for auditory modality. Nonetheless, time-frequency analyses showed that gamma and theta power in frontal regions was higher for passive listening than for internal attentional conditions. Specifically, the reduced amplitude in early gamma and theta band during both inward attention conditions may reflect reduced conscious attention of the current auditory stimulation. Finally, different pattern of beta band activity was observed only during visual imagery which can reflect cross-modal integration between visual and auditory modalities and it can distinguish this form of mental imagery from the inner speech. Taken together, these results showed that attentional suppression mechanisms in auditory modality are different from visual modality during mental imagery processes. Our results about oscillatory activity also confirm the important role of gamma oscillations in auditory processing and the differential neural dynamics underlying the visual and auditory/verbal imagery.
Selective attention depends on goal-directed and stimulus-driven modulatory factors, each relayed by different brain rhythms. Under certain circumstances, stress-related states can change the balance between goal-directed and stimulus-driven factors. However, the neuronal mechanisms underlying these changes remain unclear. In this study, we explored how psychosocial stress can modulate brain rhythms during an attentional task and a task-free period. We recorded the EEG and ECG activity of 42 healthy participants subjected to either the Trier Social Stress Test (TSST), a controlled procedure to induce stress, or a comparable control protocol (same physical and cognitive effort but without the stress component), flanked by an attentional task, a 90 s of task-free period and a state of anxiety questionnaire. We observed that psychosocial stress induced an increase in heart rate (HR), self-reported anxiety, and alpha power synchronization. Also, psychosocial stress evoked a relative beta power increase during correct trials of the attentional task, which correlates positively with anxiety and heart rate increase, and inversely with attentional accuracy. These results suggest that psychosocial stress affects performance by redirecting attentional resources toward internal threat-related thoughts. An increment of endogenous top-down modulation reflected an increased beta-band activity that may serve as a compensatory mechanism to redirect attentional resources toward the ongoing task. The data obtained here may contribute to designing new ways of clinical management of the human stress response in the future and could help to minimize the damaging effects of persistent stressful experiences.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.