2021
DOI: 10.1038/s41398-020-01133-5
|View full text |Cite
|
Sign up to set email alerts
|

Emotional visual mismatch negativity: a joint investigation of social and non-social dimensions in adults with autism

Abstract: Unusual behaviors and brain activity to socio-emotional stimuli have been reported in Autism Spectrum Disorder (ASD). Atypical reactivity to change and intolerance of uncertainty are also present, but little is known on their possible impact on facial expression processing in autism. The visual mismatch negativity (vMMN) is an electrophysiological response automatically elicited by changing events such as deviant emotional faces presented among regular neutral faces. While vMMN has been found altered in ASD in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

6
11
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 14 publications
(17 citation statements)
references
References 61 publications
(97 reference statements)
6
11
0
Order By: Relevance
“…More generally, the experiments by File et al (2017) suggested that the pattern of vMMN response varies according to the type of stimulus and level of deviance. Source reconstruction revealed that the vMMN to HSF was associated with activity in the extrastriate cortex, which is highly consistent with previous findings on vMMN to face (Kimura et al, 2012;Kovarski et al, 2021) or to other visual stimuli (e.g., Kimura et al, 2010;Urakawa et al, 2010;Susac et al, 2014). This suggests that MMN is modality specific (vMMN being elicited in visual areas while auditory MMN is elicited in auditory cortex- Näätänen et al, 2007) and relatively low-level (Susac et al, 2014).…”
Section: The Predictive Role Of Lsf Supported By Vmmnsupporting
confidence: 89%
See 3 more Smart Citations
“…More generally, the experiments by File et al (2017) suggested that the pattern of vMMN response varies according to the type of stimulus and level of deviance. Source reconstruction revealed that the vMMN to HSF was associated with activity in the extrastriate cortex, which is highly consistent with previous findings on vMMN to face (Kimura et al, 2012;Kovarski et al, 2021) or to other visual stimuli (e.g., Kimura et al, 2010;Urakawa et al, 2010;Susac et al, 2014). This suggests that MMN is modality specific (vMMN being elicited in visual areas while auditory MMN is elicited in auditory cortex- Näätänen et al, 2007) and relatively low-level (Susac et al, 2014).…”
Section: The Predictive Role Of Lsf Supported By Vmmnsupporting
confidence: 89%
“…Sources of this mismatch were found in a wide range of brain regions, from the right fusiform to the prefrontal and cingulate anterior regions, including the insula. Generators in temporal and limbic lobes were also described in other studies (Kimura et al, 2012;Li et al, 2012;Kovarski et al, 2021), as well as frontal activation (Kimura et al, 2010(Kimura et al, , 2012. The fusiform activity is in line with the preferential processing of faces (Kimura et al, 2012;Stefanics et al, 2012) especially in the right hemisphere, consistently with previous results on face vMMN (Kimura et al, 2012;FIGURE 4 | Contrast between HSF and LSF mismatch responses.…”
Section: The Predictive Role Of Lsf Supported By Vmmnsupporting
confidence: 88%
See 2 more Smart Citations
“…We conducted this analysis through the use of the Fieldtrip toolbox (Oostenveld et al, 2011) in MATLAB. We developed grand-averages of differential waveforms across two regions of interest (ROI) that correspond to the left (P7, PO7, O1) and right (P8, PO8, O2) posterior occipital-temporal electrodes (the electrodes were selected based on previous studies, e.g., Hu et al, 2020; Kovarski et al, 2021). For each time point (within 0 – 700 ms) at left or right electrodes, the clusters were formed through two or more neighboring time points whenever the t values (obtained by two-tailed t -test) exceeded the cluster threshold (0.025).…”
Section: Methodsmentioning
confidence: 99%