People with autism show superior performance to controls on the Embedded Figures Test (EFT). However, studies examining the relationship between autistic-like traits and EFT performance in neurotypical individuals have yielded inconsistent findings. To examine the inconsistency, a meta-analysis was conducted of studies that (a) compared high and low Autism-Spectrum Quotient (AQ) groups, and (b) treated AQ as a continuous variable. Outcomes are consistent with superior visual search forming part of the broader autism phenotype, but in existing literature, this is evident only when comparing extreme groups. Reanalysis of data from previous studies suggests findings are unlikely to be driven by a small number of high scorers. Monte Carlo simulations are used to illustrate the effect of methodological differences on results.
Highlights
We report longitudinal ERP data of 80 infants in a face-discrimination task.
P1, N290, Nc are all sensitive to faces in five-month-olds.
P1, N290, Nc show equal face-categorization in infants tested longitudinally.
N290 shows less variation in face-categorization trajectories than P1 or Nc.
Visual ERPs increase in amplitude over infancy, but this is not face-specific.
Individuals with autism spectrum disorder (ASD) show atypical processing of facial expressions. Research with autistic toddlers suggests that abnormalities in processing of spatial frequencies (SFs) contribute to such differences. The current event-related-potential (ERP) study investigated differences between 10-month-old infants with high- and low-likelihood for ASD in SF processing and in discrimination of fearful and neutral faces, filtered to contain specific SF. Results indicate no group differences in general processing of higher (HSF, detailed) and lower-SF (LSF, global) information. However, unlike low-likelihood infants, high-likelihood infants do not discriminate between facial expressions when either the LSF or HSF information is available. Combined with previous findings in toddlers, the current results indicate a developmental delay in efficient processing of facial expressions in ASD.
Processing faces and understanding facial expressions are crucial skills for social communication. In adults, basic face processing and facial emotion processing rely on specific interacting brain networks. In infancy, however, little is known about when and how these networks develop. The current study uses functional near-infrared spectroscopy (fNIRS) to measure differences in 5-month-olds’ brain activity in response to fearful and happy facial expressions. Our results show that the right occipital region responds to faces, indicating that the face processing network is activated at 5 months. Yet sensitivity to facial emotions appears to be still immature at this age: explorative analyses suggest that if the facial emotion processing network was active this would be mainly visible in the temporal cortex. Together these results indicate that at 5 months, occipital areas already show sensitivity to face processing, while the facial emotion processing network seems not fully developed.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.