2011
DOI: 10.1007/s10803-011-1400-0
|View full text |Cite
|
Sign up to set email alerts
|

Audiovisual Speech Perception and Eye Gaze Behavior of Adults with Asperger Syndrome

Abstract: Audiovisual speech perception was studied in adults with Asperger syndrome (AS), by utilizing the McGurk effect, in which conflicting visual articulation alters the perception of heard speech. The AS group perceived the audiovisual stimuli differently from age, sex and IQ matched controls. When a voice saying /p/ was presented with a face articulating /k/, the controls predominantly heard /k/. Instead, the AS group heard /k/ and /t/ with almost equal frequency, but with large differences between individua… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

2
34
1

Year Published

2014
2014
2023
2023

Publication Types

Select...
8
2

Relationship

1
9

Authors

Journals

citations
Cited by 45 publications
(42 citation statements)
references
References 41 publications
2
34
1
Order By: Relevance
“…In a separate pilot study (N = 14) we obtained similar results when we asked typical readers to report what they heard (instead of perceived). The rate of visual-based responses also exceeded the rate of fusion responses possibly because the visually presented /aka/ was clearly recognizable and thus did not support the /ata/ interpretation (see Tiippana, 2014, for a similar argument and, e.g., Andersen, Tiippana, Laarni, Kojo, & Sams, 2009;Saalasti et al, 2012, for similar patterns).…”
Section: Discussionmentioning
confidence: 99%
“…In a separate pilot study (N = 14) we obtained similar results when we asked typical readers to report what they heard (instead of perceived). The rate of visual-based responses also exceeded the rate of fusion responses possibly because the visually presented /aka/ was clearly recognizable and thus did not support the /ata/ interpretation (see Tiippana, 2014, for a similar argument and, e.g., Andersen, Tiippana, Laarni, Kojo, & Sams, 2009;Saalasti et al, 2012, for similar patterns).…”
Section: Discussionmentioning
confidence: 99%
“…The ability to match visual articulatory gestures with auditory speech information is one of the key skills necessary for successful audiovisual speech perception. This study may, therefore, serve as a baseline for evaluating the processing of facial articulatory movements in various populations for whom atypical audiovisual speech perception has been reported, such as autism (Foxe et al, 2013; Guiraud et al, 2012; Saalasti et al, 2012; Stevenson et al, 2014; Taylor, Isaac, & Milne, 2010), dyslexia (Bastien-Toniazzo, Stroumza, & Cavé, 2010), specific language impairment (Boliek, Keintz, Norrix, & Obrzut, 2010; Hayes, Tiippana, Nicol, Sams, & Kraus, 2003; Kaganovich, Schumaker, Macias, & Anderson, in press; Leybaert et al, 2014; Meronen, Tiippana, Westerholm, & Ahonen, 2013; Norrix, Plante, & Vance, 2006; Norrix, Plante, Vance, & Boliek, 2007), and phonological disorders (Dodd, McIntosh, Erdener, & Burnham, 2008). …”
Section: Discussionmentioning
confidence: 99%
“…Reported differences in gaze to faces in children with ASD appear to vary depending on the age of the child (Dawson et al, 2005; Chawarska and Shic, 2009; Senju and Johnson, 2009). Moreover, recent work by Foxe et al (2013) suggests that multisensory integration deficits present in children with ASD may resolve in adulthood (although subtle differences may persist; Saalasti et al, 2012). …”
Section: Introductionmentioning
confidence: 99%