2007
DOI: 10.1162/jocn.2007.19.12.1964
|View full text |Cite
|
Sign up to set email alerts
|

Neural Correlates of Multisensory Integration of Ecologically Valid Audiovisual Events

Abstract: Abstract& A question that has emerged over recent years is whether audiovisual (AV) speech perception is a special case of multisensory perception. Electrophysiological (ERP) studies have found that auditory neural activity (N1 component of the ERP) induced by speech is suppressed and speeded up when a speech sound is accompanied by concordant lip movements. In Experiment 1, we show that this AV interaction is not speechspecific. Ecologically valid nonspeech AV events (actions performed by an actor such as han… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

51
291
11

Year Published

2009
2009
2019
2019

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 299 publications
(353 citation statements)
references
References 45 publications
51
291
11
Order By: Relevance
“…However, the extent of the N1 decrease in the right hemisphere varied with the noise level and the behavioral gain, which suggests that the effect was not simply due to unisensory visual activity (since the visual stimuli was the same whatever the noise level). Recently, however, a study reproduced this N1 amplitude decrease using the additive model (Stekelenburg and Vroomen, 2007). Hence, overall, despite methodological problems, the decrease of the auditory N1 component associated with faster processing of audiovisual speech seems to be a robust result.…”
Section: Speech Perceptionmentioning
confidence: 98%
See 3 more Smart Citations
“…However, the extent of the N1 decrease in the right hemisphere varied with the noise level and the behavioral gain, which suggests that the effect was not simply due to unisensory visual activity (since the visual stimuli was the same whatever the noise level). Recently, however, a study reproduced this N1 amplitude decrease using the additive model (Stekelenburg and Vroomen, 2007). Hence, overall, despite methodological problems, the decrease of the auditory N1 component associated with faster processing of audiovisual speech seems to be a robust result.…”
Section: Speech Perceptionmentioning
confidence: 98%
“…Klucharev et al (2003) and Stekelenburg and Vroomen (2007) argued that early interactions in speech perception reflect non-specific integration related to basic computations such as location in space and time because, around these latencies, ERPs are not sensitive to the incongruence of auditory and visual speech cues (Klucharev et al, 2003). At least three arguments undermine this claim: first, it is based on a logical flaw known as converse error.…”
Section: Function Of the Early Interaction Effectsmentioning
confidence: 99%
See 2 more Smart Citations
“…de Gelder and Vroomen, 2000) already during early perceptual processing stages (e.g. de Gelder et al, 1999;Gerdes et al, 2013;Pourtois et al, 2000Pourtois et al, , 2002Stekelenburg and Vroomen, 2007) probably involving specialized structures (e.g. de Gelder and Van den Stock, 2011), while incongruent audiovisual input can even lead to perceptual illusions (cf.…”
Section: Introductionmentioning
confidence: 99%