1999
DOI: 10.1162/089892999563544
|View full text |Cite
|
Sign up to set email alerts
|

Auditory-Visual Integration during Multimodal Object Recognition in Humans: A Behavioral and Electrophysiological Study

Abstract: The aim of this study was (1) to provide behavioral evidence for multimodal feature integration in an object recognition task in humans and (2) to characterize the processing stages and the neural structures where multisensory interactions take place. Event-related potentials (ERPs) were recorded from 30 scalp electrodes while subjects performed a forced-choice reaction-time categorization task: At each trial, the subjects had to indicate which of two objects was presented by pressing one of two keys. The two … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

115
763
11

Year Published

2003
2003
2018
2018

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 960 publications
(913 citation statements)
references
References 76 publications
(105 reference statements)
115
763
11
Order By: Relevance
“…Previous ERP studies employing simple detection tasks, such as those used in the present study, also demonstrated significant amplitude deflections in centro-temporal channels over auditory cortices and in posterior-occipital channels over visual cortices between 100 and 200 ms (cf., Giard & Peronnet, 1999;Teder-Salejarvi et al, 2005;Teder-Salejarvi, McDonald, Di Russo, & Hillyard, 2002). While the behavioural results in the semantic classification condition indicate dominance of vision over audition, previous ERP findings suggest that auditory animacy judgments begin within 100ms after stimulus onset (Murray, Camen, Gonzalez Andino, Bovet, & Clarke, 2006), and such amplitude modulations are also within the timeframe attributed to animacy judgments of visual objects (Thorpe, Fize, & Marlot, 1996).…”
Section: Multisensory Integrationmentioning
confidence: 99%
“…Previous ERP studies employing simple detection tasks, such as those used in the present study, also demonstrated significant amplitude deflections in centro-temporal channels over auditory cortices and in posterior-occipital channels over visual cortices between 100 and 200 ms (cf., Giard & Peronnet, 1999;Teder-Salejarvi et al, 2005;Teder-Salejarvi, McDonald, Di Russo, & Hillyard, 2002). While the behavioural results in the semantic classification condition indicate dominance of vision over audition, previous ERP findings suggest that auditory animacy judgments begin within 100ms after stimulus onset (Murray, Camen, Gonzalez Andino, Bovet, & Clarke, 2006), and such amplitude modulations are also within the timeframe attributed to animacy judgments of visual objects (Thorpe, Fize, & Marlot, 1996).…”
Section: Multisensory Integrationmentioning
confidence: 99%
“…The notion that multisensory integration is restricted to higherorder areas has recently been challenged by human and animal studies that have revealed that crossmodal interactions can occur in unisensory areas at very low levels of cortical processing (Buchel et al, 1998;Calvert et al, 1999Calvert et al, , 2001Macaluso et al, 2000;Schroeder et al, 2001;Amedi et al, 2002;Ghazanfar et al, 2005;Kriegstein et al, 2005;Miller and D'Esposito, 2005;Watkins et al, 2006;Martuzzi et al, 2007;Kayser et al, 2007Kayser et al, , 2008Romei et al, 2007Romei et al, , 2008Wang et al, 2008) and more importantly at very short latencies (Giard and Peronnet, 1999;Foxe et al, 2000;Molholm et al, 2002;Murray et al, 2005;Senkowski et al, 2007;Sperdin et al, 2009). Such a fast timing of multisensory interactions rule out an origin in the multisensory areas mediated through backward projections, and instead favor direct heteromodal connections.…”
Section: Heteromodal Connections: Connections Between Different Sensomentioning
confidence: 99%
“…Considering that only the peripheral visual field representation of V1 receives significant projections from the auditory cortex, such congruency in the spatial features may serve to facilitate gaze orienting, and consequently the relocation of foveal vision to peripheral locations in the visual field. However, it should also be mentioned that facilitative effects have been observed in humans when stimuli were centrally presented in both auditory and visual modalities (Giard and Peronnet, 1999;Molholm et al, 2002;Martuzzi et al, 2007). Ethologically, a role in alertness for dangerous stimuli is highly probable, an interpretation that can also be attributed to the visuo-somatosensory projections: the specific link between the FST visual complex and the representation of the face in the somatosensory cortex could contribute to phenomena of avoidance of a ''dangerous" stimulus which may hit the body Graziano, 2003, 2004).…”
Section: Specificity Of Heteromodal Connections: Ethological Rolementioning
confidence: 99%
See 1 more Smart Citation
“…Compared with pure acoustic stimuli, the presence of a congruent visual stimulus enhances accuracy and shortens reaction times (Giard & Peronnet, 1999;Van Wassenhove, Grant, & Poeppel, 2005), and this effect is maximal when acoustic stimuli are weak, noisy or degraded. The performance enhancement induced by visual cues in speech-in-noise occurs largely because vision and audition offer complementary information about the stimulus; vision conveys the place of articulation, while audition primarily conveys voicing and manner (Summerfield, 1987), providing concurrent cues that are ultimately merged in a single representation.…”
Section: Introductionmentioning
confidence: 99%