2006
DOI: 10.1007/s00221-006-0634-0
|View full text |Cite
|
Sign up to set email alerts
|

Multisensory integration of speech signals: the relationship between space and time

Abstract: Integrating audiovisual cues for simple events is affected when sources are separated in space and time. By contrast, audiovisual perception of speech appears resilient when either spatial or temporal disparities exist. We investigated whether speech perception is sensitive to the combination of spatial and temporal inconsistencies. Participants heard the bisyllable /aba/ while seeing a face produce the incongruent bisyllable /ava/. We tested the level of visual influence over auditory perception when the soun… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

9
41
1
2

Year Published

2007
2007
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 59 publications
(53 citation statements)
references
References 44 publications
9
41
1
2
Order By: Relevance
“…Thus far, we have assumed that the effect of the "unity assumption" on performance reported in the present study was attributable to top-down factors (i.e., cognitive factors that affect the decision about whether or not two signals go together, or refer to the same event; Radeau & Bertelson, 1977). However, it is important to note that stimulus-driven (or structural) factors, such as any fine-timescale temporal correspondence or correlation between the stimuli occurring in the two streams (see Armel & Ramachandran, 2003;Bermant & Welch, 1976;Jones & Jarick, 2006;Radeau & Bertelson, 1977, 1987Welch, 1999) can also facilitate multisensory integration in a purely bottom-up manner. We attempted to minimize any such bottom-up differences in the integration of the auditory and visual speech stimuli in the present study by carefully matching the timing of the visual and auditory events used to make the matched and mismatched videos.…”
Section: Top-down and Bottom-up Factors Contributing To Multisensory mentioning
confidence: 61%
See 1 more Smart Citation
“…Thus far, we have assumed that the effect of the "unity assumption" on performance reported in the present study was attributable to top-down factors (i.e., cognitive factors that affect the decision about whether or not two signals go together, or refer to the same event; Radeau & Bertelson, 1977). However, it is important to note that stimulus-driven (or structural) factors, such as any fine-timescale temporal correspondence or correlation between the stimuli occurring in the two streams (see Armel & Ramachandran, 2003;Bermant & Welch, 1976;Jones & Jarick, 2006;Radeau & Bertelson, 1977, 1987Welch, 1999) can also facilitate multisensory integration in a purely bottom-up manner. We attempted to minimize any such bottom-up differences in the integration of the auditory and visual speech stimuli in the present study by carefully matching the timing of the visual and auditory events used to make the matched and mismatched videos.…”
Section: Top-down and Bottom-up Factors Contributing To Multisensory mentioning
confidence: 61%
“…It should also be pointed out that the presentation of "informationally rich" stimuli such as the sight and sound of a kettle (i.e., events that have a greater internal temporal coherence and temporally varying structure) may promote more enhanced multisensory integration than stimuli of "low" informational content such as briefly presented lights and tones, where the only time-varying information consists of the onset and offset transitions Jones & Jarick, 2006). This form of multisensory integration, driven by the coherence or correlation between two sensory signals, can be thought of as a bottom-up form of integration (Armel & Ramachandran, 2003;Bermant & Welch, 1976;Radeau & Bertelson, 1987;Welch, 1999).…”
mentioning
confidence: 99%
“…This has important consequences, because the rise time of a sound is easily confounded with the distance of the sound (distant sounds have shallow rise times; Blauert, 1997), and rise time may also explain why determining temporal order for audio-visual speech can be notoriously difficult. In fact, the delays at which auditory and visual speech streams are perceived as synchronous are extremely wide (Conrey & Pisoni, 2006;Dixon & Spitz, 1980;Jones & Jarick, 2006;Stekelenburg & Vroomen, 2007;van Wassenhove, Grant, & Poeppel, 2007;Vatakis & Spence, 2006). For example, in van Wassenhove et al, observers in an SJ task judged whether congruent audio-visual speech stimuli and incongruent McGurk-like speech stimuli (McGurk & MacDonald, 1976) were synchronous.…”
Section: When Is Simultaneous?mentioning
confidence: 99%
“…(Von Hornbostel, The Unity of the Senses, 1927Senses, /1950 For many years now, the majority of cognitive neuroscience research on the topic of multisensory perception has tended to focus on trying to understand, and increasingly to model (Alais & Burr, 2004;Ernst & Bülthoff, 2004;Roach, Heron, & McGraw, 2006), the spatial and temporal factors modulating multisensory integration (e.g., see Calvert, Spence, & Stein, 2004;Spence & Driver, 2004). Broadly speaking, it appears that multisensory integration is more likely to occur the closer that the stimuli in different modalities are presented in time (e.g., Jones & Jarick, 2006;Shore, Barnes, & Spence, 2006;van Wassenhove, Grant, & Poeppel, 2007). Spatial coincidence has also been shown to facilitate multisensory integration under some (Frens, Van Opstal, & Van der Willigen, 1995;Slutsky & Recanzone, 2001), but by no means all, conditions (see, e.g., Bertelson, Vroomen, Wiegeraad, & de Gelder, 1994;Innes-Brown & Crewther, 2009;Jones & Jarick, 2006;Jones & Munhall, 1997;Recanzone, 2003;Vroomen & Keetels, 2006).…”
mentioning
confidence: 99%
“…Broadly speaking, it appears that multisensory integration is more likely to occur the closer that the stimuli in different modalities are presented in time (e.g., Jones & Jarick, 2006;Shore, Barnes, & Spence, 2006;van Wassenhove, Grant, & Poeppel, 2007). Spatial coincidence has also been shown to facilitate multisensory integration under some (Frens, Van Opstal, & Van der Willigen, 1995;Slutsky & Recanzone, 2001), but by no means all, conditions (see, e.g., Bertelson, Vroomen, Wiegeraad, & de Gelder, 1994;Innes-Brown & Crewther, 2009;Jones & Jarick, 2006;Jones & Munhall, 1997;Recanzone, 2003;Vroomen & Keetels, 2006).…”
mentioning
confidence: 99%