2009
DOI: 10.1007/s00221-009-1831-4
|View full text |Cite
|
Sign up to set email alerts
|

Catch the moment: multisensory enhancement of rapid visual events by sound

Abstract: Repetition blindness (RB) is a visual deficit, wherein observers fail to perceive the second occurrence of a repeated item in a rapid serial visual presentation stream. Chen and Yeh (Psychon Bull Rev 15:404-408, 2008) recently observed a reduction of the RB effect when the repeated items were accompanied by two sounds. The current study further manipulated the pitch of the two sounds (same versus different) in order to examine whether this cross-modal facilitation effect is caused by the multisensory enhanceme… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
23
0

Year Published

2011
2011
2019
2019

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 26 publications
(25 citation statements)
references
References 55 publications
2
23
0
Order By: Relevance
“…Bresciani et al (2008) investigated the interaction of visual, auditory, and tactile sensory information during the presentation of sequences of events, and found that while vision, audition, and touch information were automatically integrated, their respective contributions to the integrated percept were different. In a bimodal setting, Chen and Yeh (2009) showed that the perception of visual events was enhanced by accompaniment with auditory stimulation, thus supporting the facilitation effect of sound on vision. Similarly, Odgaard et al (2004) showed that concurrent visual stimulation could enhance the loudness of auditory white noise, and Arabzadeh et al (2008) confirmed that a visual stimulation could improve the discrimination of somatosensory stimulation, thus providing support for the facilitation effect of vision on both auditory and somatosensory modalities.…”
Section: Introductionmentioning
confidence: 60%
See 1 more Smart Citation
“…Bresciani et al (2008) investigated the interaction of visual, auditory, and tactile sensory information during the presentation of sequences of events, and found that while vision, audition, and touch information were automatically integrated, their respective contributions to the integrated percept were different. In a bimodal setting, Chen and Yeh (2009) showed that the perception of visual events was enhanced by accompaniment with auditory stimulation, thus supporting the facilitation effect of sound on vision. Similarly, Odgaard et al (2004) showed that concurrent visual stimulation could enhance the loudness of auditory white noise, and Arabzadeh et al (2008) confirmed that a visual stimulation could improve the discrimination of somatosensory stimulation, thus providing support for the facilitation effect of vision on both auditory and somatosensory modalities.…”
Section: Introductionmentioning
confidence: 60%
“…This is called multimodal perception (Arabzadeh et al 2008;Driver and Noesselt 2008). Multimodal facilitation has been reported in several articles (Arabzadeh et al 2008;Bresciani et al 2008;Chen and Yeh 2009), where the authors emphasized that the processing of stimuli belonging to different sensory modalities can be facilitated by the simultaneous processing of a unimodal stimulus (Driver and Spence 1998;Meredith and Stein 1986;Teder-Salejarvi et al 2005). Bresciani et al (2008) investigated the interaction of visual, auditory, and tactile sensory information during the presentation of sequences of events, and found that while vision, audition, and touch information were automatically integrated, their respective contributions to the integrated percept were different.…”
Section: Introductionmentioning
confidence: 90%
“…Considering the three hypotheses outlined earlier (see the Introduction): preparedness-enhancement (Nickerson, 1973), signalenhancement (Stein et al, 1996), and object-enhancement (Chen & Yeh, 2009), the experiments reported here are better accounted for by the last hypothesis. The former two hypotheses are inconsistent with the fact that the crossmodal facilitatory effect elicited by the presentation of the sound was not observed across all ISIs, and critically, no crossmodal faciUtation was observed at the 0 ms ISI.…”
Section: How Does the Presentation Of The Sound Enhance Visual Percepmentioning
confidence: 89%
“…These and other results have provided evidence for the notion that multisensory integration enhances signal clarity and/or reduces stimulus ambiguity (see e.g. Chen and Yeh, 2009;Olivers and Van der Burg, 2008;Vroomen and De Gelder, 2000). One drawback of the majority of studies to date, however, is that they examine interactions among single events at a time (i.e.…”
Section: Introductionmentioning
confidence: 91%