2004
DOI: 10.1016/j.tics.2004.02.002
|View full text |Cite
|
Sign up to set email alerts
|

Merging the senses into a robust percept

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

67
1,193
5
6

Year Published

2012
2012
2018
2018

Publication Types

Select...
4
4
1

Relationship

1
8

Authors

Journals

citations
Cited by 1,566 publications
(1,308 citation statements)
references
References 53 publications
67
1,193
5
6
Order By: Relevance
“…Following the same line of reasoning as for the disambiguating capacity of spatiotemporal context, sensory information may likely be integrated over multiple modalities to resolve ambiguities that arise within a unimodal processing stream. The rapidly growing body of research on crossmodal interactions with unambiguous, yet noisy, stimuli supports this view, suggesting that disambiguation of unreliable sensory information is a primary purpose of crossmodal interactions [57]. While perceptual choices for ambiguous stimuli in different sensory modalities can occur independently [2], there have been many reports of crossmodal contextual influences of unambiguous sensory information on the perception of otherwise ambiguous stimuli.…”
Section: Crossmodal Contextmentioning
confidence: 99%
“…Following the same line of reasoning as for the disambiguating capacity of spatiotemporal context, sensory information may likely be integrated over multiple modalities to resolve ambiguities that arise within a unimodal processing stream. The rapidly growing body of research on crossmodal interactions with unambiguous, yet noisy, stimuli supports this view, suggesting that disambiguation of unreliable sensory information is a primary purpose of crossmodal interactions [57]. While perceptual choices for ambiguous stimuli in different sensory modalities can occur independently [2], there have been many reports of crossmodal contextual influences of unambiguous sensory information on the perception of otherwise ambiguous stimuli.…”
Section: Crossmodal Contextmentioning
confidence: 99%
“…ventriloquism, (Alais and Burr, 2004;Lewald and Guski, 2003;Slutsky and Recanzone, 2001)) well accounted for by Bayesian models of multisensory integration (Alais and Burr, 2004;Burr and Alais, 2006;Ernst and Bülthoff, 2004;Witten and Knudsen, 2005). More generally, vision tends to be most reliable in encoding spatial cues whereas audition provides the most reliable temporal cues.…”
Section: Phase Of Neural Oscillations: Encoding Time (Or Space?)mentioning
confidence: 99%
“…The temporal compression observed under intentional (or causal) binding could then be explained as an instance of the more general phenomenon of multisensory integration, which involves the merging of cues from different modalities into a single percept (Ernst & Bülthoff, 2004). This occurs only within a small temporal window around simultaneity, often called the temporal window of integration (Shams et al, 2002;Bresciani et al, 2005).…”
Section: Visuomotor Temporal Recalibration and Intentional Bindingmentioning
confidence: 99%
“…This means that, at least theoretically, the IE procedure may capture the distribution underlying recalibration of TOJs. Also, it has been shown that multisensory integration usually involves the loss of access to the uni-sensory estimates (Ernst & Bülthoff, 2004). It seems possible and plausible that this would lead to a compressive bias around simultaneity in the registration of SOAs already.…”
Section: Models Of Time Perception and The Window Of Integrationmentioning
confidence: 99%