2012
DOI: 10.1016/j.brainres.2012.05.015
|View full text |Cite
|
Sign up to set email alerts
|

Crossmodal interactions and multisensory integration in the perception of audio-visual motion — A free-field study

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
21
1

Year Published

2013
2013
2022
2022

Publication Types

Select...
7

Relationship

3
4

Authors

Journals

citations
Cited by 25 publications
(23 citation statements)
references
References 62 publications
1
21
1
Order By: Relevance
“…The experimental protocol (including the recording of hearing thresholds, the participants' task, and the pointing procedure) was adapted from our previous localization study (Schmiedchen et al, 2012). Participants were tested in complete darkness.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The experimental protocol (including the recording of hearing thresholds, the participants' task, and the pointing procedure) was adapted from our previous localization study (Schmiedchen et al, 2012). Participants were tested in complete darkness.…”
Section: Methodsmentioning
confidence: 99%
“…The auditory and visual stimuli were digitally generated using RPvdsEx (real-time processor visual design studio, Tucker Davis Technologies [TDT]) and delivered to two multichannel signal processors (RX8, System3, TDT). A more detailed description of the experimental setup and the calibration of the loudspeakers is given in Schmiedchen, Freigang, Nitsche, and Rübsamen (2012).…”
Section: Apparatus and Stimulimentioning
confidence: 99%
“…This is mostly achieved by using a pointing task, where participants are instructed to indicate the perceived position of a sound source (Shankweiler 1961;Seeber et al 2010;Kerber and Seeber 2012;Kühnle et al 2012;Schmiedchen et al 2012Schmiedchen et al , 2013Freigang et al 2014a) or by alignment of the participant's head and gaze to the direction of the sound source (Lewald et al 2000). Another paradigm used to measure the localisation accuracy is the absolute identification task (Abel et al 2000), where participants have to indicate the sound direction by choosing one of several indicated sound source positions in an nalternative forced-choice task.…”
Section: Measuring Auditory Space Processingmentioning
confidence: 99%
“…While there is an emerging consensus that the underlying neural correlates likely involve multiple stages of the sensory decision making pathways, it remains a challenge to uncover the dynamic processes that implement the multisensory benefit for an upcoming decision in the human brain (Bizley et al, 2016, Kayser and Shams, 2015, Rohe and Noppeney, 2014, Rohe and Noppeney, 2016). For example, many studies have shown that judgements about visual motion can be influenced by simultaneous sounds (Alais and Burr, 2004, Beer and Roder, 2004, Lewis and Noppeney, 2010, Schmiedchen et al, 2012) or vestibular information (Fetsch et al, 2010, Gu et al, 2008), even so when the multisensory stimulus is not directly task relevant (Gleiss and Kayser, 2014b, Kim et al, 2012, Sekuler et al, 1997). In particular, congruent multisensory evidence enhances visual motion discrimination performance over incongruent multisensory information (Meyer and Wuerger, 2001, Meyer et al, 2005, Soto-Faraco et al, 2003, Soto-Faraco et al, 2002).…”
Section: Introductionmentioning
confidence: 99%