2013
DOI: 10.1523/jneurosci.3020-12.2013
|View full text |Cite
|
Sign up to set email alerts
|

Sensory and Striatal Areas Integrate Auditory and Visual Signals into Behavioral Benefits during Motion Discrimination

Abstract: For effective interactions with our dynamic environment, it is critical for the brain to integrate motion information from the visual and auditory senses. Combining fMRI and psychophysics, this study investigated how the human brain integrates auditory and visual motion into benefits in motion discrimination. Subjects discriminated the motion direction of audiovisual stimuli that contained directional motion signal in the auditory, visual, audiovisual, or no modality at two levels of signal reliability. Theref… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
33
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 35 publications
(36 citation statements)
references
References 51 publications
2
33
0
Order By: Relevance
“…For typical participants (group size n=7) auditory direction could be decoded from the pattern of right LOC responses but not hMT+. However, another study that examined a similar question did successfully classify the direction of auditory motion from hMT+ activity (Dormal, Rezk, Yakobov, Lepore, & Collignon, 2016), adding to several other studies that documented audiovisual interactions in this region (von Saldern & Noppeney, 2013). The involvement of right LOC in coding for auditory-motion direction was also indicated by Alink et al (2012) via a whole-brain searchlight-based pattern classification.…”
Section: Sensitivity To Regularity In Multimodal Contextsmentioning
confidence: 67%
“…For typical participants (group size n=7) auditory direction could be decoded from the pattern of right LOC responses but not hMT+. However, another study that examined a similar question did successfully classify the direction of auditory motion from hMT+ activity (Dormal, Rezk, Yakobov, Lepore, & Collignon, 2016), adding to several other studies that documented audiovisual interactions in this region (von Saldern & Noppeney, 2013). The involvement of right LOC in coding for auditory-motion direction was also indicated by Alink et al (2012) via a whole-brain searchlight-based pattern classification.…”
Section: Sensitivity To Regularity In Multimodal Contextsmentioning
confidence: 67%
“…There are various approaches that employ the additive model, particularly in fMRI analysis, that are not associated with these confounds (e.g. Werner & Noppeney, 2010;Saldern, & Noppeney, 2013).…”
Section: Discussionmentioning
confidence: 99%
“…Multisensory enhancement, which is the most reliable index of multisensory integration (and will be discussed here most extensively) may reflect computations that yield response magnitudes that are equal to, less than or greater than the sum of the responses to the individual component stimuli 7 . In behaviour, performance enhancements are often quantified by evaluating differences in the accuracy and speed of detection, localization and/or identification of stimuli 3,4,1026 . In short, multisensory integration refers to a broad class of computations involving multiple sensory modalities in which information is integrated to produce an enhanced (or degraded) response.…”
Section: Defining Multisensory Integrationmentioning
confidence: 99%