2012
DOI: 10.3389/fpsyg.2012.00159
|View full text |Cite
|
Sign up to set email alerts
|

Processing of audiovisual associations in the human brain: dependency on expectations and rule complexity

Abstract: In order to respond to environmental changes appropriately, the human brain must not only be able to detect environmental changes but also to form expectations of forthcoming events. The events in the external environment often have a number of multisensory features such as pitch and form. For integrated percepts of objects and events, crossmodal processing, and crossmodally induced expectations of forthcoming events are needed. The aim of the present study was to determine whether the expectations created by … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

5
18
0

Year Published

2013
2013
2020
2020

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(23 citation statements)
references
References 18 publications
5
18
0
Order By: Relevance
“…Indeed, in cued reaction time tasks, the largest decrease in reaction times is typically found when there is a constant delay between the cue and the target, and this advantage is reduced as the delay becomes more variable (Niemi & Näätänen, 1981). Moreover, there is a crucial temporal window during which audiovisual stimuli are integrated (Lindström, Paavilainen, Kujala, & Tervaniemi, 2012; Van Atteveldt, Formisano, Blomert, & Goebel, 2007; Van Wassenhove et al, 2007; Zampini, Shore, & Spence, 2003) and although the width of the window varies, the point of maximal integration is consistently when visual stimuli precede auditory stimuli (Thorne & Debener, 2008; Van Wassenhove et al, 2007). Similarly, electrophysiological recordings show an enhancement of the neural response to auditory tones when they are preceded by a somatosensory or visual stimulus (Lakatos, Chen, O'Connell, Mills, & Schroeder, 2007; Kayser & Logothetis, 2009; Lakatos et al, 2009; Thorne, De Vos, Viola, & Debener, 2011; Wallace, Wilkinson, & Stein, 1996), with the largest AV effect found at an audiovisual SOA of ~65 ms.…”
Section: Discussionmentioning
confidence: 99%
“…Indeed, in cued reaction time tasks, the largest decrease in reaction times is typically found when there is a constant delay between the cue and the target, and this advantage is reduced as the delay becomes more variable (Niemi & Näätänen, 1981). Moreover, there is a crucial temporal window during which audiovisual stimuli are integrated (Lindström, Paavilainen, Kujala, & Tervaniemi, 2012; Van Atteveldt, Formisano, Blomert, & Goebel, 2007; Van Wassenhove et al, 2007; Zampini, Shore, & Spence, 2003) and although the width of the window varies, the point of maximal integration is consistently when visual stimuli precede auditory stimuli (Thorne & Debener, 2008; Van Wassenhove et al, 2007). Similarly, electrophysiological recordings show an enhancement of the neural response to auditory tones when they are preceded by a somatosensory or visual stimulus (Lakatos, Chen, O'Connell, Mills, & Schroeder, 2007; Kayser & Logothetis, 2009; Lakatos et al, 2009; Thorne, De Vos, Viola, & Debener, 2011; Wallace, Wilkinson, & Stein, 1996), with the largest AV effect found at an audiovisual SOA of ~65 ms.…”
Section: Discussionmentioning
confidence: 99%
“…Visual material can establish predictions for a sound (Bendixen et al, 2012 ; Lindström et al, 2012 ; Clark, 2013 ). Particularly, a mismatch between a predictive note-like symbol and the pitch of the corresponding sound elicits brain responses that signal the violation of a prediction.…”
Section: Experiments 1: Separate Predictionsmentioning
confidence: 99%
“…The IR presumably reflects the prediction error at sensory levels of processing. At cognitive levels, where the sound is categorized with respect to task affordances (e.g., whether or which button has to be pressed), the N2b is elicited (Widmann et al, 2004 ; Lindström et al, 2012 ). The fronto-centrally distributed ERP component is observable at approximately 200 ms after sound onset.…”
Section: Experiments 1: Separate Predictionsmentioning
confidence: 99%
See 1 more Smart Citation
“…The neural mechanisms of audiovisual integration can be investigated with the N2 response, a negative deflection of the eventrelated potential (ERP) during the early phases of auditory change detection (Näätänen, Simpson, & Loveless, 1982;Novak, Ritter, Vaughan, & Wiznitzer, 1990). N2 response is a feasible tool for investigating how visual information affects auditory processing at the cortical level (Lindström, Paavilainen, Kujala, & Tervaniemi, 2012). N2 response to attended auditory stimulus changes consists of two components: mismatch negativity (MMN) and N2b (Näätänen, 1992;Näätänen, Gaillard, & Mäntysalo, 1978;Näätänen et al, 1982;Novak et al, 1990).…”
mentioning
confidence: 99%