2008
DOI: 10.1007/s00221-008-1664-6
|View full text |Cite
|
Sign up to set email alerts
|

The dog’s meow: asymmetrical interaction in cross-modal object recognition

Abstract: Little is known on cross-modal interaction in complex object recognition. The factors influencing this interaction were investigated using simultaneous presentation of pictures and vocalizations of animals. In separate blocks, the task was to identify either the visual or the auditory stimulus, ignoring the other modality. The pictures and the sounds were congruent (same animal), incongruent (different animals) or neutral (animal with meaningless stimulus). Performance in congruent trials was better than in in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

18
73
4
1

Year Published

2010
2010
2018
2018

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 82 publications
(96 citation statements)
references
References 45 publications
18
73
4
1
Order By: Relevance
“…In particular, since associative sensory features of a learned stimulus are found to reactivate sensory areas at retrieval (Harris et al, 2001), even when only one sensory modality is cued (Nyberg et al, 2000;von Kriegstein and Giraud, 2006), these reactivations can reliably probe binding of modality-specific distributed brain regions while retrieving relevant information, either hippocampally (Staresina et al, 2009;Takashima et al, 2009) or neocortically mediated (as reported here). Furthermore, the assimilation of multisensory perceived stimuli into one coherent whole (Amedi et al, 2005;Driver and Noesselt, 2008), can be more thoroughly investigated when considering long-term consequences of these assimilative mechanisms , and the mediating effect of (semantic) congruency (Kim et al, 2008;Yuval-Greenberg and Deouell, 2009). Finally, since training can modify congruency judgments (Ernst, 2007), sometimes even modulated by other modalities (Fredembach et al, 2009), these findings can be very helpful when designing educational programs where multisensory learning is an integral part of the curriculum (Lasry and Aulls, 2007) [e.g., in medical education (Patel et al, 2009)].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…In particular, since associative sensory features of a learned stimulus are found to reactivate sensory areas at retrieval (Harris et al, 2001), even when only one sensory modality is cued (Nyberg et al, 2000;von Kriegstein and Giraud, 2006), these reactivations can reliably probe binding of modality-specific distributed brain regions while retrieving relevant information, either hippocampally (Staresina et al, 2009;Takashima et al, 2009) or neocortically mediated (as reported here). Furthermore, the assimilation of multisensory perceived stimuli into one coherent whole (Amedi et al, 2005;Driver and Noesselt, 2008), can be more thoroughly investigated when considering long-term consequences of these assimilative mechanisms , and the mediating effect of (semantic) congruency (Kim et al, 2008;Yuval-Greenberg and Deouell, 2009). Finally, since training can modify congruency judgments (Ernst, 2007), sometimes even modulated by other modalities (Fredembach et al, 2009), these findings can be very helpful when designing educational programs where multisensory learning is an integral part of the curriculum (Lasry and Aulls, 2007) [e.g., in medical education (Patel et al, 2009)].…”
Section: Discussionmentioning
confidence: 99%
“…We chose for this setup, because multisensory stimuli that fit with prior knowledge can be regarded as schemacongruent, and remembering multisensory information is easier if it represents a feature combination congruent with prior expe-rience (Kim et al, 2008;Yuval-Greenberg and Deouell, 2009). This semantic congruency can be regarded as information that can readily be assimilated into preexisting mental schemata.…”
Section: Introductionmentioning
confidence: 99%
“…Jaekl & Harris, 2009;Yuval-Greenberg & Deouell, 2009). The additional increase in response latencies in response to complex sounds in the older group may relate to the auditory processing deficits observed in normal aging.…”
Section: Semantic Classification Taskmentioning
confidence: 99%
“…However, this cross-modal RT facilitation effect was larger in the auditory modality suggesting that older participants were faster to respond to complex sounds when they were accompanied by visual stimuli than vice versa. A more pronounced influence of concurrent visual stimulation on auditory perception has been demonstrated in previous behavioural studies (cf., Chen & Spence, 2010;Yuval-Greenberg & Deouell, 2009). Faster RTs to tones following concurrent AV presentation may occur because the visual modality provides more reliable and unambiguous information for object recognition.…”
Section: Visual Dominancementioning
confidence: 99%
“…Em situações naturais, a informação de um dado objeto é compatível em diferentes modalides sensoriais (Yuval-Greenberg & Deouell, 2009). Reconhecimento cross-modal é a habilidade do cérebro de identificar um indivíduo ou um objeto através da interação dos sentidos.…”
Section: Reconhecimento Cross 2 -Modalunclassified