2021
DOI: 10.1101/2021.11.25.470008
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Resolving the time course of visual and auditory object categorization

Abstract: Humans can effortlessly categorize objects, both when they are conveyed through visual images and spoken words. To resolve the neural correlates of object categorization, studies have so far primarily focused on the visual modality. It is therefore still unclear how the brain extracts categorical information from auditory signals. In the current study we used EEG (N=47) and time-resolved multivariate pattern analysis to investigate (1) the time course with which object category information emerges in the audit… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 30 publications
(36 reference statements)
0
1
0
Order By: Relevance
“…Going further, the influence of context in perception of a stimulus is not only known in the auditory modality but also in the visual modality (Brandman & Peelen, 2017;Oliva & Torralba, 2007), where recent work showed that a social context can have a significant influence on the perception of an individual visual stimulus and its neural processing (Abassi et al, 2020;Bellot et al, 2021). Although the hierarchy of information processing shows analogies in the visual and auditory modalities (Iamshchinina et al, 2022), a significant gap exists in the literature on audition regarding the understanding of the impact of a social context on perceiving verbal sentences, especially in conditions where the auditory environment is challenging. The social context encompasses not only semantic cues but also involves intricate mechanisms related to the interplay between two or more speakers, elements that are generally overlooked in the examination of higher-order top-down cues, and that is a second gap we aimed to fill in the current study.…”
Section: Introductionmentioning
confidence: 99%
“…Going further, the influence of context in perception of a stimulus is not only known in the auditory modality but also in the visual modality (Brandman & Peelen, 2017;Oliva & Torralba, 2007), where recent work showed that a social context can have a significant influence on the perception of an individual visual stimulus and its neural processing (Abassi et al, 2020;Bellot et al, 2021). Although the hierarchy of information processing shows analogies in the visual and auditory modalities (Iamshchinina et al, 2022), a significant gap exists in the literature on audition regarding the understanding of the impact of a social context on perceiving verbal sentences, especially in conditions where the auditory environment is challenging. The social context encompasses not only semantic cues but also involves intricate mechanisms related to the interplay between two or more speakers, elements that are generally overlooked in the examination of higher-order top-down cues, and that is a second gap we aimed to fill in the current study.…”
Section: Introductionmentioning
confidence: 99%