2017
DOI: 10.1101/142562
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Modality-independent coding of scene categories in prefrontal cortex

Abstract: Natural environments convey information through multiple sensory modalities, all of which contribute to people's percepts. Although it has been shown that neural representations of visual content can be decoded from the visual cortex, it remains unclear where and how humans represent perceptual information at a conceptual level, not limited to a specific sensory modality. To address this question, we investigated how categories of scene images and sounds are represented in several brain regions. We found that … Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

4
12
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 8 publications
(16 citation statements)
references
References 53 publications
4
12
0
Order By: Relevance
“…3B,D). Traditionally sensory-specific areas have long been implicated in cross-sensory processing (Jung et al, 2018;Kim and Zatorre, 2011;Kriegstein et al, 2005;Smith and Goodale, 2015;Vetter et al, 2014); however, in this case, the regions were activated in the absence of any visual stimulation or association with a visual cue, ruling out a multisensory integrative or paired associative process. Auditory objects and scenes have resisted easy analogical transfer from their visual counterparts, both as fundamental concepts (Griffiths and Warren, 2004) and as variables in experiments, due to differences in spatiotemporal structure of an image vs. a sound .…”
Section: Hierarchy Within and Beyond Superior Temporal Regionsmentioning
confidence: 75%
“…3B,D). Traditionally sensory-specific areas have long been implicated in cross-sensory processing (Jung et al, 2018;Kim and Zatorre, 2011;Kriegstein et al, 2005;Smith and Goodale, 2015;Vetter et al, 2014); however, in this case, the regions were activated in the absence of any visual stimulation or association with a visual cue, ruling out a multisensory integrative or paired associative process. Auditory objects and scenes have resisted easy analogical transfer from their visual counterparts, both as fundamental concepts (Griffiths and Warren, 2004) and as variables in experiments, due to differences in spatiotemporal structure of an image vs. a sound .…”
Section: Hierarchy Within and Beyond Superior Temporal Regionsmentioning
confidence: 75%
“…As we identified the neural representations of temperature and sound level in scene images, we here further explore these representations to better understand how scene content is represented in PFC. Previous research has demonstrated that the prefrontal cortex represents scene categories at an abstract-level, which generalizes between visual and auditory input (Jung et al, 2018). Thus, it is possible that the prefrontal cortex represents global scene properties, temperature and sound-level information, at such an abstract level as well.…”
Section: Resultsmentioning
confidence: 99%
“…Research has demonstrated that various components of scenes are processed across several brain regions of the scene-selective network in the visual cortex, such as the parahippocampal place area (PPA, Epstein & Kanwisher, 1998), the retrosplenial cortex (RSC, Maguire, 2001), and the occipital place area (OPA, Dilks et al, 2013). However, recent studies have shown that visual scene processing takes place beyond the visual cortex and involves the associative cortex such as the parietal (Silson et al, 2016; Silson et al, 2019) and the prefrontal cortex (Jung et al, 2018). Does this indicate that the scene processing network can be extended to the prefrontal cortex (PFC)?…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations