2018
DOI: 10.1016/j.neuroscience.2018.04.030
|View full text |Cite
|
Sign up to set email alerts
|

Principles of Temporal Processing Across the Cortical Hierarchy

Abstract: The world is richly structured on multiple spatiotemporal scales. In order to represent spatial structure, many machine-learning models repeat a set of basic operations at each layer of a hierarchical architecture. These iterated spatial operations - including pooling, normalization and pattern completion - enable these systems to recognize and predict spatial structure, while robust to changes in the spatial scale, contrast and noisiness of the input signal. Because our brains also process temporal informatio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

5
91
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3
2

Relationship

1
8

Authors

Journals

citations
Cited by 81 publications
(101 citation statements)
references
References 123 publications
5
91
0
Order By: Relevance
“…In HAT, if prior context can be successfully compressed with new input, then the context is preserved, but if prior context and new input are incompatible (prediction error), then the context is overwritten. This surprise-driven gating mechanism is consistent with evidence for pattern violations being signaled independently at multiple levels of cortical processing (Bekinschtein et al, 2009;Himberger et al, 2018;Wacongne et al, 2011).…”
Section: Discussionsupporting
confidence: 85%
“…In HAT, if prior context can be successfully compressed with new input, then the context is preserved, but if prior context and new input are incompatible (prediction error), then the context is overwritten. This surprise-driven gating mechanism is consistent with evidence for pattern violations being signaled independently at multiple levels of cortical processing (Bekinschtein et al, 2009;Himberger et al, 2018;Wacongne et al, 2011).…”
Section: Discussionsupporting
confidence: 85%
“…These findings situate the core language network within the context of a cortical hierarchy of integration timescales (Himberger et al, 2018). The common functional profile shared by temporal and inferior frontal language regions occupies a particular stage within this broader hierarchy, which is located, as expected, downstream from auditory regions and upstream from the episodic network.…”
Section: The Core Language Network As a Unified Whole Occupies A Uniqsupporting
confidence: 69%
“…A second key question addressed by our study is the role of intrinsic processing in establishing the persistency of neuronal representations along an object-processing pathway. In both primates and rodents, intrinsic temporal scales have been found to increase along various cortical hierarchies (Chaudhuri et al 2015;Himberger et al 2018;Murray et al 2014;Runyan et al 2017). However, it is unknown whether a similar increase takes place along the ventral stream and, if so, how such increase may combine with the growing stability of stimulus-driven responses we have discussed in the previous section.…”
Section: Discussionmentioning
confidence: 99%
“…Recently it has been shown that models of the ventral stream based on deep convolutional neural networks can be improved in their predictive power for perception and neural activity by including adaptation mechanisms (Vinken et al 2019) and recurrent processing (Kar et al 2019;Kietzmann et al 2019;Tang et al 2018). Moreover, a progressive increase of the importance of intrinsic processing along the ventral stream may be expected, given that intrinsic temporal scales increase along various cortical hierarchies in primates and rodents (Chaudhuri et al 2015;Himberger et al 2018;Murray et al 2014;Runyan et al 2017). However, it is not known how the interaction between invariant encoding of object information and intrinsic processing unfolds along the ventral stream, and whether or not the net result is an ordered progression of temporal scales of neural processing.…”
Section: Introductionmentioning
confidence: 99%