2022
DOI: 10.1101/2022.01.28.478267
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Learning the Vector Coding of Egocentric Boundary Cells from Visual Data

Abstract: The use of spatial maps to navigate through the world requires a complex ongoing transformation of egocentric views of the environment into position within the allocentric map. Recent research has discovered neurons in retrosplenial cortex and other structures that could mediate the transformation from egocentric views to allocentric views. These egocentric boundary cells (EBCs) respond to the egocentric direction and distance of barriers relative to an animals point of view. This egocentric coding based on th… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2
1

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 54 publications
0
3
0
Order By: Relevance
“…In addition, the fact that these egocentric boundary cells can respond at distances up to 50 cm from the boundary wall suggest that they are guided by visual sensory input (Alexander et al, 2020). Computational models show that using synaptic modification with sparse coding constraints (Lian & Burkitt, 2021; Olshausen & Field, 1996, 2004) to form an efficient representation of visual input from the entire foraging trajectory within a population results in many simulated neurons that have responses similar to egocentric boundary cells that can respond similarly in circular and rectangular environments (Lian et al, 2022).…”
Section: Review Of Models Of Spatial Coordinate Representationsmentioning
confidence: 99%
“…In addition, the fact that these egocentric boundary cells can respond at distances up to 50 cm from the boundary wall suggest that they are guided by visual sensory input (Alexander et al, 2020). Computational models show that using synaptic modification with sparse coding constraints (Lian & Burkitt, 2021; Olshausen & Field, 1996, 2004) to form an efficient representation of visual input from the entire foraging trajectory within a population results in many simulated neurons that have responses similar to egocentric boundary cells that can respond similarly in circular and rectangular environments (Lian et al, 2022).…”
Section: Review Of Models Of Spatial Coordinate Representationsmentioning
confidence: 99%
“…The sparse coding model used in this paper is implemented using locally competitive algorithm (Rozell et al, 2008), which is also used in other works (Zhu and Rozell, 2013; Lian and Burkitt, 2021, 2022; Lian et al, 2023). Though some components of the model are not biologically plausible, the essence of this paper is to demonstrate that the principle of sparse coding can be used to account for the development of spatiotemporal properties V1 cells.…”
Section: Discussionmentioning
confidence: 99%
“…An appealing feature of sparse coding is its learning ability, namely that the model describes how the connection weights can be learnt based on the statistics of the input visual data, in a way that is similar to synaptic plasticity. Apart from its success in learning RFs of cells in the visual cortex, sparse coding has also been successfully applied to explain experimental results in other brain areas, such as auditory cortex (Smith and Lewicki, 2006) and hippocampal formation (Lian and Burkitt, 2021, 2022; Lian et al, 2023). For a review see Beyeler et al (2019).…”
Section: Introductionmentioning
confidence: 99%