2023
DOI: 10.1073/pnas.2305853120
|View full text |Cite
|
Sign up to set email alerts
|

Common population codes produce extremely nonlinear neural manifolds

Anandita De,
Rishidev Chaudhuri

Abstract: Populations of neurons represent sensory, motor, and cognitive variables via patterns of activity distributed across the population. The size of the population used to encode a variable is typically much greater than the dimension of the variable itself, and thus, the corresponding neural population activity occupies lower-dimensional subsets of the full set of possible activity states. Given population activity data with such lower-dimensional structure, a fundamental question asks how close the low-dimension… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 93 publications
(171 reference statements)
0
5
0
Order By: Relevance
“…What happens if these assumptions are violated? Violations of the first and second assumptions—independence and linearity of patterns—are known to impact data analysis ( 23 ) and are discussed at length in statistics textbooks ( 24 ). Namely, when the latent underlying patterns are not independent of each other, PCA will output components which fail to capture any of the correlated factors ( Fig.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…What happens if these assumptions are violated? Violations of the first and second assumptions—independence and linearity of patterns—are known to impact data analysis ( 23 ) and are discussed at length in statistics textbooks ( 24 ). Namely, when the latent underlying patterns are not independent of each other, PCA will output components which fail to capture any of the correlated factors ( Fig.…”
Section: Resultsmentioning
confidence: 99%
“…It is also unknown how phantom oscillations impact other machine learning methods that use PCA for preprocessing. Many studies explore how low-level statistics influence dimensionality reduction ( 21 23 , 32 , 50 58 ). These studies mirror our own by showing that patterns in high-dimensional data analysis may not always reflect true patterns in the data.…”
Section: Discussionmentioning
confidence: 99%
“…Clearly, MP neurons’ width, indexed by an average of the FWHM across all peaks, was significantly narrower than SP neurons’ width (RNN: 1.29 versus 2.14, p < 0.001; PoS: 0.82 versus 1.05, p < 0.001; ADn: 0.79 versus 1.28, p < 0.001). Recent studies have suggested that the width of a neuron’s tuning curve plays a critical role in shaping the dimensionality of neural representation spaces (De and Chaudhuri, 2023; Kim et al, 2020; Kriegeskorte and Wei, 2021; Langdon et al, 2023), which in turn influences the functionality of neural networks. Thus, here we delved into the impact of tuning curve’s width on HD and AHV encoding.…”
Section: Resultsmentioning
confidence: 99%
“…The key factor for expanding the dimensionality of neural representational spaces is the width of tuning curves. Research has repeatedly shown that, for one-dimensional features, narrower tuning curves not only enable individual neurons to encode a greater volume of information (Zhang and Sejnowski, 1999) but also contribute to increasing the linear dimensions of neural populations (De and Chaudhuri, 2023; Kriegeskorte and Wei, 2021) and producing a more reliable signals for low latency communication (Lenninger et al, 2023). Note that this characteristic has previously been discussed primarily in the context of SP neurons, which exhibit a great variance in their tuning curve widths.…”
Section: Discussionmentioning
confidence: 99%
“…culminates into high level representations of visual objects which allow for object recognition (DiCarlo et al, 2012 ). Additionally, common neural population codes have been shown to have highly nonlinear relationships, suggesting nonlinear transformation is a common feature of sensory processing (De & Chaudhuri, 2023 ). The results of the modeling analysis provide additional evidence for this conclusion, revealing the best predictive power across all global properties is achieved after several transformations of the time-frequency input, which ultimately highlights the critical role of information abstraction that underlies the processing hierarchy in these deep models (Kell & McDermott, 2019 ; Yamins & DiCarlo, 2016 ).…”
Section: Discussionmentioning
confidence: 99%