2010
DOI: 10.1016/j.neunet.2010.05.008
|View full text |Cite
|
Sign up to set email alerts
|

Information-theoretic methods for studying population codes

Abstract: Population coding is the quantitative study of which algorithms or representations are used by the brain to combine together and evaluate the messages carried by different neurons. Here, we review an information-theoretic approach to population coding. We first discuss how to compute the information carried by simultaneously recorded neural populations, and in particular how to reduce the limited sampling bias which affects calculation of information from a limited amount of experimental data. We then discuss … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
76
1

Year Published

2013
2013
2022
2022

Publication Types

Select...
7
2

Relationship

2
7

Authors

Journals

citations
Cited by 51 publications
(78 citation statements)
references
References 66 publications
1
76
1
Order By: Relevance
“…However, there is a long-standing debate over the role and significance of concerted activity in neural information processing (Averbeck and Lee 2004). A number of studies have revealed that concerted neuronal activity encodes extra stimulus information which cannot be extracted by the responses of single neurons, and they play important roles in animal behaviors (Dan et al 1998;Ince et al 2010;Ishikane et al 2005), whereas other work suggested that neuronal correlation conveys little information and can be largely neglected (Meytlis et al 2012;Nirenberg et al 2001;Oizumi et al 2010). In this study, we observed the changes in information representation throughout the process of adaptation.…”
Section: Discussionmentioning
confidence: 99%
“…However, there is a long-standing debate over the role and significance of concerted activity in neural information processing (Averbeck and Lee 2004). A number of studies have revealed that concerted neuronal activity encodes extra stimulus information which cannot be extracted by the responses of single neurons, and they play important roles in animal behaviors (Dan et al 1998;Ince et al 2010;Ishikane et al 2005), whereas other work suggested that neuronal correlation conveys little information and can be largely neglected (Meytlis et al 2012;Nirenberg et al 2001;Oizumi et al 2010). In this study, we observed the changes in information representation throughout the process of adaptation.…”
Section: Discussionmentioning
confidence: 99%
“…The reason for this failure is that laboratory data sample the space of possible activity patterns rather sparsely, and this sparsity undermines our confidence in the knowledge of the underlying distribution, a knowledge that is critical for the determination of the probabilities in Equation (1). This difficulty is referred to in the literature as the small sample bias, and several ad hoc counter-measures have been proposed, although those have been limited to a small handful of neurons [3][4][5].…”
Section: Methods For Estimating Information Content In Single Spike Tmentioning
confidence: 99%
“…Accordingly, the information loss decompositions can be connected to hierarchical decompositions of the mutual information [33,34]. Furthermore, information loss associated with the preservation of only certain marginal distributions can be formulated in terms of maximum entropy [24], which renders loss lattices suitable to extend previous work studying neural population coding with the maximum entropy framework [15]. We will use the notation L(S; α) to refer to the cumulative terms of the information loss decomposition, in comparison to the cumulative terms of information gain I(S; α).…”
Section: Decompositions Of Mutual Information Lossmentioning
confidence: 99%
“…The bivariate redundancy measure also already used in [24] corresponds to I(S; i.j). The rest of incremental terms can be obtained from the information loss lattice using Equation (15). Note that we could have proceeded in a similar way starting from a definition of the cumulative terms in the gain lattice, such as I min , and then determining the terms of the loss lattice.…”
Section: Dual Decompositions Of Information Gain and Information Lossmentioning
confidence: 99%