2014
DOI: 10.3389/fninf.2014.00001
|View full text |Cite
|
Sign up to set email alerts
|

Local active information storage as a tool to understand distributed neural information processing

Abstract: Every act of information processing can in principle be decomposed into the component operations of information storage, transfer, and modification. Yet, while this is easily done for today's digital computers, the application of these concepts to neural information processing was hampered by the lack of proper mathematical definitions of these operations on information. Recently, definitions were given for the dynamics of these information processing operations on a local scale in space and time in a distribu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

7
207
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 141 publications
(214 citation statements)
references
References 64 publications
(89 reference statements)
7
207
0
Order By: Relevance
“…Since at a local or pointwise level [24][25][26][27][28] (i.e., the terms inside the expectation), information is equal to change in surprisal, I ccs seeks to measure shared information as the change in surprisal that is common to the input variables (hence CCS, Common Change in Surprisal). For two inputs, I ccs is defined as:…”
Section: The Eid Using I Ccsmentioning
confidence: 99%
“…Since at a local or pointwise level [24][25][26][27][28] (i.e., the terms inside the expectation), information is equal to change in surprisal, I ccs seeks to measure shared information as the change in surprisal that is common to the input variables (hence CCS, Common Change in Surprisal). For two inputs, I ccs is defined as:…”
Section: The Eid Using I Ccsmentioning
confidence: 99%
“…Indeed, another notable information-theoretic quantity that can be computed in univariate applications, denoted as sE, received a lot of attention for its ability to quantify regularity [8] and information storage [7,9].…”
Section: Estimate Of Rmsce and Rmssementioning
confidence: 99%
“…The higher the CE, the higher the complexity of the series is. CE and sE can be seen as portions of the decomposition of the overall amount of information carried by a series given that their sum is the Shannon entropy (ShE) [7][8][9][10]. While the CE is more widely utilized as a measure of complexity of a series [1][2][3], sE is traditionally exploited to assess regularity and predictability of a process [8] or information stored in it [7,9].…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations