2020
DOI: 10.3390/e22111231
|View full text |Cite
|
Sign up to set email alerts
|

Information Processing in the Brain as Optimal Entropy Transport: A Theoretical Approach

Abstract: We consider brain activity from an information theoretic perspective. We analyze the information processing in the brain, considering the optimality of Shannon entropy transport using the Monge–Kantorovich framework. It is proposed that some of these processes satisfy an optimal transport of informational entropy condition. This optimality condition allows us to derive an equation of the Monge–Ampère type for the information flow that accounts for the branching structure of neurons via the linearization of thi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 39 publications
0
1
0
Order By: Relevance
“…The transfer of information in the shape of small electric currents is carried out by a sequence of action potentials called spike trains. In light of the Shannon Theory, some studies suggest that the information contained in spike trains in the form of temporal codes are more energetically efficient than rate codes [ 15 , 16 , 17 ]. Neuromorphic architectures based on SNN benefit from computation organization—achieving high energy efficiency by co-locating computing (neurons) and memory (synapse) elements, and information representation—less power consumption by event-driven spikes encoding information [ 18 ].…”
Section: Introductionmentioning
confidence: 99%
“…The transfer of information in the shape of small electric currents is carried out by a sequence of action potentials called spike trains. In light of the Shannon Theory, some studies suggest that the information contained in spike trains in the form of temporal codes are more energetically efficient than rate codes [ 15 , 16 , 17 ]. Neuromorphic architectures based on SNN benefit from computation organization—achieving high energy efficiency by co-locating computing (neurons) and memory (synapse) elements, and information representation—less power consumption by event-driven spikes encoding information [ 18 ].…”
Section: Introductionmentioning
confidence: 99%