2020
DOI: 10.3390/e22090917
|View full text |Cite
|
Sign up to set email alerts
|

Entropy and the Brain: An Overview

Abstract: Entropy is a powerful tool for quantification of the brain function and its information processing capacity. This is evident in its broad domain of applications that range from functional interactivity between the brain regions to quantification of the state of consciousness. A number of previous reviews summarized the use of entropic measures in neuroscience. However, these studies either focused on the overall use of nonlinear analytical methodologies for quantification of the brain activity or their content… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
76
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 94 publications
(86 citation statements)
references
References 229 publications
(318 reference statements)
0
76
1
Order By: Relevance
“…Particularly, it is known that the brain signal variability is indicative of its functioning. Specifically, this variability is generated from the interplay between single neurons and their neuronal circuits that allows the brain to self-organize itself in order to maximize the brain information capacity [ 31 ]. In turn, these findings explain the capacity of entropy of quantifying the brain’s information processing [ 32 , 33 , 34 ], given the direct correspondence between the variance and the amount of information.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…Particularly, it is known that the brain signal variability is indicative of its functioning. Specifically, this variability is generated from the interplay between single neurons and their neuronal circuits that allows the brain to self-organize itself in order to maximize the brain information capacity [ 31 ]. In turn, these findings explain the capacity of entropy of quantifying the brain’s information processing [ 32 , 33 , 34 ], given the direct correspondence between the variance and the amount of information.…”
Section: Introductionmentioning
confidence: 99%
“…In turn, these findings explain the capacity of entropy of quantifying the brain’s information processing [ 32 , 33 , 34 ], given the direct correspondence between the variance and the amount of information. This approach revealed promising results in the assessment of altered state of consciousness, brain aging, and quantification of the brain networks’ information processing [ 31 ]. The complexity of cerebral signals can be evaluated using different entropy metrics.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Although the methods described above for detecting driver states have made remarkable progress, there is still room for improvement in the accuracy and robustness of detection systems based on EEG signals. An entropy-based method for measuring non-linear dynamic parameters of incidence of new information in a time series has been applied in many scientific fields such as assisted diagnosis of diseases [ 20 ], cognitive ability understanding [ 21 ], neuroimaging research [ 22 ] and quantification of brain function [ 23 ]. Inspired by this, we consider the EEG signal, especially in an actual driving environment, to have the characteristics of complexity, instability and non-linearity.…”
Section: Introductionmentioning
confidence: 99%
“…In a subsequent work, we explore this question in more detail, but a plausible connection can be established with experimental and theoretical results via fMRI in which different measures of cost have been proposed. The brain as an informational system is the subject of active research (see for instance [ 6 , 7 , 8 ]). For example, “behavior analysis has adopted these tools [Information theory] as a novel means of measuring the interrelations between behavior, stimuli, and contingent outcomes” [ 1 ].…”
Section: Introductionmentioning
confidence: 99%