2022
DOI: 10.1142/9789811259401_0009
|View full text |Cite
|
Sign up to set email alerts
|

Entropy Analysis of Univariate Biomedical Signals: Review and Comparison of Methods

Abstract: Nonlinear techniques have found an increasing interest in the dynamical analysis of various kinds of systems. Among these techniques, entropy-based metrics have emerged as practical alternatives to classical techniques due to their wide applicability in different scenarios, specially to short and noisy processes. Issued from information theory, entropy approaches are of great interest to evaluate the degree of irregularity and complexity of physical, physiological, social, and econometric systems. Based on Sha… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
23
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1
1

Relationship

3
3

Authors

Journals

citations
Cited by 14 publications
(26 citation statements)
references
References 180 publications
0
23
0
Order By: Relevance
“…Using such notation, the static entropy quantifies the ‘static’ information contained in the current state of the process X , without considering its temporal dynamics, and can be defined as [45]: where E [·] is the expectation operator and p (·) the probability density, while H (·) denotes the entropy. The dynamic entropy (DE) instead represents the “joint” entropy of the present and past variables composing the process; therefore it provides the amount of information brought by the current sample of the series and by its past samples as well, thus giving ‘dynamic’ information on the entire process, and it can be defined as [46]: being H (·,·) the joint entropy of two random variables. Then, the conditional entropy (CE) quantifies the average uncertainty that remains about the present state of the process when its past states are known (i.e., the new information contained in the current sample that cannot be inferred from the past history), and is defined as [45]: where H (· | ·) denotes conditional entropy operator.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Using such notation, the static entropy quantifies the ‘static’ information contained in the current state of the process X , without considering its temporal dynamics, and can be defined as [45]: where E [·] is the expectation operator and p (·) the probability density, while H (·) denotes the entropy. The dynamic entropy (DE) instead represents the “joint” entropy of the present and past variables composing the process; therefore it provides the amount of information brought by the current sample of the series and by its past samples as well, thus giving ‘dynamic’ information on the entire process, and it can be defined as [46]: being H (·,·) the joint entropy of two random variables. Then, the conditional entropy (CE) quantifies the average uncertainty that remains about the present state of the process when its past states are known (i.e., the new information contained in the current sample that cannot be inferred from the past history), and is defined as [45]: where H (· | ·) denotes conditional entropy operator.…”
Section: Methodsmentioning
confidence: 99%
“…AR model identification has been performed via ordinary least squares method [47] to obtain estimations of regression parameters and prediction error variance, thus estimating the variance and the covariance matrices of the process. Then, denoting as 𝜎 ̂𝑋 2 the variance of the process, as 𝛴 ̂𝑋𝑛 𝑿 𝑛 𝑚 the covariance matrix of the present and past states of X, and as 𝜎 ̂𝑈 2 the prediction error variance, the above defined entropy measures can be computed as [46]:…”
Section: Information Domain Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…Information-theoretic analysis was performed to assess the static and dynamic information content of the analyzed series. Specifically, the entropy of a process , representing the quantity of information held in the current state of , is defined under the hypothesis of stationarity as [21]:…”
Section: Preprocessing and Data Analysismentioning
confidence: 99%
“…1 (c). The information storage (S) is the quantity of information held in the current state of the system attributable to its past states, measuring regularity and predictability of the time series; it is defined as follows [21]: analyzed process, and = . .…”
Section: Preprocessing and Data Analysismentioning
confidence: 99%