2023
DOI: 10.3390/a16050255
|View full text |Cite
|
Sign up to set email alerts
|

Neural Network Entropy (NNetEn): Entropy-Based EEG Signal and Chaotic Time Series Classification, Python Package for NNetEn Calculation

Abstract: Entropy measures are effective features for time series classification problems. Traditional entropy measures, such as Shannon entropy, use probability distribution function. However, for the effective separation of time series, new entropy estimation methods are required to characterize the chaotic dynamic of the system. Our concept of Neural Network Entropy (NNetEn) is based on the classification of special datasets in relation to the entropy of the time series recorded in the reservoir of the neural network… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

3
3

Authors

Journals

citations
Cited by 6 publications
(9 citation statements)
references
References 46 publications
(78 reference statements)
0
3
0
Order By: Relevance
“…NNetEn is an artificial neural network LogNNet model developed by Velichko and Heidari (2021) to quantify the degree of complexity in a dynamical system. An implementation of the algorithms in Python is presented in Velichko et al., 2023 and package for NNetEn calculation involved in this study is publicly available on GitHub. The model is based on classification accuracy and computes entropy directly without considering the concept of probability distribution.…”
Section: Data Acquisition and Methods Of Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…NNetEn is an artificial neural network LogNNet model developed by Velichko and Heidari (2021) to quantify the degree of complexity in a dynamical system. An implementation of the algorithms in Python is presented in Velichko et al., 2023 and package for NNetEn calculation involved in this study is publicly available on GitHub. The model is based on classification accuracy and computes entropy directly without considering the concept of probability distribution.…”
Section: Data Acquisition and Methods Of Analysismentioning
confidence: 99%
“…The model calculates entropy directly without considering or approximating probability distributions. NNetEn has proven to be a valuable tool for studying the dynamical complexity of a system (Oludehinwa et al., 2022; Velichko, Belyaev, et al., 2022; Velichko, Wagner, et al., 2022; Velichko et al., 2023).…”
Section: Introductionmentioning
confidence: 99%
“…Among the most known are Shannon entropy [ 18 ], sample entropy (SampEn) [ 19 ], permutation entropy (PermEn) [ 20 ], fuzzy entropy (FuzzyEn) [ 58 ], cosine similarity entropy [ 59 ], phase entropy [ 60 ], singular value decomposition entropy (SvdEn) [ 61 ]. The search for new chaos estimation algorithms includes, for example, Neural Network Entropy (NNetEn) [ 62 , 63 , 64 ], which is based on the classification of special datasets in relation to the entropy of the time series recorded in the reservoir of the neural network. NNetEn does not take into account probability distribution functions.…”
Section: Introductionmentioning
confidence: 99%
“…Classification of time series based on entropy analysis and machine learning (ML) is a trending task in studying nonlinear signals. For example, EEG classification in diagnosing Alzheimerʹs disease [1,2] and Parkinsonʹs disease [3][4][5][6]. There are many types of entropies, which in turn have several customizable parameters, for example, sample entropy (SampEn) [7], cosine similarity entropy (CoSiEn) [8], singular value decomposition entropy (SVDEn) [9], fuzzy entropy (FuzzyEn) [10], permutation entropy (PermEn) [11], etc.…”
Section: Introductionmentioning
confidence: 99%
“…In this context, it is a problem of paramount importance to assess the effectiveness of the different entropies when used as features in ML classification. Recently, Velichko et al proposed the use of a LogNNet neural network [12,13] for neural network entropy (NNetEn) calculation [1]. LogNNet neural network is a feedforward neural network that uses filters based on the logistic function and a reservoir inspired by recurrent neural networks, thus enabling the transformation of a signal into a high-dimensional space.…”
Section: Introductionmentioning
confidence: 99%