2021 7th International Conference on Signal Processing and Intelligent Systems (ICSPIS) 2021
DOI: 10.1109/icspis54653.2021.9729371
|View full text |Cite
|
Sign up to set email alerts
|

A Joint-Entropy Approach To Time-series Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 29 publications
0
5
0
Order By: Relevance
“…We realize that the number of trainable parameters in deep learning models (number of filters, kernels, and depth of the network) are chosen based on a trial-and-error process. However, as suggested by Pourafzal and Fereidunian [17] and Safarihamid et al [18], regardless of observation length of chaotic time series, they can be classified using a few features of the complex system [17,18]. This gives us the intuition that there is a feature space in which the given chaotic time series can be sparsified.…”
Section: Motivationmentioning
confidence: 97%
See 3 more Smart Citations
“…We realize that the number of trainable parameters in deep learning models (number of filters, kernels, and depth of the network) are chosen based on a trial-and-error process. However, as suggested by Pourafzal and Fereidunian [17] and Safarihamid et al [18], regardless of observation length of chaotic time series, they can be classified using a few features of the complex system [17,18]. This gives us the intuition that there is a feature space in which the given chaotic time series can be sparsified.…”
Section: Motivationmentioning
confidence: 97%
“…We extract the predictability with Hurst exponent [17] with 20 lags, self-organization with Disequilibrium [17] with 𝐾 = 10 and emergence using distribution entropy with 10 frequency bins [18]. In addition, complexity is the product of emergence and selforganization.…”
Section: Complex Systems Approachmentioning
confidence: 99%
See 2 more Smart Citations
“…With a discussion on the pros and cons of sample entropy and permutation entropy, Rostaghi [ 3 ] explored the application of discrete entropy in a time series. Safari [ 4 ] introduced different entropy indicators in association with emergence and self-organization for time series classification. Based on these associations, the authors proposed an alternative joint entropy in which each feature was represented using a specific entropy indicator.…”
Section: Introductionmentioning
confidence: 99%