2020
DOI: 10.3390/e22060604
|View full text |Cite
|
Sign up to set email alerts
|

Entropy Application for Forecasting

Abstract: The information theory developed by Shannon [...]

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 12 publications
(17 reference statements)
0
1
0
Order By: Relevance
“…A higher entropy suggests more complexity and variability, while lower entropy indicates a more predictable and less complex dataset [29]. The formula of entropy without specific category data involves calculating the entropy of a probability distribution, where each probability represents the likelihood of an event occurring [30]. The generic formula for entropy (H) in this context is:…”
Section: Time Interval Analysismentioning
confidence: 99%
“…A higher entropy suggests more complexity and variability, while lower entropy indicates a more predictable and less complex dataset [29]. The formula of entropy without specific category data involves calculating the entropy of a probability distribution, where each probability represents the likelihood of an event occurring [30]. The generic formula for entropy (H) in this context is:…”
Section: Time Interval Analysismentioning
confidence: 99%