2023
DOI: 10.3390/math11020293
|View full text |Cite
|
Sign up to set email alerts
|

On the Use of Variability Measures to Analyze Source Coding Data Based on the Shannon Entropy

Abstract: Source coding maps elements from an information source to a sequence of alphabetic symbols. Then, the source symbols can be recovered exactly from the binary units. In this paper, we derive an approach that includes information variation in the source coding. The approach is more realistic than its standard version. We employ the Shannon entropy for coding the sequences of a source. Our approach is also helpful for short sequences when the central limit theorem does not apply. We rely on a quantifier of the in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 39 publications
(59 reference statements)
0
1
0
Order By: Relevance
“…• Information gain (InfoGain): This criterion employs the gain of each explanatory variable using the Shannon entropy [33,34] to select the most significant variables with respect to the response variable [35].…”
Section: Performance Measuresmentioning
confidence: 99%
“…• Information gain (InfoGain): This criterion employs the gain of each explanatory variable using the Shannon entropy [33,34] to select the most significant variables with respect to the response variable [35].…”
Section: Performance Measuresmentioning
confidence: 99%