2022
DOI: 10.3390/math10203847
|View full text |Cite
|
Sign up to set email alerts
|

Using Probabilistic Models for Data Compression

Abstract: Our research objective is to improve the Huffman coding efficiency by adjusting the data using a Poisson distribution, which avoids the undefined entropies too. The scientific value added by our paper consists in the fact of minimizing the average length of the code words, which is greater in the absence of applying the Poisson distribution. Huffman Coding is an error-free compression method, designed to remove the coding redundancy, by yielding the smallest number of code symbols per source symbol, which in p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4

Relationship

3
1

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 25 publications
0
6
0
Order By: Relevance
“…Our work may be of particular interest to specialists in the field of cryptography in solving digital security problems [40]. Coding methods and their application in information security systems, cryptocompression systems, and data compression are relevant topics to be considered [10,41]. For example, the application of arithmetic coding methods in cryptographic information protection systems is considered in [42].…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Our work may be of particular interest to specialists in the field of cryptography in solving digital security problems [40]. Coding methods and their application in information security systems, cryptocompression systems, and data compression are relevant topics to be considered [10,41]. For example, the application of arithmetic coding methods in cryptographic information protection systems is considered in [42].…”
Section: Discussionmentioning
confidence: 99%
“…We are interested in identifying how pure each species is, and we assume that the source includes information variation when collecting the data. Because we do not know the true probability distribution of the source, we need to employ the plugin estimators defined in ( 9) and (10). We intuitively know that we have 100% knowledge of the Iris species because we have only a single species selected.…”
Section: Empirical Applicationsmentioning
confidence: 99%
See 1 more Smart Citation
“…Developments may include solving the analogue of Equations ( 15), (18), and (20), for other remarkable families of entropies (Renyi, Sharma-Taneja-Mittal, Naudts, etc). New examples are needed, in addition to the PDF solutions of normal type ( [71][72][73][74]).…”
Section: Discussionmentioning
confidence: 99%
“…Recent research results in statistics prove the increased interest for using different entropy measures. Many authors have dealt with this matter, among them are Koukoumis and Karagrigoriou [ 1 ], Iatan et al [ 2 ], Li et al [ 3 ], Miśkiewicz [ 4 ], Toma et. al.…”
Section: Introductionmentioning
confidence: 99%