2020
DOI: 10.1016/j.chaos.2020.110360
|View full text |Cite
|
Sign up to set email alerts
|

New entropy bounds via uniformly convex functions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 13 publications
(4 citation statements)
references
References 6 publications
0
4
0
Order By: Relevance
“…The most used of entropies is the Shannon entropy. There are many studies regarding the characterization and application of entropy Shannon (see, e.g., [ 1 , 2 ], etc.). We are studying a way of measuring the “disorder” of the divisors of a natural number.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…The most used of entropies is the Shannon entropy. There are many studies regarding the characterization and application of entropy Shannon (see, e.g., [ 1 , 2 ], etc.). We are studying a way of measuring the “disorder” of the divisors of a natural number.…”
Section: Discussionmentioning
confidence: 99%
“…In [ 1 ], Sayyari gave an extension of Jensen’s discrete inequality considering the class of uniformly convex functions getting lower and upper bounds for Jensen’s inequality. He applied this results in information theory and obtained new and strong bounds for Shannon’s entropy of a probability distribution.…”
Section: Introduction and Preliminariesmentioning
confidence: 99%
“…We start this section discussing two inequalities, the first is proved by using the basic properties of uniformly convex functions as proved in [9], [8, Theorem 1] and [2]. The other inequality about uniformly convexity is derived from results in [7]. 8], it is proved that the set of Φ-uniformly convex functions f which are continuously differentiable satisfy the inequality…”
Section: Improved Jensen Functional Through φ-Uniform Convexitymentioning
confidence: 99%
“…The motive of this study was to extend the concept of logical entropy presented in [ 13 ] to information sources. Since estimating entropy from the information source can be difficult [ 20 ], we defined the logical metric permutation entropy of a map and used it to apply for an information source.…”
Section: Introduction and Basic Notionsmentioning
confidence: 99%