2016
DOI: 10.1080/00031305.2015.1089788
|View full text |Cite
|
Sign up to set email alerts
|

Average Entropy: A New Uncertainty Measure with Application to Image Segmentation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 23 publications
(10 citation statements)
references
References 9 publications
0
10
0
Order By: Relevance
“…On the other hand, some authors used consistent measures of information like Kullback-Leibler divergence as a measure of discrimination between two life-time models [9, 10, 11] and others tried to modify or change the primary definition of entropy. Among all, we mention the cumulative residual entropy [12], the average entropy [13] and in particular, the sup-entropy [14]. The sup-entropy has been recently used as a quantifier of the efficiency of the censored sample for several survival distributions such as exponential distribution [15, 16] , Pareto [17] , Weibull [18, 19].…”
Section: Introductionmentioning
confidence: 99%
“…On the other hand, some authors used consistent measures of information like Kullback-Leibler divergence as a measure of discrimination between two life-time models [9, 10, 11] and others tried to modify or change the primary definition of entropy. Among all, we mention the cumulative residual entropy [12], the average entropy [13] and in particular, the sup-entropy [14]. The sup-entropy has been recently used as a quantifier of the efficiency of the censored sample for several survival distributions such as exponential distribution [15, 16] , Pareto [17] , Weibull [18, 19].…”
Section: Introductionmentioning
confidence: 99%
“…Consequently, Shannon entropy became important in quantifying randomness present in several fields such as financial analysis (Sharpe, 1985), data compression (Salomon, 2008), statistics (Kullback, 1959), and information theory and data transmission (Cover & Thomas, 1991). Recently, modified versions of this measure have been utilized in lifetime studies (Kittaneh & Akbar, 2016) and image processing (Kittaneh, Khan, Akbar, & Bayoud, 2016). Jaynes (1957) was the first to introduce the principle of maximum entropy and the induced maximum entropy distribution.…”
Section: The Maximum Entropy Distributionmentioning
confidence: 99%
“…An alignment method for inverse synthetic aperture radar image formation, a hyper-spectral image segmentation using Renyi entropy and a fuzzy image texture analysis and classification approach are, respectively, the topics of papers [80] , [64] and [58] . Importantly, a new uncertainty measure for image segmentation called average entropy is defined by the authors of paper [43] .…”
Section: A Review On Entropy and Its Applicationsmentioning
confidence: 99%