1986
DOI: 10.1080/03081078608934927
|View full text |Cite
|
Sign up to set email alerts
|

Normalized Measures of Entropy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
25
0

Year Published

2003
2003
2023
2023

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 51 publications
(25 citation statements)
references
References 2 publications
0
25
0
Order By: Relevance
“…Shannon entropy may be normalized on the interval [0, 1] by dividing by the maximum possible entropy (Kumar, Kumar, & Kapur, 1986):…”
Section: Diversity Measures For Empirical Datamentioning
confidence: 99%
“…Shannon entropy may be normalized on the interval [0, 1] by dividing by the maximum possible entropy (Kumar, Kumar, & Kapur, 1986):…”
Section: Diversity Measures For Empirical Datamentioning
confidence: 99%
“…It is the weighted linear average of normalized expected utility and entropy. The definition of risk is based on Yang and Qiu [12], Golan et al [15], Kumar et al [16] and Soofi [17]. It is founded on the fact that the decision maker wishes less uncertainty and bigger expected utility.…”
Section: Normalized Expected Utility-entropy Measure Of Riskmentioning
confidence: 99%
“…This leads to standardized measures which can be compared with one another [16]. An analog measure 1 ( ) a NH   , called the information index, serves to measure the reduction in uncertainty [17].…”
Section: Normalized Expected Utility-entropy Measure Of Riskmentioning
confidence: 99%
“…Entropy has traditionally been used as a measure of disorder and complexity within random signals [38,39] and has found application in fields as diverse as medicine, engineering and geophysics, to name but a few. Entropy, for instance, has been used to quantify nonlinearities and complexity in electroencephalogram (EEG) signals [40], event-related potentials in neuroelectrical signals [41,42], structural damage identification [43] and characterization of complexity in random processes [44][45][46].…”
Section: Wavelet-tsallis Q-entropymentioning
confidence: 99%