2007
DOI: 10.1007/s00362-006-0326-7
|View full text |Cite
|
Sign up to set email alerts
|

Entropy properties of record statistics

Abstract: Record values, Shannon information, Hazard rate function, Fisher information, Mutual information,

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
29
0

Year Published

2009
2009
2024
2024

Publication Types

Select...
10

Relationship

0
10

Authors

Journals

citations
Cited by 60 publications
(31 citation statements)
references
References 18 publications
0
29
0
Order By: Relevance
“…Baratpour et al [2] obtained the entropy of a continuous probability distribution using upper record values. Moreover, they obtained several bounds for this entropy using the hazard rate function.…”
Section: Introductionmentioning
confidence: 99%
“…Baratpour et al [2] obtained the entropy of a continuous probability distribution using upper record values. Moreover, they obtained several bounds for this entropy using the hazard rate function.…”
Section: Introductionmentioning
confidence: 99%
“…While system's lifetime is computed according to max{U 1,1 ,...,U m,m }, it will be appropriate to know each U i,i to acquire the entire lifetime system. The information measures for record values have been investigated by several authors, including, Zahedi and Shakil (2006), Baratpour et al (2007), and Madadi and Tata (2011). Recently, Jafari Jozani and Ahmadi (2014) studied uncertainty and information properties of RSS.…”
Section: Shannon Entropy Of Rrssmentioning
confidence: 99%
“…Many authors worked on the estimation entropy for different life distributions. Baratpour et al [4] developed the entropy of upper record values and provided several upper and lower bounds for this entropy by using the hazard rate function. Cramer and Bagh [5] discussed the entropy in the Weibull distribution for progressive censoring.…”
Section: Introductionmentioning
confidence: 99%