2005
DOI: 10.1017/s0021900200000681
|View full text |Cite
|
Sign up to set email alerts
|

Minimum dynamic discrimination information models

Abstract: In this paper, we introduce the minimum dynamic discrimination information (MDDI) approach to probability modeling. The MDDI model relative to a given distribution G is that which has least Kullback-Leibler information discrepancy relative to G, among all distributions satisfying some information constraints given in terms of residual moment inequalities, residual moment growth inequalities, or hazard rate growth inequalities. Our results lead to MDDI characterizations of many well-known lifetime models and to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2010
2010
2017
2017

Publication Types

Select...
5
2

Relationship

3
4

Authors

Journals

citations
Cited by 9 publications
(12 citation statements)
references
References 12 publications
0
12
0
Order By: Relevance
“…The multivariate dynamic information measures developed in [6] set the stage for extending the univariate MDE and MDDI procedures proposed in [1,3] to higher dimensions. This article took the first step in this direction and provided results for developing multivariate models based on partial information formulated in terms of inequality constraints for the hazard gradient.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…The multivariate dynamic information measures developed in [6] set the stage for extending the univariate MDE and MDDI procedures proposed in [1,3] to higher dimensions. This article took the first step in this direction and provided results for developing multivariate models based on partial information formulated in terms of inequality constraints for the hazard gradient.…”
Section: Discussionmentioning
confidence: 99%
“…Asadi et al [1,3] considered = {F} for an univariate F subject to the inequality or differential inequality constraints for the hazard rate or inequality constraints for the mean residual lifetime. They characterized numerous univariate distributions as MDDI and MDE models.…”
Section: : the Minimum Mutual Information (Mmi) Model In Defined Inmentioning
confidence: 99%
“…Asadi et al [15] and [16] proposed maximum dynamic entropy (MDE) and minimum dynamic discrimination information (MDDI) procedures, respectively, for developing lifetime models. The MDE and MDDI procedures extend the maximum entropy and minimum discrimination information principles of inference to the cases when the information is given in terms of hazard rate growth inequality, residual moment inequality, or residual moment growth inequality.…”
Section: Other Recent Developmentsmentioning
confidence: 99%
“…For more information about other applications of Shannon entropy, see Asadi et al, Cover and Thomas, Ebrahimi et al, among others. Dynamic and bivariate versions can be seen in Asadi et al, Chamany and Baratpour, Jomhoori and Yousefzadeh, Navarro et al, and references therein. Another useful measure to obtain the distance between 2 density functions f and g is the Kullback‐Leibler (KL) distance, defined by Kfalse(f:gfalse)=0ffalse(xfalse)normallnormalonormalg3ptffalse(xfalse)gfalse(xfalse)dx=Hfalse(ffalse)+Hfalse(f,gfalse), where Hfalse(f,gfalse)=Effalse[normallnormalonormalg4ptgfalse(Xfalse)false] is known as “Fraser information” (Kent) and is also known as “inaccuracy measure” (Kerridge).…”
Section: Introductionmentioning
confidence: 99%