1993
DOI: 10.1029/93wr00923
|View full text |Cite
|
Sign up to set email alerts
|

Minimum relative entropy: Forward probabilistic modeling

Abstract: The pioneering work of Jaynes in Bayesian/maximum entropy methods has been successfully explored in many disciplines. The principle of maximum entropy (PME) is a powerful and versatile tool of inferring a probability distribution from constraints that do not completely characterize that distribution. Minimum relative entropy (MRE) is a method which has all the important attributes of the maximum entropy approach with the advantage that prior information may be easily included. In this paper we use MRE to deter… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
60
0

Year Published

1997
1997
2017
2017

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 103 publications
(60 citation statements)
references
References 29 publications
0
60
0
Order By: Relevance
“…Instead of choosing arbitrary prior distributions, we adopted a systematic approach based on the principle of Minimum Relative Entropy (MRE) [Hou and Rubin, 2005;Rubin, 2003;Woodbury and Rubin, 2000;Woodbury and Ulrych, 1993], which states that of all the probabilities that satisfy the given constraints, such as average or higher-order moments, choose the one that has the highest entropy with respect to a known prior. Since entropy represents the amount of uncertainty associated with a probability distribution, the principle of MRE favors distribution that is the most uncommitted or the least subjective with respect to the constraints.…”
Section: Appendix A: Parameter Estimation Methodsmentioning
confidence: 99%
“…Instead of choosing arbitrary prior distributions, we adopted a systematic approach based on the principle of Minimum Relative Entropy (MRE) [Hou and Rubin, 2005;Rubin, 2003;Woodbury and Rubin, 2000;Woodbury and Ulrych, 1993], which states that of all the probabilities that satisfy the given constraints, such as average or higher-order moments, choose the one that has the highest entropy with respect to a known prior. Since entropy represents the amount of uncertainty associated with a probability distribution, the principle of MRE favors distribution that is the most uncommitted or the least subjective with respect to the constraints.…”
Section: Appendix A: Parameter Estimation Methodsmentioning
confidence: 99%
“…With the same constraints in Equations (25)- (27), the minimum relative entropy copula can be derived by minimizing the relative entropy in Equation (40), which can be expressed as [33]:…”
Section: Minimum Relative Entropy Copulamentioning
confidence: 99%
“…POMCE states that one should choose the density with the minimum cross entropy that is as close to the prior as possible and satisfies the specified constraints. Thus, the unknown probability can be inferred by incorporating the prior information subject to certain constraints [52,53], which has been applied in many areas of hydrology and water resources for statistical modeling [6,10,[40][41][42].…”
Section: Relative Entropymentioning
confidence: 99%
See 1 more Smart Citation
“…Obviously, w c (j), w s (j) and w o (j) should be as close as possible. According to the principle of minimum relative entropy [24] …”
Section: Determination Of Combination Weight By Entropy Of Informationmentioning
confidence: 99%