1994
DOI: 10.1613/jair.61
|View full text |Cite
|
Sign up to set email alerts
|

Random Worlds and Maximum Entropy

Abstract: Given a knowledge base KB containing first-order and statistical facts, we consider a principled method, called the random-worlds method, for computing a degree of belief that some formula Phi holds given KB. If we are reasoning about a world or system consisting of N individuals, then we can consider all possible worlds, or first-order models, withdomain {1,...,N} that satisfy KB, and compute thefraction of them in which Phi is true. We define the degree of belief to be the asymptotic value of this fraction… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
37
0

Year Published

2000
2000
2019
2019

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 48 publications
(37 citation statements)
references
References 21 publications
0
37
0
Order By: Relevance
“…This means, that P |= G KB holds if the conditional probability P (φ * | ψ * ) equals α for any ground conditional (φ * | ψ * )[α] in G(KB ). To perform reasoning we employ the principle of maximum entropy [24,20,43,44]. The entropy H of a probability distribution P is defined as…”
Section: Example 215mentioning
confidence: 99%
See 1 more Smart Citation
“…This means, that P |= G KB holds if the conditional probability P (φ * | ψ * ) equals α for any ground conditional (φ * | ψ * )[α] in G(KB ). To perform reasoning we employ the principle of maximum entropy [24,20,43,44]. The entropy H of a probability distribution P is defined as…”
Section: Example 215mentioning
confidence: 99%
“…we obtain the most unbiased representation of the knowledge in G(KB ), cf. [24,20,43,44] for the formal properties and uniqueness theorems. The approach of maximum entropy yields a model-based inference procedure and reasoning on KB is now solely performed using the probability distribution P ME G,KB .…”
Section: Example 215mentioning
confidence: 99%
“…Th is principle, which can be traced back some 100 years, to the works of Boltzman n, M axwell, Gibbs and Planck an d has been introduced in its modern form by Jaynes (1957 ) and Tribus (1961 ), proved its e ciency in various ® elds of Physics, Statistics, Information Th eory, Pattern R ecogn ition, Signal Processing, etc. (Grove et al 1994, Goldszmidt et al 1990, A czel and Forte 1986, Justice 1986, Jaynes 1985, Skyrms 1985, W illiams 1980, Jaynes 1979, Tribus 1961, Jaynes 1957.…”
Section: Evidence Versus Probab Ilitymentioning
confidence: 99%
“…As a credulous alternative, one can select a specific probability distribution from the models of the knowledge base and do reasoning by just using this probability distribution. A reasonable choice for such a model is the one probability distribution with maximum entropy [15,27,20]. This probability distribution satisfies several desirable properties for commonsense reasoning and is uniquely determined among the probability distributions that satisfy a given set of probabilistic conditionals, see [15,27,20] for the theoretical foundations.…”
Section: Introductionmentioning
confidence: 99%
“…Moreover, we will present a model-based inductive inference operator that is based on the principle of maximum entropy for each of the two semantics. The idea of application is quite simple and similar to the propositional case: Having defined the set of models of a relational probabilistic knowledge base (according to each of the semantics), one chooses the unique probability distribution among these models that has maximal entropy, if possible, and therefore allows us to reason precisely (i. e. with precise probabilities, not based on intervals), but in a most cautious way (see [15,20] for the theoretical foundations). Examples will illustrate in which respects these inference operators differ, but we will show that both inference operators comply with all postulates.…”
Section: Introductionmentioning
confidence: 99%