Relational Data Mining 2001
DOI: 10.1007/978-3-662-04599-2_13
|View full text |Cite
|
Sign up to set email alerts
|

Learning Probabilistic Relational Models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
253
0
2

Year Published

2005
2005
2013
2013

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 399 publications
(255 citation statements)
references
References 9 publications
0
253
0
2
Order By: Relevance
“…We merely use an abstract syntax called as First-Order Conditional Influence (FOCI) statements [13] to present the semantics of the directed models. We had earlier used this syntax to derive learning algorithms [13] and showed how most directed models such as BLPs [10], RBNs [7], PRMs [3], probabilistic relational language [4] and logical Bayes nets [2] can be represented using this syntax. The goal of this work is not to convert from FOCI statements to MLNs but to show that the knowledge captured by directed models can be represented using MLNs.…”
Section: Mlns and Directed Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…We merely use an abstract syntax called as First-Order Conditional Influence (FOCI) statements [13] to present the semantics of the directed models. We had earlier used this syntax to derive learning algorithms [13] and showed how most directed models such as BLPs [10], RBNs [7], PRMs [3], probabilistic relational language [4] and logical Bayes nets [2] can be represented using this syntax. The goal of this work is not to convert from FOCI statements to MLNs but to show that the knowledge captured by directed models can be represented using MLNs.…”
Section: Mlns and Directed Modelsmentioning
confidence: 99%
“…Directed models can learn conditional distributions due to each of the causes separately and combine them using a (possibly stochastic) function, thus making the process of learning easier. This notion of ICI has been extended to directed SRL models in two different ways: while PRMs [3] use aggregators such as max, min, and average to combine the influences due to several parents, other formalisms such as BLPs [10] and RBNs [8] use combination functions such as Noisy-OR, mean, or weighted mean to combine distributions.…”
Section: Introductionmentioning
confidence: 99%
“…In (Getoor, Friedman, Koller, & Pfeffer, 2001) the concept of relational schema and relational skeleton are introduced. A relational skeleton for a relational schema is a partial specification (instantiation) of the schema: it specifies the objects involved in the schema and the relations between them.…”
Section: Appendix Bmentioning
confidence: 99%
“…A relational skeleton for a relational schema is a partial specification (instantiation) of the schema: it specifies the objects involved in the schema and the relations between them. In (Getoor et al, 2001) Getoor et al, define a K-PRM as the RBN that specifies the probability distribution over particular instantiations of a given skeleton: for each schema there are more skeletons, for each skeleton there are more completions (i.e., complete instantiations). A K-PRM induces a distribution over instances that complete the skeleton.…”
Section: Appendix Bmentioning
confidence: 99%
“…These decision trees correspond to a particular form of ICL rules. -Unlike many representations such as Probabilistic Relational Models [Getoor et al, 2001], the ICL is not restricted to a fixed number of parameters to learn; it is possible to have representations where each individual has associated parameters. This should allow for richer representations and so better fit to the data.…”
Section: Icl and Learningmentioning
confidence: 99%