2009
DOI: 10.1016/j.dss.2009.07.004
|View full text |Cite
|
Sign up to set email alerts
|

Exploring optimization of semantic relationship graph for multi-relational Bayesian classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
31
0

Year Published

2012
2012
2015
2015

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 20 publications
(31 citation statements)
references
References 4 publications
0
31
0
Order By: Relevance
“…A common way to learn recursive dependencies in multi-relational data mining is to duplicate the entity tables involved in a self-relationship as follows (Yin et al 2004;Chen et al 2009). For instance for a self-relationship Friend(U 1 , U 2 ) with two foreign key pointers to an entity This prevents auxiliary functor nodes, such as ranking(S aux ) from having parents.…”
Section: Results Neither Of the Markov Logic Methods Lhl Nor Lsm Discmentioning
confidence: 99%
See 1 more Smart Citation
“…A common way to learn recursive dependencies in multi-relational data mining is to duplicate the entity tables involved in a self-relationship as follows (Yin et al 2004;Chen et al 2009). For instance for a self-relationship Friend(U 1 , U 2 ) with two foreign key pointers to an entity This prevents auxiliary functor nodes, such as ranking(S aux ) from having parents.…”
Section: Results Neither Of the Markov Logic Methods Lhl Nor Lsm Discmentioning
confidence: 99%
“…In statistical-relational learning, Jensen and Neville introduced the term relational autocorrelation in analogy with temporal autocorrelation (Jensen and Neville 2002;Neville and Jensen 2007). In multi-relational data mining, such dependencies are found by considering self-joins where a table is joined to itself (Chen et al 2009). We will use both the terms recursive dependency and autocorrelation.…”
mentioning
confidence: 99%
“…To indicate this dependence, we add the functor node RA(S , P ) as a parent of popularity(P ) in the decision tree for ranking(S ). Like most statistical-relational systems [1,33], and like the learn-andjoin algorithm, we consider associations between attributes of two entities only conditional of the existence of a link between the entities. Therefore there is no branch corresponding to RA(S , P ) = F in the decision tree of Figure 7(right).…”
Section: !"#$%And'!() *"#$%And'*() +$#-#'!() -#/And00-and'!() 23240$+-/5'*()mentioning
confidence: 99%
“…To indicate this dependence, we add the functor node RA(S, P ) as a parent of popularity(P ) in the decision tree for ranking (S). Like most statistical-relational systems (Getoor et al 2007;Chen et al 2009), and like the learn-and-join algorithm, we consider associations between attributes of two entities only conditional of the existence of a link between the entities. Therefore there is no branch corresponding to RA(S, P ) = F in the decision tree of Fig.…”
Section: Form a Family Join Table That Combines All Functor Nodes In mentioning
confidence: 99%