2017
DOI: 10.1214/16-ba1032
|View full text |Cite
|
Sign up to set email alerts
|

Marginal Pseudo-Likelihood Learning of Discrete Markov Network Structures

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
62
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
6

Relationship

3
3

Authors

Journals

citations
Cited by 24 publications
(65 citation statements)
references
References 22 publications
1
62
0
Order By: Relevance
“…The main contribution of this work is extending the scope of marginal pseudo‐likelihood (MPL) to also cover CMNs. This was achieved by combining the concept of marginal pseudo‐likelihood for Markov networks (Pensar et al, ) with the concept of local CSIs in Bayesian network learning (Boutilier et al, ; Friedman & Goldszmidt, ; Pensar et al, ). The resulting objective function has a tractable closed‐form expression and the corresponding estimator was here proven to be consistent in the large sample limit.…”
Section: Discussionmentioning
confidence: 99%
See 4 more Smart Citations
“…The main contribution of this work is extending the scope of marginal pseudo‐likelihood (MPL) to also cover CMNs. This was achieved by combining the concept of marginal pseudo‐likelihood for Markov networks (Pensar et al, ) with the concept of local CSIs in Bayesian network learning (Boutilier et al, ; Friedman & Goldszmidt, ; Pensar et al, ). The resulting objective function has a tractable closed‐form expression and the corresponding estimator was here proven to be consistent in the large sample limit.…”
Section: Discussionmentioning
confidence: 99%
“…Proof. (Theorem 1) Pensar et al (2016b) showed that MPL is consistent in identifying the correct graph for traditional MNs. Considering maximal regular structures, the MPL is also consistent in identifying the correct graph of a CMN.…”
Section: Appendixmentioning
confidence: 96%
See 3 more Smart Citations