2006
DOI: 10.1016/j.artint.2005.12.004
|View full text |Cite
|
Sign up to set email alerts
|

Decomposition of structural learning about directed acyclic graphs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
75
0
3

Year Published

2008
2008
2022
2022

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 63 publications
(79 citation statements)
references
References 7 publications
1
75
0
3
Order By: Relevance
“…□ C [11] (pp. [7][8] gives an example, and illustrates two cases of irrelevant factor and confounder respectively. To illuminate conceptions of confounding and irrelevant factor and Lemma 1, we continue to discuss the relationship based on their original example and give two examples as follows.…”
Section: P D E E P D E E B P D E E P D E Ementioning
confidence: 99%
See 1 more Smart Citation
“…□ C [11] (pp. [7][8] gives an example, and illustrates two cases of irrelevant factor and confounder respectively. To illuminate conceptions of confounding and irrelevant factor and Lemma 1, we continue to discuss the relationship based on their original example and give two examples as follows.…”
Section: P D E E P D E E B P D E E P D E Ementioning
confidence: 99%
“…in recent decades [1][2][3][4][5][6][7], and directed acyclic graph (DAG) is involved in describing the relationship between causal connections [4]. Confounding and confounder are two basic concepts for epidemiology causal inference [1,3].…”
Section: Introductionmentioning
confidence: 99%
“…To construct DAGs from observed data, the IC algorithm searches for a separator S from all possible variable subsets such that two variables u and v are independent conditional on S, and the PC algorithm limits possible separators to vertices that are adjacent to u and v [56,76]. [87] presented a decomposition approach for recovering structures of DAGs. The decomposition approach starts with an undirected independence graph which may not be a moral graph and may have extra edges added to the moral graph.…”
Section: A Decomposition Approach For Learning Dagsmentioning
confidence: 99%
“…This prior knowledge of conditional independence can be used for decomposition in the approach. [86] gives a divide-and-conquer strategy in which structural learning for a large DAG is split recursively into those for subgraphs. The recursive algorithm can be depicted as a binary tree whose top node is the full set of all variables and whose other nodes are proper subsets of variables at its parent node.…”
Section: A Decomposition Approach For Learning Dagsmentioning
confidence: 99%
See 1 more Smart Citation