2018
DOI: 10.1515/jci-2017-0016
|View full text |Cite
|
Sign up to set email alerts
|

Invariant Causal Prediction for Nonlinear Models

Abstract: An important problem in many domains is to predict how a system will respond to interventions. This task is inherently linked to estimating the system's underlying causal structure. To this end, Invariant Causal Prediction (ICP) (Peters et al., 2016) has been proposed which learns a causal model exploiting the invariance of causal relations using data from different environments. When considering linear models, the implementation of ICP is relatively straightforward. However, the nonlinear case is more challen… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
118
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 141 publications
(124 citation statements)
references
References 39 publications
1
118
0
Order By: Relevance
“…, even if other variables have different values, or are set to different values). Formalizing this intuition leads to statistical tests for the property of Invariant Causal Prediction (ICP): that the dependence of an effect on its direct causes (e.g., its conditional probability distribution, given the values of its direct causes) is the same in different environments and under different interventions [ 23 ]. Information flows from causes to their direct effects over time.…”
Section: Introduction: Scientific Methods and Weight-of-evidence Consementioning
confidence: 99%
“…, even if other variables have different values, or are set to different values). Formalizing this intuition leads to statistical tests for the property of Invariant Causal Prediction (ICP): that the dependence of an effect on its direct causes (e.g., its conditional probability distribution, given the values of its direct causes) is the same in different environments and under different interventions [ 23 ]. Information flows from causes to their direct effects over time.…”
Section: Introduction: Scientific Methods and Weight-of-evidence Consementioning
confidence: 99%
“…The study [220] empirically corroborated these predictions, thus establishing an intriguing bridge between the structure of learning problems and certain physical properties (cause-effect direction) of real-world data generating processes. It also led to a range of follow-up work [32], [78], [97], [114], [115], [152], [153], [156], [167], [195], [204], [243], [263], [267], [277], [278], [281], complementing the studies of Bareinboim and Pearl [14], [185], and it inspired a thread of work in the statistics community exploiting invariance for causal discovery and other tasks [105], [106], [114], [187], [191].…”
Section: A Semisupervised Learningmentioning
confidence: 99%
“…Another example of model perturbation is sensitivity analysis in Bayesian modeling (38,39). Many of the model conditions used in causal inference are in fact stability concepts that assume away confounding factors by asserting that different conditional distributions are the same (40,41).…”
Section: D2 Data Perturbationmentioning
confidence: 99%