2019
DOI: 10.1080/01621459.2019.1660174
|View full text |Cite
|
Sign up to set email alerts
|

Deep Knockoffs

Abstract: This paper introduces a machine for sampling approximate model-X knockoffs for arbitrary and unspecified data distributions using deep generative models. The main idea is to iteratively refine a knockoff sampling mechanism until a criterion measuring the validity of the produced knockoffs is optimized; this criterion is inspired by the popular maximum mean discrepancy in machine learning and can be thought of as measuring the distance to pairwise exchangeability between original and knockoff features. By build… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
152
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 116 publications
(157 citation statements)
references
References 55 publications
(152 reference statements)
0
152
0
Order By: Relevance
“…We noticed two concurrent independent works on deep knockoff generators (Romano et al, ; Jordon et al, ). Both methods generate knockoffs as a function of X and a generated white noise E .…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…We noticed two concurrent independent works on deep knockoff generators (Romano et al, ; Jordon et al, ). Both methods generate knockoffs as a function of X and a generated white noise E .…”
Section: Discussionmentioning
confidence: 99%
“…They both propose some loss function for the discrepancy between the distributions of false(X,trueX˜false)swap(S) and false(X,trueX˜false). Romano et al () directly optimize an objective function for the discrepancy for the swapped distribution, which can be considered as a natural generalization of moment matching methods. Jordon et al () use GAN to generate knockoff, where the loss function of the discriminator is the cross‐entropy of identifying the swapped vectors; this introduced p additional power networks.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…In fact, requiring that the joint distribution ofX and X be exchangeable in the sense of Candès et al (2018) is much stronger than asking for false discovery rate control at a nominal level for a specific choice of importance statistics. However, concrete examples can be found where an incorrect sampling mechanism leads to an inflation of the type-I errors (Romano et al, 2018).…”
Section: Modeling the Distribution Of The Explanatory Variablesmentioning
confidence: 99%
“…The recent work of Romano et al (2018) proposes an alternative machine that can produce approximate knockoffs in great generality, without making modeling assumptions on F X (Bottolo & Richardson, 2019;Rosenblatt et al, 2019). The approach of Romano et al (2018) is based on deep generative models and it can be powerful because it is driven by the effort to makeX as uncorrelated with X as possible in the spirit of Barber & Candès (2015). However, deep knockoffs are more computationally expensive and are not exactly exchangeable when applied to hidden Markov models.…”
Section: Sampling Knockoffsmentioning
confidence: 99%