2019
DOI: 10.1002/sta4.260
|View full text |Cite
|
Sign up to set email alerts
|

Deep latent variable models for generating knockoffs

Abstract: Selective inference is an emerging field in big data analytics; it targets on conducting variable selection and providing statistical inference at the same time. Among various selective inference frameworks, the model‐X framework offers the most flexible tool to equip almost any machine learning method with the ability for false discovery rate (FDR) controlled variable selection. This paper provides a practical and flexible approach to generate knockoffs. We propose to fit a latent variable model for generatin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3

Relationship

2
6

Authors

Journals

citations
Cited by 9 publications
(8 citation statements)
references
References 12 publications
0
8
0
Order By: Relevance
“…Romano, Sesia, and Candès (2020) developed a Deep Knockoff machine using deep generative models. Liu and Zheng (2019) developed a Model-X generating method using Deep latent variable models. More recently, Bates, Candès, Janson, and Wang (2020) proposed an efficient general metropolized knockoff sampler.…”
Section: Knockoff-based Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Romano, Sesia, and Candès (2020) developed a Deep Knockoff machine using deep generative models. Liu and Zheng (2019) developed a Model-X generating method using Deep latent variable models. More recently, Bates, Candès, Janson, and Wang (2020) proposed an efficient general metropolized knockoff sampler.…”
Section: Knockoff-based Methodsmentioning
confidence: 99%
“…where Swap(j) stands for exchanging the j-th column and the (j +p)-th column. X j can be sampled using various algorithms (Bates et al, 2020;Liu & Zheng, 2019;Romano et al, 2020;Spector & Janson, 2020).…”
Section: Notationsmentioning
confidence: 99%
“…(5) where Swap(j) stands for exchanging the 𝑗th column and the (𝑗 + 𝑝)th column of [𝐗 X], and 𝐴 d = 𝐵 indicates 𝐴 and 𝐵 are identical in distribution. X can be generated using various algorithms (Barber & Candés, 2015;Romano et al, 2020;Liu & Zheng, 2019;Bates et al, 2021;Spector & Janson, 2022). More knockoff construction details are reviewed in Web Appendix A.1.…”
Section: Algorithmmentioning
confidence: 99%
“…Similarly, Bayesian approaches to feature selection [28] have also been generalised to the group setting [35]. Finally, the Knockoff procedure [9,17,39,57,73,80] is a generative procedure that creates fake covariates (knockoffs), obeying certain symmetries under permutations of real and knockoff features. By subsequently carrying out feature selection on the combined real and knockoff data, it is possible to obtain guarantees on the False Discovery Rate of the selected features.…”
Section: Related Workmentioning
confidence: 99%