2017
DOI: 10.48550/arxiv.1712.08645
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Dropout Feature Ranking for Deep Learning Models

Abstract: Deep neural networks (DNNs) achieve state-of-the-art results in a variety of domains. Unfortunately, DNNs are notorious for their non-interpretability, and thus limit their applicability in hypothesis-driven domains such as biology and healthcare. Moreover, in the resource-constraint setting, it is critical to design tests relying on fewer more informative features leading to high accuracy performance within reasonable budget. We aim to close this gap by proposing a new general feature ranking method for deep … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(17 citation statements)
references
References 37 publications
0
17
0
Order By: Relevance
“…Inspired by the Dropout FR method based on the variational dropout layer [1] where the efficiency of the model is ignored, FSCD is proposed that considers both effectiveness and efficiency in a learnable process as illustrated in Figure 2. In the proposed FSCD, the effectiveness is optimized by the cross entropy based loss function, while the efficiency is optimized by the feature-wise regularization term in Eq.…”
Section: Fscd For Pre-ranking Modelmentioning
confidence: 99%
See 3 more Smart Citations
“…Inspired by the Dropout FR method based on the variational dropout layer [1] where the efficiency of the model is ignored, FSCD is proposed that considers both effectiveness and efficiency in a learnable process as illustrated in Figure 2. In the proposed FSCD, the effectiveness is optimized by the cross entropy based loss function, while the efficiency is optimized by the feature-wise regularization term in Eq.…”
Section: Fscd For Pre-ranking Modelmentioning
confidence: 99%
“…The feature set for the pre-ranking model is a subset of that for the ranking model, which is selected by the proposed FSCD. All feature sets, consisting of 246 feature fields in their entirety, [1], the DFS method [6], and the proposed FSCD. include different types, e.g., simple features that directly look up embeddings and complex features that require complicated computations for online embeddings.…”
Section: Experiments 31 Experiments Configurationsmentioning
confidence: 99%
See 2 more Smart Citations
“…Another method that uses dropout for feature importance is [23], applying a technique called Variational Dropout to learn the optimal dropout rate for each feature. It tries to leave out as many features as possible and at the same time keep accuracy high.…”
Section: A Feature Metricsmentioning
confidence: 99%