2019
DOI: 10.1109/tnnls.2018.2880403
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic Feature Acquisition Using Denoising Autoencoders

Abstract: In real-world scenarios, different features have different acquisition costs at test-time which necessitates cost-aware methods to optimize the cost and performance trade-off. This paper introduces a novel and scalable approach for cost-aware feature acquisition at test-time. The method incrementally asks for features based on the available context that are known feature values. The proposed method is based on sensitivity analysis in neural networks and density estimation using denoising autoencoders with bina… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
29
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 19 publications
(30 citation statements)
references
References 25 publications
1
29
0
Order By: Relevance
“…ii) Opportunistic Learning (OL) [21] a method based on deep Q-learning with variations of model uncertainty as the reward function, iii) a method based on exhaustive measurements of the sensitivity [16], and iv) a method based on approximation of sensitivities using denoising autoencoders (FACT) [17].…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…ii) Opportunistic Learning (OL) [21] a method based on deep Q-learning with variations of model uncertainty as the reward function, iii) a method based on exhaustive measurements of the sensitivity [16], and iv) a method based on approximation of sensitivities using denoising autoencoders (FACT) [17].…”
Section: Methodsmentioning
confidence: 99%
“…This normalization permits using the value of zero for missing features during the prediction to act as mean imputation. Regarding the number of layers and hidden neurons, we used a similar number of trainable parameters for OL [21] and RL-Based [19], while due to the inherent differences, we had to use different architectures for FACT [17] and Exhustive [16]. Nonetheless, for each classification task, the compared models reach a similar baseline accuracy (i.e., average accuracy after acquiring all the features).…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Density Estimation via Autoencoders The work in Kachuee et al (2018) suggests to acquire the covariate which has the highest sensitivity to the output prediction y. In order to account for different covariate acquisition costs the sensitivity scores are re-scaled appropriately.…”
Section: Markov Decision Process (Mdp) Frameworkmentioning
confidence: 99%