2020
DOI: 10.1007/s10618-020-00706-8
|View full text |Cite
|
Sign up to set email alerts
|

MIDIA: exploring denoising autoencoders for missing data imputation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 18 publications
(5 citation statements)
references
References 29 publications
0
5
0
Order By: Relevance
“…Nazabal et al published their work on HI-VAE [60] in 2018 as a variational autoencoder-based imputation framework which can be applied for a broader set of data types under the MCAR assumption and is particularly suitable for datasets with nominal variables. Since then, novel autoencoder-based static imputation methods continue to be introduced [61]- [64]. A few of these static deep learning-based imputation methods are further summarized in Suppl.…”
Section: B Advanced Approaches For Handling Missing Data (Static Data)mentioning
confidence: 99%
“…Nazabal et al published their work on HI-VAE [60] in 2018 as a variational autoencoder-based imputation framework which can be applied for a broader set of data types under the MCAR assumption and is particularly suitable for datasets with nominal variables. Since then, novel autoencoder-based static imputation methods continue to be introduced [61]- [64]. A few of these static deep learning-based imputation methods are further summarized in Suppl.…”
Section: B Advanced Approaches For Handling Missing Data (Static Data)mentioning
confidence: 99%
“…Generally, in the AE, the latent space is determined by the distribution of the dataset. Intuitively, a sampling-based method in a latent space can be used to perform imputation of the missing element [22]- [25]. The main concern here is that the distribution of the latent space is hardly represented as a closed form, so it is inevitable for the actual imputation approximation to utilize the statistical approaches such as using the average of latent variables.…”
Section: Related Workmentioning
confidence: 99%
“…The MIDA algorithm directly uses the DAE model for missing value imputation, so the improvement in imputation accuracy is limited, and the running time is too long when dealing with missing large datasets. Reference [29] explores DAE about imputation of missing data (exploring denoising autoencoders for missing data imputation MIDIA), and proposes sequential imputation of missing values MIDIA-Sequential and batch imputation of missing values MIDIA-Batch. MIDIA-Sequential trains an independent MIDIA model for each incomplete attribute, imputation missing values sequentially according to the attribute's missing rate.…”
Section: Related Workmentioning
confidence: 99%