The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2020
DOI: 10.1021/acs.analchem.9b05460
|View full text |Cite
|
Sign up to set email alerts
|

NormAE: Deep Adversarial Learning Model to Remove Batch Effects in Liquid Chromatography Mass Spectrometry-Based Metabolomics Data

Abstract: Untargeted metabolomics based on liquid chromatography−mass spectrometry is affected by nonlinear batch effects, which cover up biological effects, result in nonreproducibility, and are difficult to be calibrate. In this study, we propose a novel deep learning model, called Normalization Autoencoder (NormAE), which is based on nonlinear autoencoders (AEs) and adversarial learning. An additional classifier and ranker are trained to provide adversarial regularization during the training of the AE model, latent r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
69
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 42 publications
(74 citation statements)
references
References 33 publications
1
69
0
Order By: Relevance
“…Given there were six batches, the between-batch detection proportions meant we required a feature to be initially detected in at least 1, 2, 3, 4, 5, or 6 batches, respectively. For the traditional apLCMS procedure, we set the detection threshold (number of samples) at 30 XCMS IPO_4: CentWave parameters: same as XCMS IPO_3; peak grouping parameters: bw = 22, mzwid = 0.018; Loess parameters: missing = 1, extra = 3, span = 0.2, smooth = "loess", family = "gaussian".…”
Section: Resultsmentioning
confidence: 99%
“…Given there were six batches, the between-batch detection proportions meant we required a feature to be initially detected in at least 1, 2, 3, 4, 5, or 6 batches, respectively. For the traditional apLCMS procedure, we set the detection threshold (number of samples) at 30 XCMS IPO_4: CentWave parameters: same as XCMS IPO_3; peak grouping parameters: bw = 22, mzwid = 0.018; Loess parameters: missing = 1, extra = 3, span = 0.2, smooth = "loess", family = "gaussian".…”
Section: Resultsmentioning
confidence: 99%
“…Recently, nonlinear models, often based on neural autoencoders, have gained popularity (e.g. normAE [40], AD-AE [10], or scGEN [32]). Most models aim to find a batch-effect-free latent space representation of the data via adversarial training.…”
Section: Related Workmentioning
confidence: 99%
“…Additionally, desired biological variation (referred to in this paper as " (experimental) design") between different independent experiments needs be conserved in any algorithm which aims to remove the batch effects. Although a range of batch correction algorithms has previously been suggested [46,28,40,8], only a small subset of these remains applicable in this large-scale setting. In particular, most previous algorithms cannot incorporate high-dimensional experimental design information.…”
Section: Introductionmentioning
confidence: 99%
“…With the evolution of deep network architecture, there are more and more breakthroughs in GANs 14 in the past three years. A typical method applied to this field is the NormAE 15 developed by Rong et al in 2020. Its basic idea lies in constructing an adversarial training procedure between a nonlinear AE to remove batch effects and a discriminator to distinguish the source of domain based on the latent space.…”
Section: Introductionmentioning
confidence: 99%
“…b Comparison of classification accuracy with multiple source batches for training and only one target batch for testing. Note that "Recon_T" denotes an ablation experiment that reconstruct all target batches.Afterwards, we select several latest and most representative tools including ComBat 17 , NormAE15 , BERMUDA4 , and DESC 5 for further comparison. Accuracy of cross-batch prediction in the sample level is utilized to assess the effectiveness of each method.…”
mentioning
confidence: 99%