2017
DOI: 10.1093/nar/gkx449
|View full text |Cite
|
Sign up to set email alerts
|

NOREVA: normalization and evaluation of MS-based metabolomics data

Abstract: Diverse forms of unwanted signal variations in mass spectrometry-based metabolomics data adversely affect the accuracies of metabolic profiling. A variety of normalization methods have been developed for addressing this problem. However, their performances vary greatly and depend heavily on the nature of the studied data. Moreover, given the complexity of the actual data, it is not feasible to assess the performance of methods by single criterion. We therefore developed NOREVA to enable performance evaluation … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
186
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 314 publications
(194 citation statements)
references
References 56 publications
0
186
0
Order By: Relevance
“…We also encourage researchers to assess the potential performance of different normalization methods on their empirical profiles, using simulated data that reflect the structure of empirical data in question (e.g., number of compounds, profile shapes, sample sizes, etc.) as in the present study, or via the open source tool NOREVA …”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…We also encourage researchers to assess the potential performance of different normalization methods on their empirical profiles, using simulated data that reflect the structure of empirical data in question (e.g., number of compounds, profile shapes, sample sizes, etc.) as in the present study, or via the open source tool NOREVA …”
Section: Discussionmentioning
confidence: 99%
“…Furthermore, because of this size effect, pre‐processing must be carried out before profiles can be compared statistically. A large number of pre‐processing techniques are routinely applied to GC‐MS data, but each is subject to inherent limitations that can lead to different, and sometimes misrepresentative results, making biological interpretation problematic. Despite highly variable performance, it remains an open question, which pre‐processing technique is the most appropriate, and many researchers are unaware of the biases associated with these methods.…”
Section: Introductionmentioning
confidence: 99%
“…As a general recommendation, several normalization methods should be tried and critically assessed (see Boxes and ). Heuristics for the performance comparison of normalization strategies were recently suggested, based on statistical criteria such as the maximization of biological intergroup effects, the reduction of intragroup effects, and p ‐value distributions …”
Section: Data Visualization Preprocessing and Analysismentioning
confidence: 99%
“…E.g. many batchcorrection algorithms exist which use data-driven, internal standards (IS)-based or quality control samples (QC)-based normalization 5,[25][26][27][28] . Random errors should be statistically quantified by analysis of suitably defined replicates throughout a study, batch or run.…”
Section: Qc Measures and Metaquac Softwarementioning
confidence: 99%