2014
DOI: 10.1007/s11306-014-0712-4
|View full text |Cite
|
Sign up to set email alerts
|

Controlling the quality of metabolomics data: new strategies to get the best out of the QC sample

Abstract: The type and use of quality control (QC) samples is a 'hot topic' in metabolomics. QCs are not novel in analytical chemistry; however since the evolution of using QCs to control the quality of data in large scale metabolomics studies (first described in 2011), the need for detailed knowledge of how to use QCs and the effects they can have on data treatment is growing. A controlled experiment has been designed to illustrate the most advantageous uses of QCs in metabolomics experiments. For this, samples were fo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
109
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 134 publications
(113 citation statements)
references
References 13 publications
1
109
0
Order By: Relevance
“…One common strategy, and the one employed in this research, is to filter variables based on exhibiting presence in at least 75% of samples in at least one of n groups. Missing values due to reprocessing errors can be reduced by methods from simply combining duplicate measurements in the peak picking process and taking an average of each [6] to applying a target ion search based on a predefined library such as the previously described recursive analysis [1]. For those missing values that still occur after taking actions such as these, they can be dealt with by changing data-reprocessing parameters or manual assignation of values from raw data [7,8], or imputed by zero [9], median [5], minimum value [10], ½ minimum value [11], arithmetic mean of all [11,12] or some of the more related samples [3], k-means nearest neighbor (kNN) [2,13] etc.…”
Section: General 3051mentioning
confidence: 99%
See 3 more Smart Citations
“…One common strategy, and the one employed in this research, is to filter variables based on exhibiting presence in at least 75% of samples in at least one of n groups. Missing values due to reprocessing errors can be reduced by methods from simply combining duplicate measurements in the peak picking process and taking an average of each [6] to applying a target ion search based on a predefined library such as the previously described recursive analysis [1]. For those missing values that still occur after taking actions such as these, they can be dealt with by changing data-reprocessing parameters or manual assignation of values from raw data [7,8], or imputed by zero [9], median [5], minimum value [10], ½ minimum value [11], arithmetic mean of all [11,12] or some of the more related samples [3], k-means nearest neighbor (kNN) [2,13] etc.…”
Section: General 3051mentioning
confidence: 99%
“…The molecular feature extraction tool in Mass Hunter Qualitative Analysis (B.06.00, Agilent) was used to clean data of background noise and to provide a list of all possible features in each sample (as described previously [1]). Features were created using the accuracy of mass measurements to group ions related to the charge-state envelope, isotopic distribution, and/or the presence of adducts and dimers, as well as potential neutral loss of molecules.…”
Section: Data Processing and Treatmentmentioning
confidence: 99%
See 2 more Smart Citations
“…Recent large-scale metabolome analyses employed an experimental design using QC samples to correct a dri of the raw signal intensity during the analysis. [85][86][87] e QC samples were prepared by mixing all the sample extracts in one analysis batch or in one metabolome analysis study. e iterative analyses of the QC sample were inserted into the start, end, and between every 4-8 actual samples in batch sequences of data acquisition.…”
Section: Overcoming Bottleneck 2: Quality Control Of Quantification Datamentioning
confidence: 99%