2008
DOI: 10.1371/journal.pone.0003724
|View full text |Cite
|
Sign up to set email alerts
|

Randomization in Laboratory Procedure Is Key to Obtaining Reproducible Microarray Results

Abstract: The quality of gene expression microarray data has improved dramatically since the first arrays were introduced in the late 1990s. However, the reproducibility of data generated at multiple laboratory sites remains a matter of concern, especially for scientists who are attempting to combine and analyze data from public repositories. We have carried out a study in which a common set of RNA samples was assayed five times in four different laboratories using Affymetrix GeneChip arrays. We observed dramatic differ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
28
0

Year Published

2010
2010
2019
2019

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 36 publications
(28 citation statements)
references
References 23 publications
0
28
0
Order By: Relevance
“…During sample processing, samples were randomized in 96-well plates in order to avoid any confounding factors owing to well location, time point, or biological replicate batch effects and to minimize error owing to plate edge effects and fluid transfers. The need for randomization during sample preparation as a means to avoid, or reduce, the likelihood of confounding biological factors with processing or procedural effects has been described for microarray-based platforms [1013]. The tight groupings of replicates and the clear separation or distance between groups of replicates, indicative of low noise in the data, was demonstrated by the published PCA plot [8] and is a testament to the solid experimental design.…”
Section: Intracellular Lifecycle Transcriptome Analysismentioning
confidence: 99%
“…During sample processing, samples were randomized in 96-well plates in order to avoid any confounding factors owing to well location, time point, or biological replicate batch effects and to minimize error owing to plate edge effects and fluid transfers. The need for randomization during sample preparation as a means to avoid, or reduce, the likelihood of confounding biological factors with processing or procedural effects has been described for microarray-based platforms [1013]. The tight groupings of replicates and the clear separation or distance between groups of replicates, indicative of low noise in the data, was demonstrated by the published PCA plot [8] and is a testament to the solid experimental design.…”
Section: Intracellular Lifecycle Transcriptome Analysismentioning
confidence: 99%
“…Replicates have to be accurately distributed in different days. Conversely, processing all the replicates in the same day will result in a systematic error, randomization being a key strategy to obtain reproducible results, as recently discussed in [12].…”
Section: Discussionmentioning
confidence: 99%
“…Therefore, an experimental design that allows to evaluate, through technical replicates and biological pooling, the experimental variability and to reduce its effects on the results by suitable statistical techniques is fundamental. In the last few years, some attempts have been reported in the literature [8], [9], [10], [11], [12].…”
Section: Introductionmentioning
confidence: 99%
“…Amniotic fluid samples were processed and analyzed separately from serum samples but under parallel protocols. We performed principal component analysis (PCA) to evaluate potential batch effects (Yang et al, 2008) and corrected for these effects, where necessary, using ComBat (Johnson, Li, & Rabinovic, 2007) and xMSanalyzer.…”
Section: Methodsmentioning
confidence: 99%