2022
DOI: 10.1101/2022.09.22.508982
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Meta-analysis of (single-cell method) benchmarks reveals the need for extensibility and interoperability

Abstract: Computational methods represent the lifeblood of modern molecular biology. Benchmarking is important for all methods, but with a focus here on computational methods, benchmarking is critical to dissect important steps of analysis pipelines, formally assess performance across common situations as well as edge cases, and ultimately guide users on what tools to use. Benchmarking can also be important for community building and advancing methods in a principled way. We conducted a meta-analysis of recent single-ce… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(7 citation statements)
references
References 33 publications
0
7
0
Order By: Relevance
“…To account for this challenge, one could perform meta-analysis or combine the results from multiple studies. However, in the current single-cell benchmarking field, the input and output of the methods are rarely made available and researchers who want to extend from existing benchmarks would need to reconstruct the benchmark from scratch (Sonrel et al 2023). This situation calls for a collective effort from all researchers for more transparency in result sharing and for the development of novel approaches to create a consensus benchmarking framework that can be extended from different methods, datasets and criteria.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…To account for this challenge, one could perform meta-analysis or combine the results from multiple studies. However, in the current single-cell benchmarking field, the input and output of the methods are rarely made available and researchers who want to extend from existing benchmarks would need to reconstruct the benchmark from scratch (Sonrel et al 2023). This situation calls for a collective effort from all researchers for more transparency in result sharing and for the development of novel approaches to create a consensus benchmarking framework that can be extended from different methods, datasets and criteria.…”
Section: Discussionmentioning
confidence: 99%
“…In light of the rapid advancement of methodologies and the acknowledged significance of benchmarking within the single-cell research field, this field stands out as a good exemplar to explore the present state of benchmarking practices and help understand gaps that necessitate community attention. Recently, Sonrel and colleagues provided the first paper (Sonrel et al 2023) that quantitatively reviewed a collection of 62 single-cell benchmarking papers. They emphasised the technical aspects of the single-cell benchmark works and highlighted the need for code reproducibility, interoperability and extensibility.…”
Section: Introductionmentioning
confidence: 99%
“…To ensure that the analyses can be reproduced, extended, and scaled effectively, we developed FLOP using Nextflow [34,35]. We grouped the methods into three processing modules: (I) filtering, (II) normalisation and DE, and (III) functional analysis.…”
Section: Resultsmentioning
confidence: 99%
“…Given the rapidly evolving landscape of single-cell data analysis, it may take some time before it stabilises enough to support the use of a method like FLOP. For instance, benchmarks evaluating alternative normalisation methods for single-cell data are still under active development [35,51].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation