2023
DOI: 10.1002/cpz1.658
|View full text |Cite
|
Sign up to set email alerts
|

The Current State of Single‐Cell Proteomics Data Analysis

Abstract: Sound data analysis is essential to retrieve meaningful biological information from single‐cell proteomics experiments. This analysis is carried out by computational methods that are assembled into workflows, and their implementations influence the conclusions that can be drawn from the data. In this work, we explore and compare the computational workflows that have been used over the last four years and identify a profound lack of consensus on how to analyze single‐cell proteomics data. We highlight the need … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
38
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 17 publications
(40 citation statements)
references
References 101 publications
(152 reference statements)
0
38
0
Order By: Relevance
“…Using software for standardizing workflows across laboratories facilitates reporting. Examples of such workflows include the scp R-Bioconductor package 48,91 , the sceptre Python package 9 , the SCoPE2 pipeline 16,92 or the Scripts and Pipelines for Proteomics 93 . Packages that allow comparing structured and repeatable data processing, including evaluating different algorithms for a processing step, provide further advantages 48,91 .…”
Section: Perspectivementioning
confidence: 99%
See 1 more Smart Citation
“…Using software for standardizing workflows across laboratories facilitates reporting. Examples of such workflows include the scp R-Bioconductor package 48,91 , the sceptre Python package 9 , the SCoPE2 pipeline 16,92 or the Scripts and Pipelines for Proteomics 93 . Packages that allow comparing structured and repeatable data processing, including evaluating different algorithms for a processing step, provide further advantages 48,91 .…”
Section: Perspectivementioning
confidence: 99%
“…Examples of such workflows include the scp R-Bioconductor package 48,91 , the sceptre Python package 9 , the SCoPE2 pipeline 16,92 or the Scripts and Pipelines for Proteomics 93 . Packages that allow comparing structured and repeatable data processing, including evaluating different algorithms for a processing step, provide further advantages 48,91 . Software platforms that support exporting the commands and parameters used should be strongly preferred because audit log and/or parameter files can help tracking and later reproducing the different processing steps, including software and the versions used at each step.…”
Section: Perspectivementioning
confidence: 99%
“…Finally, imputation methods can easily be transferred and reassessed across fields. For instance, K-nearest neighbor (KNN) imputation is an imputation method used in a wide range of application and fields and has been rapidly adopted by the SCP field that still lacks dedicated models to handle missing values …”
Section: To Impute or Not To Impute?mentioning
confidence: 99%
“…Although imputation should ideally be avoided, many SCP analyses include imputation in their workflow . The reasons are mainly practical (Table ).…”
Section: To Impute or Not To Impute?mentioning
confidence: 99%
“…258 Apart from improvements in microfluidics-based workflow, the standardized computational pipeline to analyze downstream SCP data is of high interest. 258,259 The current spectral data processing approaches exhibit limited identification rate, relatively time-consuming construction of libraries and throughput which compromises the extraction of biologically critical information from SCP data. To this end, deep learning-based models might be employed with computational pipelines to boost identification and generate predicted libraries to complement experimental libraries with the least possible resources.…”
Section: Future Perspectives and Conclusionmentioning
confidence: 99%