2023
DOI: 10.1093/bib/bbad230
|View full text |Cite
|
Sign up to set email alerts
|

A novel statistical method for decontaminating T-cell receptor sequencing data

Abstract: The T-cell receptor (TCR) repertoire is highly diverse among the population and plays an essential role in initiating multiple immune processes. TCR sequencing (TCR-seq) has been developed to profile the T cell repertoire. Similar to other high-throughput experiments, contamination can happen during several steps of TCR-seq, including sample collection, preparation and sequencing. Such contamination creates artifacts in the data, leading to inaccurate or even biased results. Most existing methods assume ‘clean… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 22 publications
0
1
0
Order By: Relevance
“…26,28,29 BERT has been successfully used in NLP for learning word vectors based on contextual information, in particular, for text classification and next-sentence prediction. [30][31][32][33][34] Unlike other language models that capture context unidirectionally, BERT was designed as a bidirectional model to analyze sentences in forward and backward direction and predict new words conditioned on all other words in sentences. 26,35 Given that next-sentence prediction was conceptually related to the AS extension task, BERT was chosen as a transformer architecture for R-group prediction.…”
Section: Transformer Variantmentioning
confidence: 99%
“…26,28,29 BERT has been successfully used in NLP for learning word vectors based on contextual information, in particular, for text classification and next-sentence prediction. [30][31][32][33][34] Unlike other language models that capture context unidirectionally, BERT was designed as a bidirectional model to analyze sentences in forward and backward direction and predict new words conditioned on all other words in sentences. 26,35 Given that next-sentence prediction was conceptually related to the AS extension task, BERT was chosen as a transformer architecture for R-group prediction.…”
Section: Transformer Variantmentioning
confidence: 99%