2022
DOI: 10.48550/arxiv.2208.06955
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Continuous Active Learning Using Pretrained Transformers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

1
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 0 publications
1
1
0
Order By: Relevance
“…Text classifiers based on BioBERT (Lee et al, 2020) and PubMedBERT (Gu et al, 2020) were also developed. However, neither outperformed the linear logistic regression model on CLEF datasets, a result consistent with previous findings for TAR models (Yang et al, 2022;Sadri and Cormack, 2022).…”
Section: Methodssupporting
confidence: 90%
“…Text classifiers based on BioBERT (Lee et al, 2020) and PubMedBERT (Gu et al, 2020) were also developed. However, neither outperformed the linear logistic regression model on CLEF datasets, a result consistent with previous findings for TAR models (Yang et al, 2022;Sadri and Cormack, 2022).…”
Section: Methodssupporting
confidence: 90%
“…A variation of the AL setting that has shown success in certain domain-specific tasks is that of continuous active learning [27,47,64], where documents are iteratively retrieved by actively learning for one specific query, typically aiming for total recall [42]. For the task of technology assisted review (TAR), Yang et al [63] propose a TAR cost framework, however this framework focuses on cost modeling for reviewing one specific query.…”
Section: Related Workmentioning
confidence: 99%