2019 15th International Conference on eScience (eScience) 2019
DOI: 10.1109/escience.2019.00021
|View full text |Cite
|
Sign up to set email alerts
|

Active Learning Yields Better Training Data for Scientific Named Entity Recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 11 publications
(11 citation statements)
references
References 36 publications
0
11
0
Order By: Relevance
“…There is no feedback between the data annotation phrase and the model training phase. Active learning breaks this boundary by actively selecting the samples that are sent to annotators for labeling . Compared with supervised learning, active learning can greatly reduce the number of training examples needed for the training process.…”
Section: Constructing the Benchmark Data Setmentioning
confidence: 93%
“…There is no feedback between the data annotation phrase and the model training phase. Active learning breaks this boundary by actively selecting the samples that are sent to annotators for labeling . Compared with supervised learning, active learning can greatly reduce the number of training examples needed for the training process.…”
Section: Constructing the Benchmark Data Setmentioning
confidence: 93%
“…69 Researchers have also pursued active learning with maximum-entropy uncertainty sampling to achieve valuable annotations from experts to improve performance, but this proved time intensive to pursue. 72 Roles for hybrid systems also include establishing dictionaries for stop words and rules to detect systematic names.…”
Section: Named Entity Recognition (Ner)mentioning
confidence: 99%
“…Recent advances in natural language processing have produced an increased interest in active learning to alleviate the requirement for large annotated corpora (Olsson, 2009;Tchoua et al, 2019). Settles and Craven (2008) compare several strategies for active learning in sequence labelling scenarios, concluding that query strategies based on measures of sequence entropy combined with weighted sampling outperform other variants.…”
Section: Related Workmentioning
confidence: 99%