Proceedings of the 5th Workshop on BioNLP Open Shared Tasks 2019
DOI: 10.18653/v1/d19-5726
|View full text |Cite
|
Sign up to set email alerts
|

UZH@CRAFT-ST: a Sequence-labeling Approach to Concept Recognition

Abstract: As our submission to the CRAFT shared task 2019, we present two neural approaches to concept recognition. We propose two different systems for joint named entity recognition (NER) and normalization (NEN), both of which model the task as a sequence labeling problem. Our first system is a BiLSTM network with two separate outputs for NER and NEN trained from scratch, whereas the second system is an instance of BioBERT fine-tuned on the concept-recognition task. We exploit two strategies for extending concept cove… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
21
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
3
1
1

Relationship

1
4

Authors

Journals

citations
Cited by 10 publications
(22 citation statements)
references
References 33 publications
(35 reference statements)
1
21
0
Order By: Relevance
“…Our goal is to explore the performance, efficiency, and underlying reasons for the surprisingly good performance of our machine learning approach toward concept recognition, using the 2019 CRAFT Shared Task framework from the BioNLP-OST task [51,52]. This framework provides all data and an evaluation pipeline facilitating direct comparison to the best-performing system in that evaluation [4]. Per the setup of the CRAFT Shared Task, 67 full-text documents were provided as training with 30 unseen documents held out as an external evaluation set.…”
Section: Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…Our goal is to explore the performance, efficiency, and underlying reasons for the surprisingly good performance of our machine learning approach toward concept recognition, using the 2019 CRAFT Shared Task framework from the BioNLP-OST task [51,52]. This framework provides all data and an evaluation pipeline facilitating direct comparison to the best-performing system in that evaluation [4]. Per the setup of the CRAFT Shared Task, 67 full-text documents were provided as training with 30 unseen documents held out as an external evaluation set.…”
Section: Methodsmentioning
confidence: 99%
“…Basaldella et al [28] proposed a hybrid system named OntoGene's Entity Recognizer (OGER), which focused first on high recall through a dictionary-based entity recognizer, followed by a high-precision machine learning classifier (see [29] for an updated version of this system). Furthermore, the group who developed this system had the highest-performing method in the 2019 CRAFT Shared Task [4] (UZH@CRAFT-ST), combining an updated version of OGER with two neural approaches, thereby tackling concept recognition as a single task instead of two. As we are tackling the same task through the same framework, we use their results as a baseline for the full concept recognition system.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations