Proceedings of the 5th Workshop on BioNLP Open Shared Tasks 2019
DOI: 10.18653/v1/d19-5708
|View full text |Cite
|
Sign up to set email alerts
|

A Neural Pipeline Approach for the PharmaCoNER Shared Task using Contextual Exhaustive Models

Abstract: We present a neural pipeline approach that performs named entity recognition (NER) and concept indexing (CI), which links them to concept unique identifiers (CUIs) in a knowledge base, for the PharmaCoNER shared task on pharmaceutical drugs and chemical entities. We proposed a neural NER model that captures the surrounding semantic information of a given sequence by capturing the forwardand backward-context of bidirectional LSTM (Bi-LSTM) output of a target span using contextual span representation-based exhau… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
1
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(1 citation statement)
references
References 13 publications
0
1
0
Order By: Relevance
“…The span representation was calculated based on a pre-trained language model and then classified to a corresponding entity type by a linear layer. Following the suite, numerous studies have shown that span-based approaches to NER could produce SOTA performance (Zheng et al, 2019;Tan et al, 2020;Xia et al, 2019;Sohrab et al, 2019;Xu et al, 2021;Fu et al, 2021;Li et al, 2021;Yu et al, 2022). Fu et al (2021) designed SpanNER that learns the representation of a span based on its token representation and the span length embedding.…”
mentioning
confidence: 99%
“…The span representation was calculated based on a pre-trained language model and then classified to a corresponding entity type by a linear layer. Following the suite, numerous studies have shown that span-based approaches to NER could produce SOTA performance (Zheng et al, 2019;Tan et al, 2020;Xia et al, 2019;Sohrab et al, 2019;Xu et al, 2021;Fu et al, 2021;Li et al, 2021;Yu et al, 2022). Fu et al (2021) designed SpanNER that learns the representation of a span based on its token representation and the span length embedding.…”
mentioning
confidence: 99%