Proceedings of the Tenth International Workshop on Health Text Mining and Information Analysis (LOUHI 2019) 2019
DOI: 10.18653/v1/d19-6221
|View full text |Cite
|
Sign up to set email alerts
|

Neural Token Representations and Negation and Speculation Scope Detection in Biomedical and General Domain Text

Abstract: Since the introduction of context-aware token representation techniques such as Embeddings from Language Models (ELMo) and Bidirectional Encoder Representations from Transformers (BERT), there have been numerous reports on improved performance on a variety of natural language tasks. Nevertheless, the degree to which the resulting context-aware representations can encode information about morpho-syntactic properties of the tokens in a sentence remains unclear. In this paper, we investigate the application and i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(3 citation statements)
references
References 17 publications
0
3
0
Order By: Relevance
“…BiLSTM approaches for negation detection have been successful, with Fancellu et al (2017) reporting state of the art results for Bio-Scope (Vincze et al, 2008) abstracts. Sergeeva et al (2019) outperformed the latter using pretrained transformer models.…”
Section: Related Workmentioning
confidence: 98%
“…BiLSTM approaches for negation detection have been successful, with Fancellu et al (2017) reporting state of the art results for Bio-Scope (Vincze et al, 2008) abstracts. Sergeeva et al (2019) outperformed the latter using pretrained transformer models.…”
Section: Related Workmentioning
confidence: 98%
“…Khandelwal and Sawant (2020) employ BERT (Devlin et al, 2019) for negation detection, including negation cue detection, scope detection, and event detection. Sergeeva et al (2019) apply ELMo (Peters et al, 2018) and BERT to negation scope detection and achieve new state-of-the-art results on two negation data sets. Instead of pursuing better results, here we aim to probe how much information about negation has been encoded in hidden states in a negation detection task.…”
Section: Negation In Other Areas Of Nlpmentioning
confidence: 99%
“…Fancellu et al (2016 continued this trend, additionally treating each negation instance separately, and successfully used BiL-STM (bidirectional Long Short-Term Memory recurrent neural networks; Hochreiter and Schmidhuber, 1997). Recently, Sergeeva et al (2019) used pre-trained transformers (Vaswani et al, 2017), namely BERT (Devlin et al, 2019), to further improve performance, albeit on a derivative of the original dataset (Liu et al, 2018). Using BERT in a two-stage sequence-labelling approach on the original ConanDoyle-neg corpus and other relevant negation corpora, Khandelwal and Sawant (2020) successfully improved previous results by a considerable margin.…”
Section: Related Workmentioning
confidence: 99%