Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conferen 2019
DOI: 10.18653/v1/d19-1355
|View full text |Cite
|
Sign up to set email alerts
|

GlossBERT: BERT for Word Sense Disambiguation with Gloss Knowledge

Abstract: Word Sense Disambiguation (WSD) aims to find the exact sense of an ambiguous word in a particular context. Traditional supervised methods rarely take into consideration the lexical resources like WordNet, which are widely utilized in knowledge-based methods. Recent studies have shown the effectiveness of incorporating gloss (sense definition) into neural networks for WSD. However, compared with traditional word expert supervised methods, they have not achieved much improvement. In this paper, we focus on how t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

3
170
1

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 165 publications
(181 citation statements)
references
References 17 publications
3
170
1
Order By: Relevance
“…EWISE (Extended WSD Incorporating Sense Embeddings) [20] overcomes the bottleneck that existing supervised WSD systems have weak capability of learning low-frequency senses of words by learning continuous sense embedding. GlossBert [16] also takes glosses knowledge into consideration and constructs context-gloss pairs as the more suitable input to BERT. A robust method for generating sense embeddings with full coverage of all WordNet senses is introduced in [21].…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…EWISE (Extended WSD Incorporating Sense Embeddings) [20] overcomes the bottleneck that existing supervised WSD systems have weak capability of learning low-frequency senses of words by learning continuous sense embedding. GlossBert [16] also takes glosses knowledge into consideration and constructs context-gloss pairs as the more suitable input to BERT. A robust method for generating sense embeddings with full coverage of all WordNet senses is introduced in [21].…”
Section: Related Workmentioning
confidence: 99%
“…In [15], authors explore different strategies to incorporate the contextualized word presentation for WSD. Work in [16] fine-tunes the pretrained BERT model to do the WSD task. A great number of other neural-based methods using a neural network encoder to extract features are proposed [17][18][19][20][21].…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…As regards LMMS, we also report results after reducing the coverage to the same level as EViLBERT's by discarding embeddings of concepts that are not in the latter. For completeness, we include the results of the current state of the art in WSD among models trained on SemCor, i.e., GlossBERT [Huang et al, 2019].…”
Section: Comparison Systemsmentioning
confidence: 99%