Proceedings of the 1st Workshop on Representation Learning for NLP 2016
DOI: 10.18653/v1/w16-1620
|View full text |Cite
|
Sign up to set email alerts
|

Making Sense of Word Embeddings

Abstract: We present a simple yet effective approach for learning word sense embeddings. In contrast to existing techniques, which either directly learn sense representations from corpora or rely on sense inventories from lexical resources, our approach can induce a sense inventory from existing word embeddings via clustering of ego-networks of related words. An integrated WSD mechanism enables labeling of words in context with learned sense vectors, which gives rise to downstream applications. Experiments show that the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
92
0
1

Year Published

2017
2017
2021
2021

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 95 publications
(93 citation statements)
references
References 27 publications
(32 reference statements)
0
92
0
1
Order By: Relevance
“…contrastive: Il a la rétraction facile mais c'est parce qu'il débute. (Pelevina et al, 2016), which has been shown to perform as good as stat-of-theart unsupervised WSD systems. The method to learn the sense embeddings using SenseGram consists of four steps that we briefly summarise here.…”
Section: Sourcementioning
confidence: 99%
“…contrastive: Il a la rétraction facile mais c'est parce qu'il débute. (Pelevina et al, 2016), which has been shown to perform as good as stat-of-theart unsupervised WSD systems. The method to learn the sense embeddings using SenseGram consists of four steps that we briefly summarise here.…”
Section: Sourcementioning
confidence: 99%
“…These vectors of each instance are then clustered. Multi-prototype extensions of the skipgram model (Mikolov et al, 2013) that use no predefined sense inventory learn one embedding word vector per one word sense and are commonly fitted with a disambiguation mechanism (Huang et al, 2012;Tian et al, 2014;Neelakantan et al, 2014;Bartunov et al, 2016;Li and Jurafsky, 2015;Pelevina et al, 2016). Comparisons of the AdaGram (Bartunov et al, 2016) to (Neelakantan et al, 2014) on three SemEval word sense induction and disambiguation datasets show the advantage of the former.…”
Section: Related Workmentioning
confidence: 99%
“…For this reason, we use AdaGram as a representative of the state-of-the-art word sense embeddings in our experiments. In addition, we compare to SenseGram, an alternative sense embedding based approach by Pelevina et al (2016). What makes the comparison to the later method interesting is that this approach is similar to ours, but instead of sparse representations the authors rely on word embeddings, making their approach less interpretable.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…AdaGram (Bartunov et al, 2016) is a system that learns sense embeddings using a Bayesian extension of the Skip-gram model and provides WSD functionality based on the induced sense inventory. SenseGram (Pelevina et al, 2016) is a system that transforms word embeddings to sense embeddings via graph clustering and uses them for WSD. Other methods to learn sense embeddings were proposed, but these do not feature open implementations for WSD.…”
Section: Related Workmentioning
confidence: 99%