2018
DOI: 10.1613/jair.1.11259
|View full text |Cite
|
Sign up to set email alerts
|

From Word To Sense Embeddings: A Survey on Vector Representations of Meaning

Abstract: Over the past years, distributed semantic representations have proved to be effective and flexible keepers of prior knowledge to be integrated into downstream applications. This survey focuses on the representation of meaning. We start from the theoretical background behind word vector space models and highlight one of their major limitations: the meaning conflation deficiency, which arises from representing a word with all its possible meanings as a single vector. Then, we explain how this deficiency can be a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
87
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 229 publications
(98 citation statements)
references
References 140 publications
1
87
0
Order By: Relevance
“…Some works may no longer be competitive with the state-of-the-art, but nevertheless remain relevant for the development of sense embeddings. We recommend the recent survey of Camacho-Collados and Pilehvar (2018) for a thorough overview of this topic, and highlight a few of the most relevant methods. Chen et al (2014) initializes sense embeddings using glosses and adapts the Skip-Gram objective of word2vec to learn and improve sense embeddings jointly with word embeddings.…”
Section: Other Methods With Sense Embeddingsmentioning
confidence: 99%
See 1 more Smart Citation
“…Some works may no longer be competitive with the state-of-the-art, but nevertheless remain relevant for the development of sense embeddings. We recommend the recent survey of Camacho-Collados and Pilehvar (2018) for a thorough overview of this topic, and highlight a few of the most relevant methods. Chen et al (2014) initializes sense embeddings using glosses and adapts the Skip-Gram objective of word2vec to learn and improve sense embeddings jointly with word embeddings.…”
Section: Other Methods With Sense Embeddingsmentioning
confidence: 99%
“…However, by recasting the same word types across different sense-inducing contexts, these representations became insensitive to the different senses of polysemous words. Camacho-Collados and Pilehvar (2018) refer to this issue as the meaning conflation deficiency and explore it more thoroughly in their work.…”
Section: Introductionmentioning
confidence: 99%
“…Additionally, various computational methods have been shown to consistently outperform statistical baselines in disambiguating different word senses and meanings from distributional vectors (see Camacho-Collados & Pilehvar, 2018), for example by relying on their respective similarities to multiple different pre-determined word categories, or clusters (Boleda, Padó, & Utt, 2012;Pantel & Lin, 2002). Although the introduction of such pre-determined categories is potentially problematic from a psychological perspective, as actual speakers have to learn these categories from experience in a bottom-up and dynamic way, rather than obtaining them top-down from an outside source, such studies still demonstrate that distributional vectors encode information that can be harvested to disambiguate different senses.…”
Section: Does Each Distributional Vector Represent a Single Fixed Symentioning
confidence: 99%
“…Another improvement would be to use representation learning instead of manually selecting discriminant features. This could be performed by using graph embeddings [58], a recent transposition to graphs of the NLP concept of word embeddings [43]. This approach allows directly learning the most appropriate numeric representation of the character network for the considered classification task.…”
Section: /75mentioning
confidence: 99%