2015
DOI: 10.48550/arxiv.1504.06654
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Efficient Non-parametric Estimation of Multiple Embeddings per Word in Vector Space

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
36
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 16 publications
(36 citation statements)
references
References 0 publications
0
36
0
Order By: Relevance
“…[12]). It would also be interesting to see the performance of this set-based representation obtained with a multi-sense word embedding method, such as [10].…”
Section: Discussionmentioning
confidence: 99%
“…[12]). It would also be interesting to see the performance of this set-based representation obtained with a multi-sense word embedding method, such as [10].…”
Section: Discussionmentioning
confidence: 99%
“…Li et al shows that embedding that is aware of multiple word senses and provides vectors for each specific sense does improve the performance for some NLP tasks [14]. For this issue, some utilize the local context information and clustering for identifying word sense [42][43][44], some resort to external lexical database for disambiguation [45,15,46,13,47,48], while some combine topic modeling methods with embedding [49][50][51][52]. We adopt the idea of assigning multiple vectors to each node in the graph to represent different roles as well as exploiting local graph structure for the purpose.…”
Section: Related Workmentioning
confidence: 99%
“…Related Work on Multi-Sense Embedding The idea of multiple aspects is in a way related to the polysemy of words. There have been some studies on inferring multi-sense embeddings of words [2,11,14,26], which aims at inferring multiple embedding vectors for each word. However, the two tasks differ significantly in the following perspectives.…”
Section: Supplementary Materialsmentioning
confidence: 99%