Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing 2016
DOI: 10.18653/v1/d16-1018
|View full text |Cite
|
Sign up to set email alerts
|

Context-Dependent Sense Embedding

Abstract: Word embedding has been widely studied and proven helpful in solving many natural language processing tasks. However, the ambiguity of natural language is always a problem on learning high quality word embeddings. A possible solution is sense embedding which trains embedding for each sense of words instead of each word. Some recent work on sense embedding uses context clustering methods to determine the senses of words, which is heuristic in nature. Other work creates a probabilistic model and performs word se… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(22 citation statements)
references
References 19 publications
0
22
0
Order By: Relevance
“…There has also been work on learning multiple embeddings per word Neelakantan et al, 2015;Vu and Parker, 2016), including a lot of work in sense embeddings where the senses of a word have their own individual embeddings (Iacobacci et al, 2015;Qiu et al, 2016), as well as on how to apply such sense embeddings in downstream NLP tasks (Pilehvar et al, 2017).…”
Section: Related Workmentioning
confidence: 99%
“…There has also been work on learning multiple embeddings per word Neelakantan et al, 2015;Vu and Parker, 2016), including a lot of work in sense embeddings where the senses of a word have their own individual embeddings (Iacobacci et al, 2015;Qiu et al, 2016), as well as on how to apply such sense embeddings in downstream NLP tasks (Pilehvar et al, 2017).…”
Section: Related Workmentioning
confidence: 99%
“…Ideally, a model should model the dependency between sense choices in order to address the ambiguity from context words. Qiu et al (2016) addressed this problem by proposing a pure sense-based model. The model also expands the disambiguation context from a small window (as done in the previous works) to the whole sentence.…”
Section: # Sensesmentioning
confidence: 99%
“…Recently, Qiu et al (2016) proposed an EM algorithm to learn purely sense-level representations, where the computational cost is high when decoding the sense identity sequence, because it takes exponential time to search all sense combination within a context window. Our modular design addresses such drawback, where the sense selection module decodes a sense sequence with linear-time complexity, while the sense representation module remains representation learning in the pure sense level.…”
Section: Related Workmentioning
confidence: 99%
“…The sense selection module decides which sense to use given a text context, whereas the sense representation module learns meaningful representations based on its statistical characteristics. Unlike prior work that must suffer from either inefficient sense selection (Qiu et al, 2016) or coarse-grained representation learning (Neelakantan et al, 2014;Li and Jurafsky, 2015;Bartunov et al, 2016), the proposed modularized framework is capable of performing efficient sense selection and learning representations in pure sense level simultaneously.…”
Section: Proposed Approach: Musementioning
confidence: 99%
See 1 more Smart Citation