Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning 2016
DOI: 10.18653/v1/k16-1025
|View full text |Cite
|
Sign up to set email alerts
|

Joint Learning of the Embedding of Words and Entities for Named Entity Disambiguation

Abstract: Named Entity Disambiguation (NED) refers to the task of resolving multiple named entity mentions in a document to their correct references in a knowledge base (KB) (e.g., Wikipedia). In this paper, we propose a novel embedding method specifically designed for NED. The proposed method jointly maps words and entities into the same continuous vector space. We extend the skip-gram model by using two models. The KB graph model learns the relatedness of entities using the link structure of the KB, whereas the anchor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

2
457
1
1

Year Published

2018
2018
2019
2019

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 297 publications
(472 citation statements)
references
References 33 publications
2
457
1
1
Order By: Relevance
“…There are two key differences between all these close previous work [3,4,14,25] and ours. First, unlike these past approaches, we tackle the readability of word senses and polysemy problems by learning document representations that leverage semantics inventoried in both text corpora and knowledge resources through fine-grained elements including words and concepts in a joint learning process.…”
Section: Neural Approaches Empowered By Knowledge Resources Forcontrasting
confidence: 65%
See 3 more Smart Citations
“…There are two key differences between all these close previous work [3,4,14,25] and ours. First, unlike these past approaches, we tackle the readability of word senses and polysemy problems by learning document representations that leverage semantics inventoried in both text corpora and knowledge resources through fine-grained elements including words and concepts in a joint learning process.…”
Section: Neural Approaches Empowered By Knowledge Resources Forcontrasting
confidence: 65%
“…To tackle the above problems, neural approaches investigated the joint use of both corpus-based word distributions and knowledge resources to achieve more accurate text representations [6,13,14,25].…”
Section: Neural Approaches Empowered By Knowledge Resources Formentioning
confidence: 99%
See 2 more Smart Citations
“…They further extended the model by integrating category structure to capture meaningful semantic relationships between entities and categories. Yamada et al (2016) learned joint embedding for words and entities. Tsai and Roth (2016) proposed a way to learn multilingual embedding of words and entities.…”
Section: Entity Embeddingmentioning
confidence: 99%