2021
DOI: 10.1007/978-3-030-86159-9_37
|View full text |Cite
|
Sign up to set email alerts
|

Graph Representation Learning in Document Wikification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 38 publications
0
1
0
Order By: Relevance
“…This generally involves defining a similarity measure between a mention and a candidate entity, as well as some criterion that promotes the coherence of entities across multiple mentions (e.g., to ensure that the mention "Tesla" is not linked to the inventor if the rest of the document is about the car manufacturer). While older approaches generally used ad hoc measures of "semantic relatedness" and the like [93], more recent work often uses deep neural models to obtain vector representations of mentions (including their context) and entities [94,95]. Alternatively, some studies emphasize simple and fast-to-compute methods suitable for very large-scale wikification.…”
Section: Computational Linguistics: Extracting Emotions and Smellsmentioning
confidence: 99%
“…This generally involves defining a similarity measure between a mention and a candidate entity, as well as some criterion that promotes the coherence of entities across multiple mentions (e.g., to ensure that the mention "Tesla" is not linked to the inventor if the rest of the document is about the car manufacturer). While older approaches generally used ad hoc measures of "semantic relatedness" and the like [93], more recent work often uses deep neural models to obtain vector representations of mentions (including their context) and entities [94,95]. Alternatively, some studies emphasize simple and fast-to-compute methods suitable for very large-scale wikification.…”
Section: Computational Linguistics: Extracting Emotions and Smellsmentioning
confidence: 99%