Proceedings of the 4th Workshop on Representation Learning for NLP (RepL4NLP-2019) 2019
DOI: 10.18653/v1/w19-4319
|View full text |Cite
|
Sign up to set email alerts
|

Fine-Grained Entity Typing in Hyperbolic Space

Abstract: How can we represent hierarchical information present in large type inventories for entity typing? We study the ability of hyperbolic embeddings to capture hierarchical relations between mentions in context and their target types in a shared vector space. We evaluate on two datasets and investigate two different techniques for creating a large hierarchical entity type inventory: from an expert-generated ontology and by automatically mining type co-occurrences. We find that the hyperbolic model yields improveme… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
29
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 31 publications
(33 citation statements)
references
References 25 publications
0
29
0
Order By: Relevance
“…Xiong et al [33] encoded both global label co-occurrence statistics and word-level similarities by using a graph-enhanced model equipped with an attention-based matching module. Unlike Xiong et al [33] operating under Euclidean assumption, López et al [34] imposed a hyperbolic geometry to enrich the hierarchical information, and applied a self-attentive encoder to get the context representation. Furthermore, Lin et al [19] developed a hybrid model that incorporates latent type representation in addition to binary relevance, which is able to capture interdependencies between entity types.…”
Section: Attention-based Methodsmentioning
confidence: 99%
“…Xiong et al [33] encoded both global label co-occurrence statistics and word-level similarities by using a graph-enhanced model equipped with an attention-based matching module. Unlike Xiong et al [33] operating under Euclidean assumption, López et al [34] imposed a hyperbolic geometry to enrich the hierarchical information, and applied a self-attentive encoder to get the context representation. Furthermore, Lin et al [19] developed a hybrid model that incorporates latent type representation in addition to binary relevance, which is able to capture interdependencies between entity types.…”
Section: Attention-based Methodsmentioning
confidence: 99%
“…, c L } need to be projected into the hyperbolic space. We exploit the re-parameterization technique (Dhingra et al, 2018;López et al, 2019) to implement it, which involves computing a direction vector r and a norm magnitude η. We use the c i as an example to illustrate the procedure:…”
Section: Hyperbolic Document Projectormentioning
confidence: 99%
“…Recent advances in typing NE in English have harnessed the power of contextualized word embeddings (Peters et al, 2018;Conneau et al, 2020) to encode entities and their context. These approaches use the AIDA, BNN, OntoNotes and FIGER ontologies, which come with their own human annotated data sets López et al, 2019). By choosing to use the model of , we build upon their strengths to enable GE typing in German.…”
Section: Related Workmentioning
confidence: 99%