2020
DOI: 10.1609/aaai.v34i05.6247
|View full text |Cite
|
Sign up to set email alerts
|

Hyperbolic Interaction Model for Hierarchical Multi-Label Classification

Abstract: Different from the traditional classification tasks which assume mutual exclusion of labels, hierarchical multi-label classification (HMLC) aims to assign multiple labels to every instance with the labels organized under hierarchical relations. Besides the labels, since linguistic ontologies are intrinsic hierarchies, the conceptual relations between words can also form hierarchical structures. Thus it can be a challenge to learn mappings from word hierarchies to label hierarchies. We propose to model the word… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
26
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
3

Relationship

0
10

Authors

Journals

citations
Cited by 45 publications
(26 citation statements)
references
References 20 publications
0
26
0
Order By: Relevance
“…For modeling label dependencies, Zhou et al (2020) formulate the hierarchy as a directed graph and introduce hierarchy-aware structure encoders. Cao et al (2020) and Chen et al (2020a) exploit the hyperbolic representation for labels by encoding the taxonomic hierarchy.…”
Section: Visualizations Of Concepts Sharingmentioning
confidence: 99%
“…For modeling label dependencies, Zhou et al (2020) formulate the hierarchy as a directed graph and introduce hierarchy-aware structure encoders. Cao et al (2020) and Chen et al (2020a) exploit the hyperbolic representation for labels by encoding the taxonomic hierarchy.…”
Section: Visualizations Of Concepts Sharingmentioning
confidence: 99%
“…We test the proposed model not with a synthetic dataset, but on a concrete downstream tasks, such as entity typing. Our work resembles López et al (2019) and Chen et al (2019), though they separately learn embeddings for type labels and text representations in hyperbolic space, whereas we do it in an integrated fashion.…”
Section: Modelmentioning
confidence: 99%
“…However, their prototypical network uses the Einstein midpoint rather than the Karcher mean we use in Section 3.3. In (Chen et al, 2019) the authors embed the labels and data separately, then predict hierarchical class membership using an interaction model. Our model directly links embedding distances to model predictions, and thus learns an embedded space that is more amenable to low-resource, dynamic classification tasks.…”
Section: Related Workmentioning
confidence: 99%