2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) 2020
DOI: 10.1109/cvprw50498.2020.00426
|View full text |Cite
|
Sign up to set email alerts
|

Hierarchical Image Classification using Entailment Cone Embeddings

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
21
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 32 publications
(21 citation statements)
references
References 9 publications
0
21
0
Order By: Relevance
“…The CIFAR‐10 dataset [23] contains of 10 classes of 32 × 32 RGB images. The 60,000 images are divided into training and test sets, which have 50,000 and 10,000 images, respectively.…”
Section: Experimental Results and Analysismentioning
confidence: 99%
“…The CIFAR‐10 dataset [23] contains of 10 classes of 32 × 32 RGB images. The 60,000 images are divided into training and test sets, which have 50,000 and 10,000 images, respectively.…”
Section: Experimental Results and Analysismentioning
confidence: 99%
“…In [9], various label relations (e.g., mutual exclusion and subsumption) were encoded as conditional random field to pair-wisely capture the correlation between classes. More recently, Bertinetto et al [12] proposed a hierarchical loss that could be viewed as a weighted version of the plain cross entropy loss, and the weights were defined according to the depth of the label tree. Our LHT model introduces a novel loss that combines the traditional cross entropy loss with the proposed confusion loss.…”
Section: Related Workmentioning
confidence: 99%
“…In hierarchical classification, Bertinetto et al [3] proposed a soft-label-based method to soften the one-hot labels according to the metric defined by the label tree. Dhall et al [12] employed entailment cones to learn order preserving embedding, which can be embedded in both Euclidean and hyperbolic geometry enforces transitivity among class hierarchies. Although label embedding is a potential strategy to capture the correlation across class hierarchies, in this paper, we attempt to develop a unified deep classification framework for hierarchical classification.…”
Section: Related Workmentioning
confidence: 99%
“…The success of our proposed framework is affected by two key observations in most existing long-tailed classification works. First, incorporating the hierarchy in the model would help improve generalization on classes especially for those minority classes, by leveraging shared features among hierarchically-related classes [7]. Second, the model learning from the original distribution obtains a better representation, i.e., convolutional layers and the model learning from a re-balanced distribution shows a fairer and more unbiased classification for all categories [40,22].…”
Section: Co-occurrencementioning
confidence: 99%
“…Also, Given a sample from unknown classes, a trained model may have no ability to handle the challenge, e.g., giving a hypertension fundus to a DR classification model. Such approaches only assume mutually exclusive, unstructured labels [7].…”
Section: Pre-training With Hierarchical Informationmentioning
confidence: 99%