2008
DOI: 10.1007/978-3-540-88693-8_35
|View full text |Cite
|
Sign up to set email alerts
|

Constructing Category Hierarchies for Visual Recognition

Abstract: Abstract. Class hierarchies are commonly used to reduce the complexity of the classification problem. This is crucial when dealing with a large number of categories. In this work, we evaluate class hierarchies currently constructed for visual recognition. We show that top-down as well as bottom-up approaches, which are commonly used to automatically construct hierarchies, incorporate assumptions about the separability of classes. Those assumptions do not hold for visual recognition of a large number of object … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
95
0
1

Year Published

2012
2012
2020
2020

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 107 publications
(96 citation statements)
references
References 16 publications
(27 reference statements)
0
95
0
1
Order By: Relevance
“…With a few notable exceptions, most works in the literature either focus on developing good classifiers for taxonomy data without learning a metric [4,3,15,16,10,7], or focus on developing good metrics without exploiting the taxonomy structure [22,8,12,9,1,18]. For instance, Hierarchical SVM [4]-an example of the first kind-adapts the basic SVM classifier to learn a linear hyperplane for each category by accumulating contributions from each node along the path from the root to a leaf, but does not yield a metric.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…With a few notable exceptions, most works in the literature either focus on developing good classifiers for taxonomy data without learning a metric [4,3,15,16,10,7], or focus on developing good metrics without exploiting the taxonomy structure [22,8,12,9,1,18]. For instance, Hierarchical SVM [4]-an example of the first kind-adapts the basic SVM classifier to learn a linear hyperplane for each category by accumulating contributions from each node along the path from the root to a leaf, but does not yield a metric.…”
Section: Related Workmentioning
confidence: 99%
“…These local metrics are combined together (Q t ) along the paths of the taxonomy for effective object categorization. cess to this taxonomy structure has been shown to benefit both the accuracy as well as the scalability of learning algorithms [15,16,10].…”
Section: Introductionmentioning
confidence: 99%
“…About the first condition, high-level hierarchical feature representations can be obtained through biologically inspired design [20,26,33] or example-driven discovery which includes information transfer [9,13] and hierarchy learning [10,16,18,19,29]. In our real-world problems (Sect.…”
Section: Discussionmentioning
confidence: 99%
“…The motivation is to reduce the complexity of visual recognition problems that have a very large number of instances. To build useful taxonomies, the proposed methods exploit either purely the semantic tag labels [19,29], or purely the visual information [10,18], or both as in [16], where the authors propose a way to learn a "semantivisual" hierarchy that is both semantically meaningful and close to the visual content.…”
Section: Introductionmentioning
confidence: 99%
“…LLC feature with 21K dimension is used in order to recreate the one-versus-all accuracy reported in [14]. The relaxed hierarchy method [14] is a very strong baseline since it outperforms many other "tree-based" method [24,16]. Therefore, five hierarchies corresponding to relaxed degree ρ = {0.5, 0.6, 0.7, 0.8, 0.9} (the bigger value the less relax) are trained for comparison.…”
Section: Caltech-256mentioning
confidence: 99%