2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2016
DOI: 10.1109/cvpr.2016.246
|View full text |Cite
|
Sign up to set email alerts
|

Deep Decision Network for Multi-class Image Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
52
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 66 publications
(52 citation statements)
references
References 9 publications
0
52
0
Order By: Relevance
“…We simply apply normalization for images using means and standard deviations in three channels. For convenience of making comparison with related works, which use NIN [15] as base model, such as HD-CNN [29], DDN [30], DualNet [31], we build a D-PCN based on NIN. We follow the setting of NIN in [30,31], and D-PCN is trained without data augmentation.…”
Section: Classification Results On Cifar-100mentioning
confidence: 99%
See 2 more Smart Citations
“…We simply apply normalization for images using means and standard deviations in three channels. For convenience of making comparison with related works, which use NIN [15] as base model, such as HD-CNN [29], DDN [30], DualNet [31], we build a D-PCN based on NIN. We follow the setting of NIN in [30,31], and D-PCN is trained without data augmentation.…”
Section: Classification Results On Cifar-100mentioning
confidence: 99%
“…Table 1 shows performance comparisons between several works. Noted directly compared to D-PCN are [15,[29][30][31], which are all built on NIN and deploying multiple networks. And HD-CNN actually uses cropping and 10 view testing [1] as data augmentation, but it's a representative work using multiple subnetworks, for which reason it's listed here.…”
Section: Classification Results On Cifar-100mentioning
confidence: 99%
See 1 more Smart Citation
“…However, such multi-way splits are cumbersome to determine and do not improve the performance of decision trees [13]. [28] is another closely related work to ours. It uses a deep neural network to perform a hierarchical partition of the data as in decision trees while creating the clusters of confusing classes.…”
Section: Hybridization Based Classification Techniquesmentioning
confidence: 94%
“…CIGN may also be compared to neural network-decision tree hybrids. [20] builds a Deep Decision Network; first classifying the data with a root network, and then using spectral clustering on the misclassified samples. New networks for each class cluster are added iteratively and trained by holding the previous networks fixed.…”
Section: Neural Network -Decision Tree Hybridsmentioning
confidence: 99%