2014
DOI: 10.1007/s11263-014-0784-7
|View full text |Cite
|
Sign up to set email alerts
|

Dictionary Learning for Fast Classification Based on Soft-thresholding

Abstract: Classifiers based on sparse representations have recently been shown to provide excellent results in many visual recognition and classification tasks. However, the high cost of computing sparse representations at test time is a major obstacle that limits the applicability of these methods in large-scale problems, or in scenarios where computational power is restricted. We consider in this paper a simple yet efficient alternative to sparse coding for feature extraction. We study a classification scheme that app… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
22
0
18

Year Published

2016
2016
2020
2020

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 33 publications
(42 citation statements)
references
References 25 publications
(43 reference statements)
1
22
0
18
Order By: Relevance
“…Accuracy Fawzi et al [7] 53.44% Coates et al [4] 79.6% Coates et al [5] 81.5% Proposed method 83.03% Table 4. Performance comparison on the CIFAR-10 dataset.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Accuracy Fawzi et al [7] 53.44% Coates et al [4] 79.6% Coates et al [5] 81.5% Proposed method 83.03% Table 4. Performance comparison on the CIFAR-10 dataset.…”
Section: Methodsmentioning
confidence: 99%
“…Table 4 shows some results on the CIFAR-10 dataset without data augmentation. Fawzi et al [7] use a single layer dictionary learning method for comparison. Coates et al [4], [5] used unsupervised multi-layer sparse coding with large dictionaries (up to 4k atoms).…”
Section: Methodsmentioning
confidence: 99%
“…Moreover, Sep-DNAOL enables us to produce a MAP solution of the features with task-adapted regularization in a latent way while features in DPL are the intermediate latent variables obtained with a maximizing likelihood estimation (MLE) so that the inconsistency issue will still appear in their framework. We will also compare NonSep-DNAOL with fast soft-thresholding based dictionary learning (ST-DL) [54]. Intuitively speaking, apart from their distinct motivations, ST-DL can be also viewed as a special instance of NonSep-DNAOL, but the classification loss functions in two frameworks are different.…”
Section: Framework Comparisonmentioning
confidence: 99%
“…Among a few fast sparse coding approximations, the simplest choice is arguably the thresholded feature [9], [13], [18], [34], [44]:z…”
Section: Introductionmentioning
confidence: 99%