2012
DOI: 10.1007/s12559-012-9137-4
|View full text |Cite
|
Sign up to set email alerts
|

Supervised Learning and Codebook Optimization for Bag-of-Words Models

Abstract: In this paper, we present a novel approach for supervised codebook learning and optimization for bag-ofwords models. This type of models is frequently used in visual recognition tasks like object class recognition or human action recognition. An entity is represented as a histogram of codewords, which are traditionally clustered with unsupervised methods like k-means or random forests and then classified in a supervised way. We propose a new supervised method for joint codebook creation and class learning, whi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2012
2012
2021
2021

Publication Types

Select...
3
3
3

Relationship

1
8

Authors

Journals

citations
Cited by 33 publications
(9 citation statements)
references
References 45 publications
0
9
0
Order By: Relevance
“…The methods of the second category tie the classifier and the codebook learning and rely on the classifier's decisions optimize the codebook. In [22], [43], and [26], max-margin formulations are used to learn the codebook, while in [16] a multilayer perceptron (MLP) is used to backpropagate the error to the dictionary. In [47] multiple dictionaries with complementary discriminative information are learned.…”
Section: Related Workmentioning
confidence: 99%
“…The methods of the second category tie the classifier and the codebook learning and rely on the classifier's decisions optimize the codebook. In [22], [43], and [26], max-margin formulations are used to learn the codebook, while in [16] a multilayer perceptron (MLP) is used to backpropagate the error to the dictionary. In [47] multiple dictionaries with complementary discriminative information are learned.…”
Section: Related Workmentioning
confidence: 99%
“…Bag-of-Words (BOW) model was used to express a document (or sentence) as a histogram of frequency of words, ignoring the order of these words in natural language processing firstly [10]. It was used to perform the classification of computer images by Li [11].This model can learn characteristic labels of scenes without human intervention in the training database.…”
Section: Bag-of-words Modelmentioning
confidence: 99%
“…We recently introduced a fully supervised way for jointly learning the dictionary and the prediction model, which makes the method more discriminant [6].…”
Section: Keypoints and (Semi)-structured Modelsmentioning
confidence: 99%