2009 IEEE Conference on Computer Vision and Pattern Recognition 2009
DOI: 10.1109/cvprw.2009.5206627
|View full text |Cite
|
Sign up to set email alerts
|

Multi-class active learning for image classification

Abstract: One of the principal bottlenecks in applying learning techniques to classification problems is the large amount of labeled training data required. Especially for images and video, providing training data is very expensive in terms of human time and effort. In this paper we propose an active learning approach to tackle the problem. Instead of passively accepting random training examples, the active learning algorithm iteratively selects unlabeled examples for the user to label, so that human effort is focused o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
112
0

Year Published

2010
2010
2020
2020

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 112 publications
(113 citation statements)
references
References 6 publications
1
112
0
Order By: Relevance
“…For multi-class problems, a concept referred to as "classifiers in contention" (the classifiers most likely to be affected by choosing an example for active learning) is introduced in [15]. This concept can also be employed for forming an interference measure -if two examples are likely to affect two different classifiers, they likely carry different information and are not redundant.…”
Section: Classifiers In Contentionmentioning
confidence: 99%
See 3 more Smart Citations
“…For multi-class problems, a concept referred to as "classifiers in contention" (the classifiers most likely to be affected by choosing an example for active learning) is introduced in [15]. This concept can also be employed for forming an interference measure -if two examples are likely to affect two different classifiers, they likely carry different information and are not redundant.…”
Section: Classifiers In Contentionmentioning
confidence: 99%
“…We employ the 2-step approach proposed by Joshi et al [15]. In the first step, binary probability values of class membership are estimated for all the binary subproblems.…”
Section: Probability Estimatesmentioning
confidence: 99%
See 2 more Smart Citations
“…P (label = M L) and P (label = CL) are approximated as the proportion of pairs in same and different clusters. Similar in spirit to active learning for training classifiers [36], we select the pair with the highest entropy under the distribution P (label | d ij ) and present it to the user to solicit label.…”
Section: Active Selection Of Pairmentioning
confidence: 99%