Proceedings of the Fifth Annual Workshop on Computational Learning Theory 1992
DOI: 10.1145/130385.130417
|View full text |Cite
|
Sign up to set email alerts
|

Query by committee

Abstract: We analyze the "query by committee" algorithm, a method for filtering informative queries from a random stream of inputs. We show that if the two-member committee algorithm achieves information gain with positive lower bound, then the prediction error decreases exponentially with the number of queries. We show that, in particular, this exponential decrease holds for query learning of perceptrons.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

3
757
1
2

Year Published

2009
2009
2018
2018

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 1,249 publications
(796 citation statements)
references
References 6 publications
3
757
1
2
Order By: Relevance
“…The supervised machine learning methods used in the experiments cover several different machine learning paradigms, and include Additive Regression (Friedman, 2002), Decision Table (Kohavi, 1995), Nearest Neighbor with a weighted condition (Aha et al, 1991), K* (Cleary et al, 1995), Locally Weighted Learning with Naive Bayes and Linear regression classifiers (Frank et al, 2002;Atkeson et al, 1997), Random Committee (Seung et al, 1992), and Random Trees (Aldous, 1993).…”
Section: Machine Learning Algorithmsmentioning
confidence: 99%
See 1 more Smart Citation
“…The supervised machine learning methods used in the experiments cover several different machine learning paradigms, and include Additive Regression (Friedman, 2002), Decision Table (Kohavi, 1995), Nearest Neighbor with a weighted condition (Aha et al, 1991), K* (Cleary et al, 1995), Locally Weighted Learning with Naive Bayes and Linear regression classifiers (Frank et al, 2002;Atkeson et al, 1997), Random Committee (Seung et al, 1992), and Random Trees (Aldous, 1993).…”
Section: Machine Learning Algorithmsmentioning
confidence: 99%
“…Locally Weighted Learning is a memory-based learning algorithm, used with naive Bayes or linear regression (Frank et al, 2002;Atkeson et al, 1997), and makes the prediction through the weighted connection within the data using regression or classification. The Random Committee (Seung et al, 1992) is a meta-learning algorithm that applies different randomly selected number of seeds. Random Tree (Aldous, 1993) is a tree-based method that builds a decision tree using a random selection of attributes in every given dataset.…”
Section: Machine Learning Algorithmsmentioning
confidence: 99%
“…Calculation process can vary up to selected active learning algorithm. Many algorithms are defined in literature such as query by committee [12], uncertainty sampling [13], margin sampling [14], entropy [14] etc. The most effective labeled data decided by using active learning algorithm is referred as informative data.…”
Section: Active Sample Selection In Ensemble Learningmentioning
confidence: 99%
“…A committee-based active-learning approach, called query-by-committee (QBC), was first proposed by Seung et al [7]. It was applied to selective-sampling problems by Freund [8], where the learner examined many unlabeled examples and only selected those samples that were more informative for learning than the others.…”
Section: Introductionmentioning
confidence: 99%
“…Early QBC studies by Seung et al [7] took into consideration their theoretical aspects within the context of binaryclassification problems. They defined a version space as a set of concepts that labeled all the training examples correctly, and they developed an algorithm to effectively restrict the version space as the number of examples increased.…”
Section: Introductionmentioning
confidence: 99%