IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004 2004
DOI: 10.1109/robot.2004.1307137
|View full text |Cite
|
Sign up to set email alerts
|

Enabling learning from large datasets: applying active learning to mobile robotics

Abstract: Absrmct-Autonomous navigation in outdoor, om-road environments requires solving complex classification problems. Obstacle detection, road following and terrain classification are examples of tasks which have been successfully approached using supervised machine learning techniques for classification. Large amounts of training data are usually necessary in order to achieve satisfactory generalization. In such .cases, manually labeling data becomes an expensive and tedious process.This paper describes a method f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2005
2005
2019
2019

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(12 citation statements)
references
References 12 publications
0
12
0
Order By: Relevance
“…Novelty detection in and of itself is a rich field of research [8], and has seen many applications to mobile robotics [9], [10]. [11], [12] specifically propose density estimation in the context of active learning for robotic perception systems. Specifically, a large dataset consisting of a robot's recorded perceptual history is analyzed to identify the most unlikely single percepts (given the entire history).…”
Section: Active Learning Through Novelty Reductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Novelty detection in and of itself is a rich field of research [8], and has seen many applications to mobile robotics [9], [10]. [11], [12] specifically propose density estimation in the context of active learning for robotic perception systems. Specifically, a large dataset consisting of a robot's recorded perceptual history is analyzed to identify the most unlikely single percepts (given the entire history).…”
Section: Active Learning Through Novelty Reductionmentioning
confidence: 99%
“…While [11] uses non-parametric density estimation, parametric approaches are generally much faster at test time, allowing for easier application in an online setting (to allow for query filtering as well as a pool approach). Therefore, parametric density estimation is preferable in this context.…”
Section: Active Learning Through Novelty Reductionmentioning
confidence: 99%
“…The method balanced exploration and exploitation and used probabilistic active learning algorithm to predict the policies that would produce higher expected gains. Dima et al [199] described an active learning algorithm based on kernel density estimation to identify exemplar images in a dataset.…”
Section: Other Miscellaneous Approachesmentioning
confidence: 99%
“…To address this important problem, we have developed the Unlabeled Data Filtering (UDF) algorithm [24]. The intuition behind UDF is simple: we are interested in reducing the size of an unlabeled data set by discarding redundant examples, while keeping most of the rare patterns.…”
Section: A the Initialization Problemmentioning
confidence: 99%
“…In [24] we argued that aggregating interest measures over all the patches within an image is undesirable because high interest patches can be overwhelmed by large amounts of uninteresting patches. The other extreme of scoring an image only based on its most interesting patch is also ineffective, because the most unusual patterns in an image will often be outliers corresponding to various artifacts.…”
Section: B the Data Block Constraintmentioning
confidence: 99%