Proceedings of the 2020 SIAM International Conference on Data Mining 2020
DOI: 10.1137/1.9781611976236.32
|View full text |Cite
|
Sign up to set email alerts
|

A Graph-Based Approach for Active Learning in Regression

Abstract: Active learning aims to reduce labeling efforts by selectively asking humans to annotate the most important data points from an unlabeled pool and is an example of human-machine interaction. Though active learning has been extensively researched for classification and ranking problems, it is relatively understudied for regression problems. Most existing active learning for regression methods use the regression function learned at each active learning iteration to select the next informative point to query. Thi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 24 publications
0
5
0
Order By: Relevance
“…To show that EMOC [46] is the best active learning method for our filter, we replace EMOC with other label-independent active learning methods for regression: GS [50], k-medoids [50], EGAL [51], EMC [52], variance [46], and GBA [57], and the results are shown in Figure 8. While the difference in scores between EMOC, GS, k-medoids, variance, and GBA is small at the 30th iteration, EMOC outperforms other methods in early iterations; therefore, we conclude that EMOC is the best for our purpose.…”
Section: ) Active Learning Methodmentioning
confidence: 99%
See 1 more Smart Citation
“…To show that EMOC [46] is the best active learning method for our filter, we replace EMOC with other label-independent active learning methods for regression: GS [50], k-medoids [50], EGAL [51], EMC [52], variance [46], and GBA [57], and the results are shown in Figure 8. While the difference in scores between EMOC, GS, k-medoids, variance, and GBA is small at the 30th iteration, EMOC outperforms other methods in early iterations; therefore, we conclude that EMOC is the best for our purpose.…”
Section: ) Active Learning Methodmentioning
confidence: 99%
“…After selecting L ′ pixels n 1 , ..., n L ′ , we can find the next pixel to be selected, n L ′ +1 , using the active learning method. Among existing active learning methods for regression [46], [48]- [57], we can use only label-independent methods, because the labels p n1 , ..., p n L are changed during the optimization process. We experimentally find that Kading et al's [46] method outperforms other available methods; therefore, we use it in our method.…”
Section: A Active Learning Based Local Filter 1) Parameter Prediction...mentioning
confidence: 99%
“…, then selects the next evaluation point that maximizes Q(x i ) when x i ∈ U is added to the training set (Z) [31]:…”
Section: ) Sequential Laplacian Regularized V-optimal (Slrv)mentioning
confidence: 99%
“…However as the name suggests, these two approaches do not consider the model uncertainty and only select the samples greedily, which may result in high predictive variance (discussed in Section IV). Recently, Zhang et al propose a graph based active learning (GBAL) approach to select the next evaluation point based on the uncertainty information using L 1 measure, which enables it to use any surrogate model [31].…”
Section: Introductionmentioning
confidence: 99%
“…The most popular active learning sample selection strategies include random sampling and committee voting query (Kee et al, 2018; Zhao et al, 2018). These strategies mainly focus on classification and clustering (Zhang et al, 2020).…”
Section: Introductionmentioning
confidence: 99%