2017 IEEE 19th International Workshop on Multimedia Signal Processing (MMSP) 2017
DOI: 10.1109/mmsp.2017.8122269
|View full text |Cite
|
Sign up to set email alerts
|

Uncertainty sampling based active learning with diversity constraint by sparse selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 17 publications
(10 citation statements)
references
References 12 publications
0
10
0
Order By: Relevance
“…5) as in (Hwa, 2004;Joshi et al, 2009). Given that the difference in degree of certainty for similar examples can be small, uncertainty selections are prone to return similar examples (Wang et al, 2017). To address this issue, some works incorporate measures to exploit the diversity information of the examples in the selection process (Sener and Savarese, 2017;Wang et al, 2017;Sinha et al, 2019).…”
Section: Active Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…5) as in (Hwa, 2004;Joshi et al, 2009). Given that the difference in degree of certainty for similar examples can be small, uncertainty selections are prone to return similar examples (Wang et al, 2017). To address this issue, some works incorporate measures to exploit the diversity information of the examples in the selection process (Sener and Savarese, 2017;Wang et al, 2017;Sinha et al, 2019).…”
Section: Active Learningmentioning
confidence: 99%
“…Given that the difference in degree of certainty for similar examples can be small, uncertainty selections are prone to return similar examples (Wang et al, 2017). To address this issue, some works incorporate measures to exploit the diversity information of the examples in the selection process (Sener and Savarese, 2017;Wang et al, 2017;Sinha et al, 2019). Finally, expected-model change selects examples that would cause the greatest change to a model's output if their labels were known (Freytag et al, 2014;Roy and McCallum, 2001;.…”
Section: Active Learningmentioning
confidence: 99%
“…Diversity constraint sampling (Diversity-AL). Diversity sampling methods have been used in information retrieval (Xu et al, 2007) and image classification (Wang et al, 2017b). The core idea is that samples that are highly similar to each other typically yield little new information and thus low performance.…”
Section: Active Learningmentioning
confidence: 99%
“…For text compression, we suggest AL strategies to maximize the model's coverage and the diversity of the samples. To this end, we build upon work in uncertainty sampling by (Peris and Casacuberta, 2018;Wang et al, 2017b) and propose a new strategy to predict the sample diversity at a structural level.…”
Section: Neural Seq2seq Text Compressionmentioning
confidence: 99%
“…Du et al proposed a robust multi-label active learning algorithm which introduced maximum correntropy criterion as the measure of uncertainty [15]. In [16], [17], the sparse modeling was incorporated into this samples selection to address the problem of redundant information between uncertain samples. In query-by-committee method, the query is formulated according to the criterion of minimal agreement.…”
Section: Introductionmentioning
confidence: 99%