2020
DOI: 10.48550/arxiv.2010.15947
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

PAL : Pretext-based Active Learning

Abstract: When obtaining labels is expensive, the requirement of a large labeled training data set for deep learning can be mitigated by active learning. Active learning refers to the development of algorithms to judiciously pick limited subsets of unlabeled samples that can be sent for labeling by an oracle. We propose an intuitive active learning technique that, in addition to the task neural network (e.g., for classification), uses an auxiliary self-supervised neural network that assesses the utility of an unlabeled … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(12 citation statements)
references
References 9 publications
0
10
0
Order By: Relevance
“…The main purpose of the self-supervised task module is to extract features that can be well passed on to the downstream active learning task. Past researchers [17], [18] have found that compared to the latest selfsupervised learning methods such as SimCLR [36] and Sim-Siam [37], rotation prediction [39] is better at self-supervised task learning. The loss function is defined as…”
Section: Self-supervised Taskmentioning
confidence: 99%
See 4 more Smart Citations
“…The main purpose of the self-supervised task module is to extract features that can be well passed on to the downstream active learning task. Past researchers [17], [18] have found that compared to the latest selfsupervised learning methods such as SimCLR [36] and Sim-Siam [37], rotation prediction [39] is better at self-supervised task learning. The loss function is defined as…”
Section: Self-supervised Taskmentioning
confidence: 99%
“…6) coreset [31]: which is a representative of distributionbased approaches and selects data that cover all highly diverse data according to the feature distribution. 7) PAL [17]: which adds a self-supervision head to the original classification network, and trains the selfsupervision task in parallel with the original classification task. 8) PT4AL [18]: which sorts the unlabeled data in descending order of self-supervised task losses and divided into batches for each active learning cycle.…”
Section: Competitive Approachesmentioning
confidence: 99%
See 3 more Smart Citations