Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) 2020
DOI: 10.18653/v1/2020.emnlp-main.355
|View full text |Cite
|
Sign up to set email alerts
|

ALICE: Active Learning with Contrastive Natural Language Explanations

Abstract: Training a supervised neural network classifier typically requires many annotated training samples. Collecting and annotating a large number of data points are costly and sometimes even infeasible. Traditional annotation process uses a low-bandwidth human-machine communication interface: classification labels, each of which only provides a few bits of information. We propose Active Learning with Contrastive Explanations (ALICE), an expert-in-the-loop training framework that utilizes contrastive natural languag… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
4

Relationship

2
6

Authors

Journals

citations
Cited by 24 publications
(22 citation statements)
references
References 36 publications
(31 reference statements)
0
22
0
Order By: Relevance
“…Minimal pairs have also been used to design controlled experiments and probe neural models' ability to capture various linguistic phenomena (Gulordava et al, 2018;Ettinger et al, 2018;Futrell et al, 2019;Gardner et al, 2020;Schuster et al, 2020). Finally, Liang et al (2020) use contrastive explanations as part of an active learning framework to improve data efficiency. Our work shares the objective of Liang et al (2020) to improve data efficiency, but is methodologically closer to probing work that uses minimal pairs to represent specific linguistic features.…”
Section: Discovering and Detecting Dialect Featuresmentioning
confidence: 99%
See 1 more Smart Citation
“…Minimal pairs have also been used to design controlled experiments and probe neural models' ability to capture various linguistic phenomena (Gulordava et al, 2018;Ettinger et al, 2018;Futrell et al, 2019;Gardner et al, 2020;Schuster et al, 2020). Finally, Liang et al (2020) use contrastive explanations as part of an active learning framework to improve data efficiency. Our work shares the objective of Liang et al (2020) to improve data efficiency, but is methodologically closer to probing work that uses minimal pairs to represent specific linguistic features.…”
Section: Discovering and Detecting Dialect Featuresmentioning
confidence: 99%
“…Finally, Liang et al (2020) use contrastive explanations as part of an active learning framework to improve data efficiency. Our work shares the objective of Liang et al (2020) to improve data efficiency, but is methodologically closer to probing work that uses minimal pairs to represent specific linguistic features.…”
Section: Discovering and Detecting Dialect Featuresmentioning
confidence: 99%
“…For examples, there can be color attributes like "white", size attributes like "large", and action attributes like "standing". Attributes are important sources of information beyond the coarse-grained object classes (Liang et al, 2020c). Each edge in the scene graph denotes relation between two connected objects.…”
Section: C2 Additional Dataset Informationmentioning
confidence: 99%
“…Moreover, most pool-based AL strategies query class information of instances. However, recently, Liang et al [158] proposed the strategy active learning with contrastive natural language explanations (ALICE). It uses queries of the form "How would you differentiate between the class y ∈ Ω Y and class y ∈ Ω Y ?"…”
Section: A Active Learning For Classificationmentioning
confidence: 99%