Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 2021
DOI: 10.18653/v1/2021.findings-acl.145
|View full text |Cite
|
Sign up to set email alerts
|

Meta-Learning Adversarial Domain Adaptation Network for Few-Shot Text Classification

Abstract: Meta-learning has emerged as a trending technique to tackle few-shot text classification and achieved state-of-the-art performance. However, existing solutions heavily rely on the exploitation of lexical features and their distributional signatures on training data, while neglecting to strengthen the model's ability to adapt to new tasks. In this paper, we propose a novel meta-learning framework integrated with an adversarial domain adaptation network, aiming to improve the adaptive ability of the model and ge… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 27 publications
(21 citation statements)
references
References 17 publications
(29 reference statements)
0
12
0
Order By: Relevance
“…MLADA [4] is committed to improving the model's adaptability. The text representations are obtained by introducing an adversarial domain adaptation network.…”
Section: Classification Experiments Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…MLADA [4] is committed to improving the model's adaptability. The text representations are obtained by introducing an adversarial domain adaptation network.…”
Section: Classification Experiments Resultsmentioning
confidence: 99%
“…In recent years, few-shot learning has been generously applied to computer vision tasks and is emerging as a promising solution to the low-resource regime [2]. Some researches have achieved better results on text classification tasks [3,4] by focusing on the local-to-global paradigm, which minimizes the distance between the support and query distributions to obtain strong performance over the entire dataset. Moreover, feature selection is a way to reduce time consumption.…”
Section: Introductionmentioning
confidence: 99%
“…We evaluate our models on typical 5-way 1-shot and 5way 5-shot text classification settings. Following the evaluation setup in (Dopierre, Gravier, and Logerais 2021), we report the average accuracy over 600 episodes sampled from the test set for intent classification datasets; and following (Han et al 2021), we report the average accuracy over 1000 episodes sampled from the test set for news or review classification datasets. We run each experimental setting 5 times.…”
Section: Experimental Settingsmentioning
confidence: 99%
“…Meta-learning has been extensively studied in image classification and achieve remarkable successes (Vinyals et al 2016;Snell, Swersky, and Zemel 2017;Finn, Abbeel, and Levine 2017;Sung et al 2018;Hou et al 2019;Tseng et al 2020;Liu et al 2021;Gao et al 2021). The effectiveness in image classification motivates the recent application of meta-learning to few-shot text classification (Yu et al 2018;Geng et al 2019Geng et al , 2020Bao et al 2020;Han et al 2021).…”
Section: Introductionmentioning
confidence: 99%
“…It has been receiving increasing attention for its potential in reducing data collection effort and computational costs and extending to rare cases. To deal with data-scarcity in NLU problems, previous research introduces external knowledge (Sui et al, 2021), utilizes meta-learning (Geng et al, 2019;Han et al, 2021) and adopts data augmentation to generate labeled utterances for few-shot classes (Murty et al, 2021;Wei et al, 2021). Recent studies (Radford et al, 2019;Brown et al, 2020) have shown that large-scale pre-trained language models are able to perform NLU tasks in a few-shot learning manner.…”
Section: Related Workmentioning
confidence: 99%