Proceedings of the 1st Workshop on Meta Learning and Its Applications to Natural Language Processing 2021
DOI: 10.18653/v1/2021.metanlp-1.8
|View full text |Cite
|
Sign up to set email alerts
|

Semi-supervised Meta-learning for Cross-domain Few-shot Intent Classification

Abstract: Meta-learning aims to optimize the model's capability to generalize to new tasks and domains. Lacking a data-efficient way to create meta training tasks has prevented the application of meta-learning to the real-world few shot learning scenarios. Recent studies have proposed unsupervised approaches to create meta-training tasks from unlabeled data for free, e.g., the SMLMT method (Bansal et al., 2020a) constructs unsupervised multiclass classification tasks from the unlabeled text by randomly masking words in … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 11 publications
0
4
0
Order By: Relevance
“…In [102], the authors propose a transferable meta-learning algorithm with a meta task adaptation to minimize the domain divergence and thus facilitate knowledge transfer across domains. To further improve the transferability of cross-domain knowledge, [103] and [104] propose to incorporate semi-supervised techniques into the meta-learning framework. Specifically, [103] combines the representation power of large pre-trained language models (e.g., BERT [33]) with the generalization capability of prototypical networks enhanced by SMLMT [105] to achieve effective generalization and adaptation to tasks from new domains.…”
Section: A Meta-learning From Multimodal Task Distributionsmentioning
confidence: 99%
“…In [102], the authors propose a transferable meta-learning algorithm with a meta task adaptation to minimize the domain divergence and thus facilitate knowledge transfer across domains. To further improve the transferability of cross-domain knowledge, [103] and [104] propose to incorporate semi-supervised techniques into the meta-learning framework. Specifically, [103] combines the representation power of large pre-trained language models (e.g., BERT [33]) with the generalization capability of prototypical networks enhanced by SMLMT [105] to achieve effective generalization and adaptation to tasks from new domains.…”
Section: A Meta-learning From Multimodal Task Distributionsmentioning
confidence: 99%
“…Therefore our model not only provides the adaptive ability to the given backbone predictor as other meta-learning works but simultaneously provides better probabilistic modeling for uncertainty-aware applications. There are also previous works that combine adversarial training and meta learning topic [38], [39], [40], [41]. Our work is different because we adversarially train the generative model itself as a meta learner instead of as data-augmentation tricks, as pretrained models, or for adversarial robustness.…”
Section: B Meta Learningmentioning
confidence: 99%
“…ICI [33] and the method of Lazarou et al. [34] progressively filter out less trustworthy instances to perform pseudo‐labelling, LST [35] weighted the cherry‐picked pseudo‐labelled data further.…”
Section: Related Workmentioning
confidence: 99%
“…Knowledge distillation has been employed in FSL methods [9, 13, 18, 34, 35, 41, 42]. These methods typically adopt the model compression strategy, that is, under the guidance of a teacher model to build the student model.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation