Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Langua 2021
DOI: 10.18653/v1/2021.naacl-main.201
|View full text |Cite
|
Sign up to set email alerts
|

A Survey on Recent Approaches for Natural Language Processing in Low-Resource Scenarios

Abstract: Deep neural networks and huge language models are becoming omnipresent in natural language applications. As they are known for requiring large amounts of training data, there is a growing body of work to improve the performance in low-resource settings. Motivated by the recent fundamental changes towards neural models and the popular pre-train and fine-tune paradigm, we survey promising approaches for low-resource natural language processing. After a discussion about the different dimensions of data availabili… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
102
0
1

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 137 publications
(103 citation statements)
references
References 155 publications
(129 reference statements)
0
102
0
1
Order By: Relevance
“…In Section 4.1, we find that current learning-based methods do not necessarily outperform traditional retrieval-based methods. We attribute the results to the limited data such as query-API pairs in the query-based API recommendation task, which is a low-resource scenario [16], [34]. We also discover that pre-trained models such as BERT show superior performance in query reformulation in Section 4.2.…”
Section: Low Resource Setting In Query-based Api Recommendationmentioning
confidence: 92%
“…In Section 4.1, we find that current learning-based methods do not necessarily outperform traditional retrieval-based methods. We attribute the results to the limited data such as query-API pairs in the query-based API recommendation task, which is a low-resource scenario [16], [34]. We also discover that pre-trained models such as BERT show superior performance in query reformulation in Section 4.2.…”
Section: Low Resource Setting In Query-based Api Recommendationmentioning
confidence: 92%
“…The context presented helps explain the growing interest in the field of Low-Resource NLP [4,12], which addresses traditional NLP tasks with the assumption of scarcity of available data. Some approaches to this family of tasks propose semi-supervised methods, such as adding large quantities of unlabeled data to a small labelled dataset [21], or applying cross-lingual annotation transfer learning [2] to leverage annotated data available in languages other than the desired one.…”
Section: Background and Related Workmentioning
confidence: 99%
“…This problem is even more critical in languages other than English: statistics 4 show that English is used by 63.1 % of the population on the internet, while Portuguese, for instance, is only used by 0.7 %. This scenario has contributed to the rise of Low-Resource NLP, which aims to develop techniques to deal with low data availability in a specific language or application domain [12].…”
Section: Introductionmentioning
confidence: 99%
“…Main findings of all of these challenges were that transformer models become more commonly used [5,15] as they begun to dominate the field of information extraction due to their general applicability across languages and domains. For an overview of recent approaches to low-resource NLP, we refer the refer to [8].…”
Section: Related Workmentioning
confidence: 99%