2021
DOI: 10.48550/arxiv.2111.01322
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Diverse Distributions of Self-Supervised Tasks for Meta-Learning in NLP

Abstract: Meta-learning considers the problem of learning an efficient learning process that can leverage its past experience to accurately solve new tasks. However, the efficacy of meta-learning crucially depends on the distribution of tasks available for training, and this is often assumed to be known a priori or constructed from limited supervised datasets. In this work, we aim to provide task distributions for meta-learning by considering self-supervised tasks automatically proposed from unlabeled text, to enable la… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 23 publications
0
2
0
Order By: Relevance
“…[9] Deep learning, a subfield uof ML, has also shown great promise in medical image analysis. Deep learning algorithms use multiple layers of neural networks to learn increasingly complex features from medical images [11]. These algorithms have been successful in applications such as image segmentation, where the goal is to identify and outline specific structures within an image.…”
Section: Manually-designed Algorithms To Machine Learningmentioning
confidence: 99%
“…[9] Deep learning, a subfield uof ML, has also shown great promise in medical image analysis. Deep learning algorithms use multiple layers of neural networks to learn increasingly complex features from medical images [11]. These algorithms have been successful in applications such as image segmentation, where the goal is to identify and outline specific structures within an image.…”
Section: Manually-designed Algorithms To Machine Learningmentioning
confidence: 99%
“…User data privacy is the major issue.to solve the privacy issue different method are proposed. A method of Differential privacy is proposed to solve the privacy problems.in this method privacy can be achieved by adding noise [3]. Drawback of this technique is that it only prevent from gaining additional information from an individual data.…”
Section: Related Workmentioning
confidence: 99%
“…Transformers, exemplified by models like GPT [33], [34] and BERT [35], have gained prominence in code generation tasks [27], [30], [31], enabling models to attend to relevant code contexts and generate code with heightened context awareness [7]. Graph Neural Networks (GNNs) [36] prove instrumental in handling code represented as graphs, effectively capturing relationships between code entities [15]. DL techniques excel in capturing complex patterns and generating code with enhanced accuracy [7].…”
Section: Deep Learning (Dl)mentioning
confidence: 99%