2020
DOI: 10.48550/arxiv.2012.03460
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Reprogramming Language Models for Molecular Representation Learning

Abstract: Recent advancements in transfer learning have made it a promising approach for domain adaptation via transfer of learned representations. This is especially when relevant when alternate tasks have limited samples of well-defined and labeled data, which is common in the molecule data domain. This makes transfer learning an ideal approach to solve molecular learning tasks. While Adversarial Reprogramming has proven to be a successful method to repurpose neural networks for alternate tasks, most works consider so… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 7 publications
0
2
0
Order By: Relevance
“…We provide some notable examples below. Vinod, Chen, and Das (2020) repurposed a language model to predict biochemical sequences; Tsai, Chen, and Ho (2020) 2022) reprogrammed a vision model to classify text sentences and DNA sequences. (Oh et al 2023) extended BAR by generating input-aware visual prompts through an external encoder-decoder model for limited data recognition.…”
Section: Model Reprogrammingmentioning
confidence: 99%
See 1 more Smart Citation
“…We provide some notable examples below. Vinod, Chen, and Das (2020) repurposed a language model to predict biochemical sequences; Tsai, Chen, and Ho (2020) 2022) reprogrammed a vision model to classify text sentences and DNA sequences. (Oh et al 2023) extended BAR by generating input-aware visual prompts through an external encoder-decoder model for limited data recognition.…”
Section: Model Reprogrammingmentioning
confidence: 99%
“…The pioneering work of (Tsai, Chen, and Ho 2020) has demonstrated that through MR, even a CNN initially trained on ImageNet can be swiftly adapted to excel in classifying medical images, interestingly, even outperforming the traditional finetuning approach. Subsequent research efforts have extended the idea of MR into various domains, achieving successful adaption without finetuing (Vinod, Chen, and Das 2020;Yen et al 2021;Yang, Tsai, and Chen 2021;Neekhara et al 2022;Chen et al 2023).…”
Section: Introductionmentioning
confidence: 99%