ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2023
DOI: 10.1109/icassp49357.2023.10095864
|View full text |Cite
|
Sign up to set email alerts
|

F-PABEE: Flexible-Patience-Based Early Exiting For Single-Label and Multi-Label Text Classification Tasks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 9 publications
0
1
0
Order By: Relevance
“…Full-model fine-tuning is one of the most widely used method for utilizing PTMs. However, fine-tuning (Devlin et al, 2019;Zhu et al, , 2021aZhu, 2021a;Gao et al, 2023;Zhang et al, 2023a) needs to tune all parameters of PTMs for each task, resulting in large GPU memory and storage costs, especially for supersized PTMs (Brown et al, 2020;Wang et al, 2021a). Parameter-efficient tuning (PETuning) is a new fine-tuning paradigm that can reduce the adaptation costs of PTMs by only tuning a very small number of internal or additional parameters (Ding et al, 2022;Zhang et al, 2023b;.…”
Section: Introductionmentioning
confidence: 99%
“…Full-model fine-tuning is one of the most widely used method for utilizing PTMs. However, fine-tuning (Devlin et al, 2019;Zhu et al, , 2021aZhu, 2021a;Gao et al, 2023;Zhang et al, 2023a) needs to tune all parameters of PTMs for each task, resulting in large GPU memory and storage costs, especially for supersized PTMs (Brown et al, 2020;Wang et al, 2021a). Parameter-efficient tuning (PETuning) is a new fine-tuning paradigm that can reduce the adaptation costs of PTMs by only tuning a very small number of internal or additional parameters (Ding et al, 2022;Zhang et al, 2023b;.…”
Section: Introductionmentioning
confidence: 99%