2022
DOI: 10.48550/arxiv.2201.00558
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Which Student is Best? A Comprehensive Knowledge Distillation Exam for Task-Specific BERT Models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…We take a case in the Indonesian research environment as Indonesian is an underrepresented language [2], [3]. The lack of datasets, NLP researchers, and the high computational costs [4], [5] lead to slow NLP research progress in this area.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…We take a case in the Indonesian research environment as Indonesian is an underrepresented language [2], [3]. The lack of datasets, NLP researchers, and the high computational costs [4], [5] lead to slow NLP research progress in this area.…”
Section: Introductionmentioning
confidence: 99%
“…Lastly, we investigate simple text-shortening methods which produce better performance. There are several variations to compare: (1) original text, (2) removing stopwords, (3) removing punctuation, (4) removing stopwords and punctuation, (5) removing stopwords and low-frequency words, (6) combining head and tail, and ( 7) unique words only. The result indicates that removing stopwords outperforms the other methods.…”
Section: Introductionmentioning
confidence: 99%