2019
DOI: 10.48550/arxiv.1909.04761
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

MultiFiT: Efficient Multi-lingual Language Model Fine-tuning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 0 publications
0
3
0
Order By: Relevance
“…-USE + SVM: We first adopt a ML-based approach. Instead of TF-IDF features, we obtain contextualized representations of the input using Universal sentence encoder (USE) 6 . We then fed the input representations to an SVM model.…”
Section: Comparison With Other Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…-USE + SVM: We first adopt a ML-based approach. Instead of TF-IDF features, we obtain contextualized representations of the input using Universal sentence encoder (USE) 6 . We then fed the input representations to an SVM model.…”
Section: Comparison With Other Methodsmentioning
confidence: 99%
“…Recently, rather than leveraging hand-crafted features, automatic extraction of relevant features in the form of distributed representations has become popular [21]. Various previous studies [7,6,20] have shown the effect usage of Language Model Fine-tuning are an better alternative for the classification tasks than other methods.…”
Section: Introductionmentioning
confidence: 99%
“…Strategic knowledge sharing has been shown to improve the performance on downstream tasks and languages (Gururangan et al, 2020). Therefore, this technique is crucial for multilingual applications, as most of the world's languages lack large amount of labeled data Eisenschlos et al, 2019;Joshi et al, 2020). However the performance of multilingual language model on low-resource languages is still limited compared to other languages since they are naturally under-sampled during the training process (Wu and Dredze, 2020).…”
Section: Transfer Learning For Low-resource Languagesmentioning
confidence: 99%