2021
DOI: 10.48550/arxiv.2109.07348
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Cross-lingual Transfer of Monolingual Models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 0 publications
0
5
0
Order By: Relevance
“…Our experiments build off previous efforts that try to enable crosslingual transfer from pretrained monolingual LLMs to new languages (Artetxe et al, 2018(Artetxe et al, , 2020Tran, 2020;Reimers and Gurevych, 2020;Gogoulou et al, 2021). For example, BERT shows effective but limited transfer when attached to multi-lingual tokenizers (Reimers and Gurevych, 2020).…”
Section: Related Workmentioning
confidence: 90%
“…Our experiments build off previous efforts that try to enable crosslingual transfer from pretrained monolingual LLMs to new languages (Artetxe et al, 2018(Artetxe et al, , 2020Tran, 2020;Reimers and Gurevych, 2020;Gogoulou et al, 2021). For example, BERT shows effective but limited transfer when attached to multi-lingual tokenizers (Reimers and Gurevych, 2020).…”
Section: Related Workmentioning
confidence: 90%
“…BERT was trained to model semantics with English sub-word units. As Gogoulou et al [11] show, semantic information can be transferred across language boundaries. FPT based SLT models can leverage this pre-learned semantic information to better model the sign language utterance s.…”
Section: Representation Power Of Neural Sign Language Translation Modelsmentioning
confidence: 99%
“…Artetxe et al [10] show that the attention patterns learned by BERT on one written language transfer to another written language with minimal fine-tuning. Gogoulou et al [11] further illustrate how semantic information is transferred between different languages.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Ogueji et al (2021) and Ogunremi et al (2023), showcase the positive effects of pretraining on closer and related languages to the target language, even if this is less data than larger pretrained models, in part because of the possibility of shared vocabulary (Oladipo et al, 2022). Our experiments build off previous efforts that try to enable crosslingual transfer from pretrained monolingual LLMs to new languages (Artetxe et al, 2018(Artetxe et al, , 2020Tran, 2020;Reimers and Gurevych, 2020;Gogoulou et al, 2021).…”
Section: Related Workmentioning
confidence: 99%