2021
DOI: 10.1007/978-3-030-79457-6_46
|View full text |Cite
|
Sign up to set email alerts
|

Hierarchical Transformer Encoders for Vietnamese Spelling Correction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(1 citation statement)
references
References 6 publications
0
1
0
Order By: Relevance
“…Therefore, with the recent development of deep learning, pre-trained neural network-based language models can be applied to detect and correct spelling errors, yielding good results due to their ability to represent better language performance. In a recent investigation (Tran, Dinh, Phan, & Nguyen, 2021), a Hierarchical Transformer model was introduced for the correction of spelling errors in Vietnamese. This model utilizes two Transformer encoders: one at the character level and the other at the word level, enabling the representation of each word from both character and word perspectives.…”
Section: Theoretical Basismentioning
confidence: 99%
“…Therefore, with the recent development of deep learning, pre-trained neural network-based language models can be applied to detect and correct spelling errors, yielding good results due to their ability to represent better language performance. In a recent investigation (Tran, Dinh, Phan, & Nguyen, 2021), a Hierarchical Transformer model was introduced for the correction of spelling errors in Vietnamese. This model utilizes two Transformer encoders: one at the character level and the other at the word level, enabling the representation of each word from both character and word perspectives.…”
Section: Theoretical Basismentioning
confidence: 99%