2018
DOI: 10.1007/978-3-319-99501-4_10
|View full text |Cite
|
Sign up to set email alerts
|

Chinese Grammatical Error Correction Using Statistical and Neural Models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 21 publications
(14 citation statements)
references
References 15 publications
0
14
0
Order By: Relevance
“…YouDao [50] incorporated the vanilla transformer as the main neural translation model. AliGM [55] took the statistical language and machine translation models and the seq2seq with attention architecture as the neural machine translation to solve the grammatical problems. As for BLCU [56], the convolutional sequence-tosequence model was introduced as the main model.…”
Section: Comparison In Cgec Taskmentioning
confidence: 99%
“…YouDao [50] incorporated the vanilla transformer as the main neural translation model. AliGM [55] took the statistical language and machine translation models and the seq2seq with attention architecture as the neural machine translation to solve the grammatical problems. As for BLCU [56], the convolutional sequence-tosequence model was introduced as the main model.…”
Section: Comparison In Cgec Taskmentioning
confidence: 99%
“…Precision Recall F 0.5 YouDao (Fu, Huang, and Duan 2018) 35.24 18.64 29.91 AliGM (Zhou et al 2018) 41.00 13.75 29.36 BLCU (Ren, Yang, and…”
Section: Modelmentioning
confidence: 99%
“…We denote them as GECToR-BERT, GECToR-XLNet and GECToR (Ensemble) respectively. For the Chinese GEC task, we compare S2A to several best performing systems evaluated on the NLPCC-2018 dataset, including three top systems in the NLPCC-2018 challenge (YouDao (Fu, Huang, and Duan 2018), AliGM (Zhou et al 2018), BLCU (Ren, Yang, and Xun 2018)), the seq2seq baseline Char Transformer, and the current state-of-the-art method MaskGEC (Zhao and Wang 2020). Note that the proposed S2A model is orthogonal to MaskGEC, and we also report our results enhanced with the data augmentation method of MaskGEC.…”
Section: Baselinesmentioning
confidence: 99%
“…• AliGM (Zhou et al 2018): The Chinese GEC system developed by Alibaba, which combines NMT-based approaches, SMT-based approaches and a rule-based approach together with various modules.…”
Section: Baselinesmentioning
confidence: 99%