2018
DOI: 10.1007/978-3-030-00810-9_9
|View full text |Cite
|
Sign up to set email alerts
|

Restoring Punctuation and Capitalization Using Transformer Models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 16 publications
(5 citation statements)
references
References 15 publications
0
5
0
Order By: Relevance
“…Figure 2 illustrates the sequence-to-sequence encoder-decoder architecture used in this paper. We note that a couple of previous works used machine translation approaches to address punctuation restoration (Peltz et al 2011, Vāravs andSalimbajevs 2018). Similar to this work, these past works frame the punctuation restoration task as a text-to-text task.…”
Section: Related Workmentioning
confidence: 70%
“…Figure 2 illustrates the sequence-to-sequence encoder-decoder architecture used in this paper. We note that a couple of previous works used machine translation approaches to address punctuation restoration (Peltz et al 2011, Vāravs andSalimbajevs 2018). Similar to this work, these past works frame the punctuation restoration task as a text-to-text task.…”
Section: Related Workmentioning
confidence: 70%
“…Garg and Anika, 2018;Che et al, 2016;Żelasko et al, 2018)), which have less predictive power. The handful of approaches that make use of Transformer architectures are not bidirectional Nguyen et al, 2019;Vāravs and Salimbajevs, 2018;Wang et al, 2018). Our model also differs from the above in that it leverages pretraining to reduce training time and increase accuracy.…”
Section: Related Workmentioning
confidence: 99%
“…Garg and Anika, 2018;Che et al, 2016;Żelasko et al, 2018)), which have less predictive power. The handful of approaches that make use of Transformer architectures are not bidirectional Vāravs and Salimbajevs, 2018;. Our model also differs from the above in that it leverages pretraining to reduce training time and increase accuracy.…”
Section: Related Workmentioning
confidence: 99%