2023
DOI: 10.1016/j.acorp.2023.100045
|View full text |Cite
|
Sign up to set email alerts
|

Book Review

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 0 publications
0
3
0
Order By: Relevance
“…We first divided the sentences into the aforementioned 29 words and phrases consisting of 1 to 4 Chinese characters. We trained this model on a collection of sentences from the CCL corpus 47 that included transferring pairs (transfers) between those words from the 29-word set. After that, we used a Viterbi decoder to determine the most likely sequence of words given the predicted tonal syllable probabilities from the tone decoder and syllable decoder, as well as the word-sequence probabilities from the natural language model 8 .…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…We first divided the sentences into the aforementioned 29 words and phrases consisting of 1 to 4 Chinese characters. We trained this model on a collection of sentences from the CCL corpus 47 that included transferring pairs (transfers) between those words from the 29-word set. After that, we used a Viterbi decoder to determine the most likely sequence of words given the predicted tonal syllable probabilities from the tone decoder and syllable decoder, as well as the word-sequence probabilities from the natural language model 8 .…”
Section: Methodsmentioning
confidence: 99%
“…We used CCL corpus from Peking University 47 to distill training dataset for our domain-specific language model. We first measure the number of transfers between each two phrases (the counts of the previous phrase transferred to the first word of the next phrase in the whole CCL corpus).…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation