IRI -2005 IEEE International Conference on Information Reuse and Integration, Conf, 2005.
DOI: 10.1109/iri-05.2005.1506496
|View full text |Cite
|
Sign up to set email alerts
|

Tuning statistical machine translation parameters using perplexity

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(2 citation statements)
references
References 4 publications
0
2
0
Order By: Relevance
“…It may not classify exactly since there can be missing data and overfitting problems (Witten and Frank, 2000;Duda et al, 2001). Overfitting means "fitting too much the training data and the model starts to degrade performance on test data" (Nabhan and Rafea, 2005). The problem occurs when noisy or erroneous data are added to a data set (refer to Figure 2).…”
Section: Classification Methods In Data Miningmentioning
confidence: 99%
“…It may not classify exactly since there can be missing data and overfitting problems (Witten and Frank, 2000;Duda et al, 2001). Overfitting means "fitting too much the training data and the model starts to degrade performance on test data" (Nabhan and Rafea, 2005). The problem occurs when noisy or erroneous data are added to a data set (refer to Figure 2).…”
Section: Classification Methods In Data Miningmentioning
confidence: 99%
“…MERT is a method that attempt to optimize the parameter of the model while considering a more complex evaluation than simply counting incorrect translation and attempt to train the model based on the method that will be used to evaluate the model [18]. The tuning process is done to develop better translation model than the one created on the training part [19]. The goal of MERT is to find a minimum error rate count on a representative corpus 𝑓 1 𝑆 with given translations 𝑒 Λ†1𝑆 and a set of K different candidate translations 𝐂 𝑠 = {𝐞 𝑠,1 , … , 𝐞 𝑠,𝐾 }.…”
Section: Tuningmentioning
confidence: 99%