2022
DOI: 10.12928/telkomnika.v20i6.24248
|View full text |Cite
|
Sign up to set email alerts
|

RoBERTa: language modelling in building Indonesian question-answering systems

Abstract: This research aimed to evaluate the performance of the A Lite BERT (ALBERT), efficiently learning an encoder that classifies token replacements accurately (ELECTRA) and a robust optimized BERT pretraining approach (RoBERTa) models to support the development of the Indonesian language question and answer system model. The evaluation carried out used Indonesian, Malay and Esperanto. Here, Esperanto was used as a comparison of Indonesian because it is international, which does not belong to any person or country … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 19 publications
0
0
0
Order By: Relevance
“…In this study, the paper on Bidirectional Encoder Representations from Transformers (BERT) tested learning rates of 1E-5 and 5e-5 for both languages, as per the reference. The results obtained are an accuracy of 91.7% and an F1score of 86.2% with the conclusion RoBERTa model was better to implement the Indonesian questionand-answer system [11].…”
Section: Implementation Of Recurrent Neural Network (Rnn) For Questio...mentioning
confidence: 83%
“…In this study, the paper on Bidirectional Encoder Representations from Transformers (BERT) tested learning rates of 1E-5 and 5e-5 for both languages, as per the reference. The results obtained are an accuracy of 91.7% and an F1score of 86.2% with the conclusion RoBERTa model was better to implement the Indonesian questionand-answer system [11].…”
Section: Implementation Of Recurrent Neural Network (Rnn) For Questio...mentioning
confidence: 83%