2022
DOI: 10.1007/978-981-19-8234-7_14
|View full text |Cite
|
Sign up to set email alerts
|

Error Investigation of Pre-trained BERTology Models on Vietnamese Natural Language Inference

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(4 citation statements)
references
References 21 publications
0
4
0
Order By: Relevance
“…These limitations may reduce the effectiveness of models trained on this dataset when applied on other healthcare topics. Nevertheless, they can be addressed by adding more diverse text sources when selecting the premise sentences, or by combining with other Vietnamese monolingual datasets, e.g., the ViNLI dataset [17].…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…These limitations may reduce the effectiveness of models trained on this dataset when applied on other healthcare topics. Nevertheless, they can be addressed by adding more diverse text sources when selecting the premise sentences, or by combining with other Vietnamese monolingual datasets, e.g., the ViNLI dataset [17].…”
Section: Discussionmentioning
confidence: 99%
“…There are few works for other languages [3,16,23,34], and particularly, for Vietnamese [17]. These works however focus on constructing open-domain datasets and also only for the monolingual problem.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations