2021
DOI: 10.28995/2075-7182-2021-20-235-245
|View full text |Cite
|
Sign up to set email alerts
|

Russian SuperGLUE 1.1: Revising the Lessons not Learned by Russian NLP-models

Abstract: In the last year, new neural architectures and multilingual pre-trained models have been released for Russian, which led to performance evaluation problems across a range of language understanding tasks.This paper presents Russian SuperGLUE 1.1, an updated benchmark styled after GLUE for Russian NLP models. The new version includes a number of technical, user experience and methodological improvements, including fixes of the benchmark vulnerabilities unresolved in the previous version: novel and improved tests… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 9 publications
(14 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?