Proceedings of the 13th International Joint Conference on Natural Language Processing and the 3rd Conference of the Asia-Pacifi 2023
DOI: 10.18653/v1/2023.ijcnlp-main.23
|View full text |Cite
|
Sign up to set email alerts
|

Assessment of Pre-Trained Models Across Languages and Grammars

Alberto Muñoz-Ortiz,
David Vilares,
Carlos Gómez-Rodríguez

Abstract: We present an approach for assessing how multilingual large language models (LLMs) learn syntax in terms of multi-formalism syntactic structures. We aim to recover constituent and dependency structures by casting parsing as sequence labeling. To do so, we select a few LLMs and study them on 13 diverse UD treebanks for dependency parsing and 10 treebanks for constituent parsing. Our results show that: (i) the framework is consistent across encodings, (ii) pre-trained word vectors do not favor constituency repre… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 41 publications
(52 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?