Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) 2020
DOI: 10.18653/v1/2020.emnlp-main.195
|View full text |Cite
|
Sign up to set email alerts
|

XL-AMR: Enabling Cross-Lingual AMR Parsing with Transfer Learning Techniques

Abstract: Meaning Representation (AMR) is a popular formalism of natural language that represents the meaning of a sentence as a semantic graph. It is agnostic about how to derive meanings from strings and for this reason it lends itself well to the encoding of semantics across languages. However, cross-lingual AMR parsing is a hard task, because training data are scarce in languages other than English and the existing English AMR parsers are not directly suited to being used in a cross-lingual setting. In this work we … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
74
0
2

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
4

Relationship

4
6

Authors

Journals

citations
Cited by 46 publications
(80 citation statements)
references
References 47 publications
(48 reference statements)
0
74
0
2
Order By: Relevance
“…Zero-shot Cross-Lingual Results. Having parity in the quality and quantity of data across languages is often an unrealistic expectation, especially whenever the task requires expert annotators (Pasini, 2020): this is the reason why, over the last few years, cross-lingual transfer learning techniques and benchmarks have garnered attention in NLP (Barba et al, 2020;Blloshmi et al, 2020;Hu et al, 2020). While transfer learning techniques are becoming increasingly popular, their application to SRL is not straightforward.…”
Section: Resultsmentioning
confidence: 99%
“…Zero-shot Cross-Lingual Results. Having parity in the quality and quantity of data across languages is often an unrealistic expectation, especially whenever the task requires expert annotators (Pasini, 2020): this is the reason why, over the last few years, cross-lingual transfer learning techniques and benchmarks have garnered attention in NLP (Barba et al, 2020;Blloshmi et al, 2020;Hu et al, 2020). While transfer learning techniques are becoming increasingly popular, their application to SRL is not straightforward.…”
Section: Resultsmentioning
confidence: 99%
“…Wang et al (2018) parsed Chinese AMR with a transition-based system. For cross-lingual AMR parsing, Blloshmi et al (2020) trained an AMR parser similar to the approach of Zhang et al (2019b), using cross-lingual transfer learning, outperforming the transition-based cross-lingual AMR parser of Damonte and Cohen (2018) on German, Spanish, Italian, and Chinese.…”
Section: Overview Of Approachesmentioning
confidence: 99%
“…Evaluated across multiple multilingual and cross-lingual Semantic Word Similarity datasets, Conception shows state-of-the-art results not only compared to concept representations such as NASARI, but also to multilingual word embeddings such as Conceptnet Numberbatch and cross-lingual language models such as XLM. Additionally, our concept representations are particularly robust on resource-poor languages, like Farsi, along the lines of recent work in Semantic Parsing and Semantic Role Labeling aimed at bridging the gap between languages (Blloshmi et al, 2020;. Finally, Conception can be seamlessly applied to a downstream task: in Word Sense Disambiguation, it improves over state-of-the-art supervised and knowledge-based sense embeddings, showing that Conception encodes information that is still not captured by BERT-based contextualized representations.…”
Section: Discussionmentioning
confidence: 97%