Proceedings of the 11th International Workshop on Semantic Evaluation (SemEval-2017) 2017
DOI: 10.18653/v1/s17-2096
|View full text |Cite
|
Sign up to set email alerts
|

Sheffield at SemEval-2017 Task 9: Transition-based language generation from AMR.

Abstract: This paper describes the submission by the University of Sheffield to the SemEval 2017 Abstract Meaning Representation Parsing and Generation task (SemEval 2017 Task 9, Subtask 2). We cast language generation from AMR as a sequence of actions (e.g., insert/remove/rename edges and nodes) that progressively transform the AMR graph into a dependency parse tree. This transition-based approach relies on the fact that an AMR graph can be considered structurally similar to a dependency tree, with a focus on content r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 9 publications
(11 citation statements)
references
References 14 publications
(12 reference statements)
0
11
0
Order By: Relevance
“…The participants declined to submit a system description paper. (Lampouras and Vlachos, 2017) This team's method is based on inverting previous work on transition-based parsers, and casts NLG from AMR as a sequence of actions (e.g., insert/remove/rename edges and nodes) that progressively transform the AMR graph into a syntactic parse tree. It achieves this by employing a sequence of four classifiers, each focusing on a subset of the transition actions, and finally realizing the syntactic parse tree into the final sentence.…”
Section: Cmumentioning
confidence: 99%
“…The participants declined to submit a system description paper. (Lampouras and Vlachos, 2017) This team's method is based on inverting previous work on transition-based parsers, and casts NLG from AMR as a sequence of actions (e.g., insert/remove/rename edges and nodes) that progressively transform the AMR graph into a syntactic parse tree. It achieves this by employing a sequence of four classifiers, each focusing on a subset of the transition actions, and finally realizing the syntactic parse tree into the final sentence.…”
Section: Cmumentioning
confidence: 99%
“…While initial work used statistical approaches (Flanigan et al, 2016b;Pourdamghani et al, 2016;Song et al, 2017;Lampouras and Vlachos, 2017;Mille et al, 2017;Gruzitis et al, 2017), recent research has demonstrated the success of deep learning, and in particular the sequence-to-sequence model (Sutskever et al, 2014), which has achieved the state-of-the-art results on AMR-to-text generation (Konstas et al, 2017). One limitation of sequence-to-sequence models, however, is that they require serialization of input AMR graphs, which adds to the challenge of representing graph structure information, especially when the graph is large.…”
Section: Introductionmentioning
confidence: 99%
“…In relation to generation methods from Abstract Meaning Representation, it was possible to highlight approaches based on machine translation (Pourdamghani et al, 2016;Ferreira et al, 2017), on transformation to intermediate representations (Lampouras and Vlachos, 2017;Mille et al, 2017), on deep learning models (Konstas et al, 2017;Song et al, 2018), and on rule extraction (from graphs and trees) (Song et al, 2016;Flanigan et al, 2016). Methods based on transformation into intermediate representations focused on transforming AMR graphs into simpler representations (usually dependency trees) and then using an appropriate surface realization system.…”
Section: Natural Language Generation From Abstract Meaning Representationmentioning
confidence: 99%