Proceedings of the 11th International Conference on Natural Language Generation 2018
DOI: 10.18653/v1/w18-6549
|View full text |Cite
|
Sign up to set email alerts
|

Neural sentence generation from formal semantics

Abstract: Sequence-to-sequence models have shown strong performance in a wide range of NLP tasks, yet their applications to sentence generation from logical representations are underdeveloped. In this paper, we present a sequence-to-sequence model for generating sentences from logical meaning representations based on event semantics. We use a semantic parsing system based on Combinatory Categorial Grammar (CCG) to obtain data annotated with logical formulas. We augment our sequence-to-sequence model with masking for pre… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
2
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 25 publications
(29 reference statements)
0
2
0
Order By: Relevance
“…These approaches solve a slightly different problem than our approach does, and would likely require significant adaptation to solve the problem of explaining norm-related agent decisions. Nevertheless, comparison with these methods (and with state-of-the-art deep learning methods such as in Manome et al, 2018) is a fruitful topic for future work.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…These approaches solve a slightly different problem than our approach does, and would likely require significant adaptation to solve the problem of explaining norm-related agent decisions. Nevertheless, comparison with these methods (and with state-of-the-art deep learning methods such as in Manome et al, 2018) is a fruitful topic for future work.…”
Section: Discussionmentioning
confidence: 99%
“…This work is summarized by Fiedler (2001a). Recent work focusing on the more general problem of generating text from formal or logical structures includes Manome et al (2018)'s generation of sentences from logical formulas using a sequence-to-sequence approach, and Pourdamghani et al (2016)'s generation of sentences from Abstract Meaning Representations by linearizing and using Phrase Based Machine Translation approaches. Where our approach differs is that rather than aiming to justify logical conclusions via proofs or specify natural language translations of arbitrary logical forms, our approach justifies the decisions of autonomous agents (governed by principles specified in logic) in a way that is understandable to human users.…”
Section: Introductionmentioning
confidence: 99%
“…Logic-to-Text Generation Logic-to-text generation is the task of generating NL text, starting from a logical formalism (e.g., propositional logic, description logic, or first-order logic). Although the bulk of recent work on NLG (see e.g., Gatt and Krahmer (2018) for a survey) has focused on other areas, generating text from logic nonetheless has a long tradition, with approaches ranging from rule-based methodologies (Wang, 1980;De Roeck and Lowden, 1986;Calder et al, 1989;Shieber et al, 1989;Shemtov, 1996;Carroll and Oepen, 2005;Mpagouli and Hatzilygeroudis, 2009;Coppock and Baxter, 2010;Butler, 2016;Flickinger, 2016;Kasenberg et al, 2019) to statistical (Wong and Mooney, 2007;Lu and Ng, 2011;Basile, 2015) and neural models (Manome et al, 2018;Hajdik et al, 2019;Chen et al, 2020;Liu et al, 2021;Wang et al, 2021;Lu et al, 2022).…”
Section: Introductionmentioning
confidence: 99%