“…Logic-to-Text Generation Logic-to-text generation is the task of generating NL text, starting from a logical formalism (e.g., propositional logic, description logic, or first-order logic). Although the bulk of recent work on NLG (see e.g., Gatt and Krahmer (2018) for a survey) has focused on other areas, generating text from logic nonetheless has a long tradition, with approaches ranging from rule-based methodologies (Wang, 1980;De Roeck and Lowden, 1986;Calder et al, 1989;Shieber et al, 1989;Shemtov, 1996;Carroll and Oepen, 2005;Mpagouli and Hatzilygeroudis, 2009;Coppock and Baxter, 2010;Butler, 2016;Flickinger, 2016;Kasenberg et al, 2019) to statistical (Wong and Mooney, 2007;Lu and Ng, 2011;Basile, 2015) and neural models (Manome et al, 2018;Hajdik et al, 2019;Chen et al, 2020;Liu et al, 2021;Wang et al, 2021;Lu et al, 2022).…”