Findings of the Association for Computational Linguistics: NAACL 2022 2022
DOI: 10.18653/v1/2022.findings-naacl.95
|View full text |Cite
|
Sign up to set email alerts
|

Syntax Controlled Knowledge Graph-to-Text Generation with Order and Semantic Consistency

Abstract: The knowledge graph (KG) stores a large amount of structural knowledge, while it is not easy for direct human understanding. Knowledge graph-to-text (KG-to-text) generation aims to generate easy-to-understand sentences from the KG, and at the same time, maintains semantic consistency between generated sentences and the KG. Existing KG-to-text generation methods phrase this task as a sequence-tosequence generation task with linearized KG as input and consider the consistency issue of the generated texts and KG … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 26 publications
(45 reference statements)
0
0
0
Order By: Relevance
“…Graformer (Schmitt et al, 2020) introduces a model that combines relative position information to compute self-attention. Other approaches (Wang et al, 2021b;Liu et al, 2022;Guo et al, 2020;Ribeiro et al, 2020a) first linearize KG into sequences and then feed them into the sequence-to-sequence (Seq2Seq) model for generating desired texts. In this paper, we employ the Seq2Seq model with a planning selector to control linearized sequence orders.…”
Section: Kg-to-text Generationmentioning
confidence: 99%
See 2 more Smart Citations
“…Graformer (Schmitt et al, 2020) introduces a model that combines relative position information to compute self-attention. Other approaches (Wang et al, 2021b;Liu et al, 2022;Guo et al, 2020;Ribeiro et al, 2020a) first linearize KG into sequences and then feed them into the sequence-to-sequence (Seq2Seq) model for generating desired texts. In this paper, we employ the Seq2Seq model with a planning selector to control linearized sequence orders.…”
Section: Kg-to-text Generationmentioning
confidence: 99%
“…Existing works (Zhao et al, 2020) have shown that the linearized order of the given triples has an effect on the generated text's quality. Previous works mainly use graph traversal or multistep prediction (Su et al, 2021;Liu et al, 2022;Zhao et al, 2020) methods for triple order generation. uses the relation-biased BFS (RBFS) strategy to traverse and linearize KGs into sequences.…”
Section: Sequence Order Generationmentioning
confidence: 99%
See 1 more Smart Citation