2020
DOI: 10.48550/arxiv.2004.14813
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

ENT-DESC: Entity Description Generation by Exploring Knowledge Graph

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
6
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 18 publications
0
6
0
Order By: Relevance
“…(Zhu et al, 2020) proposes a fact-aware summarization model to ensure that the content generated by the model conforms to factual logic. The MGCN models (Cheng et al, 2020) adopt multiple graph transformations to obtain the context feature in different scales, which achieve great performance on KG-to-text tasks. (Chen et al, 2020) and (Ji et al, 2020) use PLM models with knowledge injection to generate the content with commonsense.…”
Section: Kg-to-textmentioning
confidence: 99%
See 3 more Smart Citations
“…(Zhu et al, 2020) proposes a fact-aware summarization model to ensure that the content generated by the model conforms to factual logic. The MGCN models (Cheng et al, 2020) adopt multiple graph transformations to obtain the context feature in different scales, which achieve great performance on KG-to-text tasks. (Chen et al, 2020) and (Ji et al, 2020) use PLM models with knowledge injection to generate the content with commonsense.…”
Section: Kg-to-textmentioning
confidence: 99%
“…ENT-DESC The ENT-DESC dataset (Cheng et al, 2020) is extracted from Wikipedia with more than 9.9 million pages. The dataset contains domains like humans, events, locations, etc.…”
Section: Datasetmentioning
confidence: 99%
See 2 more Smart Citations
“…Recent works for solving this task have two main categories. One category is using graph neural networks (Marcheggiani and Perez-Beltrachini, 2018;Ribeiro et al, 2020b;Cheng et al, 2020) et al, 2020a;Yang et al, 2020;Gardent et al, 2017;Hoyle et al, 2020) and then formulating a sequence-to-sequence generation task with lineared KG nodes as input to generate sentences. E.g., Distiawan et al (Distiawan et al, 2018) utilize a fixed tree traversal order to directly flatten the KG into a linearized representation.…”
Section: Kg-to-text Generationmentioning
confidence: 99%