2022
DOI: 10.1007/s00521-022-07965-0
|View full text |Cite
|
Sign up to set email alerts
|

Improving and evaluating complex question answering over knowledge bases by constructing strongly supervised data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 8 publications
(3 citation statements)
references
References 45 publications
0
3
0
Order By: Relevance
“…To generate the language they looked at fine-tuning for retrieval-assisted generation models by combining pre-trained parametric and non-parametric memory. Authors [31] proposed a cross-lingual training approach that utilizes the generative architecture with resource-rich language. Authors [32] explored the direction of creating a dataset by the utilization of generative pre-trained language models in unsupervised environment followed by model fine-tuning by leveraging the guidance provided by the synthesized dataset.…”
Section: Related Workmentioning
confidence: 99%
“…To generate the language they looked at fine-tuning for retrieval-assisted generation models by combining pre-trained parametric and non-parametric memory. Authors [31] proposed a cross-lingual training approach that utilizes the generative architecture with resource-rich language. Authors [32] explored the direction of creating a dataset by the utilization of generative pre-trained language models in unsupervised environment followed by model fine-tuning by leveraging the guidance provided by the synthesized dataset.…”
Section: Related Workmentioning
confidence: 99%
“…Others have proposed dynamic graphs [8], which form a joint subgraph of the question and the entity for reasoning, and regard the entire quetion as a node of the subgraph, but in the end it did not solve the above-mentioned reasoning path error to a great extent. Other studies [14,15] Specifically, RCAANet first adopts a pre-trained model to obtain question embedding and relation embedding. At each hop, we utilize the input module to acquire attention word features and semantic features.…”
Section: Introductionmentioning
confidence: 99%
“…These triplets can be abbreviated as h,r,t, where h and t represent entities and r denotes the relationship between them. In recent years, KGs have proven useful in a variety of downstream applications, including question‐answering (Cao et al, 2023), recommendation systems (Hsu et al, 2022), and support for robot actions (Daruna et al, 2022). KGs are classified into dynamic and static knowledge graphs, with the difference lying in the inclusion of temporal information in the former.…”
Section: Introductionmentioning
confidence: 99%