2022
DOI: 10.48550/arxiv.2204.09140
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Survey on Multi-hop Question Answering and Generation

Abstract: The problem of Question Answering (QA) has attracted significant research interest for long. Its relevance to language understanding and knowledge retrieval tasks, along with the simple setting makes the task of QA crucial for strong AI systems. Recent success on simple QA tasks has shifted the focus to more complex settings. Among these, Multi-Hop QA (MHQA) is one of the most researched tasks over the recent years.The ability to answer multi-hop questions and perform multi step reasoning can significantly imp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 143 publications
(368 reference statements)
0
3
0
Order By: Relevance
“…Despite the high performance of recent pretrained language models on question-answering (QA) tasks, solving questions that require multi-hop reasoning is still challenging (Mavi et al, 2022). In this paper, we focus on spatial reasoning over text which can be described as inferring the implicit 1 spatial relations from explicit relations 2 described in the text.…”
Section: Introductionmentioning
confidence: 99%
“…Despite the high performance of recent pretrained language models on question-answering (QA) tasks, solving questions that require multi-hop reasoning is still challenging (Mavi et al, 2022). In this paper, we focus on spatial reasoning over text which can be described as inferring the implicit 1 spatial relations from explicit relations 2 described in the text.…”
Section: Introductionmentioning
confidence: 99%
“…The embedding-based multi-hop question [5] answering algorithm mainly relies on the semantic similarity between the question and the relationships in the knowledge graph, and uses the scoring function of the knowledge graph embedding model's triple to construct the answer scoring function. The specific operation methods based on embedding can be divided into two types: semantic analysis-based and graph neural network-based multi-hop question answering inference algorithms.…”
Section: The Embedding-based Multi-hop Knowledge Question Answering (...mentioning
confidence: 99%
“…The Transformer model, proposed by Vaswani et al [15], has revolutionized natural language processing with its self-attention mechanism, which allows the model to capture long-range dependencies and contextual information efficiently. Transformers have been successfully applied to a variety of NLP tasks, such as machine translation, text summarization [29], and question-answering [30]. Recently, researchers have started exploring the application of Transformer-based models in the knowledge graph domain.…”
Section: Transformer Attention Mechanism and Its Applications In Know...mentioning
confidence: 99%