2018
DOI: 10.48550/arxiv.1808.09920
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Question Answering by Reasoning Across Documents with Graph Convolutional Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
43
0
1

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 23 publications
(45 citation statements)
references
References 0 publications
1
43
0
1
Order By: Relevance
“…As multi-hop relations in KBs could be mentioned together in a single text piece, these datasets are not designed for an open-domain setting which necessitates multi-hop retrieval. Existing methods on these datasets either retrieve passages from a small passage pool pruned based on the the specific dataset Dhingra et al, 2020), or focus on a non-retrieval setting where a compact documents set is already given (De Cao et al, 2018;Tu et al, 2019;Beltagy et al, 2020). Compared to these research, our work aims at building an efficient multi-hop retrieval model that easily scales to large real-world corpora that include millions of open-domain documents.…”
Section: Related Workmentioning
confidence: 99%
“…As multi-hop relations in KBs could be mentioned together in a single text piece, these datasets are not designed for an open-domain setting which necessitates multi-hop retrieval. Existing methods on these datasets either retrieve passages from a small passage pool pruned based on the the specific dataset Dhingra et al, 2020), or focus on a non-retrieval setting where a compact documents set is already given (De Cao et al, 2018;Tu et al, 2019;Beltagy et al, 2020). Compared to these research, our work aims at building an efficient multi-hop retrieval model that easily scales to large real-world corpora that include millions of open-domain documents.…”
Section: Related Workmentioning
confidence: 99%
“…Graph Neural Networks Graph neural networks (GNNs) capture the dependencies and relations between nodes connected with edges, which propagate features across nodes layer by layer (Scarselli et al, 2008;Kipf & Welling, 2016;Hamilton et al, 2017). GNNs have demonstrated effectiveness in a wide variety of tasks in natural language processing such as text classification (Yao et al, 2019;, machine translation (Bastings et al, 2017), question answering (Song et al, 2018;De Cao et al, 2018), recommendation (Wu et al, 2019) and information extraction (Li et al, 2020a). For example, proposed Star Transformer, a Transformer backbone but replaces the fully-connected structure in self-attention with a star-like topology, in which every two non-adjacent nodes are connected through a shared relay node.…”
Section: Related Workmentioning
confidence: 99%
“…A sequential approach is followed by Memory Network-based models to iteratively store the information gathered from passages in a memory cell [22,34,38]. Works by [4,11,36] use graph convolutional network [21] to do multi-hop reasoning.…”
Section: Multi-step Datasets and Reasoningmentioning
confidence: 99%