Proceedings of the 2019 Conference of the North 2019
DOI: 10.18653/v1/n19-1240
|View full text |Cite
|
Sign up to set email alerts
|

Question Answering by Reasoning Across Documents with Graph Convolutional Networks

Abstract: Most research in reading comprehension has focused on answering questions based on individual documents or even single paragraphs. We introduce a neural model which integrates and reasons relying on information spread within documents and across multiple documents. We frame it as an inference problem on a graph. Mentions of entities are nodes of this graph while edges encode relations between different mentions (e.g., within-and crossdocument coreference). Graph convolutional networks (GCNs) are applied to the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
134
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
5
5

Relationship

0
10

Authors

Journals

citations
Cited by 201 publications
(146 citation statements)
references
References 26 publications
1
134
0
Order By: Relevance
“…Dhingra et al (2018) modify an existing neural QA model to additionally incorporate coreference information provided by a coreference resolution model. De Cao et al (2018) build a graph connecting entities and apply Graph Convolutional Networks (Kipf and Welling, 2016) to perform multi-hop reasoning, which leads to strong results on WIKIHOP. Zhong et al (2019) propose a new neural QA architecture that combines a combination of coarse-grained and fine-grained reasoning to achieve very strong results on WIKIHOP.…”
Section: Related Workmentioning
confidence: 99%
“…Dhingra et al (2018) modify an existing neural QA model to additionally incorporate coreference information provided by a coreference resolution model. De Cao et al (2018) build a graph connecting entities and apply Graph Convolutional Networks (Kipf and Welling, 2016) to perform multi-hop reasoning, which leads to strong results on WIKIHOP. Zhong et al (2019) propose a new neural QA architecture that combines a combination of coarse-grained and fine-grained reasoning to achieve very strong results on WIKIHOP.…”
Section: Related Workmentioning
confidence: 99%
“…In previous studies, GCNs are used to encode dependency trees (Marcheggiani and Titov, 2017;Zhang et al, 2018) and cross-document relations (Yasunaga et al, 2017;De Cao et al, 2019) for downstream tasks. Our work is the first to leverage GCNs for encoding conversation structures.…”
Section: Conversational-gcn: Aggregation-based Structure Modeling Formentioning
confidence: 99%
“…In both cases, all relevant information, barring some linguistic knowledge, is provided or the questions are unanswerable (Rajpurkar et al, 2018). This allows using an attention-based approach of indirectly combining information (Dhingra et al, 2018;Cao et al, 2019;Song et al, 2018).…”
Section: Related Workmentioning
confidence: 99%