Proceedings of the 28th ACM International Conference on Information and Knowledge Management 2019
DOI: 10.1145/3357384.3358165
|View full text |Cite
|
Sign up to set email alerts
|

Incorporating Relation Knowledge into Commonsense Reading Comprehension with Multi-task Learning

Abstract: This paper focuses on how to take advantage of external relational knowledge to improve machine reading comprehension (MRC) with multi-task learning. Most of the traditional methods in MRC assume that the knowledge used to get the correct answer generally exists in the given documents. However, in real-world task, part of knowledge may not be mentioned and machines should be equipped with the ability to leverage external knowledge. In this paper, we integrate relational knowledge into MRC model for commonsense… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 21 publications
(17 citation statements)
references
References 19 publications
(34 reference statements)
0
13
0
Order By: Relevance
“…subgraphs from ConceptNet), and attending to it via attention mechanism when representing the inputs (Bauer et al, 2018;Paul and Frank, 2019;Lin et al, 2019). Alternative approaches include using the knowledge to score answer candidates and prune implausible ones (Lin et al, 2017;Tandon et al, 2018), and training in a multi-task setup via auxiliary tasks pertaining to knowledge (Xia et al, 2019).…”
Section: External Knowledge In Neural Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…subgraphs from ConceptNet), and attending to it via attention mechanism when representing the inputs (Bauer et al, 2018;Paul and Frank, 2019;Lin et al, 2019). Alternative approaches include using the knowledge to score answer candidates and prune implausible ones (Lin et al, 2017;Tandon et al, 2018), and training in a multi-task setup via auxiliary tasks pertaining to knowledge (Xia et al, 2019).…”
Section: External Knowledge In Neural Modelsmentioning
confidence: 99%
“…To increase the coverage of high-precision world knowledge and facilitate multi-hop reasoning by making intermediate reasoning steps explicit, prior work incorporated KBs (e.g. ConceptNet; Speer and Havasi, 2012) and knowledge-informed models into LM-based models (Xia et al, 2019;.…”
Section: Introductionmentioning
confidence: 99%
“…Rajani et al (2019) incorporates the generated explanations into the training of language models for enhancement. Xia et al (2019) leverages two auxiliary relation-aware tasks to better model the interactions between question and candidate answers. Chalier et al (2020) proposes a multi-faceted model of commonsense knowledge statements to capture more expressive meta-properties.…”
Section: Commonsense Reasoning Methodsmentioning
confidence: 99%
“…For applications without available structured knowledge bases, researchers have relied on commonsense aggregated from corpus statistics pulled from unstructured text (Tandon et al, 2018;Lin et al, 2017;Li et al, 2018;Banerjee et al, 2019). More recently, rather than providing relevant commonsense as an additional input to neural networks, researchers have looked into indirectly encoding commonsense knowledge into the parameters of neural networks through pretraining on commonsense knowledge bases (Zhong et al, 2018) or explanations (Rajani et al, 2019), or by using multi-task objectives with commonsense relation prediction (Xia et al, 2019).…”
Section: Descriptionmentioning
confidence: 99%