2019
DOI: 10.1609/aaai.v33i01.33017208
|View full text |Cite
|
Sign up to set email alerts
|

Improving Natural Language Inference Using External Knowledge in the Science Questions Domain

Abstract: Natural Language Inference (NLI) is fundamental to many Natural Language Processing (NLP) applications including semantic search and question answering. The NLI problem has gained significant attention due to the release of large scale, challenging datasets. Present approaches to the problem largely focus on learning-based methods that use only textual information in order to classify whether a given premise entails, contradicts, or is neutral with respect to a given hypothesis. Surprisingly, the use of method… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
90
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 115 publications
(102 citation statements)
references
References 21 publications
0
90
0
Order By: Relevance
“…Static KG Models. We compare to three static KG variants of our QA framework that model the knowledge module with path/graph encoders: (1) a RN degenerate version of our system, which computes a knowledge embedding by an attention mechanism over the retrieved paths for each question-choice entity pair; (2) Relational Graph Convolutional Networks (RGCN) (Schlichtkrull et al, 2018) which encode local graphs by using graph convolutional networks with relation-specific weight matrices; (3) GconAttn (Wang et al, 2019) which models the alignment between entities via attention and pools over all entity embeddings. Link Prediction Model.…”
Section: Baselinesmentioning
confidence: 99%
“…Static KG Models. We compare to three static KG variants of our QA framework that model the knowledge module with path/graph encoders: (1) a RN degenerate version of our system, which computes a knowledge embedding by an attention mechanism over the retrieved paths for each question-choice entity pair; (2) Relational Graph Convolutional Networks (RGCN) (Schlichtkrull et al, 2018) which encode local graphs by using graph convolutional networks with relation-specific weight matrices; (3) GconAttn (Wang et al, 2019) which models the alignment between entities via attention and pools over all entity embeddings. Link Prediction Model.…”
Section: Baselinesmentioning
confidence: 99%
“…CBPT (Zhong et al, 2018) is a plug-in method of assembling the predictions of any models with a straightforward method of utilizing pre-trained concept embeddings from ConceptNet. TEXTGRAPH-CAT (Wang et al, 2019c) concatenates the graphbased and text-based representations of the statement and then feed it into a classifier. We create sentence template for generating sentences and then feed retrieved triples as additional text inputs as a baseline method TRIPLESTRING.…”
Section: Compared Methodsmentioning
confidence: 99%
“…In contrast to our work, they do not explicitly impose graph-structured knowledge into models , but limit its potential within transforming word embeddings to concept embeddings. Some other recent attempts (Zhong et al, 2018;Wang et al, 2019c) to use ConceptNet graph embeddings are adopted and compared in our experiments ( §5). Rajani et al (2019) propose to manually collect more human explanations for correct answers as additional supervision for auxiliary training.…”
Section: Related Workmentioning
confidence: 99%
“…Structured Commonsense Knowledge in Neural Systems: Different approaches have been proposed to extract and integrate external knowledge into neural models for various NLU tasks such as reading comprehension (RC) (Xu et al, 2017;Mihaylov and Frank, 2018;Weissenborn et al, 2018), question answering (QA) (Xu et al, 2016;Tandon et al, 2018;Wang et al, 2019), etc. Recently, many works proposed different ways to extract knowledge from static knowledge graphs (KGs).…”
Section: Related Workmentioning
confidence: 99%