Proceedings of *SEM 2021: The Tenth Joint Conference on Lexical and Computational Semantics 2021
DOI: 10.18653/v1/2021.starsem-1.7
|View full text |Cite
|
Sign up to set email alerts
|

NeuralLog: Natural Language Inference with Joint Neural and Logical Reasoning

Abstract: Deep learning (DL) based language models achieve high performance on various benchmarks for Natural Language Inference (NLI). And at this time, symbolic approaches to NLI are receiving less attention. Both approaches (symbolic and DL) have their advantages and weaknesses. However, currently, no method combines them in a system to solve the task of NLI. To merge symbolic and deep learning methods, we propose an inference framework called NeuralLog, which utilizes both a monotonicity-based logical inference engi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
11
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
3
2

Relationship

2
7

Authors

Journals

citations
Cited by 18 publications
(13 citation statements)
references
References 29 publications
(44 reference statements)
0
11
0
Order By: Relevance
“…This parallels the increase in performance we observe from mid-training on semi-synthetic deduction steps, although they do not tackle the full pipeline task. Hu et al (2020) and Chen et al (2021b) propose systems which perform single-sentence natural language inference through proof search in the natural logic space. Explicitly using the monotonicity calculus (Icard et al, 2017) and natural logic to generate contrastive examples for semisynthetic mid-training is a promising future direction that could help to address the monotonicity issues encountered by our step deduction model, as discussed in Section 5.4.…”
Section: Related Workmentioning
confidence: 99%
“…This parallels the increase in performance we observe from mid-training on semi-synthetic deduction steps, although they do not tackle the full pipeline task. Hu et al (2020) and Chen et al (2021b) propose systems which perform single-sentence natural language inference through proof search in the natural logic space. Explicitly using the monotonicity calculus (Icard et al, 2017) and natural logic to generate contrastive examples for semisynthetic mid-training is a promising future direction that could help to address the monotonicity issues encountered by our step deduction model, as discussed in Section 5.4.…”
Section: Related Workmentioning
confidence: 99%
“…To discover potential tasks that can provide essential linguistic information for symbolic inferences, we studied four major logical systems for NLI, all with high accuracy on SICK (Marelli et al 2014), and several challenge datasets for the NLI task. They include NLI systems based on natural logic (Abzianidze 2020), monotonicity reasoning (Hu et al 2020;Chen, Gao, and Moss 2021), and theorem proving (Yanaka et al 2018).…”
Section: Inference Information Probesmentioning
confidence: 99%
“…This task probes the graph-based abstract meaning representation for sentences, a type of knowledge found effective in symbolic systems for acquiring paraphrase pairs and selecting correct inference steps (Yanaka et al 2018;Chen, Gao, and Moss 2021). The task is to construct a semantic graph that captures connections between concepts, modifiers, and relations in a sentence.…”
Section: Semantic Graph Construction (Semgraph)mentioning
confidence: 99%
“…Natural logic dates back to the formalisms of Sanchez (1991), but has been received more recent treatments and reformulations in (MacCartney and Manning, 2007;Hu and Moss, 2018). Symbolic and hybrid neural/symbolic implementations of the natural logic paradigm have been explored in (Chen et al, 2021;Kalouli et al, 2020;Abzianidze, 2017;Hu et al, 2020).…”
Section: Related Workmentioning
confidence: 99%