2021
DOI: 10.1016/j.aiopen.2021.03.001
|View full text |Cite
|
Sign up to set email alerts
|

Neural, symbolic and neural-symbolic reasoning on knowledge graphs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 50 publications
(25 citation statements)
references
References 22 publications
0
16
0
Order By: Relevance
“…There are six studies in this category, all but one of which leverage relational structure in some manner. A common theme is the use of graph representations and/or GNNs which aligns with recent research directions proposed by Garcez et al in [21], as well as Zhang et al in [24]. We call Level 3 Cooperative, as it is conceptually similar to Reinforcement Learning (RL).…”
Section: Nesy Categoriesmentioning
confidence: 81%
See 1 more Smart Citation
“…There are six studies in this category, all but one of which leverage relational structure in some manner. A common theme is the use of graph representations and/or GNNs which aligns with recent research directions proposed by Garcez et al in [21], as well as Zhang et al in [24]. We call Level 3 Cooperative, as it is conceptually similar to Reinforcement Learning (RL).…”
Section: Nesy Categoriesmentioning
confidence: 81%
“…Several surveys have already been conducted which cover the overall NeSy landscape going as far back as 2005, and as recently as 2021, so we will not attempt to replicate that here [1,4,8,[13][14][15][16][17][18][19][20][21][22][23][24][25]. In fact, our understanding of the field is guided by the works of these scholars.…”
Section: Contributionsmentioning
confidence: 99%
“…However, these works merely target link prediction, but cannot perform CQA like we do, and extensions to target these more general queries are not straightforward. For other less related neural-symbolic methods we refer the reader to (Zhang et al 2020;Bianchi et al 2020).…”
Section: Related Workmentioning
confidence: 99%
“…On the other hand, embedding methods are essentially tailored towards a form of inductive reasoning: given a number of queries and their answers, they are used to predict answers to other similar queries, but they typically ignore ontological knowledge accompanying KGs. Since large portions of expert knowledge can be conveniently represented using ontologies, the benefits of coupling the ontological reasoning and embedding-based methods for KG completion are evident and have been acknowledged in several works (Bianchi et al 2020;Zhang et al 2020;Gutiérrez-Basulto and Schockaert 2018;Kulmanov et al 2019). However, to the best of our knowledge such coupling for CQA has not been studied so far.…”
Section: Introductionmentioning
confidence: 99%
“…Here, System 2 refers to the System 1/System 2 dual process theory of human reasoning explicated by psychologist and Nobel laureate Daniel Kahneman in his 2011 book "Thinking, Fast and Slow" [77]. AI researchers [6,51,87,96,104,152,164] have drawn many parallels between the characteristics of sub-symbolic and symbolic AI systems and human reasoning with System 1/System 2. Broadly speaking, sub-symbolic (neural, deep-learning) architectures are said to be akin to the fast, intuitive, often biased and/or logically flawed System 1.…”
Section: Introductionmentioning
confidence: 99%