2021
DOI: 10.1007/s10994-021-06017-3
|View full text |Cite
|
Sign up to set email alerts
|

Beyond graph neural networks with lifted relational neural networks

Abstract: We introduce a declarative differentiable programming framework, based on the language of Lifted Relational Neural Networks, where small parameterized logic programs are used to encode deep relational learning scenarios through the underlying symmetries. When presented with relational data, such as various forms of graphs, the logic program interpreter dynamically unfolds differentiable computation graphs to be used for the program parameter optimization by standard means. Following from the declarative, relat… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 12 publications
(6 citation statements)
references
References 69 publications
(66 reference statements)
0
4
0
Order By: Relevance
“…Further related work has introduced, for example, neuro-symbolic approaches that generate textual output in the form of domain-specific languages [ 46 , 47 ] or logic programs [ 48 , 49 ], where the description of simple concepts detected in the input examples is the foremost goal [ 50 – 52 ]. Lifted Neural Networks [ 53 ] encode deep relational learning to provide insights into complex relations that go beyond the current GNN capabilities. Furthermore, there exist approaches to incorporate logic-based symbolic domain knowledge into GNNs, with the goal to improve their performance [ 42 ].…”
Section: Introductionmentioning
confidence: 99%
“…Further related work has introduced, for example, neuro-symbolic approaches that generate textual output in the form of domain-specific languages [ 46 , 47 ] or logic programs [ 48 , 49 ], where the description of simple concepts detected in the input examples is the foremost goal [ 50 – 52 ]. Lifted Neural Networks [ 53 ] encode deep relational learning to provide insights into complex relations that go beyond the current GNN capabilities. Furthermore, there exist approaches to incorporate logic-based symbolic domain knowledge into GNNs, with the goal to improve their performance [ 42 ].…”
Section: Introductionmentioning
confidence: 99%
“…The proposed method itself also heavily relies on the completeness of the used theory, in this case being the functional and phenotypical annotations of the S.cerevisiae genome. Potential remedies, while still retaining some training efficiency, could be the use of computationally expensive feature selection techniques or even lifted methods such as LRNNs (Lifted Relational Neural Networks) ( Šourek et al 2021 ).…”
Section: Discussionmentioning
confidence: 99%
“…The great algebraic variance of the t-norm theory has allowed identifying parameterized (i.e. weighted) classes of t-norms [100,89] that are very close to standard neural computation patterns (e.g. ReLU or sigmoidal layers).…”
Section: Nn(nn_burglary [B]) :-Burglary(b) Nn(nn_earthquake [E]mentioning
confidence: 99%