2021
DOI: 10.1007/s10994-021-06090-8
|View full text |Cite
|
Sign up to set email alerts
|

Inclusion of domain-knowledge into GNNs using mode-directed inverse entailment

Abstract: We present a general technique for constructing Graph Neural Networks (GNNs) capable of using multi-relational domain knowledge. The technique is based on mode-directed inverse entailment (MDIE) developed in Inductive Logic Programming (ILP). Given a data instance e and background knowledge B, MDIE identifies a most-specific logical formula ⊥ B (e) that contains all the relational information in B that is related to e. We represent ⊥ B (e) by a "bottom-graph" that can be converted into a form suitable for GNN … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
2
2
2
1

Relationship

5
2

Authors

Journals

citations
Cited by 12 publications
(21 citation statements)
references
References 38 publications
0
10
0
Order By: Relevance
“…This suggests that substantially more experimentation is needed to see if the predictive performance of Simple CRMs can be improved. We note also that that the DRM uses substantially more complex features than the Simple CRM, and that (Dash et al, 2022); DRM (500), is a form of MLP called a Deep Relational Machine that has as input Boolean feature-vectors resulting from a stochastic selection of 500 relational features (Dash et al, 2019); and CILP++, an MLP that has input Boolean feature-vectors resulting from an exhaustive feature construction technique called Bottom-Clause Propositionalisation. Baseline is the majority class predictor.…”
Section: Additional Results: Crms As Prediction Machinesmentioning
confidence: 99%
See 3 more Smart Citations
“…This suggests that substantially more experimentation is needed to see if the predictive performance of Simple CRMs can be improved. We note also that that the DRM uses substantially more complex features than the Simple CRM, and that (Dash et al, 2022); DRM (500), is a form of MLP called a Deep Relational Machine that has as input Boolean feature-vectors resulting from a stochastic selection of 500 relational features (Dash et al, 2019); and CILP++, an MLP that has input Boolean feature-vectors resulting from an exhaustive feature construction technique called Bottom-Clause Propositionalisation. Baseline is the majority class predictor.…”
Section: Additional Results: Crms As Prediction Machinesmentioning
confidence: 99%
“…There are also higher-level domain-relations that determine presence of connected, fused structures. Some more details on the background-knowledge can be seen in these recent studies: (Dash et al, 2021(Dash et al, , 2022. 2021) is taken as the target model.…”
Section: Trains Chessmentioning
confidence: 99%
See 2 more Smart Citations
“…We show first recent results reported in [10,11]. The experiments reported consider the inclusion of human-selected domain-knowledge for two kinds of deep neural networks (a multi-layer perceptron, or MLP, and a graph neural network, or GNN).…”
Section: Examples From Drug-designmentioning
confidence: 99%