2022
DOI: 10.1016/j.artint.2021.103649
|View full text |Cite
|
Sign up to set email alerts
|

Logic Tensor Networks

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
60
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 70 publications
(60 citation statements)
references
References 51 publications
0
60
0
Order By: Relevance
“…For this reason, a possible future direction would be to design a new post processing stage based on a neural network that draws the line for us, and possibly even go further by filtering with (higher order) axioms from the ontology. Another area of interest for possible future research is integrating existing ontological knowledge directly into the main scene graph generation network, perhaps in the form of a new term in the loss function [14], or through incorporating neurosymbolic propositional and first order logic directly as part of the training process [4].…”
Section: Discussionmentioning
confidence: 99%
“…For this reason, a possible future direction would be to design a new post processing stage based on a neural network that draws the line for us, and possibly even go further by filtering with (higher order) axioms from the ontology. Another area of interest for possible future research is integrating existing ontological knowledge directly into the main scene graph generation network, perhaps in the form of a new term in the loss function [14], or through incorporating neurosymbolic propositional and first order logic directly as part of the training process [4].…”
Section: Discussionmentioning
confidence: 99%
“…Another aspect related to scalability is the choice of aggregation function and fuzzy logic operators. Emilie van Krieken et al [14] and Samy Badreddine [4] found substantial differences between differential fuzzy logic operators in terms of computational efficiency, scalability, gradients, and ability to handle exceptions, which are important characteristics in a learning setting. Their analysis lays the groundwork for the present FasterLTN architecture, which incorporates and extends the log-product aggregator analyzed in [14].…”
Section: Related Workmentioning
confidence: 99%
“…We first summarize the Faster R-CNN overall architecture (Section 3.1). Then, we introduce the main concepts behind LTNs (Section 3.2) and their application to object detection (Section 3.3), referring the reader to [3,4] for additional details. Finally, the joint training procedure of Faster-LTN is explained in Section 3.4, highlighting the main changes introduced to make end-to-end training feasible.…”
Section: The Faster-ltn Architecturementioning
confidence: 99%
See 1 more Smart Citation
“…There has been other work in differentiable neuro-symbolic systems outside of TPRs and VSAs, most notably using first-order logic [20][21][22]. These works represent another powerful alternative approach, though at this time are often more involved in their design and training procedures.…”
Section: Related Workmentioning
confidence: 99%