2021
DOI: 10.1007/s10994-021-06058-8
|View full text |Cite
|
Sign up to set email alerts
|

Learning from interpretation transition using differentiable logic programming semantics

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 34 publications
0
2
0
Order By: Relevance
“…Then P is represented by the matrix MP ∈ [0,1] m×n . Each element akj in MP is defined as follows (Gao et al 2021):…”
Section: A Matrix Representation Of Lpmentioning
confidence: 99%
See 1 more Smart Citation
“…Then P is represented by the matrix MP ∈ [0,1] m×n . Each element akj in MP is defined as follows (Gao et al 2021):…”
Section: A Matrix Representation Of Lpmentioning
confidence: 99%
“…To employ KAN to learn the SH matrix of an LP, a differentiable logic semantics used in [replaces the logical 'or' operator with the product t-norm (Chapter ? ?,Gao et al, 2021) and uses the differentiable function ϕ(x − 1) in to replace the θ(x) function. The hyperparameter γ controls the slope similarity between the functions ϕ and θ:∅(𝑥) = 1 1 + 𝑒 −𝛾𝑥After the conversion into propositional form, the relational facts are transformed into pairs of interpretation vectors (vi,vo).…”
mentioning
confidence: 99%