2004
DOI: 10.1016/j.jal.2004.03.006
|View full text |Cite
|
Sign up to set email alerts
|

A neural implementation of multi-adjoint logic programming

Abstract: A neural net based implementation of propositional [0, 1]-valued multi-adjoint logic programming is presented, which is an extension of earlier work on representing logic programs in neural networks carried out in [A.S. d'Avila Garcez et al., Neural-Symbolic Learning Systems: Foundations and Applications, Springer, 2002; S. Hölldobler et al., Appl. Intelligence 11 (1) (1999) 45-58]. Proofs of preservation of semantics are given, this makes the extension to be well-founded.The implementation needs some preproce… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
7
0

Year Published

2004
2004
2010
2010

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 13 publications
(7 citation statements)
references
References 16 publications
0
7
0
Order By: Relevance
“…Following traditional techniques of logic programming, a procedural semantics was given in [13], in which non-determinism was discarded by using reductants. Various computational approaches have been developed for propositional residuated logic programs: on the one hand, there exists a bottom-up neural-like implementation of the fixed-point semantics which calculates the successive iterations of the immediate consequences operator [14]; on the other hand, a goal-oriented top-down approach tabulation procedure has been presented in [15], [16]. In this paper we maintain our interest on the use of tabulation (tabling, or memoizing) methods.…”
Section: Introductionmentioning
confidence: 99%
“…Following traditional techniques of logic programming, a procedural semantics was given in [13], in which non-determinism was discarded by using reductants. Various computational approaches have been developed for propositional residuated logic programs: on the one hand, there exists a bottom-up neural-like implementation of the fixed-point semantics which calculates the successive iterations of the immediate consequences operator [14]; on the other hand, a goal-oriented top-down approach tabulation procedure has been presented in [15], [16]. In this paper we maintain our interest on the use of tabulation (tabling, or memoizing) methods.…”
Section: Introductionmentioning
confidence: 99%
“…Some such models for the representation of fuzzy logic programs have already been developed (e.g. [HHS04], [MCA04]). When the rules include weights, a connectionist representation has a significant advantage: the neural network may be trained to learn the weights that make the rules fit best the data, so that the problem of extracting symbolic rules from the network is reduced to learning the rule weights.…”
Section: Introductionmentioning
confidence: 99%
“…The recent paradigm of soft computing promotes the use and integration of different approaches for problem solving. The approach presented in [7,9] introduced a hybrid framework to handling uncertainty, expressed in the language of multi-adjoint logic but implemented by using ideas from the world of neural networks.…”
Section: Introductionmentioning
confidence: 99%
“…Several semantics have been proposed for multi-adjoint logic programs but, regarding the implementation, the fix-point semantics was the chosen one: given a multi-adjoint logic program P its meaning (the minimal model) is obtained by iterating the T P operator. At least theoretically, by computing the sequence of iterations of T P one could answer in parallel all the possible queries to P; in order to take advantage of this potential parallelism, a recurrent neural network implementation of T P was introduced in [7], where the truth values belonged to the unit interval, the connectives were the usual t-norms Gödel, product and Lukasiewicz, together with any weighted sum. Later, this neural net was improved in order to be able to use any finite ordinal sum of Gödel, product and Lukasiewicz t-norms in [8].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation