2020
DOI: 10.1016/j.knosys.2020.105972
|View full text |Cite
|
Sign up to set email alerts
|

Interpretable neural networks based on continuous-valued logic and multicriteria decision operators

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
20
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
1
1
1

Relationship

3
5

Authors

Journals

citations
Cited by 34 publications
(20 citation statements)
references
References 15 publications
0
20
0
Order By: Relevance
“…Our strategy consists of implementing networks based on logical gates, modeled by Perceptron with fixed weights and biases. This hybrid neural model was introduced in [ 9 , 10 ]. Here, a single Perceptron in the NN network is activated by so-called Squashing activation functions, differentiable, a parametric family of functions that satisfy natural invariance requirements and contain rectified linear units as a particular case [ 16 , 17 ].…”
Section: Methodsmentioning
confidence: 99%
“…Our strategy consists of implementing networks based on logical gates, modeled by Perceptron with fixed weights and biases. This hybrid neural model was introduced in [ 9 , 10 ]. Here, a single Perceptron in the NN network is activated by so-called Squashing activation functions, differentiable, a parametric family of functions that satisfy natural invariance requirements and contain rectified linear units as a particular case [ 16 , 17 ].…”
Section: Methodsmentioning
confidence: 99%
“…Preliminary analysis shows that for some applications, it is more advantageous to use different activation functions for different neurons -i.e., select a family of activation functions instead, and select the parameters of activation functions of different neurons during training. Specifically, this was shown for a special family of squashing activation functions that contain rectified linear neurons as a particular case; see, e.g., [2]- [4]. Functions from this family have the form…”
Section: Shall We Go Beyond Rectified Linear Activation Functions?mentioning
confidence: 98%
“…Our strategy consists of implementing networks based on logical gates, modeled by Perceptron with fixed weights and biases. This hybrid neural model was introduced in (Csiszár et al, 2020a). Here, a single Perceptron in the NN network is activated by so-called Squashing activation functions, differentiable, parametric family of functions that satisfy natural invariance requirements and contain rectified linear units as a particular case (Urenda et al, 2020a;Zeltner et al, 2020).…”
Section: Continuous-valued Logic Multi-criteria Decision Operators Anmentioning
confidence: 99%
“…Thus, the Perceptron in the neural networks' hidden layers can model a threshold-based nilpotent operator (Csiszár et al, 2020a(Csiszár et al, , 2020b): a conjunction, a disjunction, or even an aggregative operator. This means that the weights of the first layer is to be learned, while the hidden layers of the pre-designed neural block, worked as logical operators with frozen weights and biases.…”
Section: Continuous-valued Logic Multi-criteria Decision Operators Anmentioning
confidence: 99%
See 1 more Smart Citation