2021
DOI: 10.12700/aph.18.2.2021.2.2
|View full text |Cite
|
Sign up to set email alerts
|

Towards Fast and Understandable Computations: Which “And”- and “Or”-Operations Can Be Represented by the Fastest (i.e., 1-Layer) Neural Networks? Which Activations Functions Allow Such Representations?

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
3
3
1

Relationship

3
4

Authors

Journals

citations
Cited by 7 publications
(6 citation statements)
references
References 0 publications
0
6
0
Order By: Relevance
“…Combining extreme learning machines with the continuous logical background can be a very promising direction towards a more interpretable, transparent, and safe machine learning. Supplemental research is also in progress aiming to investe which "And"-and "Or"-operations can be represented by the fastest (i.e., 1-Layer) neural networks, and which activations functions allow such representations [1].…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…Combining extreme learning machines with the continuous logical background can be a very promising direction towards a more interpretable, transparent, and safe machine learning. Supplemental research is also in progress aiming to investe which "And"-and "Or"-operations can be represented by the fastest (i.e., 1-Layer) neural networks, and which activations functions allow such representations [1].…”
Section: Discussionmentioning
confidence: 99%
“…Note that the general operator for ν = 1 is conjunctive, for ν = 0 it is disjunctive and for ν = ν * = f −1 1 2 it is self-dual. As a benefit of using this general operator, a conjunction, a disjunction and an aggregative operator differ only in one parameter of the general operator in Equation (1). Additionally, the parameter ν has the semantic meaning of the level of expectation: maximal for the conjunction, neutral for the aggregation, and minimal for the disjunction.…”
Section: Nilpotent Logical Systemsmentioning
confidence: 99%
See 2 more Smart Citations
“…More specifically, "and" neurons are implemented through t-norms while "or" neurons are implemented using t-conorms (s-norms). In [28], the authors analyzed which versions of fuzzy techniques provide the best approximation to neural data processingand, vice versa, which activation function makes the results of neural processing best describable in fuzzy terms. Interestingly, they concluded that the best activation function in this sense is exactly the ReLU function.…”
Section: Neural Network Based On Nilpotent Fuzzy Logic and Mcdm Toolsmentioning
confidence: 99%