2021
DOI: 10.1016/j.neunet.2021.02.028
|View full text |Cite
|
Sign up to set email alerts
|

Self-organized Operational Neural Networks with Generative Neurons

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
39
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7

Relationship

2
5

Authors

Journals

citations
Cited by 59 publications
(53 citation statements)
references
References 26 publications
0
39
0
Order By: Relevance
“…Owing to this, unlike ONNs, the generative neurons inside a Self-ONN layer can be parallelized more efficiently, leading to a considerable reduction in computational complexity and time. The generalized formulations of the forward-propagation through a Self-ONN neuron and back-propagation training of the Self-ONNs are described in [25], [29].…”
Section: Ii1d Self-organized Operational Neural Networkmentioning
confidence: 99%
See 2 more Smart Citations
“…Owing to this, unlike ONNs, the generative neurons inside a Self-ONN layer can be parallelized more efficiently, leading to a considerable reduction in computational complexity and time. The generalized formulations of the forward-propagation through a Self-ONN neuron and back-propagation training of the Self-ONNs are described in [25], [29].…”
Section: Ii1d Self-organized Operational Neural Networkmentioning
confidence: 99%
“…Recent studies [19]- [21] have pointed out that CNNs having homogenous network configuration based on a firstorder neuron model cannot adequately learn problems with a complex and highly nonlinear solution space [19]- [21] unless a sufficiently high network depth and complexity (variants of CNN) are accommodated. Recently, Self-Organized Operational Neural Networks (Self-ONN) have been proposed to achieve a high heterogeneity level with self-organized operator optimization capability to maximize the learning performance [25]. The superior regression capability of Self ONNs over image segmentation, restoration, and denoising was demonstrated in recent studies [25], [29].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…To address these limitations, Operational Neural Networks (ONNs) [21]- [24] have recently been proposed as a heterogeneous network model encapsulating distinct nonlinear neurons. ONNs are derived from the Generalized Operational Perceptrons (GOPs) [14], [20] that can learn those problems where MLPs entirely fail.…”
Section: Introductionmentioning
confidence: 99%
“…Instead, during the training of the network, to maximize the learning performance, each generative neuron in a Self-ONN can customize the nodal operators, 𝚿, of each kernel connection. This yields a heterogeneity level that is far beyond that of ONNs, and the traditional "weight optimization" becomes an "operator generation" process as the details can be found in [24]. The superior regression capability of Self ONNs over image segmentation, restoration, and denoising was demonstrated in recent studies; however, they have not been evaluated for a classification problem.…”
Section: Introductionmentioning
confidence: 99%