2009
DOI: 10.1007/s00500-009-0502-5
|View full text |Cite
|
Sign up to set email alerts
|

The novel aggregation function-based neuron models in complex domain

Abstract: The computational power of a neuron lies in the spatial grouping of synapses belonging to any dendrite tree. Attempts to give a mathematical representation to the grouping process of synapses continue to be a fascinating field of work for researchers in the neural network community. In the literature, we generally find neuron models that comprise of summation, radial basis or product aggregation function, as basic unit of feed-forward multilayer neural network. All these models and their corresponding networks… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2010
2010
2018
2018

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 24 publications
(11 citation statements)
references
References 17 publications
(19 reference statements)
0
11
0
Order By: Relevance
“…For supporting the complex domain (second generation neural network) analysis of various representation methods have been developed [1,3,5,6,8,10,12,14]. Further the application of various computational learning algorithms which have been fully proved and introduced by many developers [15,16,20,21,27,29].…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…For supporting the complex domain (second generation neural network) analysis of various representation methods have been developed [1,3,5,6,8,10,12,14]. Further the application of various computational learning algorithms which have been fully proved and introduced by many developers [15,16,20,21,27,29].…”
Section: Resultsmentioning
confidence: 99%
“…Benvenuoto and Piazza (1992) [8] published a different version of CBP with the extension of real activation function in complex domain. From 2009 to 2012, Tripathi proposed extensive study in the field of novel complex valued neural computing [10]. A. Hirose and S. Yoshida.…”
Section: Introductionmentioning
confidence: 99%
“…The higher the complexity of an ANN is the more computations and memory intensive it can be. The number of neurons to be used in an ANN is a function of the mapping or the classifying power of the neuron itself [7,8]. Therefore, in case of high-dimensional problem, it is imperative to look for higher dimensional neuron model that can directly process the high-dimensional information.…”
Section: D Vector-valued Neuronmentioning
confidence: 99%
“…The weights and threshold values used in complex neural networks, are all complex numbers, and the split type output function F C of a complex valued neuron is defined to be ₣c(z) = ₣ R (x) + i ₣ R (y) … Eq. 1 Where z = x + iy is a complex valued input to the complex valued neuron, i denotes the imaginary quantity and F R (u) = 1/(1+ exp(-u)), i.e., the real and imaginary parts of the complex valued output of a complex valued neuron means the sigmoid functions of the real part x and the imaginary part y gives us the net input z to the neuron respectively [7]. The use of sigmoidal activation function separately for the real and imaginary parts ensures that the magnitude of real and imaginary part of F(z) is bounded but now the function is no longer holomorphic (analytic) because the Cauchy-Riemann equation does not hold as in the theory of complex-valued functions, Liouville's theorem states that the complex analytic function cannot be bounded on all of the complex plane unless it is a constant function.…”
Section: Machine Learning Throughcomplex-valued Neural Networkmentioning
confidence: 99%