2018
DOI: 10.1016/j.ins.2017.09.057
|View full text |Cite
|
Sign up to set email alerts
|

Introducing quaternion multi-valued neural networks with numerical examples

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 32 publications
(15 citation statements)
references
References 28 publications
0
15
0
Order By: Relevance
“…Because training an ELM is formulated as a least-squares problem, we provided tools for training general hypercomplexvalued ELM models using real linear algebra operations. Moreover, the realvalued formulation given by (31) depends almost solely on the multiplication table of the underlying hypercomplex algebra. Because its multiplication table uniquely determines a hypercomplex algebra, the definitions presented in this work allows for the implementation of ELMs in any hypercomplex algebra.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Because training an ELM is formulated as a least-squares problem, we provided tools for training general hypercomplexvalued ELM models using real linear algebra operations. Moreover, the realvalued formulation given by (31) depends almost solely on the multiplication table of the underlying hypercomplex algebra. Because its multiplication table uniquely determines a hypercomplex algebra, the definitions presented in this work allows for the implementation of ELMs in any hypercomplex algebra.…”
Section: Discussionmentioning
confidence: 99%
“…With regards to computational complexity, the hypercomplex-valued ELMs are much more time demanding than their real counterpart. The computational burden is mainly due to the transformations Φ L and Φ R required on ( 16), (19), and (31). For example, in our experiment, the training phase is carried out in 3.25s and 283.76s for the real and hypercomplex-valued models, respectively.…”
Section: Color Image Auto-encodingmentioning
confidence: 99%
See 1 more Smart Citation
“…On the other hand, since quaternion‐valued neural networks are superior to real‐valued and complex‐valued neural networks in processing high‐dimensional data and spatial transforms, in recent years, the study of quaternion‐valued neural networks has become a hot issue . However, due to the noncommutativity of the quaternion multiplication, the main method for studying the dynamical behavior of quaternion‐valued neural network systems is to decompose the considered neural network systems into real‐valued or complex‐valued systems …”
Section: Introductionmentioning
confidence: 99%
“…When it comes to 3-D and 4-D data sources, such as measurements from seismometers, ultrasonic anemometers, and inertial body sensors, quaternions in quaternion domain H have inherent advantages over real vectors in representing 3-D and 4-D data owing to the natural ability to encode the cross-channel correlation and the accurate modeling of rotation and orientation [14]. In order to take advantage of the quaternion representation, a number of quaternionvalued neural models have been proposed, such as quaternion LMS algorithm [15], quaternion nonlinear adaptive filter [16], quaternion Kalman filter [17], quaternion independent component analysis algorithm [18], quaternion support vector machine [19], quaternion multi-valued neural networks [20], and quaternion ELM (QELM) [21]. Analogous with the complex case, the augmented quaternion statistics reveal that the covariance matrix is not adequate to capture the full second-order statistics of a quaternion vector [22], [23], which makes the Quaternion Widely Linear (QWL) processing become the optimal linear processing for general quaternion-valued signals.…”
Section: Introductionmentioning
confidence: 99%