2016
DOI: 10.1002/tee.22394
|View full text |Cite
|
Sign up to set email alerts
|

Hyperbolic Hopfield neural networks with four‐state neurons

Abstract: In recent years, applications of neural networks with Clifford algebra have become widespread. Clifford algebra is also referred to as geometric algebra and is useful in dealing with geometric objects. Hopfield neural networks with Clifford algebra, such as complex numbers and quaternions, have been proposed. However, it has been difficult to construct Hopfield neural networks by Clifford algebra with positive part of the signature, such as hyperbolic numbers. Hyperbolic numbers are useful algebra to deal with… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 11 publications
(11 citation statements)
references
References 38 publications
0
11
0
Order By: Relevance
“…The stability condition for HHNNs is symmetric connections w ij = w ji . Therefore, we obtain u ij = u ji and v ij = v ji .…”
Section: Hyperbolic Hopfield Neural Networkmentioning
confidence: 93%
See 1 more Smart Citation
“…The stability condition for HHNNs is symmetric connections w ij = w ji . Therefore, we obtain u ij = u ji and v ij = v ji .…”
Section: Hyperbolic Hopfield Neural Networkmentioning
confidence: 93%
“…Kobayashi proposed several models of hyperbolic HNN (HHNN), i.e. multistate and split HHNNs . The stability condition for the CHNNs and multistate HHNNs is that the mutual connections are conjugate.…”
Section: Introductionmentioning
confidence: 99%
“…We also introduce a broad family of hypercomplex-valued activation functions and provide an important theorem concerning the stability (in the sense of Lyapunov) for discrete-time hypercomplex-valued Hopfield-type neural networks. In fact, the theorem presented in this paper can be applied for the stability analysis of many discrete-time HHNNs from the literature, including complex-valued (Jankowski et al, 1996;Zhou & Zurada, 2014;Kobayashi, 2017b), hyperbolic-valued (Kobayashi, 2013(Kobayashi, , 2016b, dual-numbered (Kobayashi, 2018a), tessarine-valued (Isokawa et al, 2010;Kobayashi, 2018c), quaternion-valued (Isokawa et al, 2008a;Valle & de Castro, 2018), and octonion-valued Hopfield neural networks (de Castro & Valle, 2018a).…”
Section: Contributions and Organization Of The Papermentioning
confidence: 99%
“…Neural networks based on hyperbolic numbers constitute an active topic of research since the earlier 2000s [20,21,22]. Hyperbolic-valued Hopfield neural networks, in particular, have been extensively investigated in the last decade by Kobayashi, Kuroe, and collaborators [43,45,81,82,46,47,48]. Hyperbolicvalued Hopfield neural networks are usually synthesized using either Hebbian learning [45,48] or the projection rule [47].…”
Section: Hyperbolic-valued Multistate Rcnnsmentioning
confidence: 99%
“…On the other hand, the projection rule may fails to satisfy the stability conditions imposed on the synaptic weights [47]. Examples of the activation function employed on hyperbolic-valued Hopfield neural networks include the split-sign function [81] and the directional multistate activation function [46]. In the following, we address the stability of hyperbolic-valued RCNNs with the directional multistate activation function, which corresponds to the csgn function given by (30).…”
Section: Hyperbolic-valued Multistate Rcnnsmentioning
confidence: 99%