2002
DOI: 10.1109/tcsi.2002.800481
|View full text |Cite
|
Sign up to set email alerts
|

Existence and characterization of limit cycles in nearly symmetric neural networks

Abstract: It is known that additive neural networks with a symmetric interconnection matrix are completely stable, i.e., each trajectory converges toward some equilibrium point. This paper addresses the fundamental question of robustness of complete stability of additive neural networks with respect to small perturbations of the nominal symmetric interconnections. It is shown that in the general case, complete stability is not robust. More precisely, the paper considers a class of neural networks, and gives a necessary … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
7
0

Year Published

2004
2004
2014
2014

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 42 publications
(8 citation statements)
references
References 24 publications
(66 reference statements)
0
7
0
Order By: Relevance
“…Such nonrobust CNNs are able to display nonconvergent dynamics, including bifurcations, large-size periodic oscillations and even complex attractors, close to symmetry. Analogous results have been singled out for general neural networks with an additive neuron interconnecting structure [Di Marco et al, 2002]. On the other hand, in the CNNs community it is widely believed that nonrobust cases are expected to be exceptional in the class of symmetric CNNs.…”
Section: Introductionmentioning
confidence: 55%
“…Such nonrobust CNNs are able to display nonconvergent dynamics, including bifurcations, large-size periodic oscillations and even complex attractors, close to symmetry. Analogous results have been singled out for general neural networks with an additive neuron interconnecting structure [Di Marco et al, 2002]. On the other hand, in the CNNs community it is widely believed that nonrobust cases are expected to be exceptional in the class of symmetric CNNs.…”
Section: Introductionmentioning
confidence: 55%
“…If we choose N = {V 2 }, quantity E m deÿned in Property 1 is given by E m = 5:2%. It is also seen that for errors in the interconnections satisfying A¡E m = 5:2%, V 2 is not an equilibrium point of (5). Hence, on the basis of Theorem 1, we have that (5) is globally convergent to M = {x ? }…”
Section: An Application Examplementioning
confidence: 90%
“…Indeed, recent work has shown that there are classes of nominally symmetric additive neural networks for which convergence is not robust with respect to perturbations due to tolerances [4,5]. Namely, even arbitrarily small perturbations cause the birth of large-size non-vanishing oscillations in the long-run behaviour of the trajectories, an highly undesirable situation which makes the networks not useful for solving signal processing tasks.…”
Section: Introductionmentioning
confidence: 99%
“…As is well-known, RNNs may exhibit limit cycle behavior [44], a property which can be exploited for designing robot controllers [55,71]. However, a potential drawback of this approach is that if an RNN enters into a limit cycle, i.e., a self-excited periodic oscillation, it can become insensitive or react unpredictably to changes in the input signals; in this sense, external control of the system may be lost.…”
Section: Introductionmentioning
confidence: 99%