2011
DOI: 10.1109/tnn.2011.2109395
|View full text |Cite
|
Sign up to set email alerts
|

Extended Hamiltonian Learning on Riemannian Manifolds: Theoretical Aspects

Abstract: This paper introduces a general theory of extended Hamiltonian (second-order) learning on Riemannian manifolds, as an instance of learning by constrained criterion optimization. The dynamical learning equations are derived within the general framework of extended-Hamiltonian stationary-action principle and are expressed in a coordinate-free fashion. A theoretical analysis is carried out in order to compare the features of the dynamical learning theory with the features exhibited by the gradient-based ones. In … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
27
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 32 publications
(27 citation statements)
references
References 55 publications
0
27
0
Order By: Relevance
“…To overcome this difficulty, Fiori generalized the EGA and proposed the EHA on manifolds. In particular, on the real symplectic group, the EHA can be expressed by Fiori:
…”
Section: Optimization On the Real Symplectic Groupmentioning
confidence: 99%
See 1 more Smart Citation
“…To overcome this difficulty, Fiori generalized the EGA and proposed the EHA on manifolds. In particular, on the real symplectic group, the EHA can be expressed by Fiori:
…”
Section: Optimization On the Real Symplectic Groupmentioning
confidence: 99%
“…Eventually, the present research aims at investigating a least‐squares problem on the real symplectic group where the difference between 2 points is induced by the geodesic distance. To solve such optimization problem, some effective iterative methods are proposed, like the Euclidean gradient algorithm (EGA) and the extended Hamiltonian algorithm (EHA) developed in Fiori . In this paper, we study the geodesic‐based Riemannian‐steepest‐descent algorithm (RSDA) to solve a least‐squares problem and show some simulations to illustrate the efficiency of our algorithm.…”
Section: Introductionmentioning
confidence: 99%
“…References [16,17] introduced a general theory of extended Hamiltonian (second order) learning on Riemannian manifold, especially, as an instance of learning by constrained criterion function optimization on the matrix manifolds. For the Lorentz group, the extended Hamiltonian algorithm can be expressed by Ȧ = X,…”
Section: Extended Hamiltonian Algorithm On the Lorentz Groupmentioning
confidence: 99%
“…The considered averaging optimization problem could be solved numerically via a geodesic-based Riemannian-steepest-descent algorithm (RSDA). Furthermore, the devised method RSDA is compared with the line-search algorithm, Euclidean gradient algorithm (EGA) and the second-order learning algorithm, extended Hamiltonian algorithm (EHA), proposed in [16,17].…”
Section: Introductionmentioning
confidence: 99%
“…It is noticeable that this approach has been studied from different perspectives, for instance, within the optimization framework the discretization of appropriate continuous dynamical systems, whose trajectory minimizes the target, has resulted in a promising "ODE-approach to nonlinear programming" [14,15], as well as machine learning by optimization on Riemannian manifolds [16,17]. Also, promising results have been presented for learning systems endowed with either a Lie-group structure [18] or a pseudo-Riemannian metric [19], and machine learning by dynamical systems on manifolds [20,21]. In Section 4 we present discrete gradient methods [22], which by construction respect the energy-diminishing feature of gradient systems regardless the step size.…”
Section: Introductionmentioning
confidence: 99%