Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2016
DOI: 10.1007/s10107-016-0992-8
|View full text |Cite
|
Sign up to set email alerts
|

Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity

Abstract: In a real Hilbert space H, we study the fast convergence properties as t → +∞ of the trajectories of the second-order evolution equationẍ (t) + α tẋ (t) + ∇Φ(x(t)) = 0,where ∇Φ is the gradient of a convex continuously differentiable function Φ : H → R, and α is a positive parameter. In this inertial system, the viscous damping coefficient α t vanishes asymptotically in a moderate way. For α > 3, we show that any trajectory converges weakly to a minimizer of Φ, just assuming that argmin Φ = ∅. The strong conver… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

13
361
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 230 publications
(374 citation statements)
references
References 30 publications
13
361
0
Order By: Relevance
“…Irrespective of the state space selection, under convexity and suitable smoothness assumptions on f , for p ≥ 2 solutions of (2) minimize the sub-optimality measuref (x 1 (t)) : [7]. Under further conditions on f it can also be established that x(t) converges to x * [12]. However, while this type of convergence results are instrumental for the understanding of system (3), they do not provide information related to the robustness properties of the system under small but persistent time-varying disturbances, which could be of adversarial nature.…”
Section: On the Uniform Convergence Properties Of The Acceleratedmentioning
confidence: 99%
See 1 more Smart Citation
“…Irrespective of the state space selection, under convexity and suitable smoothness assumptions on f , for p ≥ 2 solutions of (2) minimize the sub-optimality measuref (x 1 (t)) : [7]. Under further conditions on f it can also be established that x(t) converges to x * [12]. However, while this type of convergence results are instrumental for the understanding of system (3), they do not provide information related to the robustness properties of the system under small but persistent time-varying disturbances, which could be of adversarial nature.…”
Section: On the Uniform Convergence Properties Of The Acceleratedmentioning
confidence: 99%
“…Theorem 3.1: Suppose that Assumption 3.1 holds and consider the HAND-1. Then, the following holds: (a) Every maximal solution is complete and the set A, given by (12), is UGAS. (b) For each δ ∈ R >0 and each compact set K 0 ⊂ R 2n there exists an ε * ∈ R >0 and a T ∈ R >0 such that for every perturbation e(t) satisfying sup t |e(t)| ≤ ε * and every initial condition z(0, 0) ∈ K 0 × [T min , T max ] the solutions of the perturbed dynamics (9) satisfy |z(t, j)| A ≤ δ for all (t, j) ∈ dom(z) such that t + j ≥ T .…”
Section: A Hybrid Regularization For Radially Unbounded Convex Functmentioning
confidence: 99%
“…For a deeper discussion of this matter, see e.g. [1,28] as well as [21,22] and the many references therein.…”
Section: The Heavy Ball Perturbationmentioning
confidence: 99%
“…Incorporating previous gradient information alleviates zigzagging behavior compared to methods, which only use current gradient information. In recent years authors demonstrated that many optimization algorithms of this kind can be seen as a discretization of the trajectories of differential equations derived from the field of continuous dynamical systems, e.g., [1,24,28,31]. It is shown that inertial approaches are very effective and demonstrate good convergence properties.…”
Section: Introductionmentioning
confidence: 99%
“…The dominant iterative regularization method for solving (1) should be the Landweber method, given by f δ k+1 = f δ k + ∆tK * (y δ − Kf δ k ), ∆t ∈ (0, 2/ K * K ) (k = 0, 1, 2...) (2) with some starting element f 0 ∈ Q, where K * denotes the adjoint operator of K. The continuous analog to (2) as ∆t tends to zero is known as asymptotic regularization or Showalter's method (see, e.g., [24,25]). It is in the form of a first order evolution equatioṅ…”
Section: Introductionmentioning
confidence: 99%