2021 European Control Conference (ECC) 2021
DOI: 10.23919/ecc54610.2021.9654966
|View full text |Cite
|
Sign up to set email alerts
|

Local linear convergence of stochastic higher-order methods for convex optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(9 citation statements)
references
References 12 publications
0
9
0
Order By: Relevance
“…, w r , and use the same decorrelation procedure as in FastICA. For this stochastic method one can derive the following convergence and descent results (see [11], [12]). Theorem 1: Assume that the functions f i have the second derivatives Lipschitz over the unit ball B.…”
Section: B Stochastic Second-order Taylor-based Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…, w r , and use the same decorrelation procedure as in FastICA. For this stochastic method one can derive the following convergence and descent results (see [11], [12]). Theorem 1: Assume that the functions f i have the second derivatives Lipschitz over the unit ball B.…”
Section: B Stochastic Second-order Taylor-based Methodsmentioning
confidence: 99%
“…FastICA is one of the most used ICA algorithms, which is based on full batch Newton type iterations [7]. The second algorithm is a stochastic Newton type method with a proper cubic regularization term that guarantees descent, originally developed in the paper [11].…”
Section: Optimization Algorithms For Icamentioning
confidence: 99%
See 3 more Smart Citations