2016
DOI: 10.48550/arxiv.1611.01146
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Finding Approximate Local Minima Faster than Gradient Descent

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
22
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(23 citation statements)
references
References 0 publications
1
22
0
Order By: Relevance
“…Reference Oracle Iterations Simplicity Non-stochastic [1,6] Hessian-vector product Õ(log n/ It is worth highlighting that our gradient-descent based algorithm enjoys the following nice features:…”
Section: Settingmentioning
confidence: 99%
See 3 more Smart Citations
“…Reference Oracle Iterations Simplicity Non-stochastic [1,6] Hessian-vector product Õ(log n/ It is worth highlighting that our gradient-descent based algorithm enjoys the following nice features:…”
Section: Settingmentioning
confidence: 99%
“…Similarly to Algorithm 1 and Algorithm 2, the renormalization step Line 6 in Algorithm 3 only guarantees that the value y t would not scales exponentially during the algorithm, and does not affect z 0 ← 0; for t = 1, ..., T s do Sample θ (1) , θ (2) , • • • , θ (m) ∼ D;…”
Section: C2 Proof Details Of Escaping Saddle Points Using Algorithmmentioning
confidence: 99%
See 2 more Smart Citations
“…Queries Oracle [24,56] O(1/ǫ 1.5 ) Hessian [1,15] Õ(log n/ǫ 1.75 ) Hessian-vector product [42,43] Õ(log 4 n/ǫ 2 ) Gradient [44] Õ(log 6 n/ǫ 1.75 ) Gradient this work Õ(log 2 n/ǫ 1.75 ) Quantum evaluation Table 1: A summary of the state-of-the-art works on finding approximate second-order stationary points using different oracles. The query complexities are highlighted in terms of the dimension n and the error ǫ.…”
Section: Referencementioning
confidence: 99%