Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining 2022
DOI: 10.1145/3534678.3539379
|View full text |Cite
|
Sign up to set email alerts
|

Partial-Quasi-Newton Methods

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 19 publications
0
2
0
Order By: Relevance
“…We use the Euclidean norm λ(z) F(z) 2 to measure the convergence of our algorithm. The advantage of block updates in SR-k updates results a faster superlinear convergence than Liu and Luo [22]'s methods. Following the analysis of SR-k methods for convex optimization, we obtain the results for solving nonlinear equations as follows.…”
Section: A the Proofs In Sectionmentioning
confidence: 98%
See 1 more Smart Citation
“…We use the Euclidean norm λ(z) F(z) 2 to measure the convergence of our algorithm. The advantage of block updates in SR-k updates results a faster superlinear convergence than Liu and Luo [22]'s methods. Following the analysis of SR-k methods for convex optimization, we obtain the results for solving nonlinear equations as follows.…”
Section: A the Proofs In Sectionmentioning
confidence: 98%
“…where f : R d → R is smooth and strongly self-concordant. Quasi-Newton methods [2,3,4,6,8,33,36] are widely recognized for their fast convergence rates and efficient updates, which attracts growing attention in many fields such as statistics [1,16,37], economics [20,24] and machine learning [12,15,19,22,23]. Unlike standard Newton methods which need to compute the Hessian and its inverse, quasi-Newton methods go along the descent direction by the following scheme…”
Section: Introductionmentioning
confidence: 99%