2017
DOI: 10.48550/arxiv.1705.03952
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Superlinearly Convergent Asynchronous Distributed Network Newton Method

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
10
0

Year Published

2017
2017
2019
2019

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(10 citation statements)
references
References 22 publications
0
10
0
Order By: Relevance
“…One example of local but not global strong convexity is the objective function of logistic regression [1], i.e., [29] is quasi-Newton and therefore allows each f i to be a general convex function, yet it requires, like other second-order algorithms in [25], [27], [28], [30]- [33], each ∇f i to be (globally) Lipschitz continuous, which is unnecessary for problem (1) under Assumption 1. Moreover, the local Lipschitz continuity of ∇ 2 f i in Assumption 1 is weaker than the three times continuous differentiability of f i in [33] and the (global) Lipschitz continuity of ∇ 2 f i in [25], [26], [30], [31].…”
Section: B In-network Optimizationmentioning
confidence: 99%
See 3 more Smart Citations
“…One example of local but not global strong convexity is the objective function of logistic regression [1], i.e., [29] is quasi-Newton and therefore allows each f i to be a general convex function, yet it requires, like other second-order algorithms in [25], [27], [28], [30]- [33], each ∇f i to be (globally) Lipschitz continuous, which is unnecessary for problem (1) under Assumption 1. Moreover, the local Lipschitz continuity of ∇ 2 f i in Assumption 1 is weaker than the three times continuous differentiability of f i in [33] and the (global) Lipschitz continuity of ∇ 2 f i in [25], [26], [30], [31].…”
Section: B In-network Optimizationmentioning
confidence: 99%
“…Comparison in convergence rates: We compare the convergence rate results of DEAN and the existing decentralized second-order methods [25]- [33]. Like the inexact secondorder methods D-BFGS [29], NN [30], ANN [31], and DQN [32], the particular DEAN in Theorem 3 linearly converges to a suboptimal solution. We also provide an explicit error bound for this suboptimal solution, which is near zero when the step-sizes are very small, whereas [29]- [32] do not offer such bounds.…”
Section: Convergence Analysismentioning
confidence: 99%
See 2 more Smart Citations
“…While this technique cannot be used directly in distributed optimization due to the non-sparsity of the Hessian inverse, there exist ways of using second order information to approximate the Newton step for distributed settings. This has been done for consensus optimization problems reformulated as both the penalty-based methods [17], [24] and dual-based methods [20], as well as the more recent primal-dual methods [22], [25]. These approximate Newton methods exhibit faster convergence relative to their corresponding first order methods.…”
Section: Introductionmentioning
confidence: 99%