2016
DOI: 10.1109/tsp.2016.2602803
|View full text |Cite
|
Sign up to set email alerts
|

Weighted ADMM for Fast Decentralized Network Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
40
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 41 publications
(40 citation statements)
references
References 36 publications
0
40
0
Order By: Relevance
“…Lemma IV.1. [First-order Optimality Condition [16]] Under Assumptions 1 and 2, the following two statements are equivalent:…”
Section: Convergence Of Non-private Mr-admmmentioning
confidence: 99%
See 1 more Smart Citation
“…Lemma IV.1. [First-order Optimality Condition [16]] Under Assumptions 1 and 2, the following two statements are equivalent:…”
Section: Convergence Of Non-private Mr-admmmentioning
confidence: 99%
“…Existing approaches to decentralizing the above problem primarily consist of subgradient-based algorithms [7]- [9] and ADMM-based algorithms [10]- [16]. It has been shown that ADMM-based algorithms can converge at the rate of O( 1 k )…”
Section: Introductionmentioning
confidence: 99%
“…A distributed algorithm combining a linearization approach with ADMM has been proposed in [109], while quadratic approximations have been explored in [110]. A fast distributed ADMM algorithm for quadratic problems is devised in [111]. A more general ADMM framework is considered in [112], where an explicit converge rate has been provided.…”
Section: Discussion and Referencesmentioning
confidence: 99%
“…This problem has found wide applications in various domains, ranging from rendezvous in multi-agent systems [1], support vector machine [2] and classification [3] in machine learning, source localization in sensor networks [4], to data regression in statistics [5], [6]. To solve the optimization problem (1) in an decentralized manner, different algorithms were proposed in recent years, including the distributed (sub)gradient algorithm [7], augmented Lagrangian methods (ALM) [8], and the alternating direction method of multipliers (ADMM) as well as its variants [8]- [11]. Among existing approaches, ADMM has attracted tremendous attention due to its wide applications [9] and fast convergence rate in both primal and dual iterations [11].…”
Section: Introductionmentioning
confidence: 99%