2019
DOI: 10.1109/tifs.2018.2855169
|View full text |Cite
|
Sign up to set email alerts
|

ADMM Based Privacy-Preserving Decentralized Optimization

Abstract: This paper considers the problem of privacypreservation in decentralized optimization, in which N agents cooperatively minimize a global objective function that is the sum of N local objective functions. We assume that each local objective function is private and only known to an individual agent. To cooperatively solve the problem, most existing decentralized optimization approaches require participating agents to exchange and disclose estimates to neighboring agents. However, this results in leakage of priva… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
104
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 157 publications
(108 citation statements)
references
References 69 publications
0
104
0
Order By: Relevance
“…Some of them employ cryptography-based methods in the protocol to hide the private information [32]- [35]. A recent work [34] uses partially homomorphic cryptography in ADMM-based distributed learning to preserve data privacy but the proposed approach cannot protect the information leakage of the private user data from the final learned models. In contrast, our approach provides differential privacy in the final trained machine learning models.…”
Section: Related Workmentioning
confidence: 99%
“…Some of them employ cryptography-based methods in the protocol to hide the private information [32]- [35]. A recent work [34] uses partially homomorphic cryptography in ADMM-based distributed learning to preserve data privacy but the proposed approach cannot protect the information leakage of the private user data from the final learned models. In contrast, our approach provides differential privacy in the final trained machine learning models.…”
Section: Related Workmentioning
confidence: 99%
“…Theorem III.1. [Sufficient Condition] Consider the modified ADMM defined by (20) (21). Let {f (t), Λ(t)} be outputs in each iteration and {f * , Λ * } a pair satisfying (22)(23).…”
Section: B Convergence Analysismentioning
confidence: 99%
“…Existing approaches to decentralizing the above problem primarily consist of subgradient-based algorithms [7]- [10] and ADMM-based algorithms [11]- [21]. It has been shown that ADMM-based algorithms can converge at the rate of O( 1 k ) while subgradient-based algorithms typically converge at the rate of O( 1 √ k ), where k is the number of iterations [16].…”
Section: Introductionmentioning
confidence: 99%
“…Hence, these cryptographic approaches are not suitable for distributed optimization which typically needs many iterations to converge. Indeed, [44] is the only work that we are aware of studying privacy-preserving decentralized optimization. Different from their proposed algorithm that based on ADMM and partially homomorphic cryptography, our construction of the privacypreserving part is non-cryptographic and simpler.…”
Section: B Numerical Resultsmentioning
confidence: 99%