2016
DOI: 10.1007/s40314-016-0371-3
|View full text |Cite
|
Sign up to set email alerts
|

A class of customized proximal point algorithms for linearly constrained convex optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
26
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 30 publications
(26 citation statements)
references
References 26 publications
0
26
0
Order By: Relevance
“…Obviously, the efficiency of ALM heavily depends on the solvability of the x-subproblem, that is, whether or not the core x-subproblem has closed-form solution. Unfortunately, in many real applications [1,3,8,9], the coefficient matrix A is not identity matrix (or does not satisfies AA T = I m ), which makes it difficult even infeasible for solving this subproblem of ALM. To overcome such difficulty, Yang and Yuan [14] proposed a linearized ALM aiming at linearizing the x-subproblem such that its closed-form solution can be easily derived.…”
Section: Introductionmentioning
confidence: 99%
“…Obviously, the efficiency of ALM heavily depends on the solvability of the x-subproblem, that is, whether or not the core x-subproblem has closed-form solution. Unfortunately, in many real applications [1,3,8,9], the coefficient matrix A is not identity matrix (or does not satisfies AA T = I m ), which makes it difficult even infeasible for solving this subproblem of ALM. To overcome such difficulty, Yang and Yuan [14] proposed a linearized ALM aiming at linearizing the x-subproblem such that its closed-form solution can be easily derived.…”
Section: Introductionmentioning
confidence: 99%
“…Recently, Cai et al [2] designed a PPA with a relaxation step for the model (1.1) with p = 2, whose global convergence and the worst-case sub-linear convergence rate were analyzed in detail. More recently, by introducing some parameters to the metric proximal matrix, an extended parameterized PPA based on [15] was developed for the two block separable convex programming [1], whose effectiveness and robustness was demonstrated by testing a sparse vector optimization problem in the statistical learning compared with two popular algorithms.…”
Section: Introductionmentioning
confidence: 99%
“…Cai et al [5] also proposed a R-PPA for solving (1) and analyzed its global convergence with a worst-case linear convergence rate. Based on the results of [5,11,15], Ma and Ni [23] recently revisited the application of PPA for solving the basis pursuit and matrix completion problem. Our new proposed Parameterized PPA (P-PPA) can be actually regarded as more general extensions of the algorithms developed in [15] and [23] which does not make use of the separable structure of the objective function in (1).…”
Section: Introductionmentioning
confidence: 99%
“…Major contributions of this paper are summarized in the following. Firstly, the proximal matrix in our proposed P-PPA is more general and flexible than those in the previous work [15,23], due to more induced parameters to take consideration of the problem structure instead of a unique objective function. Secondly, by properly choosing the algorithm parameters, the new P-PPA could significantly outperform some state-of-the-art methods, such as ADMM [1] and R-PPA [11], for solving the separable convex optimization, especially when the problem size is large and high accurate solutions are required.…”
Section: Introductionmentioning
confidence: 99%