2016
DOI: 10.1063/1.4965314
|View full text |Cite
|
Sign up to set email alerts
|

Scaled first–order methods for a class of large–scale constrained least square problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
4
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 17 publications
0
4
0
Order By: Relevance
“…Proposition 3.1 ensures that SGP algorithm converges without restrictive assumptions on the step-length parameter α k and the diagonal scaling matrix D k , whose choices can be directed to accelerate the convergence rate of the scheme. Even if the theoretical convergence rate O(1/k) on the objective function values is lower than the rate O(1/k 2 ) of some optimal first-order methods exploiting extrapolation/inertial steps [28,5,2,21], the practical performance of SGP method, achievable by suitable selections of D k and α k , is very well comparable with the convergence rate of the optimal algorithms [9,30,8,13]. In the following we provide the updating rules for D k and α k that allow SGP to efficiently solve problem (5).…”
Section: A Scaled Gradient Approach For Ct Image Reconstructionmentioning
confidence: 98%
See 1 more Smart Citation
“…Proposition 3.1 ensures that SGP algorithm converges without restrictive assumptions on the step-length parameter α k and the diagonal scaling matrix D k , whose choices can be directed to accelerate the convergence rate of the scheme. Even if the theoretical convergence rate O(1/k) on the objective function values is lower than the rate O(1/k 2 ) of some optimal first-order methods exploiting extrapolation/inertial steps [28,5,2,21], the practical performance of SGP method, achievable by suitable selections of D k and α k , is very well comparable with the convergence rate of the optimal algorithms [9,30,8,13]. In the following we provide the updating rules for D k and α k that allow SGP to efficiently solve problem (5).…”
Section: A Scaled Gradient Approach For Ct Image Reconstructionmentioning
confidence: 98%
“…Following the suggestions in [8,13], the parameter ρ k+1 is chosen as ρ k+1 = 1 + 10 15 /(k + 1) 2.1 .…”
Section: A Scaled Gradient Approach For Ct Image Reconstructionmentioning
confidence: 99%
“…Downloaded 05/03/20 to 129.175.97.14. Redistribution subject to SIAM license or copyright; see http://www.siam.org/journals/ojsa.php It has been shown in [13] and [8] that the presence of a variable metric in firstorder methods can significantly improve the performance in solving problem (5.10) with respect to their standard nonscaled versions. For this reason we only report the results obtained by comparing the variable metric GP method SGP with different choices for the step length parameter.…”
Section: Reconstruction Of Fiber Orientation Distribution In Diffusio...mentioning
confidence: 99%
“…In this work, we propose to solve (1) by an accelerated gradient scheme belonging to the class of Scaled Gradient Projection (SGP) methods 18,19 . The SGP methods have been recently applied in low-sampled X-rays cone beam CT (CBCT) image reconstruction, with very good results in terms of image accuracy 13,14,20 . In particular, in 14 the authors proposed a SGP method for X-rays CT image reconstruction and applied it to a phantom simulation using a geometry different from DBT limited angles.…”
mentioning
confidence: 99%