2013
DOI: 10.48550/arxiv.1309.3529
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

An Inexact Successive Quadratic Approximation Method for Convex L-1 Regularized Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
10
0

Year Published

2014
2014
2015
2015

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(12 citation statements)
references
References 0 publications
2
10
0
Order By: Relevance
“…Combining the two inequalities gives the desired result. The proof of the next result mimics the analysis of Byrd et al [5]. Lemma 3.9.…”
Section: 4supporting
confidence: 65%
See 1 more Smart Citation
“…Combining the two inequalities gives the desired result. The proof of the next result mimics the analysis of Byrd et al [5]. Lemma 3.9.…”
Section: 4supporting
confidence: 65%
“…Recently, Byrd et al [5] analyze the inexact proximal Newton method with a more stringent adaptive stopping condition…”
Section: Inexact Proximalmentioning
confidence: 99%
“…Based on this result, it is not difficult to prove that if γ k is chosen by a suitable line-search procedure on V , convergence of Algorithm 1 to stationary points of Problem (1) (in the sense of Theorem 1) is guaranteed. Note that standard line-search methods proposed for smooth functions cannot be applied to V (due to the nonsmooth part G); one needs to rely on more sophisticated procedures, e.g., in the spirit of those proposed in [6], [10], [12], [37]. We provide next an example of linesearch rule that can be used to compute γ k while guaranteeing convergence of Algorithms 1 and 2 [instead of using rules i)iii) in Theorem 1]; because of space limitations, we consider only the case of exact solutions (i.e., ǫ k i = 0 in Algorithm 1 and ǫ k pi = 0 in Algorithms 2 and 3).…”
Section: Examples and Special Casesmentioning
confidence: 99%
“…Of course, the above procedure will likely be more efficient in terms of iterations than the one based on diminishing step-size rules [as (6)]. However, performing a line-search on a multicore architecture requires some shared memory and coordination among the cores/processors; therefore we do not consider further this variant.…”
Section: Examples and Special Casesmentioning
confidence: 99%
“…And indeed, recent years have witnessed a flurry of research activity aimed at developing solution methods that are simple (for example based solely on matrix/vector multiplications) but yet capable to converge to a good approximate solution in reasonable time. It is hardly possible here to even summarize the huge amount of work done in this field; we refer the reader to the recent works [1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19] as entry points to the literature.…”
Section: Introductionmentioning
confidence: 99%