The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2015
DOI: 10.1007/s10589-015-9812-y
|View full text |Cite
|
Sign up to set email alerts
|

On how to solve large-scale log-determinant optimization problems

Abstract: We propose a proximal augmented Lagrangian method and a hybrid method, i.e., employing the proximal augmented Lagrangian method to generate a good initial point and then employing the Newton-CG augmented Lagrangian method to get a highly accurate solution, to solve large-scale nonlinear semidefinite programming problems whose objective functions are a sum of a convex quadratic function and a log-determinant term. We demonstrate that the algorithms can supply a high quality solution efficiently even for some il… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
3
3

Relationship

1
5

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 39 publications
0
5
0
Order By: Relevance
“…Accordingly, many algorithms for solving optimization problems including the logdet function have been studied extensively so far. For example, see [16,17,18].…”
Section: Introductionmentioning
confidence: 99%
“…Accordingly, many algorithms for solving optimization problems including the logdet function have been studied extensively so far. For example, see [16,17,18].…”
Section: Introductionmentioning
confidence: 99%
“…They demonstrated that regularization using • 2 or • ∞ norms instead of • 1 in (P) are more suitable for the structured models/problems. Wang [17] first generated an initial point using the proximal augmented Lagrangian method, then applied the Newton-CG augmented Lagrangian method to problems with an additional convex quadratic term in (P). Li and Xiao [13] employed the symmetric Gauss-Seidel-type ADMM in the same framework of [18].…”
Section: Introductionmentioning
confidence: 99%
“…A more recent work by Zhang et al [21] shows that (P) with simple constraints as X ij = 0 for (i, j) ∈ Ω can be converted into a more computationally tractable problem for large values of ρ. Among the methods mentioned here, only the methods discussed in [18,19,17] can handle problems as general as (P).…”
Section: Introductionmentioning
confidence: 99%
“…Moreover, below we introduce two standard assumptions for the log-determinant problem [37] and Gibbs sampler [54] respectively. Assumption 4.2.…”
Section: Theoretical Resultsmentioning
confidence: 99%
“…The augmented Lagrangian method is exploited in [37] to solve the log-determinant optimization. It combines the proximal augmented Lagrangian and Newton-CG augmented Lagrangian as a hybrid approach.…”
Section: Related Workmentioning
confidence: 99%