2013
DOI: 10.1016/j.neucom.2012.10.014
|View full text |Cite
|
Sign up to set email alerts
|

Low-rank quadratic semidefinite programming

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2014
2014
2016
2016

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 26 publications
0
5
0
Order By: Relevance
“…In every iteration, one can pick the coordinate of the largest real value in the descent direction −∇f ( w) and greedily decrease the objective function. Similar to the low-rank factorization M = LL T , one can employ the non-negative representation w = v ⊙ v and refine v using some efficient non-convex optimization approaches [2]. 7.…”
Section: Connection With Shalev-shwartz Et Al's Algorithm: Shalev-shmentioning
confidence: 99%
See 3 more Smart Citations
“…In every iteration, one can pick the coordinate of the largest real value in the descent direction −∇f ( w) and greedily decrease the objective function. Similar to the low-rank factorization M = LL T , one can employ the non-negative representation w = v ⊙ v and refine v using some efficient non-convex optimization approaches [2]. 7.…”
Section: Connection With Shalev-shwartz Et Al's Algorithm: Shalev-shmentioning
confidence: 99%
“…We use BILGO-KTA and BILGO-LMNN to denote the BILGO solver for the two metric learning optimization problems respectively. Moreover, we use "L-bfgs + exact line search" as it is suggested in [2] to improve the intermediate result in every 5 iterations of BILGO-KTA, giving rise to its local search version BILGO-KTA-LS.…”
Section: Accuracy and Efficiency On Metric Learningmentioning
confidence: 99%
See 2 more Smart Citations
“…Meanwhile, there are several methods available for solving some special case of (1), including the inexact interior-point methods [4,5,9,10], the alternating direction methods [7,22,30], the quasi-Newton method [11], the inexact semi-smooth Newton-CG method [21], inexact accelerated proximal gradient (IAPG) method [20] and other methods [8,24].…”
Section: Introductionmentioning
confidence: 99%