2017
DOI: 10.1109/tsp.2017.2679687
|View full text |Cite
|
Sign up to set email alerts
|

A Nonconvex Splitting Method for Symmetric Nonnegative Matrix Factorization: Convergence Analysis and Optimality

Abstract: Symmetric nonnegative matrix factorization (SymNMF) has important applications in data analytics problems such as document clustering, community detection, and image segmentation. In this paper, we propose a novel nonconvex variable splitting method for solving SymNMF. The proposed algorithm is guaranteed to converge to the set of Karush-Kuhn-Tucker (KKT) points of the nonconvex SymNMF problem. Furthermore, it achieves a global sublinear convergence rate. We also show that the algorithm can be efficiently impl… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
25
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 40 publications
(25 citation statements)
references
References 47 publications
0
25
0
Order By: Relevance
“…Developing stochastic optimization algorithms for both stages amenable to large-scale implementations is also pertinent. Generalizing SPARTA and our analytical results to robust sparse PR and matrix recovery with outliers constitute worthwhile future directions too [60,61,20].…”
Section: Discussionmentioning
confidence: 88%
“…Developing stochastic optimization algorithms for both stages amenable to large-scale implementations is also pertinent. Generalizing SPARTA and our analytical results to robust sparse PR and matrix recovery with outliers constitute worthwhile future directions too [60,61,20].…”
Section: Discussionmentioning
confidence: 88%
“…To address the optimization problem of the CDE model, inspired by Non-convex Spiting framework [Lu et al, 2017], we design an effective NS-Alternating algorithm. In this algorithm, we first reformulate the problem (6) and alternately solve the subproblems of the reformulation.…”
Section: Optimization: Ns-alternatingmentioning
confidence: 99%
“…L(H, V; Λ) is the augmented Lagrangian. Note that, proximal term ||V − V (t) || 2 F and penalty parameter ξ (t) are added according to the study [Lu et al, 2017]. The optimization w.r.t.…”
Section: Representation Matrix Subproblemmentioning
confidence: 99%
“…Then, according to Point 1, the step-size should show an upward trend during the iterative process. However, the traditional approach and some related approaches [29,30] all make the step-size increasingly smaller during the iterative process, which does not meet our requirements.…”
mentioning
confidence: 92%