2007
DOI: 10.1162/neco.2007.19.10.2756
|View full text |Cite
|
Sign up to set email alerts
|

Projected Gradient Methods for Nonnegative Matrix Factorization

Abstract: Non-negative matrix factorization (NMF) can be formulated as a minimization problem with bound constraints. Although bound-constrained optimization has been studied extensively in both theory and practice, so far no study has formally applied its techniques to NMF. In this paper, we propose two projected gradient methods for NMF, both of which exhibit strong optimization properties.We discuss efficient implementations and demonstrate that one of the proposed methods converges faster than the popular multiplica… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
1,068
0
6

Year Published

2009
2009
2023
2023

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 1,481 publications
(1,110 citation statements)
references
References 14 publications
2
1,068
0
6
Order By: Relevance
“…The non-negativity constraint generally leads to a sparse, part-based representation of the original data set, which is often semantically more meaningful than other factorization methods. While it is difficult to find the optimal solution of NMF because of the non-convexity of the objective function, several efficient approximation algorithms exist (e.g., multiplicative update [4], projective gradient decent [5]). Here we use multiplicative update because of its simplicity.…”
Section: B Feature Groupingmentioning
confidence: 99%
“…The non-negativity constraint generally leads to a sparse, part-based representation of the original data set, which is often semantically more meaningful than other factorization methods. While it is difficult to find the optimal solution of NMF because of the non-convexity of the objective function, several efficient approximation algorithms exist (e.g., multiplicative update [4], projective gradient decent [5]). Here we use multiplicative update because of its simplicity.…”
Section: B Feature Groupingmentioning
confidence: 99%
“…is the Frobenius inner product. Parameters β and σ in our experiments have been set to β = 0.1 and σ = 0.01, which is an efficient parameter selection, as has been verified in other studies [24], [25].…”
Section: ) Maximum Margin Projection Matrix Update For Fixed W Omentioning
confidence: 80%
“…An efficient approach for setting an appropriate value to the learning step parameter λ t based on the Armijo rule [23] is presented in [24], which is also adopted in this work. According to this strategy, the learning step takes the form λ t = β g t , where g t is the first non-negative integer value found satisfying: (15) where operator ., .…”
Section: ) Maximum Margin Projection Matrix Update For Fixed W Omentioning
confidence: 99%
“…, 2006). In our work, we applied the algorithm of projected gradient method for NMF presented by Chih-Jen Lin (Lin, 2007). Due to its good convergence, this algorithm works much faster than the multiplicative method presented by Lee and Seung (Lee & Seung 2001).…”
Section: Non-negative Matrix Factorization In Briefmentioning
confidence: 99%