2008
DOI: 10.1016/j.patcog.2007.09.010
|View full text |Cite
|
Sign up to set email alerts
|

SVD based initialization: A head start for nonnegative matrix factorization

Abstract: This article was published in an Elsevier journal. The attached copy is furnished to the author for non-commercial research and education use, including for instruction at the author's institution, sharing with colleagues and providing to institution administration.Other uses, including reproduction and distribution, or selling or licensing copies, or posting to personal, institutional or third party websites are prohibited.In most cases authors are permitted to post their version of the article (e.g. in Word … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
406
0
3

Year Published

2009
2009
2023
2023

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 589 publications
(434 citation statements)
references
References 28 publications
1
406
0
3
Order By: Relevance
“…As found out, the trick is to seek the strategy amongst other constrained low-rank factorization schemes like clustering and SVD. Boutsidis and Gallopoulos (2008) suggested the NNDSVD, an algorithm that contains no randomization and that can readily be combined with all existing NMF algorithms.…”
Section: Nmf Using Projected Gradient Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…As found out, the trick is to seek the strategy amongst other constrained low-rank factorization schemes like clustering and SVD. Boutsidis and Gallopoulos (2008) suggested the NNDSVD, an algorithm that contains no randomization and that can readily be combined with all existing NMF algorithms.…”
Section: Nmf Using Projected Gradient Methodsmentioning
confidence: 99%
“…This truncation creates a matrix V r that is the best approximation of V of rank at most r in terms of the Frobenius norm (according to the Schmidt-Eckart-Young-Mirsky theory, see also Boutsidis and Gallopoulos, 2008). Considering the non-negativity constraint, each C (i) in (9) must be approximated by its non-negative section C (i)+ (i.e.…”
Section: Nmf Using Projected Gradient Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The sensitivity of NMF-like algorithms to the choice of initial factors has been noted by a number of authors [12,13]. While stochastic initialization is widely used in this context, ideally we would like to produce a single integrated clustering without requiring multiple runs of the integration process.…”
Section: Integration By Matrix Factorizationmentioning
confidence: 99%
“…While stochastic initialization is widely used in this context, ideally we would like to produce a single integrated clustering without requiring multiple runs of the integration process. Therefore to initialize the integration process, we populate the pair (P, H) by employing the deterministic NNDSVD strategy described by Boutsidis & Gallopoulos [13]. This strategy applies two sequential SVD processes to the matrix X to produce a pair of initial factors.…”
Section: Integration By Matrix Factorizationmentioning
confidence: 99%