2021
DOI: 10.3390/math9091006
|View full text |Cite
|
Sign up to set email alerts
|

A Review on Initialization Methods for Nonnegative Matrix Factorization: Towards Omics Data Experiments

Abstract: Nonnegative Matrix Factorization (NMF) has acquired a relevant role in the panorama of knowledge extraction, thanks to the peculiarity that non-negativity applies to both bases and weights, which allows meaningful interpretations and is consistent with the natural human part-based learning process. Nevertheless, most NMF algorithms are iterative, so initialization methods affect convergence behaviour, the quality of the final solution, and NMF performance in terms of the residual of the cost function. Studies … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
3
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 28 publications
(10 citation statements)
references
References 79 publications
(99 reference statements)
0
7
0
Order By: Relevance
“…Reproducibility is yet another challenge for NMF. Alternating least squares requires an initialization, such as a random or non-negative double SVD model (NNDSVD) (Esposito, 2021). While NNDSVD is "robust", it differs fundamentally in nature from NMF, and any non-random initialization can trap updates into a local minimum even if random noise is added to the model and zeros are filled.…”
Section: Discussionmentioning
confidence: 99%
“…Reproducibility is yet another challenge for NMF. Alternating least squares requires an initialization, such as a random or non-negative double SVD model (NNDSVD) (Esposito, 2021). While NNDSVD is "robust", it differs fundamentally in nature from NMF, and any non-random initialization can trap updates into a local minimum even if random noise is added to the model and zeros are filled.…”
Section: Discussionmentioning
confidence: 99%
“…The non‐negative constraint is particularly useful for facilitating the interpretation of latent factors within spectral data. The majority of NMF methods are iterative and converge to a local minima; however, the initialisation of the algorithm is important in determining the outputs, and random initialisation can influence the convergence and stability of the final solution [ 31 ]. Herein, we used a method shown to generate sparse initial factors [ 25 ], although other approaches are available [ 31 ].…”
Section: Resultsmentioning
confidence: 99%
“…Our implementation of NMF is a modification of the implementation found in Python's Scikit-Learn package (41). We manually initialize W and H, as opposed to the automated method found in the package, due to the wide variety of initialization methods possible for NMF (42,43), and to maintain consistent, precise, and easily reproduceable control over the initial conditions of our models. For the analyses found in this paper, we apply non-negative dual singular value decomposition (nndsvd), a consistent and efficient initialization method (44).…”
Section: Dimensionality Reduction (Dr) Methodsmentioning
confidence: 99%