2017
DOI: 10.1109/tsp.2016.2620967
|View full text |Cite
|
Sign up to set email alerts
|

Online Nonnegative Matrix Factorization With Outliers

Abstract: Abstract-We propose a unified and systematic framework for performing online nonnegative matrix factorization in the presence of outliers. Our framework is particularly suited to large-scale data. We propose two solvers based on projected gradient descent and the alternating direction method of multipliers. We prove that the sequence of objective values converges almost surely by appealing to the quasi-martingale convergence theorem. We also show the sequence of learned dictionaries converges to the set of sta… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
39
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 33 publications
(39 citation statements)
references
References 106 publications
0
39
0
Order By: Relevance
“…To solve (2) by considering each batch sequentially, we leverage the stochastic majorization-minimization framework [31], in which solving (2) is separated into two steps: (1) learning H , namely non-negative encoding, and (2) updating W , namely dictionary update. Specifically, given a current batch V q and previous optimized dictionary W q−1 , we first learn coefficient matrix H q by…”
Section: Online Nmfmentioning
confidence: 99%
“…To solve (2) by considering each batch sequentially, we leverage the stochastic majorization-minimization framework [31], in which solving (2) is separated into two steps: (1) learning H , namely non-negative encoding, and (2) updating W , namely dictionary update. Specifically, given a current batch V q and previous optimized dictionary W q−1 , we first learn coefficient matrix H q by…”
Section: Online Nmfmentioning
confidence: 99%
“…These algorithms can be gathered into two main categories, depending on the considered assumptions on the endmembers. In [22,23,24,25], the endmembers do not vary from one sample to another, while in [26,27,28,29,30,31], the endmembers may evolve between successive samples. In particular, the Incremental NMF (INMF) [26] considers that the endmembers evolve slowly between two consecutive acquisitions; this is now the most widely used assumption adopted in on-line NMF algorithms.…”
Section: On-line Nmf Methodsmentioning
confidence: 99%
“…The second contribution is algorithm design, in which we substitute the on-line multiplicative update rules by using optimization method based on Alternating Direction Method of Multipliers (ADMM) in [25]; ADMM proved its superiority over multiplicative updates with respect to both reconstruction accuracy and convergence rate [34,35,36].…”
Section: Main Contributionsmentioning
confidence: 99%
“…By revisiting (26), (27) and (29), it is obvious that we are essentially using hyper-parameters c and d to control the regularization parameter of the 1 -norm penalty term, which is not straightforward. A more simple and straightforward alternative is to alternate c and d by introducing a hyper-parameter γ to control the sparse matrix directly, which is in line with the cases in [7], [18], and [22]. In this setting, the optimization problem in (26) is rewritten as, (32) And the update for the sparse matrix (27) is rewritten as:…”
Section: F Simplification For Parameter Tuning and Computationmentioning
confidence: 99%
“…Previous studies have shown that NMF is sensitive to outliers in the data matrix [6]. The outliers with large errors, if not handled properly, will dominate the objective function and cause severe distortion in the learned bases [7]. Meanwhile, outliers deteriorate the performance of the pruning of trivial bases, which is important for the automatic estimation of the model order in the observation, thus it becomes much more challenging to determine the ground truth model order from the corrupted observations with outliers.…”
Section: Introductionmentioning
confidence: 99%