The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
Proceedings of the 18th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining 2012
DOI: 10.1145/2339530.2339582
|View full text |Cite
|
Sign up to set email alerts
|

Fast bregman divergence NMF using taylor expansion and coordinate descent

Abstract: Non-negative matrix factorization (NMF) provides a lower rank approximation of a matrix. Due to nonnegativity imposed on the factors, it gives a latent structure that is often more physically meaningful than other lower rank approximations such as singular value decomposition (SVD). Most of the algorithms proposed in literature for NMF have been based on minimizing the Frobenius norm. This is partly due to the fact that the minimization problem based on the Frobenius norm provides much more flexibility in alge… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
32
0

Year Published

2013
2013
2019
2019

Publication Types

Select...
3
3
1

Relationship

1
6

Authors

Journals

citations
Cited by 37 publications
(32 citation statements)
references
References 29 publications
(46 reference statements)
0
32
0
Order By: Relevance
“…Depending on the probabilistic model of the underlying data, NMF can be formulated with various divergences. Formulations and algorithms based on Kullback-Leibler divergence [67,79], Bregman divergence [24,68], Itakura-Saito divergence [29], and Alpha and Beta divergences [21,22] have been developed. For discussion on nonnegative rank as well as the geometric interpretation of NMF, see Lin and Chu [72], Gillis [34], and Donoho and Stodden [27].…”
Section: Conclusion and Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Depending on the probabilistic model of the underlying data, NMF can be formulated with various divergences. Formulations and algorithms based on Kullback-Leibler divergence [67,79], Bregman divergence [24,68], Itakura-Saito divergence [29], and Alpha and Beta divergences [21,22] have been developed. For discussion on nonnegative rank as well as the geometric interpretation of NMF, see Lin and Chu [72], Gillis [34], and Donoho and Stodden [27].…”
Section: Conclusion and Discussionmentioning
confidence: 99%
“…Matrices W and H are found by solving an optimization problem defined with Frobenius norm, Kullback-Leibler divergence [67,68], or other divergences [24,68]. In this paper, we focus on the NMF based on Frobenius norm, which is the most commonly used formulation:…”
Section: Introductionmentioning
confidence: 99%
“…In a generalized way, the Bregman divergence D 蠒 (D D ) is used as the objective function to be minimized [21,22]. Considering only separable Bregman divergences, (5) where 蠒(.)…”
Section: Factors Using Nnmfmentioning
confidence: 99%
“…To enable such a column-wise update even in ONMF, we derive a set of column-wise orthogonal constraints, taking into consideration both nonnegativity and orthogonality at the same time. Furthermore, we show that the columnwise orthogonal constraint can also be applied to column-wise update algorithms called scalar Block Coordinate Descent for solving Bregman divergence NMF (sBCD-NMF) (Li et al 2012) where the Frobenius norm in (1) is replaced with more general Bregman divergence (Li et al 2012). This sBCD-ONMF algorithm is the first algorithm to solve ONMF with Bregman divergence.…”
Section: Introductionmentioning
confidence: 99%
“…In Sect. 4, we incorporate the column-wise orthogonal constraint into sBCD-NMF proposed by Li et al (2012) in order to propose sBCD-ONMF algorithm. In Sect.…”
Section: Introductionmentioning
confidence: 99%