2006 IEEE International Conference on Acoustics Speed and Signal Processing Proceedings
DOI: 10.1109/icassp.2006.1661352
|View full text |Cite
|
Sign up to set email alerts
|

New Algorithms for Non-Negative Matrix Factorization in Applications to Blind Source Separation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
120
0

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 152 publications
(121 citation statements)
references
References 3 publications
1
120
0
Order By: Relevance
“…Kullback-Leibler (KL) divergence between V and W H, denoted D(V ||W H), was found to work well for audio source separation in [2], so we will restrict ourselves to KL divergence in this paper. Generalization to other objective functions using the techniques described in [4] is straightforward.…”
Section: Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…Kullback-Leibler (KL) divergence between V and W H, denoted D(V ||W H), was found to work well for audio source separation in [2], so we will restrict ourselves to KL divergence in this paper. Generalization to other objective functions using the techniques described in [4] is straightforward.…”
Section: Algorithmmentioning
confidence: 99%
“…During training, we assume availability of a clean speech spectrogram, V speech , of size n f × nst, and a clean (speech-free) noise spectrogram, Vnoise, of size n f × nnt, where n f is the number of frequency bins, nst is the number of speech frames, and nnt is the number of noise frames. Different objective functions lead to different variants of NMF, a number of which are described in [4]. Kullback-Leibler (KL) divergence between V and W H, denoted D(V ||W H), was found to work well for audio source separation in [2], so we will restrict ourselves to KL divergence in this paper.…”
Section: Algorithmmentioning
confidence: 99%
“…There are several ways to achieve some control of the sparsity. In this paper, we follow the approach proposed in [7] and [9] for KL cost functions, in which the NMF is regularized using non-linear projections based on (3). Applying this procedure, the regularized learning rules are the following,…”
Section: Non-negative Matrix Factorization (Nmf)mentioning
confidence: 99%
“…However, recent NMF-based techniques in speech processing report better results by using NMF with KL divergence [6], [4]. For this reason, in this paper, we propose a NMF-based method for speech denoising which combines the use of the KL divergence with sparseness constraints following the procedure described in [7]. This paper is organized as follows: Section 2 introduces the mathematical background of NMF; in Section 3 we present the speech denoising process using NMF.…”
Section: Introductionmentioning
confidence: 99%
“…Atoms within the same group must co-occur by definition. Along with the co-occurrence constraints, to improve the likelihood of co-occurrence within groups, we also impose a smoothness constraint on S using established NMF modifications [7,8]. Fig.…”
Section: That Ssmentioning
confidence: 99%