2018
DOI: 10.3390/s18051371
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised Learning for Monaural Source Separation Using Maximization–Minimization Algorithm with Time–Frequency Deconvolution †

Abstract: This paper presents an unsupervised learning algorithm for sparse nonnegative matrix factor time–frequency deconvolution with optimized fractional β-divergence. The β-divergence is a group of cost functions parametrized by a single parameter β. The Itakura–Saito divergence, Kullback–Leibler divergence and Least Square distance are special cases that correspond to β=0, 1, 2, respectively. This paper presents a generalized algorithm that uses a flexible range of β that includes fractional values. It describes a … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
11
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 9 publications
(11 citation statements)
references
References 41 publications
0
11
0
Order By: Relevance
“…erefore, μ and λ need to be set appropriately so that the Mathematical Problems in Engineering final optimization result of equation (16) is consistent with equation (15). Direction 2: combining the distance metric function and the negative entropy contrast function, we reduce the constrained conditions and establish two forms of cost functions F(w), according to the different transformation forms of min ε(y, r):…”
Section: Enhanced Ica With Referencementioning
confidence: 99%
See 2 more Smart Citations
“…erefore, μ and λ need to be set appropriately so that the Mathematical Problems in Engineering final optimization result of equation (16) is consistent with equation (15). Direction 2: combining the distance metric function and the negative entropy contrast function, we reduce the constrained conditions and establish two forms of cost functions F(w), according to the different transformation forms of min ε(y, r):…”
Section: Enhanced Ica With Referencementioning
confidence: 99%
“…e cost function in equation (17) and the cost function in equation (13) are reciprocal relations, while the cost function in equation 18is positive and negative with the cost function in equation (14). en, the corresponding EICA-R scheme is shown in the following expressions: Mathematical Problems in Engineering e scheme of (19) and (20) is corresponding to the scheme of (15) and (16). It also faces the problem of division operation or scaling factor setting iteration.…”
Section: Enhanced Ica With Referencementioning
confidence: 99%
See 1 more Smart Citation
“…However, when a fault become severe, it is not consistent with the development of a fault. The correlation coefficient [27,28] and Kullback-Leibler (K-L) divergence [29][30][31] do not work well for identifying faults. Recently, sample entropy (SE) [32,33], approximate entropy (AE) [34,35] and fuzzy entropy (FE) [36,37] were introduced into the fault diagnosis domain.…”
Section: Introductionmentioning
confidence: 99%
“…In source separation it is more realistic to consider the effect of the surrounding environment such as reflection of the sources. To address this issue, researchers have considered convolutive mixtures [1][2][3][4][5][6][7] instead of the instantaneous mixture [8][9][10][11]. However, the convolutive mixture is modeled under the narrowband approximation [4] that is not valid when the mixing filter length is greater than the Short-Time Fourier Transform (STFT) windows length, which is the case of the reverberant environment.…”
Section: Introductionmentioning
confidence: 99%