2015
DOI: 10.1109/taslp.2015.2450494
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Channel Audio Source Separation Using Multiple Deformed References

Abstract: We present a general multi-channel source separation framework where additional audio references are available for one (or more) source(s) of a given mixture. Each audio reference is another mixture which is supposed to contain at least one source similar to one of the target sources. Deformations between the sources of interest and their references are modeled in a linear manner using a generic formulation. This is done by adding transformation matrices to an excitation-filter model, hence affecting different… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
30
0

Year Published

2017
2017
2018
2018

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 21 publications
(30 citation statements)
references
References 27 publications
0
30
0
Order By: Relevance
“…and one can easily see that, while the new penalty keeps the group sparsity property thanks to Ψ gr (H) defined in (12), it prevents (when γ j > 0) the supergroups from vanishing since if H (j) 1 tends to zero, then − log H (j) 1 tends to +∞. This formulation generalizes the group sparsity constraint in the sense that (13) reduces to (12) for γ j = 0.…”
Section: Group Sparsitymentioning
confidence: 95%
See 4 more Smart Citations
“…and one can easily see that, while the new penalty keeps the group sparsity property thanks to Ψ gr (H) defined in (12), it prevents (when γ j > 0) the supergroups from vanishing since if H (j) 1 tends to zero, then − log H (j) 1 tends to +∞. This formulation generalizes the group sparsity constraint in the sense that (13) reduces to (12) for γ j = 0.…”
Section: Group Sparsitymentioning
confidence: 95%
“…For the separation to be feasible, we require that every learned source model has a corresponding non-zero activation; however, this constraint is not enforced by the group sparsity penalty in (12) where it can happen that a group of different sources are fit together using the same source model, instead of separately using their designated models, rendering their separation impossible. We observed this "source vanishing" phenomenon in practice as illustrated in Fig.…”
Section: B Relative Group Sparsity Constraints and Parameter Estimatmentioning
confidence: 99%
See 3 more Smart Citations