2014
DOI: 10.1109/tit.2013.2291876
|View full text |Cite
|
Sign up to set email alerts
|

Blind Multilinear Identification

Abstract: International audienceWe discuss a technique that allows blind recovery of signals or blind identification of mixtures in instances where such recovery or identification were previously thought to be impossible: (i) closely located or highly correlated sources in antenna array processing, (ii) highly correlated spreading codes in CDMA radio communication, (iii) nearly dependent spectra in fluorescent spectroscopy. This has important implications --- in the case of antenna array processing, it allows for joint … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
81
0
6

Year Published

2014
2014
2023
2023

Publication Types

Select...
6
2

Relationship

4
4

Authors

Journals

citations
Cited by 78 publications
(88 citation statements)
references
References 65 publications
0
81
0
6
Order By: Relevance
“…In fact, the set of tensors of rank at most ξ is not closed if ξ > 1. Examples of the lack of closeness have been provided in the literature [15,16], which suffice to prove it. In other words, it may happen that for a given tensor, and for any rank-r approximation of it, there always exists another better rank-r approximation.…”
Section: Existencementioning
confidence: 95%
See 2 more Smart Citations
“…In fact, the set of tensors of rank at most ξ is not closed if ξ > 1. Examples of the lack of closeness have been provided in the literature [15,16], which suffice to prove it. In other words, it may happen that for a given tensor, and for any rank-r approximation of it, there always exists another better rank-r approximation.…”
Section: Existencementioning
confidence: 95%
“…Define the three coherences μ A , μ B , and μ C associated with the matrices A, B, and C, respectively. It has been indeed shown in [15,17] that under the constraint:…”
Section: Existencementioning
confidence: 99%
See 1 more Smart Citation
“…These include: (i) impose orthogonality between columns of factor matrices [20] -in Blind Source Separation, this takes the form of a spatial prewhitening; (ii) impose orthogonality between decomposable tensors [45]; (iii) prevent divergence by bounding coefficients λ r [61], [54]; (iv) if the tensor is nonnegative, use a nonnegative CP [54]; (v) impose a minimal angle between columns of factor matrices [55]; (vi) compute an exact CP of another tensor 8 , which has undergone a multilinear compression via truncated HOSVD [21], [11]; (vii) compute another decomposition where the core tensor is block diagonal instead of diagonal [26] [79]; (viii) compute a Joint Approximate Diagonalization (JAD) of matrix slices, which may be viewed as another decomposition where the core tensor is not diagonal [62], [87], [89], [2], [86], [51], [20], [30], [56], [69], [14], as depicted in Figure 1. The drawbacks of this family of approaches, which become more and more popular, are three-fold.…”
Section: Approximate Decompositionsmentioning
confidence: 99%
“…As it requires only mild assumptions for being essentially unique, the CPD provides means for blindly and jointly identifying the components of multilinear models, which arise in many real-world applications; see [1][2][3] for some examples.…”
Section: Introductionmentioning
confidence: 99%