2014
DOI: 10.1109/lsp.2013.2294862
|View full text |Cite
|
Sign up to set email alerts
|

Localization of More Sources Than Sensors via Jointly-Sparse Bayesian Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
14
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 30 publications
(14 citation statements)
references
References 11 publications
0
14
0
Order By: Relevance
“…Lemma 1. Let the estimator λ be as defined in (4). Then, under Assumptions 1 and 2 with c = c ′ = c ′′ = 1, we have that…”
Section: A the Estimatormentioning
confidence: 99%
See 2 more Smart Citations
“…Lemma 1. Let the estimator λ be as defined in (4). Then, under Assumptions 1 and 2 with c = c ′ = c ′′ = 1, we have that…”
Section: A the Estimatormentioning
confidence: 99%
“…However, none of the above works addresses the question of tradeoff between m and n when m < k. Initial works considering the m < k regime were [21] and [4], followed by [18] and [23], where it was empirically demonstrated that when multiple samples are available, it is possible to operate in the m < k regime. However, the analysis in [4] is done under two fairly restrictive conditions. The first condition is an orthogonality assumption on the data vectors that requires n i=1 X i X ⊤ i to be diagonal.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Recently it was shown that given the true dictionary A, and a data segment Y s ∈ R M ×Ls , where L s is the length of the segment in data frames, M-SBL (multiple measurement Sparse Bayesian Learning) applied directly on Y s can identify active sources under the assumption that sources are uncorrelated in the time segment [22]. The number of sources identified in this case is not limited by the number of channels M , 1 ≤ k ≤ M (M + 1)/2.…”
Section: Related Workmentioning
confidence: 99%
“…Under the frame of BCS, signal reconstruction is mainly achieved by sparse Bayesian learning (SBL) [6]. The SBL has obtained great development in the recent researches, such as root SBL [7], variational SBL [8], jointly SBL [9] and grid evolution method [15]. SBL has a multilayered assumption frame which is designed to iterate to "learn" new information and update the hyperparameters.…”
Section: Introductionmentioning
confidence: 99%