2017
DOI: 10.1016/j.dsp.2017.05.003
|View full text |Cite
|
Sign up to set email alerts
|

Sparse Bayesian learning using correlated hyperparameters for recovery of block sparse signals

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 8 publications
(11 citation statements)
references
References 17 publications
0
11
0
Order By: Relevance
“…where κ is a single fixed constant. In practice, the relevance between the neighbour blocks is inhomogeneous, meaning that κs shall be a set of various coefficients [31]. Thus, in the first layer, we modify Equation (10) and develop an adaptive coupled pattern with the form…”
Section: Probability Distribution Of Xmentioning
confidence: 99%
See 2 more Smart Citations
“…where κ is a single fixed constant. In practice, the relevance between the neighbour blocks is inhomogeneous, meaning that κs shall be a set of various coefficients [31]. Thus, in the first layer, we modify Equation (10) and develop an adaptive coupled pattern with the form…”
Section: Probability Distribution Of Xmentioning
confidence: 99%
“…In this part, the expectation-maximisation method is invoked to estimate η 1 and η 2 . The expectation of logarithm distribution about η 1 and η 2 is obtained with Equation ( 12) [31].…”
Section: Estimation Of Coupled Parameters η 1 and ηmentioning
confidence: 99%
See 1 more Smart Citation
“…Experiment 1: This paper compares the reconstructed results of block-sparse Bayesian learning fast edge-likelihood maximization algorithm (BSBL-FM) [23], orthogonal matching pursuit algorithm (OMP) [24], and MHSPCS model for multi-channel HS signals. At the same compression ratio, the running time, PSNR, and reconstruction similarity coefficient of the three algorithms were tested.…”
Section: Pexperiments and Analysismentioning
confidence: 99%
“…A source is sparse in a given representation domain if most of its elements are close to zero. The CS technique requires the sparsity of the sources which restricts its application [9]. In [10], the authors proposed to apply the CS technique to solve underdetermined real-valued finite-alphabet source recovery problems.…”
mentioning
confidence: 99%