2016
DOI: 10.1109/tip.2016.2556582
|View full text |Cite
|
Sign up to set email alerts
|

Two-Dimensional Pattern-Coupled Sparse Bayesian Learning via Generalized Approximate Message Passing

Abstract: We consider the problem of recovering 2D block-sparse signals with unknown cluster patterns. The 2D block-sparse patterns arise naturally in many practical applications, such as foreground detection and inverse synthetic aperture radar imaging. To exploit the underlying block-sparse structure, we propose a 2D pattern-coupled hierarchical Gaussian prior model. The proposed pattern-coupled hierarchical Gaussian prior model imposes a soft coupling mechanism among neighboring coefficients through their shared hype… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
48
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
10

Relationship

0
10

Authors

Journals

citations
Cited by 75 publications
(48 citation statements)
references
References 30 publications
(65 reference statements)
0
48
0
Order By: Relevance
“…To efficiently solve the reformulated optimization problem, we apply a nature-inspired metaheuristic algorithm; called Beetle Antennae Search (BAS) algorithm. We leverage the properties of metaheuristic algorithms in general, i.e., their well-known ability to efficiently solve complex nonlinear non-convex optimization problems [41]- [45]. Metaheuristic algorithms have found application is several practical situations [46]- [52].…”
Section: B Aim and Organization Of Our Workmentioning
confidence: 99%
“…To efficiently solve the reformulated optimization problem, we apply a nature-inspired metaheuristic algorithm; called Beetle Antennae Search (BAS) algorithm. We leverage the properties of metaheuristic algorithms in general, i.e., their well-known ability to efficiently solve complex nonlinear non-convex optimization problems [41]- [45]. Metaheuristic algorithms have found application is several practical situations [46]- [52].…”
Section: B Aim and Organization Of Our Workmentioning
confidence: 99%
“…The ELBO can be re-expressed as Lðq; ΘÞ ¼ Z qðθÞln pðyjx; βÞpðxjα 1 ; α 2 ; κ; vÞpðα 1 Þpðα 2 ÞpðβÞpðκÞ qðθÞ dθ (24) Because the variables {x i } in p(y|x, β) are non-factorizable, updating the approximate posterior distribution q(x) becomes intractable. The high computational cost involved in obtaining q(x) owing to the computation of the inverse of a K × K matrix at each iteration inhibits the application of the conventional SBL method.…”
Section: Overview Of Variational Inferencementioning
confidence: 99%
“…First, our model can be readily applied to either SMV or MMV problems, while the original PC-SBL algorithm proposed in [ 28 ] solves for the clustered pattern SMVs. Although it has been recently extended via generalized approximate message passing (GAMP) to solve for 2D problems [ 44 ], it needs some extra modifications to be used for the MMVs. Secondly, our model uses the Bernoulli-Gaussian prior, and it promotes the clustering pattern by adding hyperpriors on the supports of the solution, while in [ 28 ], this task is performed on the variances of the solution components.…”
Section: Background and Introductionmentioning
confidence: 99%