2016
DOI: 10.1109/tit.2016.2602222
|View full text |Cite
|
Sign up to set email alerts
|

Group-Sparse Model Selection: Hardness and Relaxations

Abstract: Abstract-Group-based sparsity models are instrumental in linear and non-linear regression problems. The main premise of these models is the recovery of "interpretable" signals through the identification of their constituent groups, which can also provably translate in substantial savings in the number of measurements for linear models in compressive sensing. In this paper, we establish a combinatorial framework for group-model selection problems and highlight the underlying tractability issues. In particular, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
50
0

Year Published

2016
2016
2020
2020

Publication Types

Select...
5
1
1

Relationship

3
4

Authors

Journals

citations
Cited by 19 publications
(50 citation statements)
references
References 38 publications
(128 reference statements)
0
50
0
Order By: Relevance
“…This model is strongly related to (and can in fact be viewed as special case of) the joint sparsity model [21,22,26], where one considers a signal consisting of several "channels" (such as the three color channels of an RGB image) and assumes that nonzeros coefficients appear at the same location within each of the channels. A generalization of the block sparsity model is the group sparsity model where the groups of nonzero coefficients are allowed to overlap [3,34].…”
Section: Introductionmentioning
confidence: 99%
“…This model is strongly related to (and can in fact be viewed as special case of) the joint sparsity model [21,22,26], where one considers a signal consisting of several "channels" (such as the three color channels of an RGB image) and assumes that nonzeros coefficients appear at the same location within each of the channels. A generalization of the block sparsity model is the group sparsity model where the groups of nonzero coefficients are allowed to overlap [3,34].…”
Section: Introductionmentioning
confidence: 99%
“…This does not correspond to a matroid constraint, but it can nevertheless be handled using dynamic programming [20].…”
Section: Definitionmentioning
confidence: 99%
“…This problem is in general NP-hard, but there exist special cases, such as the block model [7] and the acyclic overlapping group model [5], which can be solved exactly in polynomial time. Furthermore, convex relaxations such as the group lasso [1], [2] and the latent group lasso [3], [4] provide tractable proxies to the discrete problem, with the disadvantage though of not being able to obtain the entire solution set [5].…”
Section: Introductionmentioning
confidence: 99%
“…This has led to the sparse group lasso norm [11] and also to the general dynamic program for acyclic group structures that finds a K-sparse solution covered by at most G-groups [5].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation