2011
DOI: 10.1109/lsp.2011.2139204
|View full text |Cite
|
Sign up to set email alerts
|

Multidimensional Shrinkage-Thresholding Operator and Group LASSO Penalties

Abstract: The scalar shrinkage-thresholding operator (SSTO) is a key ingredient of many modern statistical signal processing algorithms including: sparse inverse problem solutions, wavelet denoising, and JPEG2000 image compression. In these applications, it is customary to select the threshold of the operator by solving a scalar sparsity penalized quadratic optimization. In this work, we present a natural multidimensional extension of the scalar shrinkage thresholding operator. Similarly to the scalar case, the threshol… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
45
0

Year Published

2013
2013
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 36 publications
(45 citation statements)
references
References 20 publications
0
45
0
Order By: Relevance
“…Consider the joint distribution of random processes , each of length . We will consider approximations of the form (8) where selects the parent. Let denote the set of all such approximations.…”
Section: Main Result: Best Parent and Causal Dependence Tree Appromentioning
confidence: 99%
See 1 more Smart Citation
“…Consider the joint distribution of random processes , each of length . We will consider approximations of the form (8) where selects the parent. Let denote the set of all such approximations.…”
Section: Main Result: Best Parent and Causal Dependence Tree Appromentioning
confidence: 99%
“…"Group Lasso" is a method to infer the causal relationships between multivariate auto-regressive models [6]. Bolstad [8]. Tan and Willsky analyzed sample complexity for identifying the topology of a tree structured network of LTI systems [9].…”
Section: B Related Workmentioning
confidence: 99%
“…Unique to the OID formulation are the binary optimization variables {z h }; finding the optimal (sub)set of inverters to dispatch involves the solution of combinatorially many subproblems. Nevertheless, a computationallyaffordable convex reformulation was developed in [11], by leveraging sparsity-promoting regularization [26] and semidefinite relaxation (SDR) techniques [12], [23], [27] as briefly described next.…”
Section: B Centralized Optimization Strategymentioning
confidence: 99%
“…Specifically, the number of inverters operating under OID decreases as λ is increased [26]. Key to developing a relaxation of the OID task is to express powers and voltage magnitudes as linear functions of the outerproduct Hermitian matrix V := vv H , and to reformulate the OID problem with cost and constraints that are linear in V, as well as the constraints V 0 and rank(V) = 1 [12], [23], [27].…”
Section: B Centralized Optimization Strategymentioning
confidence: 99%
See 1 more Smart Citation