2016
DOI: 10.1007/s12532-016-0105-y
|View full text |Cite
|
Sign up to set email alerts
|

Revisiting compressed sensing: exploiting the efficiency of simplex and sparsification methods

Abstract: We propose two approaches to solve large-scale compressed sensing problems. The first approach uses the parametric simplex method to recover very sparse signals by taking a small number of simplex pivots, while the second approach reformulates the problem using Kronecker products to achieve faster computation via a sparser problem formulation. In particular, we focus on the computational aspects of these methods in compressed sensing. For the first approach, if the true signal is very sparse and we initialize … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 44 publications
0
1
0
Order By: Relevance
“…Kronecker products have many applications in signal processing, semidefinite programming, and quantum computing. They have also been used extensively in the theory and applications of linear matrix equations such as Sylvester equations and Lyapunov problem [21], in some compressed sensing applications using sparsification [23] and in constructing convex relaxations of non-convex sets [1]. Tensor product preconditioners used in conjugate gradient method [20] and image restoration [15], low rank tensor approximations arising in the context of certain quantum chemistry systems [22], quantum many-body systems [12], and high dimensional partial differential equations [2,3] are among many other applications which utilize the rich properties of tensor products that can transfer the structure of the individual elements to the product itself.…”
Section: Introductionmentioning
confidence: 99%
“…Kronecker products have many applications in signal processing, semidefinite programming, and quantum computing. They have also been used extensively in the theory and applications of linear matrix equations such as Sylvester equations and Lyapunov problem [21], in some compressed sensing applications using sparsification [23] and in constructing convex relaxations of non-convex sets [1]. Tensor product preconditioners used in conjugate gradient method [20] and image restoration [15], low rank tensor approximations arising in the context of certain quantum chemistry systems [22], quantum many-body systems [12], and high dimensional partial differential equations [2,3] are among many other applications which utilize the rich properties of tensor products that can transfer the structure of the individual elements to the product itself.…”
Section: Introductionmentioning
confidence: 99%