1995
DOI: 10.1117/12.217610
|View full text |Cite
|
Sign up to set email alerts
|

<title>Examples of basis pursuit</title>

Abstract: The Time-Frequency and Time-Scale communities have recently developed a large number of overcomplete waveform dictionaries. Decomposition into overcomplete systems is not unique, and several methods for decomposition have been proposed -including the Method of Frames, Matching Pursuit, and, for special dictionaries, the Best Orthogonal Basis.Basis Pursuit is a principle for decomposing a signal into an "optimal" superposition of dictionary elements -where optimal means having the smallest 11 norm of coefficien… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
34
0

Year Published

2002
2002
2024
2024

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 34 publications
(34 citation statements)
references
References 1 publication
(3 reference statements)
0
34
0
Order By: Relevance
“…Viceversa, any minimizer θ * of the problem (2) corresponds to one (or more) AMP fixed points of the form ( θ * , R * , b * ). 7 Here and below, given f : 8 This equation always admits at least one solution since b → (r ; b) is continuous in b ≥ 0, with (r ; 0) = 0 and (for ρ strictly convex) (r ; ∞) = 1, cf. Proposition 6.3.…”
Section: Relation To M-estimationmentioning
confidence: 99%
See 1 more Smart Citation
“…Viceversa, any minimizer θ * of the problem (2) corresponds to one (or more) AMP fixed points of the form ( θ * , R * , b * ). 7 Here and below, given f : 8 This equation always admits at least one solution since b → (r ; b) is continuous in b ≥ 0, with (r ; 0) = 0 and (for ρ strictly convex) (r ; ∞) = 1, cf. Proposition 6.3.…”
Section: Relation To M-estimationmentioning
confidence: 99%
“…n/ p(n) → δ ∈ (0, 1)) so that even in the noiseless case, the equations Y = Xθ would be underdetermined. In the p > n setting, it became popular to use 1 -penalized least squares (Lasso [7,34]). That series of papers considered the Lasso convex optimization problem in the case of X with iid N(0, 1/n) entries (just as here) and followed the same 3-step strategy we use here; namely, (1) Introducing an AMP algorithm; (2) Obtaining the asymptotic distribution of AMP by state evolution; and (3) Showing that AMP agrees with the Lasso solution in the large-n limit.…”
Section: Underlying Toolsmentioning
confidence: 99%
“…Both y and A are known, both x 0 and z 0 are unknown, and we seek an approximation to x 0 . A very popular approach estimates x 0 via the solution x 1,λ of the following convex optimization problem (P 2,λ,1 ) minimize 1 2 y − Ax Thousands of articles use or study this approach, which has variously been called LASSO, Basis Pursuit, or more prosaically, 1 -penalized least-squares [Tib96, CD95,CDS98]. There is a clear need to understand the extent to which (P 2,λ,1 ) accurately recovers x 0 .…”
Section: Introductionmentioning
confidence: 99%
“…Hence we can obtain the minimizer as a function ofỹ i by adding θ i to the graph i figure (19) and flipping the axis. The result is plotted in the next figure.…”
Section: 2mentioning
confidence: 99%
“…The LASSO (Least Absolute Shrinkage and Selection Operator) presented in [18], also known as Basis Pursuit DeNoising (BPDN) [19,20] is arguably the most successful method for sparse regression. The LASSO estimator is defined in terms of an optimization problem…”
Section: 2mentioning
confidence: 99%