1999
DOI: 10.1049/ip-vis:19990445
|View full text |Cite
|
Sign up to set email alerts
|

Forward sequential algorithms for best basis selection

Abstract: Recently, the problem of signal representation in terms of basis vectors from a large, "overcomplete", spanning dictionary has been the focus of much research. Achieving a succinct, or "sparse", representation is known as the problem of best basis representation. We consider methods which seek to solve this problem by sequentially building up a basis set for the signal. Three distinct algorithm types have appeared in the literature which we term Basic Matching Pursuit (BMP), Order Recursive Matching Pursuit (O… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
95
0

Year Published

2006
2006
2022
2022

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 120 publications
(101 citation statements)
references
References 37 publications
0
95
0
Order By: Relevance
“…Then keep the dictionary D fixed and find U according to some sparseness constraint rule. In practice, to find the coefficients 'w', vector selection algorithms such as Matching Pursuit algorithm [11,12], Basic Matching Pursuit (BMP) [13], Orthogonal Matching Pursuit (OMP) [14,15], or Order Recursive Matching Pursuit (ORMP) [16] are used. This stage is computationally intensive, since it needs to update each dictionary, particularly for large training sets.…”
Section: Phase1: Sparse Codingmentioning
confidence: 99%
“…Then keep the dictionary D fixed and find U according to some sparseness constraint rule. In practice, to find the coefficients 'w', vector selection algorithms such as Matching Pursuit algorithm [11,12], Basic Matching Pursuit (BMP) [13], Orthogonal Matching Pursuit (OMP) [14,15], or Order Recursive Matching Pursuit (ORMP) [16] are used. This stage is computationally intensive, since it needs to update each dictionary, particularly for large training sets.…”
Section: Phase1: Sparse Codingmentioning
confidence: 99%
“…Coding coefficients x k are computed via the orthogonal projection of y on D k (step 8). This is often carried out recursively by different methods using the current correlation value C k m k : QR factorization [7], Cholesky factorization [6], or block matrix inversion [14]. The obtained coefficients vector Different stopping criteria (step 11) can be used: a threshold on k for the number of iterations, a threshold on the relative root MSE (rRMSE) ǫ k 2 / y 2 , or a threshold on the decrease in the rRMSE.…”
Section: Review Of Orthogonal Matching Pursuitmentioning
confidence: 99%
“…Here, we use modified matching pursuit (MMP) [12] which selects each vector (in series) to minimize the residual representation error. The simpler matching pursuit (MP) algorithm is more computationally efficient, but provides less accurate reconstruction.…”
Section: Modified Matching Pursuit (Mmp): Greedy Vector Selectionmentioning
confidence: 99%
“…The MP algorithm is more computationally efficient but provides less accurate reconstruction. More details and comparisons can be found in [12,29].…”
Section: Modified Matching Pursuit (Mmp): Greedy Vector Selectionmentioning
confidence: 99%
See 1 more Smart Citation