2012
DOI: 10.1109/tsp.2012.2187285
|View full text |Cite
|
Sign up to set email alerts
|

Greedy Sparse RLS

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
19
0

Year Published

2013
2013
2016
2016

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 38 publications
(19 citation statements)
references
References 22 publications
0
19
0
Order By: Relevance
“…Image restoration as a solution of the ill-posed inverse problem typically concentrates on content models of sparse, locally supported signal segments or components in a space-scale varying context. Sometimes instead of an equal access to the whole sensed data to compute a user-defined solution, only locally defined data, predefined geometric component or the like are taken with higher weight to estimate an adaptively and locally sparse solution [30]. Moreover, adaptive regularization is applied with family of priors taking into account the geometric properties of the image [31] or corresponding to the ROI preselected by sensing procedure [32].…”
Section: Framework Of the Proposed Methodsmentioning
confidence: 99%
“…Image restoration as a solution of the ill-posed inverse problem typically concentrates on content models of sparse, locally supported signal segments or components in a space-scale varying context. Sometimes instead of an equal access to the whole sensed data to compute a user-defined solution, only locally defined data, predefined geometric component or the like are taken with higher weight to estimate an adaptively and locally sparse solution [30]. Moreover, adaptive regularization is applied with family of priors taking into account the geometric properties of the image [31] or corresponding to the ROI preselected by sensing procedure [32].…”
Section: Framework Of the Proposed Methodsmentioning
confidence: 99%
“…The so-called SPARLS algorithm is introduced in [14] using an expectation-maximization (EM) approach. The work of [15] proposes an adaptive version of the greedy least squares method using partial orthogonalization to systems. The work of [16] modifies the RLS algorithm by using a general convex function of the system parameters, resulting in the l 0 -RLS and l 1 -RLS algorithms.…”
Section: Introductionmentioning
confidence: 99%
“…These algorithms are particularly useful for sparse system identification and sparse channel parameter estimation [9,10]. Well-known examples include the zero-attracting least mean square (ZA-LMS) [3], reweighted ZA-LMS (RZA-LMS) [3], and sparse recursive least mean [8]. Many variants of sparse adaptive filtering algorithms have also been developed in [11][12][13]19,26,27].…”
Section: Introductionmentioning
confidence: 99%