2022
DOI: 10.48550/arxiv.2201.08620
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Extended Randomized Kaczmarz Method for Sparse Least Squares and Impulsive Noise Problems

Abstract: The Extended Randomized Kaczmarz method is a well known iterative scheme which can find the Moore-Penrose inverse solution of a possibly inconsistent linear system and requires only one additional column of the system matrix in each iteration in comparison with the standard randomized Kaczmarz method. Also, the Sparse Randomized Kaczmarz method has been shown to converge linearly to a sparse solution of a consistent linear system. Here, we combine both ideas and propose an Extended Sparse Randomized Kaczmarz m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(5 citation statements)
references
References 25 publications
(53 reference statements)
0
5
0
Order By: Relevance
“…Fixing the relaxation parameters w k,i = 1 for all iterations k and indices i lead to the standard RSK method. In [23], an extension of the RSK with linear expected convergence has been proposed for solving sparse least squares and impulsive noise problems while requiring only one additional column of the system matrix in each iteration. For consistent systems the iterates of the standard RSK method converge in expectation to the solution of the regularized Basis Pursuit Problem…”
Section: Related Workmentioning
confidence: 99%
“…Fixing the relaxation parameters w k,i = 1 for all iterations k and indices i lead to the standard RSK method. In [23], an extension of the RSK with linear expected convergence has been proposed for solving sparse least squares and impulsive noise problems while requiring only one additional column of the system matrix in each iteration. For consistent systems the iterates of the standard RSK method converge in expectation to the solution of the regularized Basis Pursuit Problem…”
Section: Related Workmentioning
confidence: 99%
“…Like the algorithms in the works [24,13,5,23,20], our approach combines two randomized iterative algorithms. Specifically, for the consistent case (b ∈ range(AB)), we propose using the RK algorithm to solve the subsystem Ay = b followed by the RRK algorithm to solve the minimization problem (3) as shown in Algorithm 4, and call it the RK-RRK algorithm.…”
Section: The Proposed Algorithmsmentioning
confidence: 99%
“…x 2 2 + λ x 1 , we call the resulting algorithm the RK-RSK algorithm. In the following remark, we give the relationship between the ExSRK algorithm [20] and the RK-RSK algorithm.…”
Section: The Rk-rrk Algorithm For the Case B ∈ Range(ab)mentioning
confidence: 99%
See 2 more Smart Citations