2022
DOI: 10.1007/s11075-022-01473-x
|View full text |Cite
|
Sign up to set email alerts
|

Faster randomized block sparse Kaczmarz by averaging

Abstract: The standard randomized sparse Kaczmarz (RSK) method is an algorithm to compute sparse solutions of linear systems of equations and uses sequential updates, and thus, does not take advantage of parallel computations. In this work, we introduce a parallel (mini batch) version of RSK based on averaging several Kaczmarz steps. Naturally, this method allows for parallelization and we show that it can also leverage large overrelaxation. We prove linear expected convergence and show that, given that parallel computa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(7 citation statements)
references
References 42 publications
0
5
0
Order By: Relevance
“…Inspired by the works of Petra [34] and Tondji et al [35], we show that the sparse Kaczmarz algorithm is a particular instance of the coordinate descent method applied to an unconstrained dual problem corresponding to a regularized MCP problem (8). Since the conjugate of the function F š›æ (x) is…”
Section: Convergencementioning
confidence: 87%
See 1 more Smart Citation
“…Inspired by the works of Petra [34] and Tondji et al [35], we show that the sparse Kaczmarz algorithm is a particular instance of the coordinate descent method applied to an unconstrained dual problem corresponding to a regularized MCP problem (8). Since the conjugate of the function F š›æ (x) is…”
Section: Convergencementioning
confidence: 87%
“…Inspired by the works of Petra [34] and Tondji et al [35], we show that the sparse Kaczmarz algorithm is a particular instance of the coordinate descent method applied to an unconstrained dual problem corresponding to a regularized MCP problem (). Since the conjugate of the function FĪ“false(xfalse)$$ {F}_{\delta }(x) $$ is rightFĪ“āˆ—(z)=leftmaxxāˆˆā„n(āŸØz,xāŸ©āˆ’FĪ“(x))right=leftāˆ‘i=1nmaxxiāˆˆā„zixiāˆ’Ī»lĪ“(xi)āˆ’12xi2right=leftāˆ‘i=1nfĪ“āˆ—(zi),$$ {\displaystyle \begin{array}{cc}\hfill {F}_{\delta}^{\ast }(z)=& \kern0.2em \underset{x\in {\mathrm{\mathbb{R}}}^n}{\max}\left(\left\langle z,x\right\rangle -{F}_{\delta }(x)\right)\hfill \\ {}\hfill =& \kern0.2em \sum \limits_{i=1}^n\underset{x_i\in \mathrm{\mathbb{R}}}{\max}\left({z}_i{x}_i-\lambda {l}_{\delta}\left({x}_i\right)-\frac{1}{2}{x}_i^2\right)\hfill \\ {}\hfill =& \kern0.2em \sum \limits_{i=1}^n{f}_{\delta}^{\ast}\left({z}_i\right),\hfill \end{array}} $$ where fĪ“āˆ—false(zifalse)={left leftarray12…”
Section: The Sparse Kaczmarz Methodsmentioning
confidence: 94%
“…We would like to thank Dr. Lionel N. Tondji et.al. for sending us their conference paper Tondji et al (2021) to kindly remind us that they also independently proposed the weighted randomized sparse Kaczmarz method. This work was supported by the National Natural Science Foundation of China (No.11971480, No.61977065), the Natural Science Fund of Hunan for Excellent Youth (No.2020JJ3038), and the Fund for NUDT Young Innovator Awards (No.20190105).…”
Section: Discussionmentioning
confidence: 99%
“…There exist extensive studies on the mirror descent method and its stochastic variants in optimization, see [6,9,30,31,40,42] for instance. The existing works either depend crucially on the finite-dimensionality of the underlying spaces or establish only error estimates in terms of objective function values, and therefore they are not applicable to our algorithm 1 for illposed problems.…”
Section: Description Of the Methodsmentioning
confidence: 99%