2021
DOI: 10.1016/j.sciaf.2021.e00880
|View full text |Cite
|
Sign up to set email alerts
|

A three-term Polak-Ribière-Polyak derivative-free method and its application to image restoration

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
11
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
7

Relationship

3
4

Authors

Journals

citations
Cited by 11 publications
(11 citation statements)
references
References 39 publications
0
11
0
Order By: Relevance
“…The SDYCG algorithm will be used to address this problem. Similar methods have been employed to deal with this problem; see [34][35][36]. All codes in this work are written in Matlab R2014a and run on an HP core i5, 8th Gen personal computer.…”
Section: Application In Signal Recoverymentioning
confidence: 99%
“…The SDYCG algorithm will be used to address this problem. Similar methods have been employed to deal with this problem; see [34][35][36]. All codes in this work are written in Matlab R2014a and run on an HP core i5, 8th Gen personal computer.…”
Section: Application In Signal Recoverymentioning
confidence: 99%
“…where b ∈  m is an observed signal, v 0 ∈  n is the unknown signal, 𝜔 is the noise and A ∈  m×n (m ≪ n) is a linear operator. In order to address problem (33), one of the tools usually employed is the 𝓁 1 -regularization. The restoration is obtained by solving the famous machine learning problem: the LASSO problem, that is,…”
Section: Application To Signal Recoverymentioning
confidence: 99%
“…The m × n matrix A is obtained by first filling it with independent samples of a standard Gaussian distribution and then ortho-normalizing the rows. The observation b is generated by (33), where 𝜔 is the Gaussian noise distributed as N(0, 𝛿 2 I) and 𝛿 ≥ 0 is the standard deviation (SD). We set 𝜇 = 0.01||A ⊤ b|| ∞ .…”
Section: Application To Signal Recoverymentioning
confidence: 99%
See 1 more Smart Citation
“…Exploiting the simplicity and low storage requirement of the conjugate gradient method [1,2], in recent times, several authors have extended many conjugate gradient algorithms designed to solve unconstrained optimization problems to solve large-scale nonlinear equations (1.6) (see [3][4][5][6][7][8][9][10][11][17][18][19][20][21][22][23][24][25][26][27][28][29][30][31][32][33]36]). For instance, using the projection scheme of Solodov and Svaiter [35], Xiao and Zhu [38] extended the Hager and Zhang conjugate descent (CG DESCENT) method to solve (1.6).…”
Section: Introductionmentioning
confidence: 99%