2018
DOI: 10.48550/arxiv.1803.03797
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Efficient FPGA Implementation of Conjugate Gradient Methods for Laplacian System using HLS

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…The problem (11) can be solved alternatively for Z and S. Then the resulting problem in Z is unconstrained least-squares problem, which can be solved using linear conjugate gradient algorithm (For various preconditioned CG approaches, see [4][5][6][7][8][9][12][13][14][15][16][17][18][19][20][21][22][23][24]), and over S it is non-negative leastsquares problem which can be solved using [37]. The cost g(U ) and the gradient ∇g(U ) can be easily computed following Lemma 3, hence, we employ Riemannian conjugate gradient to solve the outer optimization problem over U .…”
Section: Nonnegative Tensor Completionmentioning
confidence: 99%
“…The problem (11) can be solved alternatively for Z and S. Then the resulting problem in Z is unconstrained least-squares problem, which can be solved using linear conjugate gradient algorithm (For various preconditioned CG approaches, see [4][5][6][7][8][9][12][13][14][15][16][17][18][19][20][21][22][23][24]), and over S it is non-negative leastsquares problem which can be solved using [37]. The cost g(U ) and the gradient ∇g(U ) can be easily computed following Lemma 3, hence, we employ Riemannian conjugate gradient to solve the outer optimization problem over U .…”
Section: Nonnegative Tensor Completionmentioning
confidence: 99%
“…This is a sparse linear system in Z, which can be solved using linear conjugate gradient method. For various preconditioned CG approaches, see [53,33,34,35,36,37,38,42,48,52,41,43,44,45,46,47,49,50,51]. Problem (9) has only one term involving S. Hence, the optimization problem over S reduces to…”
Section: Convex Optimization Problemmentioning
confidence: 99%