2012
DOI: 10.1103/physrevb.85.045103
|View full text |Cite
|
Sign up to set email alerts
|

Optimizing large parameter sets in variational quantum Monte Carlo

Abstract: We present a technique for optimizing hundreds of thousands of variational parameters in variational quantum Monte Carlo. By introducing iterative Krylov subspace solvers and by multiplying by the Hamiltonian and overlap matrices as they are sampled, we remove the need to construct and store these matrices and thus bypass the most expensive steps of the stochastic reconfiguration and linear method optimization techniques. We demonstrate the effectiveness of this approach by using stochastic reconfiguration to … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
142
0
1

Year Published

2013
2013
2023
2023

Publication Types

Select...
6
1
1

Relationship

2
6

Authors

Journals

citations
Cited by 127 publications
(144 citation statements)
references
References 30 publications
1
142
0
1
Order By: Relevance
“…To reduce the cost, one can solve the SR equation iteratively by conjugate gradient (CG) method, so that the explicit construction of S kl is not required 33 . We only need to realize the matrix-vector multiplication As a result, the computational cost is reduced from O n s n 2 p + n 3 p to O (n s n p n iter ), and the memory cost is reduced from O n s n p + n 2 p to O (n s n p ).…”
Section: Summary and Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…To reduce the cost, one can solve the SR equation iteratively by conjugate gradient (CG) method, so that the explicit construction of S kl is not required 33 . We only need to realize the matrix-vector multiplication As a result, the computational cost is reduced from O n s n 2 p + n 3 p to O (n s n p n iter ), and the memory cost is reduced from O n s n p + n 2 p to O (n s n p ).…”
Section: Summary and Discussionmentioning
confidence: 99%
“…In order to optimize large number of parameters, one can solve the SR equation iteratively by conjugate gradient (CG) method 33 . The detailed implementation is described in Appendix B.…”
Section: Variational Monte Carlomentioning
confidence: 99%
See 1 more Smart Citation
“…1,9 For larger systems, several other methods have been extensively used to study the 1D and 2D Hubbard models as well as their strong coupling versions. 10 Among such approximations, we have the quantum Monte Carlo, 11,12 the variational Monte Carlo, 13 the density matrix renormalization group [14][15][16][17] as well as approximations based on matrix product and tensor network states. 18 Both the dynamical mean field theory and its cluster extensions [19][20][21][22][23][24][25][26] have made important contributions to our present knowledge of the Hubbard model.…”
mentioning
confidence: 99%
“…In principle, this can be easily cured by particle number projection, which converts the Bogoliubov vacuum into an antisymmetrized geminal power. [57][58][59][60] Practically, the performance of CT-MP2 at long distances can be improved by adding a level shift to the diagonal part ofĤ 0 , such that no negative eigenvalues appear in Eq. (22).…”
Section: Resultsmentioning
confidence: 99%