1997
DOI: 10.1103/physrevlett.79.1173
|View full text |Cite
|
Sign up to set email alerts
|

Stochastic Gradient Approximation: An Efficient Method to Optimize Many-Body Wave Functions

Abstract: A novel, efficient optimization method for physical problems is presented. The method utilizes the noise inherent in stochastic functions. As an application, an algorithm for the variational optimization of quantum many-body wave functions is derived. The numerical results indicate superior performance when compared to traditional techniques. [S0031-9007(97)

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
86
0

Year Published

1999
1999
2024
2024

Publication Types

Select...
5
4

Relationship

2
7

Authors

Journals

citations
Cited by 75 publications
(87 citation statements)
references
References 13 publications
1
86
0
Order By: Relevance
“…Firstly, unlike the local optimization methods, the GO method update all tensors simultaneously and also the noise in the gradient may help avoid the local minima, which has some similarity to the simulated annealing technique. 25 Secondly, the MC sweeps can be easily and massively parallelized. Thirdly, it is easy to deal with the systems beyond nearest neighbor interactions, such as J 1 -J 2 square Heisenberg model.…”
Section: Methodsmentioning
confidence: 99%
“…Firstly, unlike the local optimization methods, the GO method update all tensors simultaneously and also the noise in the gradient may help avoid the local minima, which has some similarity to the simulated annealing technique. 25 Secondly, the MC sweeps can be easily and massively parallelized. Thirdly, it is easy to deal with the systems beyond nearest neighbor interactions, such as J 1 -J 2 square Heisenberg model.…”
Section: Methodsmentioning
confidence: 99%
“…For example, by choosing the step size to decrease as μ t ∝ 1/t α , with N fixed and 0 < α 1, we are guaranteed convergence to some local minimum. 18 Equivalently, the noise could be reduced at each step by increasing N ∝ t β , or some combination of both.…”
Section: B Optimizationmentioning
confidence: 99%
“…Reintroducing the plain (not importance-sampled) projection operator F = 1−τ (H −E 0 ), we can rewrite the reweighting factor for a simple Gutzwiller wavefunction into (21) with the ratioF (R ′ , R)/F (R ′ , R) = 1 for R ′ = R, and many such polynomials, therefore the order of the polynomial representing (20) increases with the number n of Monte Carlo steps. It is therefore not practical to directly estimate the ever increasing number of coefficients for the reweighting factor.…”
Section: B Correlated Samplingmentioning
confidence: 99%
“…General methods for achieving this are correlated sampling 15,20 or the recently developed stochastic gradient approximation. 21 Exploiting the particular form of Gutzwiller wavefunctions, we find a different approach, which is equivalent to correlated sampling: We observe that expectation values can be rewritten as the quotient of two polynomials (i.e. a rational function) in the Gutzwiller parameters.…”
Section: Introductionmentioning
confidence: 99%