2017
DOI: 10.1021/acs.jctc.7b00119
|View full text |Cite
|
Sign up to set email alerts
|

A Blocked Linear Method for Optimizing Large Parameter Sets in Variational Monte Carlo

Abstract: We present a modification to variational Monte Carlo's linear method optimization scheme that addresses a critical memory bottleneck while maintaining compatibility with both the traditional ground state variational principle and our recently-introduced variational principle for excited states. For wave function ansatzes with tens of thousands of variables, our modification reduces the required memory per parallel process from tens of gigabytes to hundreds of megabytes, making the methodology a much better fit… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
56
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
6
1

Relationship

5
2

Authors

Journals

citations
Cited by 28 publications
(56 citation statements)
references
References 41 publications
0
56
0
Order By: Relevance
“…One important limitation of the LM comes when the number of variational parameters rises to 10,000 or more, at which point the contributions to H and S made by each Markov chain become cumbersome to store in memory, especially when running one Markov chain per core on a large parallel system in which per-core memory is limited. QMCPACK currently addresses this memory bottleneck using the blocked LM [35], a recent algorithm that separates the variable space into blocks, estimates the most important variable-change directions within each block, and then uses these directions to construct a reduced and vastly more memory efficient LM eigenvalue problem to generate an update direction in the overall variable space. Like excited state targeting, this is a new feature that can be expected to evolve in time, and has been made openly available to the community in the spirit of rapid dissemination.…”
Section: Handling Large Parameter Setsmentioning
confidence: 99%
“…One important limitation of the LM comes when the number of variational parameters rises to 10,000 or more, at which point the contributions to H and S made by each Markov chain become cumbersome to store in memory, especially when running one Markov chain per core on a large parallel system in which per-core memory is limited. QMCPACK currently addresses this memory bottleneck using the blocked LM [35], a recent algorithm that separates the variable space into blocks, estimates the most important variable-change directions within each block, and then uses these directions to construct a reduced and vastly more memory efficient LM eigenvalue problem to generate an update direction in the overall variable space. Like excited state targeting, this is a new feature that can be expected to evolve in time, and has been made openly available to the community in the spirit of rapid dissemination.…”
Section: Handling Large Parameter Setsmentioning
confidence: 99%
“…This enables the expected strong coupling between them to be handled more accurately by the LMstyle diagonalization within that block. While the BLM has been successfully applied up to about 25,000 parameters and found to closely reproduce the results of the standard LM, [6] it remains a relatively new method, and the present study will provide additional data on its efficacy.…”
Section: The Linear Methodsmentioning
confidence: 73%
“…A more extensive description of the BLM and its precise memory usage can be found in its original paper. [6] We divide parameters evenly among blocks, but one could implement the use of tailored blocks of varying sizes. It is advisable to choose the block size to be large enough to keep important parameters of the same type, such as all of those for a Jastrow factor, within the same block.…”
Section: The Linear Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Although we have chosen to test this strategy using Ω as the variational principle and the VMC linear method [18,[30][31][32][33][34] as the wave function update method, we expect it to be effective for other variational principles and updated methods as well.…”
Section: Transformations Between Variational Principlesmentioning
confidence: 99%