1997
DOI: 10.1103/physreve.55.1162
|View full text |Cite
|
Sign up to set email alerts
|

Pivot method for global optimization

Abstract: A pivot algorithm for the location of a global minimum of a multiple-minimum problem is presented. The pivot method uses a series of randomly placed probes in phase space, moving the worst probes to be near better probes iteratively until the system converges. The approach chooses nearest-neighbor pivot probes to search the entire phase space by using a nonlocal distribution for the placement of the relocated probes. To test the algorithm, a standard suite of functions is given, as well as the energies and geo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
29
1

Year Published

2002
2002
2018
2018

Publication Types

Select...
7
2

Relationship

3
6

Authors

Journals

citations
Cited by 39 publications
(30 citation statements)
references
References 19 publications
0
29
1
Order By: Relevance
“…The nearest neighbour pivot method [28] with a Gaussian distribution as the pivoting distribution was used to perform the weighted least-squares ®tting. A weighting factor was assigned to each point based on the energy of the point and relative to the point lowest in energy along the containing radial line.…”
Section: Surface ®Ttingmentioning
confidence: 99%
“…The nearest neighbour pivot method [28] with a Gaussian distribution as the pivoting distribution was used to perform the weighted least-squares ®tting. A weighting factor was assigned to each point based on the energy of the point and relative to the point lowest in energy along the containing radial line.…”
Section: Surface ®Ttingmentioning
confidence: 99%
“…-Pivot strategy This method is inspired by the work of Serra et al (1997). Let us denote the best location of the particle by p, the best position of the informers of the particle by g and the objective function by f .…”
Section: The Strategies Of Displacementmentioning
confidence: 99%
“…However, in a quantum computer, with enough qubits available, we can perform full optimisation for all atoms. For example, for larger LJ clusters, if we had larger qubits, we could incorporate the partial knowledge that we had by starting with the structure of the smaller (M À k) clusters and adding k additional particles at random [9,42]. In a previous work [9], using the pivot method we have shown that the computational cost (CPU time) scales as M 2.9 with the number of L-J particles to be minimised.…”
mentioning
confidence: 97%
“…where Dx i is a randomly generated vector according to a particular distribution such as a Gaussian distribution [9].…”
mentioning
confidence: 99%