“…For example, the search for optimal (single-objective) or efficient (multiobjective) parameter sets may be carried out manually by trial and error or by relying on well-established optimization algorithms, − such as hill-climbing, simplex minimization, ,, the Levenberg–Marquardt method, − the L-BFGS method, evolutionary algorithms, multiobjective evolutionary algorithms, , multiobjective Particle Swarm Optimization, and multiobjective Genetic Algorithms . These parameter-calibration workflows also frequently rely on training surrogate models ,− ,− , (SMs), such as in Gaussian process regressors (GPRs), multistate Bennett’s acceptance ratio − (MBAR), pair correlation function rescaling (PCFR), radial basis functions , (RBFs), and thermodynamic reweighting, ,, to replace the simulations involved in the estimation of the target properties, thus reducing the underlying computational cost. Other strategies with similar goals include the use of the statistical-mechanical fluctuation formula, ,− whereby the derivatives of the (mechanical) target properties with respect to the parameters are obtained from an analytical expression (which requires a single simulation per derivative estimate), as opposed to based on finite-difference approaches (which require multiple simulations per derivative estimate).…”