“…In practice, many optimization problems often encounter the difficulty of being trapped in local optima and call for global optimization solutions. [1][2][3] There are many well-established numerical methods for optimization, such as steepest descent methods, quasi-Newton methods, conjugate gradient methods, simplex methods, etc. However, in order to locate the global optimum, they all require that the objective function be unimodal and/or differentiable.…”