A generalised integer S Ising spin glass model is analysed using the replica
formalism. The bilinear couplings are assumed to have a Gaussian distribution
with ferromagnetic mean
Abstract. We investigate the statistical properties of cut sizes generated by heuristic algorithms which solve approximately the graph bisection problem. On an ensemble of sparse random graphs, we find empirically that the distribution of the cut sizes found by "local" algorithms becomes peaked as the number of vertices in the graphs becomes large. Evidence is given that this distribution tends towards a Gaussian whose mean and variance scales linearly with the number of vertices of the graphs. Given the distribution of cut sizes associated with each heuristic, we provide a ranking procedure which takes into account both the quality of the solutions and the speed of the algorithms. This procedure is demonstrated for a selection of local graph bisection heuristics. branch-and-bound, or branch-and-cut, form the first class; they determine (exactly) the optimum of the cost function which is to be minimized. However, for NP-hard problems, they require large computation ressources, and in particular, large computation times. The second class consists of "heuristic" algorithms; these are not guaranteed to find the optimal (lowest cost) solution, nor even a solution very close to the optimum, but in practice they find good approximate solutions very fast. For problems in science, one's main interest is in the optimal solution, so an exact algorithm is required. However, for many engineering applications, the heuristic approach may be preferable. There are several reasons for this: (i) The computational ressources are simply insufficient to solve the instances of interest by exact methods; (ii) The cost function one wants to minimize is computationally very demanding, and limited resources force one to use an approximate cost function instead. This is the rule rather than the exception with very complex systems such as VLSI. If the true cost function cannot be used, there is little point in finding the true optimum for the wrong problem. (iii) Heuristic algorithms typically generate numerous "good enough" solutions, thus providing information about the statistical properties of low cost solutions. This information can in turn be used for generating better heuristics, or for finding new criteria for guiding the branching in exact algorithms such as branch-and-bound.For almost any combinatorial optimization problem, it is very easy to devise heuristic algorithms which perform quite well; this is probably why so many such algorithms have been proposed to date. Usually they fall into just a few families, the most popular of which are local search, simulated annealing, tabu search, and evolutionary computation. The practitioner is frequently confronted with the problem of choosing which method to use. Thus he would like to rank these algorithms and determine which one is best for his "instance" (the set of parameters which completely specify the cost function). A difficulty then arises because most heuristic algorithms are stochastic, so that they can give many different solutions for a single instance. In general, the dist...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.