2022
DOI: 10.1038/s42256-022-00468-6
|View full text |Cite
|
Sign up to set email alerts
|

Combinatorial optimization with physics-inspired graph neural networks

Abstract: Combinatorial optimization problems are pervasive across science and industry. Modern deep learning tools are poised to solve these problems at unprecedented scales, but a unifying framework that incorporates insights from statistical physics is still outstanding. Here we demonstrate how graph neural networks can be used to solve combinatorial optimization problems. Our approach is broadly applicable to canonical NP-hard problems in the form of quadratic unconstrained binary optimization problems, such as maxi… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
33
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 86 publications
(34 citation statements)
references
References 112 publications
1
33
0
Order By: Relevance
“…
We provide a comprehensive reply to the comment written by Chiara Angelini and Federico Ricci-Tersenghi [arXiv:2206.13211] and argue that the comment singles out one particular nonrepresentative example problem, entirely focusing on the maximum independent set (MIS) on sparse graphs, for which greedy algorithms are expected to perform well. Conversely, we highlight the broader algorithmic development underlying our original work [1], and (within our original framework) provide additional numerical results showing sizable improvements over our original results, thereby refuting the comment's performance statements. We also provide results showing run-time scaling superior to the results provided by Angelini and Ricci-Tersenghi.
…”
mentioning
confidence: 51%
See 1 more Smart Citation
“…
We provide a comprehensive reply to the comment written by Chiara Angelini and Federico Ricci-Tersenghi [arXiv:2206.13211] and argue that the comment singles out one particular nonrepresentative example problem, entirely focusing on the maximum independent set (MIS) on sparse graphs, for which greedy algorithms are expected to perform well. Conversely, we highlight the broader algorithmic development underlying our original work [1], and (within our original framework) provide additional numerical results showing sizable improvements over our original results, thereby refuting the comment's performance statements. We also provide results showing run-time scaling superior to the results provided by Angelini and Ricci-Tersenghi.
…”
mentioning
confidence: 51%
“…The comment by Angelini and Ricci-Tersenghi is exclusively focused on the maximum independent set (MIS) problem for sparse random d-regular graphs with low densities between ∼ 10 −6 and ∼ 10 −3 , in line with one of the comment's main references entitled "Monte Carlo algorithms are very effective in finding the largest independent set in sparse random graphs." However, the comment leaves out the fact that we have also provided results for standard MaxCut benchmark instances based on the publicly-available (and dense) Gset data set, as provided in Table I of our paper [1]. We report on a wide array of benchmark results, including results based on (i) an SDP solver using dual scaling (DSDP), (ii) Breakout Local Search (BLS), (iii) a Tabu Search metaheuristic (KHLWG), and (iv) a recurrent graph neural network (GNN) architecture for maximum constraint satisfaction problems (RUN-CSP).…”
mentioning
confidence: 99%
“…It would be also interesting to extend our work for optimization problems with continuous variables [45] using models that can handle continuous variables [46,47]. Additionally, one could also incorporate graph autoregressive networks in the VCA scheme to take the graph structure of some optimization problems into consideration [48,49]. There is also flexibility in tuning the temperature cooling schedules to potentially improve the performance of VCA [50].…”
Section: Discussionmentioning
confidence: 99%
“…Schuetz et al [19] considered the graph coloring problem, which is the zero-temperature version of the benchmark problem we propose to use in this work, and found that a graph neural network (GNN) can propose moves that allow one to efficiently find a proper coloring with comparable performances to (but not outperforming) state-of-the-art local search algorithm. Additionally, GNN have shown to be successful at solving discrete combinatorial problems [35], but they do not provide much advantage over classical greedy algorithms, and sometimes they can even show worse performance [36,37]. Finally, Inack et al [20] showed that the machine-learning-assisted simulated annealing scheme does not work on a glassy problem with a rough energy landscape.…”
Section: State Of the Artmentioning
confidence: 99%