2022
DOI: 10.48550/arxiv.2210.16934
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Learning to Compare Nodes in Branch and Bound with Graph Neural Networks

Abstract: Branch-and-bound approaches in integer programming require ordering portions of the space to explore next, a problem known as node comparison. We propose a new siamese graph neural network model to tackle this problem, where the nodes are represented as bipartite graphs with attributes. Similar to prior work, we train our model to imitate a diving oracle that plunges towards the optimal solution. We evaluate our method by solving the instances in a plain framework where the nodes are explored according to thei… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 17 publications
(43 reference statements)
0
1
0
Order By: Relevance
“…Graph Neural Networks (GNN) have been applied to address many problems in power systems [27]. For example, the application of GNN for learning to branch in B&B algorithms has been demonstrated to effectively reduce the running time of MIP [28][29][30]. DeepMind [31] develops a GNN model proposed in [28] which is built from the MIP model and applies it to neural diving and neural branching for solving problems including electric grid optimization [32].…”
Section: Introductionmentioning
confidence: 99%
“…Graph Neural Networks (GNN) have been applied to address many problems in power systems [27]. For example, the application of GNN for learning to branch in B&B algorithms has been demonstrated to effectively reduce the running time of MIP [28][29][30]. DeepMind [31] develops a GNN model proposed in [28] which is built from the MIP model and applies it to neural diving and neural branching for solving problems including electric grid optimization [32].…”
Section: Introductionmentioning
confidence: 99%