2018
DOI: 10.1016/j.ejor.2017.08.025
|View full text |Cite
|
Sign up to set email alerts
|

Logical and inequality implications for reducing the size and difficulty of quadratic unconstrained binary optimization problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
3
2
2

Relationship

2
5

Authors

Journals

citations
Cited by 26 publications
(18 citation statements)
references
References 17 publications
0
18
0
Order By: Relevance
“…The mean and standard deviation were computed over five runs. Before solving the QUBO problems, we applied preprocessing techniques, reducing their size and difficulty [11]. This proved effective and eliminated a great many variables.…”
Section: Results For the Quantum Version Of Bidvitmentioning
confidence: 99%
See 1 more Smart Citation
“…The mean and standard deviation were computed over five runs. Before solving the QUBO problems, we applied preprocessing techniques, reducing their size and difficulty [11]. This proved effective and eliminated a great many variables.…”
Section: Results For the Quantum Version Of Bidvitmentioning
confidence: 99%
“…Solutions to (P5) can be approximated using heuristics such as simulated annealing [32], path relinking [25], tabu search [25], and parallel tempering [38]. Before solving (P5), it is advisable to reduce its size and difficulty by making use of logical implications among the coefficients [11]. This involves fixing every variable that corresponds to a node that has no neighbours to one, as it necessarily is included in an ε-dense subset.…”
Section: Chunk Coarseningmentioning
confidence: 99%
“…implemented the methods from Glover et al (18) for this article. However, we do not provide details, but rather concentrate on MaxCut reduction techniques in the following.…”
Section: Post-processingmentioning
confidence: 99%
“…For testing we use the QUBO instances presented in [20] and [21]. The algorithms were implemented in Python 3.6.…”
Section: Computational Experimentsmentioning
confidence: 99%
“…The experiments were performed on a 3.40 GHz Intel Core i7 processor with 16 GB RAM running 64 bit Windows 7 OS. The datasets described in [20] have 1000 nodes while the ORLIB instances [21] have 1000 and 2500 nodes. Our experiments utilize a path relinking and tabu search based QUBO solver.…”
Section: Computational Experimentsmentioning
confidence: 99%