2013
DOI: 10.1007/978-3-642-40994-3_28
|View full text |Cite
|
Sign up to set email alerts
|

SNNAP: Solver-Based Nearest Neighbor for Algorithm Portfolios

Abstract: A Mamma e Papà e ai miei fratelli AbstractThe success of portfolio algorithms in competitions in the area of combinatorial problem solving, as well as in practice, has motivated interest in the development of new approaches to determine the best solver for the problem at hand. Yet, although there are a number of ways in which this decision can be made, it always relies on a rich set of features to identify and distinguish the structure of the problem instances.In this thesis, however, it is firstly shown how n… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
9
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(9 citation statements)
references
References 20 publications
0
9
0
Order By: Relevance
“…Numerical results on both SAT and CSP domains show that the number of features can be significantly reduced while often providing considerable performances gains. Moreover, in [8] the authors introduce SNNAP (Solver-based Nearest Neighbors for Algorithm Portfolios), an alternative view of ISAC which uses the existing features to predict the best three solvers for a particular instance. A brand new classifier that selects solvers based on a Cost-Sensitive Hierarchical Clustering (CSHC) model is presented in [23].…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Numerical results on both SAT and CSP domains show that the number of features can be significantly reduced while often providing considerable performances gains. Moreover, in [8] the authors introduce SNNAP (Solver-based Nearest Neighbors for Algorithm Portfolios), an alternative view of ISAC which uses the existing features to predict the best three solvers for a particular instance. A brand new classifier that selects solvers based on a Cost-Sensitive Hierarchical Clustering (CSHC) model is presented in [23].…”
Section: Related Workmentioning
confidence: 99%
“…Unfortunately, we have not seen significant improvements: all the performance gains were always below 1%. 8 However, it is worth noting that merely removing constant features and scaling them in a given range could lead to major enhancements with a small computational effort. Consider for instance Figure 3 that shows a comparison between the performances obtained by 3S on Benchmark B, by using normalized and not normalized features.…”
Section: Features Preprocessingmentioning
confidence: 99%
See 1 more Smart Citation
“…Another approach based on k-NN is SNNAP (Collautti, Malitsky, Mehta, & O'Sullivan, 2013), which first predicts the performance of each algorithm with regression models and then uses this information for a k-NN approach in the predicted performance space. As for 3S, SNNAP was inspired by the ISAC approach.…”
Section: Related Workmentioning
confidence: 99%
“…Especially on SAT problems, automatic algorithm selectors have achieved impressive performance improvements in the last decade. SATzilla (Xu et al 2008;2007;2012) predicted algorithm performance by means of ridge regression until 2009 and nowadays uses a pairwise voting scheme based on random forests; ISAC (Kadioglu et al 2010) clusters instances in the instance feature space and uses a nearest neighbour approach on cluster centers for algorithm selection; 3S (Kadioglu et al 2011;Malitsky et al 2013) uses k-NN in the feature space and introduces pre-solving schedules computed by Integer Linear Programming and cost-sensitive clustering; SNAPP (Collautti et al 2013) predicts algorithm performance based on instance features and chooses an algorithm based on the similarity of the predicted performances. All these systems are specialized on a single approach.…”
Section: Related Workmentioning
confidence: 99%