2011
DOI: 10.1016/j.asoc.2010.06.015
|View full text |Cite
|
Sign up to set email alerts
|

Hybrid optimization with improved tabu search

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
22
0

Year Published

2013
2013
2020
2020

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 46 publications
(22 citation statements)
references
References 47 publications
0
22
0
Order By: Relevance
“…The results of [25,[44][45][46] showed that the optimization results using TS still has a possibility to get stuck in a local optimum value. To get the best result need to take several simulations and using a different number of individual neighbors and tabu list in each simulation to obtain the most optimal value.…”
Section: Tabu Searchmentioning
confidence: 96%
“…The results of [25,[44][45][46] showed that the optimization results using TS still has a possibility to get stuck in a local optimum value. To get the best result need to take several simulations and using a different number of individual neighbors and tabu list in each simulation to obtain the most optimal value.…”
Section: Tabu Searchmentioning
confidence: 96%
“…SNMRVU algorithm can obtain better function values if we apply one termination criterion, which is the number of evaluation function values is ≤ 50, 000. 7.14e-11 9.06e-10 3.34e-09 F 3 5.57e-11 6.56e-11 2.12e-09 F 4 (0.0) (0.0) (0.0) F 5 6.01e-07 1.54e-05 4.32e-05 F 6 0 (0.0) (0.0) F 7 3.85e-07 2.11e-06 9.85e-06 F 8 4.9e-07 1.42e-14 2.81e-14 F 9 (0.0) (0.0) (0.0) F 10 4.3e-07 3.12e-04 1.12e-04…”
Section: Comparison Between Snmrvu Ssr and Ssc On Functions With mentioning
confidence: 99%
“…Many promising methods have been proposed to solve the mentioned problem in Equation 1, for example, genetic algorithms [14,24], particle swarm optimization [23,28], ant colony optimization [31], tabu search [9,10], differential evolution [2,6] scatter search [15,20], and variable neighborhood search [11,27]. Although the efficiency of these methods, when applied to lower and middle dimensional problems, e.g., D < 100, many of them suffer from the curse of dimensionality when applied to high dimensional problems.…”
Section: Introductionmentioning
confidence: 99%
“…Metaheuristic approaches have been formed according to inspiration by nature, physics and human being. In recent years, many of these algorithms and their improved algorithms have been successfully applied to various problems of engineering optimization [12][13][14][15][16]. A common feature in meta-heuristic approaches is that they combine rules and randomness to imitate natural phenomena.…”
Section: Introductionmentioning
confidence: 99%
“…A common feature in meta-heuristic approaches is that they combine rules and randomness to imitate natural phenomena. These phenomena include the biological evolutionary process (e.g., the Genetic Algorithm (GA) [17][18] and the Differential Evolution (DE) [12][13]), animal behavior (e.g., Particle Swarm Optimization (PSO) [14] and Ant Colony Algorithm (ACA) [15][16]), and the physical annealing process (e.g., Simulated Annealing (SA) [1,19]). These algorithms are one of the approximate optimization approaches that have mechanism of departing from local optimum.…”
Section: Introductionmentioning
confidence: 99%