2013
DOI: 10.1016/j.ins.2012.12.043
|View full text |Cite
|
Sign up to set email alerts
|

An improved adaptive binary Harmony Search algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
60
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 115 publications
(61 citation statements)
references
References 34 publications
1
60
0
Order By: Relevance
“…are calculated. On the other hand, the performance of the proposed IBBA-RSS algorithm is compared with six different algorithms that are reported in [37]: NGHS1 [36], SBHS [37], BHS [38], DBHS [39], ABHS [40] and ABHS1 [41]. Table 2 shows the comparisons between the proposed algorithm and six algorithms, where best results are highlighted in bold.…”
Section: Low-dimensional 0-1 Knapsack Problemsmentioning
confidence: 99%
“…are calculated. On the other hand, the performance of the proposed IBBA-RSS algorithm is compared with six different algorithms that are reported in [37]: NGHS1 [36], SBHS [37], BHS [38], DBHS [39], ABHS [40] and ABHS1 [41]. Table 2 shows the comparisons between the proposed algorithm and six algorithms, where best results are highlighted in bold.…”
Section: Low-dimensional 0-1 Knapsack Problemsmentioning
confidence: 99%
“…Modifications on harmony memory consideration rate have been studied in many researches [30,31]. Through the reference, it can be concluded that harmony memory consideration rate should be given a small value to increase the diversity of the harmony memory and global search when solutions differ greatly.…”
Section: Improvements On Harmony Memory Considerationmentioning
confidence: 99%
“…Parameters are set as follows: ingen = 100, 0 = 0.3, = 5, visual = 1.5, 1 = 0.3, 2 = 0.5, 1 = 0.6, 2 = 0.4, 1 = 0.6, 2 = 0.4, max gen = 100000, and = 30; for other parameters, refer to [24,30,32].…”
Section: Verification Tests By Benchmarkmentioning
confidence: 99%
“…Moreover, it has few mathematical requirements and derivative information is not needed, because it uses the stochastic random search [2][3][4]. The HS has been successfully applied to various areas, including the binary coded optimization problems [5], the reaction kinetic parameter estimation [6], the power economic load dispatch [7], the cost minimization [8], the damage detection [9], the feature selection [10], the machine learning [11], and the classification [12].…”
Section: Introductionmentioning
confidence: 99%