2018
DOI: 10.1007/s00500-018-3473-6
|View full text |Cite
|
Sign up to set email alerts
|

Best neighbor-guided artificial bee colony algorithm for continuous optimization problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
34
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
7
2

Relationship

1
8

Authors

Journals

citations
Cited by 64 publications
(34 citation statements)
references
References 43 publications
0
34
0
Order By: Relevance
“…Simulation studies are done on a computer with C++ and i7 8 GB RAM hardware. For SSEABC, ABC [21], MABC [35], NABC [36], and APABC [37], the results are obtained with 100 independent runs with 7500 function evaluations (FEs) for each sample. A Gaussian white noise signal with 100 samples is applied as input signal to both the unknown system and the IIR filter for each algorithm run.…”
Section: Simulation Resultsmentioning
confidence: 99%
“…Simulation studies are done on a computer with C++ and i7 8 GB RAM hardware. For SSEABC, ABC [21], MABC [35], NABC [36], and APABC [37], the results are obtained with 100 independent runs with 7500 function evaluations (FEs) for each sample. A Gaussian white noise signal with 100 samples is applied as input signal to both the unknown system and the IIR filter for each algorithm run.…”
Section: Simulation Resultsmentioning
confidence: 99%
“…(1) Begin (2) / * Build classification model based on DMSDL-QBSA-RF * / (3) Initialize the positions of N birds using equations (16) and (17): X i (i � 1, 2, ..., N); (4) Calculated fitness: f(X i )(i � 1, 2, ..., N); set X i to be P i and find P gbest ; (5) While iter < iter max + 1 do (6) For i � 1: n tree (7) Give each tree a training set of size N by random sampling with replacement based on Bootstrap; (8) Select m try attributes randomly at each leaf node, compare the attributes, and select the best one; (9) Recursively generate each decision tree without pruning operations; (10) End For (11) Update classification accuracy of RF: Evaluate f(X i ); (12) Update gbest and P gbest ; (13) [n best , m best ] � gbest; (14) iter � iter + 1; (15) End While (16) / * Classify using RF model * / (17) For i � 1: n best (18) Give each tree a training set of size N by random sampling with replacement based on Bootstrap; (19) Select m best attributes randomly at each leaf node, compare the attributes, and select the best one; (20) Recursively generate each decision tree without pruning operations; (21) End (22) Return DMSDL-QBSA-RF classification model (23) Classify the test dataset using equation 20; (24) Calculate OOB error; (25) End ALGORITHM 2: DMSDL-QBSA-RF classification model.…”
Section: Practical Applicationmentioning
confidence: 99%
“…The simulation results have shown that the proposed approach has the best performance in some complex functions. Peng et al [ 18 ] also developed a hybrid approach, which is using the best neighbor-guided solution search strategy to search ABC algorithm. The experimental results indicate that the proposed ABC is very competitive and outperforms the other algorithms.…”
Section: Introductionmentioning
confidence: 99%
“…In summary, our scheme significantly outperforms the existing method in almost all aspects. Therefore, our scheme may be applied in many scenarios with range query, such as cloud computing [37], [38], recommendation system [39], real-time graph stream [40], ciphertext query [41], [42], privacy-preserving for location-based services [43], continuous optimization [44], energy scheduling and optimization [45], etc.…”
Section: Comparison Of Search Efficiencymentioning
confidence: 99%