2019
DOI: 10.1109/access.2019.2937136
|View full text |Cite
|
Sign up to set email alerts
|

Firefly Algorithm With Luciferase Inhibition Mechanism

Abstract: Firefly algorithm (FA) was proposed by Yang which inspired from communication between fireflies through flash. As an efficient swarm intelligence algorithm, FA has been successfully applied in real-world applications. However, FA employs full attraction model that the method of selecting fireflies is sequential selection, namely each firefly can be attracted to other all fireflies in the worst case. This causes FA is prone to fall into local optima and high computational complexity. Inspired by the principle o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 12 publications
(4 citation statements)
references
References 51 publications
0
4
0
Order By: Relevance
“…(1) Begin (2) / * Build classification model based on DMSDL-QBSA-RF * / (3) Initialize the positions of N birds using equations (16) and (17): X i (i � 1, 2, ..., N); (4) Calculated fitness: f(X i )(i � 1, 2, ..., N); set X i to be P i and find P gbest ; (5) While iter < iter max + 1 do (6) For i � 1: n tree (7) Give each tree a training set of size N by random sampling with replacement based on Bootstrap; (8) Select m try attributes randomly at each leaf node, compare the attributes, and select the best one; (9) Recursively generate each decision tree without pruning operations; (10) End For (11) Update classification accuracy of RF: Evaluate f(X i ); (12) Update gbest and P gbest ; (13) [n best , m best ] � gbest; (14) iter � iter + 1; (15) End While (16) / * Classify using RF model * / (17) For i � 1: n best (18) Give each tree a training set of size N by random sampling with replacement based on Bootstrap; (19) Select m best attributes randomly at each leaf node, compare the attributes, and select the best one; (20) Recursively generate each decision tree without pruning operations; (21) End (22) Return DMSDL-QBSA-RF classification model (23) Classify the test dataset using equation 20; (24) Calculate OOB error; (25) End ALGORITHM 2: DMSDL-QBSA-RF classification model.…”
Section: Practical Applicationmentioning
confidence: 99%
See 1 more Smart Citation
“…(1) Begin (2) / * Build classification model based on DMSDL-QBSA-RF * / (3) Initialize the positions of N birds using equations (16) and (17): X i (i � 1, 2, ..., N); (4) Calculated fitness: f(X i )(i � 1, 2, ..., N); set X i to be P i and find P gbest ; (5) While iter < iter max + 1 do (6) For i � 1: n tree (7) Give each tree a training set of size N by random sampling with replacement based on Bootstrap; (8) Select m try attributes randomly at each leaf node, compare the attributes, and select the best one; (9) Recursively generate each decision tree without pruning operations; (10) End For (11) Update classification accuracy of RF: Evaluate f(X i ); (12) Update gbest and P gbest ; (13) [n best , m best ] � gbest; (14) iter � iter + 1; (15) End While (16) / * Classify using RF model * / (17) For i � 1: n best (18) Give each tree a training set of size N by random sampling with replacement based on Bootstrap; (19) Select m best attributes randomly at each leaf node, compare the attributes, and select the best one; (20) Recursively generate each decision tree without pruning operations; (21) End (22) Return DMSDL-QBSA-RF classification model (23) Classify the test dataset using equation 20; (24) Calculate OOB error; (25) End ALGORITHM 2: DMSDL-QBSA-RF classification model.…”
Section: Practical Applicationmentioning
confidence: 99%
“…Liu et al [ 16 ] presented a multistrategy brain storm optimization (BSO) with dynamic parameter adjustment which is more competitive than other related algorithms. Peng et al [ 17 ] has proposed FA with luciferase inhibition mechanism to improve the effectiveness of selection. The simulation results have shown that the proposed approach has the best performance in some complex functions.…”
Section: Introductionmentioning
confidence: 99%
“…For verifying the effect of the proposed CoFA‐BCR, the original FA 50 and other two state‐of‐the‐art FA variants (i.e., RaFA 51 and LiFA 52 ) are employed to conduct comparison test, in which these competitors only replace the CoFA and other parts as same as the CoFA‐BCR. Tables 2 to 4 show experimental results of all algorithms at different datasets.…”
Section: Experimental Studymentioning
confidence: 99%
“…The poor performance of the algorithm in solving complex optimization problems was improved. Inspired by the firefly emission principle, Hu Peng et al proposed an improved FA with luciferase inhibition mechanism (LiFA) [25], which reduces the computational complexity of the algorithm. In 2017, Jiaxu Ning et al proposed an improved ACO with hybrid strengthened pheromone updating and smoothing mechanisms (ACOH) [26]; the global search capability and convergence rate of the algorithm were improved to a certain extent.…”
Section: Introductionmentioning
confidence: 99%