2019
DOI: 10.1016/j.asoc.2019.105576
|View full text |Cite
|
Sign up to set email alerts
|

JayaX: Jaya algorithm with xor operator for binary optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
33
0
2

Year Published

2019
2019
2022
2022

Publication Types

Select...
7
2

Relationship

1
8

Authors

Journals

citations
Cited by 83 publications
(35 citation statements)
references
References 47 publications
0
33
0
2
Order By: Relevance
“…(1) Input: the data set after feature extracted (2) Output: best feature subset (3) Get training set and test set after the features have been extracted; (4) Initialize population, such as initial position, velocity and fitness values of particles, as well as local optimal pbest and global optimal gbest of particles; (5) for k � 1 ⟶ maxIterations do (6) for i � 1 ⟶ swarmSize do (7) Update inertia weight using equation 10; (8) for j � 1 ⟶ dimension do (9) Update a and p; (10) Update A and C, respectively, using equations (19) and 20; (11) Update ϖ t 1 and ϖ t 2 , respectively, using equations (13) and 14; (12) Update velocity of particle using equation 11; (13) Update position of particle using equation 18; (14) end for (15) end for (16) for i � 1 ⟶ swarmSize do (17) Update fitness value fitness i ; (18) Update local optimal pbest; (19) end for (20) Update global optimal gbest; (21) end for (22) Getting the optimal feature subset selected by NBPSOSEE; (23) Delete a feature f in the current feature subset using equation 26; (24) Update the optimal feature subset using equation 27; (25) Repeat steps 23 and 24 until the termination condition is met; ALGORITHM 2: NBPSOSEE-SBS feature selection. 8 Mathematical Problems in Engineering e logistic regression model is trained by training set with the optimal feature subset selected by the NBPSOSEE-SBS algorithm.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…(1) Input: the data set after feature extracted (2) Output: best feature subset (3) Get training set and test set after the features have been extracted; (4) Initialize population, such as initial position, velocity and fitness values of particles, as well as local optimal pbest and global optimal gbest of particles; (5) for k � 1 ⟶ maxIterations do (6) for i � 1 ⟶ swarmSize do (7) Update inertia weight using equation 10; (8) for j � 1 ⟶ dimension do (9) Update a and p; (10) Update A and C, respectively, using equations (19) and 20; (11) Update ϖ t 1 and ϖ t 2 , respectively, using equations (13) and 14; (12) Update velocity of particle using equation 11; (13) Update position of particle using equation 18; (14) end for (15) end for (16) for i � 1 ⟶ swarmSize do (17) Update fitness value fitness i ; (18) Update local optimal pbest; (19) end for (20) Update global optimal gbest; (21) end for (22) Getting the optimal feature subset selected by NBPSOSEE; (23) Delete a feature f in the current feature subset using equation 26; (24) Update the optimal feature subset using equation 27; (25) Repeat steps 23 and 24 until the termination condition is met; ALGORITHM 2: NBPSOSEE-SBS feature selection. 8 Mathematical Problems in Engineering e logistic regression model is trained by training set with the optimal feature subset selected by the NBPSOSEE-SBS algorithm.…”
Section: Resultsmentioning
confidence: 99%
“…Yang et al [23] propose a novel binary Jaya optimization algorithm, which is integrated with the lambda iteration method to transform the dual objectives of economy and emission commitment into a single objective problem. Aslan et al [24] proposed the JayaX binary optimization algorithm replacing Jaya's solution update rules with XOR operator and compared the results with the latest algorithm, which can produce better quality results in binary optimization problems. e whale optimization algorithm (WOA) [25,26] was proposed, using the wrapperbased method to reach the optimal subset of features and effectively improve the accuracy of classification.…”
Section: Introductionmentioning
confidence: 99%
“…In this study, shuffled frog leaping algorithm and tree seed algorithm which are continuous optimization methods are proposed. Aslan et al [29] have mentioned in detail discrete optimization method. All algorithms run under the same conditions and the stopping criterion is determined as the number of function evaluations (FEs).…”
Section: Proposed Algorithmsmentioning
confidence: 99%
“…Some of optimization-based local search algorithms are Tabu Search (TS) algorithm [34] and Simulated Annealing (SA) algorithm [35]. In recently, optimization-based metaheuristic algorithms are used for many different optimization problems, because of problem free, has a simple structure and easy adaptable to any optimization problems [36]. Some of these metaheuristic algorithms for solving data clustering problems are such as Genetic Algorithm (GA) [37,38], Teacher Learning Based Optimization (TLBO) [39], Ant Colony Optimization (ACO) [40], Artificial Bee Colony (ABC) [3,41], Gravitational Search Algorithm (GSA) [42], Particle Swarm Optimization (PSO) [43], Grey Wolf Optimizer (GWO) [4] and Cuckoo Search (CS) [44,45] algorithms.…”
Section: Introductionmentioning
confidence: 99%