The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2015
DOI: 10.1007/s00500-015-1820-4
|View full text |Cite
|
Sign up to set email alerts
|

A dynamic multi-objective evolutionary algorithm using a change severity-based adaptive population management strategy

Abstract: In addition to the need for simultaneously optimizing several competing objectives, many real-world problems are also dynamic in nature. These problems are called dynamic multi-objective optimization problems. Applying evolutionary algorithms to solve dynamic optimization problems has obtained great attention among many researchers. However, most of works are restricted to the single-objective case. In this work, we propose an adaptive hybrid population management strategy using memory, local search and random… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
38
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 119 publications
(43 citation statements)
references
References 39 publications
0
38
0
Order By: Relevance
“…The helpful information offered by the archive can assist in handling neighbor sub-problems by cooperation. Azzouz et al proposed an adaptive strategy for managing hybrid populations with memory, local search and random strategies, to effectively tackle DMOPs, which guarantees a rapid convergence and good diversity [45]. Koo et al proposed a selective memory technique, which selects a partial retrieval based on the diversity in the decision space to maintain effective memories [46].…”
Section: Dynamic Multi-objective Optimization Algorithmsmentioning
confidence: 99%
“…The helpful information offered by the archive can assist in handling neighbor sub-problems by cooperation. Azzouz et al proposed an adaptive strategy for managing hybrid populations with memory, local search and random strategies, to effectively tackle DMOPs, which guarantees a rapid convergence and good diversity [45]. Koo et al proposed a selective memory technique, which selects a partial retrieval based on the diversity in the decision space to maintain effective memories [46].…”
Section: Dynamic Multi-objective Optimization Algorithmsmentioning
confidence: 99%
“…slow convergence and poor diversity, when the environment changes. As a result the authors in [1] proposed an adaptive hybrid population management strategy using memory, local search and random strategies to effectively handle environment dynamicity in DMOPs. The special feature of this algorithm is that it can adjust the number of memory and random solutions to be used according to the change severity.…”
Section: Related Workmentioning
confidence: 99%
“…It has been proved that when the optimal solution returns to the previous position repeatedly or the environment changes periodically, this algorithm will help save computing time and bias search process, thus becoming very efficient. In [15], Azzouz et al proposed an adaptive hybrid population management strategy, which is based on a technology that can measure the severity of environmental changes, then according to the technology it can adjust memory, local search (LS) and the number of random solutions.…”
Section: Related Workmentioning
confidence: 99%
“…Input: The Dynamic Multi-objective Optimaztion Function F (X); Output: P OSs: the POSs of F (X); 1 Randomly initiate a Population the P op 0 ; 2 P OS 0 =DMOEA(P op 0 ); 3 P OS s =P OS 0 ; 4 Train a SVM classifier SC S by using P g ∈ P OS 0 and N g / ∈ P OS 0 ; 5 Randomly generate solutions {xy 1 , · · · , xy p } of the function F (X) 1 ; 6 if xy i pass the recognition of the SVM SC S then 7 Put xy i into P op 1 8 end 9 P OS 1 =DMOEA(P op 1 ); 10 P OS t =P OS 1 ; 11 for t = 1 to n do 12 P SAM P LES t =P OS t ; 13 Train SC S by using P g ∈ P SAM P LES t and N g ∈ N SAM P LES t ; 14 Randomly generate solutions {xy 1 , · · · , xy p } of the function F (X) t+1 ; 15 if xy i pass the recognition of the SVM SC S then 16 Put xy i into P op t+1 17 end 18 P OS t+1 = DMOEA(P op t+1 ); 19 P OSs = P OSs ∪ P OS t+1 ; 20 end 21 return P OSs;…”
Section: Empirical Studymentioning
confidence: 99%