2020
DOI: 10.1016/j.ijleo.2020.164978
|View full text |Cite
|
Sign up to set email alerts
|

Optimization of five-parameter BRDF model based on hybrid GA-PSO algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
10

Relationship

0
10

Authors

Journals

citations
Cited by 30 publications
(9 citation statements)
references
References 5 publications
0
9
0
Order By: Relevance
“…Owing to the combination of MOPSO and GDM, MOPSO-GDM achieves best performance in feature selection compared with NSGA, MOPSO and another improved PSO: MOPSO-M algorithms, especially in high-dimensional space optimization ( Figure 7 ). From past studies, hybrid algorithms are commonly used as improved algorithms, such as GA-PSO (Gupta et al, 2019 ; Liu et al, 2020 ), BAPSO (Almadhor et al, 2021 ), PSO-GWO (Dahmani and Yebdri, 2020 ), and hybrid multi-objective optimization algorithms: NSGA-MOPSO (Shuaipeng et al, 2017 ; Xie et al, 2022 ). Nevertheless, most hybrid algorithms directly combined the whole of two algorithms, without considering the complexity of the algorithm (Wang et al, 2022 ).…”
Section: Discussionmentioning
confidence: 99%
“…Owing to the combination of MOPSO and GDM, MOPSO-GDM achieves best performance in feature selection compared with NSGA, MOPSO and another improved PSO: MOPSO-M algorithms, especially in high-dimensional space optimization ( Figure 7 ). From past studies, hybrid algorithms are commonly used as improved algorithms, such as GA-PSO (Gupta et al, 2019 ; Liu et al, 2020 ), BAPSO (Almadhor et al, 2021 ), PSO-GWO (Dahmani and Yebdri, 2020 ), and hybrid multi-objective optimization algorithms: NSGA-MOPSO (Shuaipeng et al, 2017 ; Xie et al, 2022 ). Nevertheless, most hybrid algorithms directly combined the whole of two algorithms, without considering the complexity of the algorithm (Wang et al, 2022 ).…”
Section: Discussionmentioning
confidence: 99%
“…In this study, to overcome the lack of exploitation ability in the genetic algorithm (GA), slow convergence, premature convergence and the tendency to fall into the local optimal solution in particle swarm optimization (PSO) [ 46 , 47 , 48 ], a novel combined method, AsyLnCPSO-GA, was presented and introduced to select the optimal feature combination, then fed to the naïve Bayesian classifier. Owing to the combination of AsyLnCPSO and GA, AsyLnCPSO-GA achieved best performance in feature selection compared with PSO and the improved PSO–AsyLnCPSO ( Figure 7 , Figure 8 , Figure 9 and Figure 10 ) algorithms.…”
Section: Discussionmentioning
confidence: 99%
“…PSO is a stochastic population-based algorithm that is easier to implement than GA, as this latter requires excessive effort to tune a large number of parameters. Additionally, PSO exploits a larger search space than GA [ 63 ]. Moreover, the particles in PSO have a memory term which is important for the algorithm.…”
Section: Algorithms For Facility Location In Vfcmentioning
confidence: 99%