“…T, t are the maximum and current iteration steps, respectively. 13,32 It can be concluded that the cosine decreasing weight decreases slowly in the early stage and rapidly in the late stage. The large weight in the early search step contributes to accelerate the global convergence speed.…”
“…To solve this problem, a cosine decreasing weight function was added to this formula, as shown in the following Equation:where, ω max and ω mix are the maximum and minimum weight respectively. T , t are the maximum and current iteration steps, respectively 13,32 …”
Section: Parameter Optimization Processmentioning
confidence: 99%
“…Sharma introduced the inertia weight into the FA, weight defines a function of iteration or time-varying inertia weight. 13 Zhang and Li respectively put forward the linear decreasing inertia weight and the Gaussian decreasing inertia weight to balance the global search ability and local optimization ability. 14,15 Compared with the previous hybrid parameter extraction methods, we can see that most of the parameter extraction methods do not optimize all parameters at the same time, but directly extract the external parameters and optimize the internal parameters with algorithms, so as to avoid the low efficiency and accuracy of multielement model, which results the unimproved parameters may not reach the ideal value.…”
Section: Introductuionmentioning
confidence: 99%
“…So, the modified firefly algorithm (MFA) is proposed to avoid these problems. Sharma introduced the inertia weight into the FA, weight defines a function of iteration or time‐varying inertia weight 13 . Zhang and Li respectively put forward the linear decreasing inertia weight and the Gaussian decreasing inertia weight to balance the global search ability and local optimization ability 14,15 …”
In this article, an improved firefly algorithm is proposed to extract the parameters of the small signal model of GaN HEMTs on SiC substrates. First, the initial values of the parameters are extracted directly by pinch-off, unbiased cold S-parameters and hot S-parameters. Second, an improved firefly algorithm is proposed to optimize parameters. Finally, in order to verify this method, the performances of traditional algorithm are compared with modified firefly algorithm in convergence speed, execution time, and accuracy, which shows that this proposed method is more suitable for the extraction of highdimensional parameters model.
“…T, t are the maximum and current iteration steps, respectively. 13,32 It can be concluded that the cosine decreasing weight decreases slowly in the early stage and rapidly in the late stage. The large weight in the early search step contributes to accelerate the global convergence speed.…”
“…To solve this problem, a cosine decreasing weight function was added to this formula, as shown in the following Equation:where, ω max and ω mix are the maximum and minimum weight respectively. T , t are the maximum and current iteration steps, respectively 13,32 …”
Section: Parameter Optimization Processmentioning
confidence: 99%
“…Sharma introduced the inertia weight into the FA, weight defines a function of iteration or time-varying inertia weight. 13 Zhang and Li respectively put forward the linear decreasing inertia weight and the Gaussian decreasing inertia weight to balance the global search ability and local optimization ability. 14,15 Compared with the previous hybrid parameter extraction methods, we can see that most of the parameter extraction methods do not optimize all parameters at the same time, but directly extract the external parameters and optimize the internal parameters with algorithms, so as to avoid the low efficiency and accuracy of multielement model, which results the unimproved parameters may not reach the ideal value.…”
Section: Introductuionmentioning
confidence: 99%
“…So, the modified firefly algorithm (MFA) is proposed to avoid these problems. Sharma introduced the inertia weight into the FA, weight defines a function of iteration or time‐varying inertia weight 13 . Zhang and Li respectively put forward the linear decreasing inertia weight and the Gaussian decreasing inertia weight to balance the global search ability and local optimization ability 14,15 …”
In this article, an improved firefly algorithm is proposed to extract the parameters of the small signal model of GaN HEMTs on SiC substrates. First, the initial values of the parameters are extracted directly by pinch-off, unbiased cold S-parameters and hot S-parameters. Second, an improved firefly algorithm is proposed to optimize parameters. Finally, in order to verify this method, the performances of traditional algorithm are compared with modified firefly algorithm in convergence speed, execution time, and accuracy, which shows that this proposed method is more suitable for the extraction of highdimensional parameters model.
“…Subsequently, Yang improved the quality of the FA by introducing chaos into the standard FA and increased the accuracy of the standard FA by dynamically adjusting its parameters [ 15 ]. Sharma introduced the inertia weight into the FA; this strategy can overcome the tendency of falling into local optima and can achieve a slow convergence for optimization problems [ 16 ]. Farahani and other scholars proposed a Gaussian distribution FA, which referred to an adaptive step size and improved the glow worm algorithm by improving the overall position of the FA population through Gaussian distribution [ 17 ].…”
The firefly algorithm (FA) is proposed as a heuristic algorithm, inspired by natural phenomena. The FA has attracted a lot of attention due to its effectiveness in dealing with various global optimization problems. However, it could easily fall into a local optimal value or suffer from low accuracy when solving high-dimensional optimization problems. To improve the performance of the FA, this paper adds the self-adaptive logarithmic inertia weight to the updating formula of the FA, and proposes the introduction of a minimum attractiveness of a firefly, which greatly improves the convergence speed and balances the global exploration and local exploitation capabilities of FA. Additionally, a step-size decreasing factor is introduced to dynamically adjust the random step-size term. When the dimension of a search is high, the random step-size becomes very small. This strategy enables the FA to explore solution more accurately. This improved FA (LWFA) was evaluated with ten benchmark test functions under different dimensions (D = 10, 30, and 100) and with standard IEEE CEC 2010 benchmark functions. Simulation results show that the performance of improved FA is superior comparing to the standard FA and other algorithms, i.e., particle swarm optimization, the cuckoo search algorithm, the flower pollination algorithm, the sine cosine algorithm, and other modified FA. The LWFA also has high performance and optimal efficiency for a number of optimization problems.
Intelligent optimization algorithms based on swarm principles have been widely researched in recent times. The Firefly Algorithm (FA) is an intelligent swarm algorithm for global optimization problems. In literature, FA has been seen as one of the efficient and robust optimization algorithm. However, the solution search space used in FA is insufficient, and the strategy for generating candidate solutions results in good exploration ability but poor exploitation performance. Although, there are a lot of modifications and hybridizations of FA with other optimizing algorithms, there is still a room for improvement. Therefore, in this paper, we first propose modification of FA by introducing a stepping ahead parameter. Second, we design a hybrid of modified FA with Covariance Matrix Adaptation Evolution Strategy (CMAES) to improve the exploitation while containing good exploration. Traditionally, hybridization meant to combine two algorithms together in terms of structure only, and preference was not taken into account. To solve this issue, preference in terms of user and problem (time complexity) is taken where CMAES is used within FA's loop to avoid extra computation time. This way, the structure of algorithm together with the strength of the individual solution are used. In this paper, FA is modified first and later combined with CMAES to solve selected global optimization benchmark problems. The effectiveness of the new hybridization is shown with the performance analysis.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.