2018
DOI: 10.1155/2018/6381610
|View full text |Cite
|
Sign up to set email alerts
|

STATCOM Estimation Using Back-Propagation, PSO, Shuffled Frog Leap Algorithm, and Genetic Algorithm Based Neural Networks

Abstract: Different optimization techniques are used for the training and fine-tuning of feed forward neural networks, for the estimation of STATCOM voltages and reactive powers. In the first part, the paper presents the voltage regulation in IEEE buses using the Static Compensator (STATIC) and discusses efficient ways to solve the power systems featuring STATCOM by load flow equations. The load flow equations are solved using iterative algorithms such as Newton-Raphson method. In the second part, the paper focuses on t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
11
0
1

Year Published

2018
2018
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 19 publications
(12 citation statements)
references
References 56 publications
0
11
0
1
Order By: Relevance
“…To counteract the aforementioned limitation of gradient-based algorithms, metaheuristics as a global searching method have been employed to improve the ANN training phase. Various metaheuristic algorithms, such as cuckoo search optimization [ 41 ], bat optimization [ 42 ], monarch butterfly optimization [ 43 ], shuffled frog leap algorithm [ 44 ], kidney-inspired algorithm [ 45 ], and an improved particle swarm optimization [ 46 ], have been recently proposed and investigated. Previous studies show improved performances of metaheuristic-assisted models compared to the traditional models.…”
Section: Introductionmentioning
confidence: 99%
“…To counteract the aforementioned limitation of gradient-based algorithms, metaheuristics as a global searching method have been employed to improve the ANN training phase. Various metaheuristic algorithms, such as cuckoo search optimization [ 41 ], bat optimization [ 42 ], monarch butterfly optimization [ 43 ], shuffled frog leap algorithm [ 44 ], kidney-inspired algorithm [ 45 ], and an improved particle swarm optimization [ 46 ], have been recently proposed and investigated. Previous studies show improved performances of metaheuristic-assisted models compared to the traditional models.…”
Section: Introductionmentioning
confidence: 99%
“…On the other hand, the BP method is the primary method of ANNs; however, this method is prone to get stuck in the local minima and also experiences slower concentration rate toward the optimum solution. 38 PSO performs better than BP and GA in terms of rate of concentration. 39,40 According to the above results, we compared our proposed "CLAHE+MGLCM+PSONN" method with the state-of-the-art approaches: NBC, 9 WN+SVM, 10 ELM, 11 and CLAHE+ELM.…”
Section: Discussionmentioning
confidence: 94%
“…Instead, PSO uses a simple formula to update the positions of each particle. On the other hand, the BP method is the primary method of ANNs; however, this method is prone to get stuck in the local minima and also experiences slower concentration rate toward the optimum solution . PSO performs better than BP and GA in terms of rate of concentration …”
Section: Discussionmentioning
confidence: 99%
“…After the pre-training stage, the resulting network is fine-tuned using the BP algorithm to reach the global optimum [20]. We take the sum of square errors of outputs as the loss function defined as…”
Section: B Dbn Trainingmentioning
confidence: 99%