2022
DOI: 10.1007/s44196-022-00156-8
|View full text |Cite
|
Sign up to set email alerts
|

An Improved Sea Lion Optimization for Workload Elasticity Prediction with Neural Networks

Abstract: The work in this paper presents a study into nature-inspired optimization applied to workload elasticity prediction using neural networks. Currently, the trend is for proactive decision support in increasing or decreasing the available resource in cloud computing. The aim is to avoid overprovision leading to resource waste and to avoid resource under-provisioning. The combination of optimization and neural networks has potential for the performance, accuracy, and stability of the prediction solution. In this c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 67 publications
(79 reference statements)
0
4
0
Order By: Relevance
“…Learning algorithms are divided into two main groups: local search (LS) and global search algorithms. The Backpropagation algorithms (BPs) 3 or Extreme Learning Machines (ELM) 4 belong to the first group, commonly used for weight optimisation, while Evolutionary Algorithms (EAs) 5 belong to the second group. This second group is usually referred to Neuroevolution 6 or application of metaheuristics such as EAs to the evolution of ANNs, also known in the literature as Evolutionary Artificial Neural Networks (EANNs) 7 9 , so that both the weights and the ANN architecture are optimised.…”
Section: Introductionmentioning
confidence: 99%
“…Learning algorithms are divided into two main groups: local search (LS) and global search algorithms. The Backpropagation algorithms (BPs) 3 or Extreme Learning Machines (ELM) 4 belong to the first group, commonly used for weight optimisation, while Evolutionary Algorithms (EAs) 5 belong to the second group. This second group is usually referred to Neuroevolution 6 or application of metaheuristics such as EAs to the evolution of ANNs, also known in the literature as Evolutionary Artificial Neural Networks (EANNs) 7 9 , so that both the weights and the ANN architecture are optimised.…”
Section: Introductionmentioning
confidence: 99%
“…In this work, the SLO algorithm acts as a hyperparameter optimizer. SLO has been proposed to resolve global-scale optimization [24]. It stimulates the hunting behavior of sea lions, including how they use their tail and whiskers or capture and encircle prey.…”
Section: Hyperparameter Tuning Using Slo Algorithmmentioning
confidence: 99%
“…Once the feature is made, they are inputted into LSTM classification for allotting the appropriate class. LSTM is an exclusive RNN model with RNN features, while the memory cell sequence was exploited to improve the learning procedure of time series and randomly handling input datasets [23]. Furthermore, the input dataset's long-term dependence is captured to prevent the gradient disappearance of data communication, a significant enhancement to capture the active modification of time sequence.…”
Section: Lstm-based Classificationmentioning
confidence: 99%