2017
DOI: 10.1007/978-3-319-58253-5_1
|View full text |Cite
|
Sign up to set email alerts
|

Hidden Markov Model Classifier for the Adaptive Particle Swarm Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 20 publications
(9 citation statements)
references
References 28 publications
0
9
0
Order By: Relevance
“…On the other hand, recent studies have studied how hidden Markov models improve the optimization algorithms. In [75], authors studied the relation of particle distances to determine the state of the particle swarm optimizer. The states were inspired by [38,76].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…On the other hand, recent studies have studied how hidden Markov models improve the optimization algorithms. In [75], authors studied the relation of particle distances to determine the state of the particle swarm optimizer. The states were inspired by [38,76].…”
Section: Related Workmentioning
confidence: 99%
“…Therefore, we apply a discretization process to the evolutionary factor f of each inner-phase of the PSO. The discretization process used in this work is defined in [75] and corresponds to identifying the interval in which the calculated evolutionary factor belongs. The seven defined intervals are:…”
Section: Hmm-pso Integrationmentioning
confidence: 99%
“…As a result, the RSM is integrating with the FEA to build the mathematical models in the present study. Then the PSO algorithm [36][37][38][39] is utilized. PSO is a multi-objective optimization technique that gives high precision results and fast convergence capabilities for linear and weak non-linear models.…”
Section: Introductionmentioning
confidence: 99%
“…Especially, the Hidden Markov Model (HMM) [18]. HMMs success is due to ability to deal with the variability by means of stochastic modeling.…”
Section: Introductionmentioning
confidence: 99%
“…The main idea is to predict the best cooling law parameter based on history of the run. To do that, first we train the HMM model by updating its parameters [18]. Then we proceed to a classification process through the Viterbi algorithm [18] which gives the most probable cooling law parameter.…”
Section: Introductionmentioning
confidence: 99%