1996
DOI: 10.1007/3-540-61108-8_52
|View full text |Cite
|
Sign up to set email alerts
|

Optimizing hidden Markov models with a genetic algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

1998
1998
2018
2018

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(4 citation statements)
references
References 7 publications
0
4
0
Order By: Relevance
“…Reference [202] uses a hidden Markov model (HMM) classifier based on the HS algorithm for classification of motor imagery ERP signals. Success or failure of the HMM algorithm largely depends on the type of structure and parameters used in the algorithm [203], therefore, HS is used to train HMM. Usually, not using optimal parameters causes over-fitting in HMM [202].…”
Section: Harmony Search (Hs)mentioning
confidence: 99%
“…Reference [202] uses a hidden Markov model (HMM) classifier based on the HS algorithm for classification of motor imagery ERP signals. Success or failure of the HMM algorithm largely depends on the type of structure and parameters used in the algorithm [203], therefore, HS is used to train HMM. Usually, not using optimal parameters causes over-fitting in HMM [202].…”
Section: Harmony Search (Hs)mentioning
confidence: 99%
“…The modeling of MSA is performed by training of conventional HMMs, which is a computationally hard problem. Conventional HMMs are generally trained by Baum-Welch algorithm [45] and recently by plentiful meta-heuristic approaches such as evolutionary algorithms [46][47][48], simulating annealing [49,50] and PSO [51,52].…”
Section: Hidden Markov Model For Multiple Sequence Alignmentmentioning
confidence: 99%
“…Most of these have been concerned with multiple sequence alignments [1,12,13,17,18,25,27,32,36,37,42,44,[50][51][52][53][57][58][59]64,66,68,73], although a number have targeted other representations [20,28,29,33,34,39,40,56,60,63,65,71,72]. Most MSA approaches can be divided into two classes: those which directly evolve alignments e.g.…”
Section: Evolutionary Computation Approachesmentioning
confidence: 99%
“…Thomsen [63] and Won et al [71] have used similar approaches for other biosequence applications. EAs have also been used to evolve HMMs for non-biological sequence classification problems [16,41,60]. In [16], a GA is used to optimise HMM parameter values, and in [41] a GA is used to optimise both topology and parameter values -in both cases leading to more accurate models than those produced using Baum-Welch.…”
Section: Evolutionary Computation Approachesmentioning
confidence: 99%