2013
DOI: 10.1007/s11222-013-9399-z
|View full text |Cite
|
Sign up to set email alerts
|

MCMC implementation for Bayesian hidden semi-Markov models with illustrative applications

Abstract: Hidden Markov models (HMMs) are flexible, wellestablished models useful in a diverse range of applications. However, one potential limitation of such models lies in their inability to explicitly structure the holding times of each hidden state. Hidden semi-Markov models (HSMMs) are more useful in the latter respect as they incorporate additional temporal structure by explicit modelling of the holding times. However, HSMMs have generally received less attention in the literature, mainly due to their intensive c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
22
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 15 publications
(22 citation statements)
references
References 38 publications
0
22
0
Order By: Relevance
“…If both these distributions are geometric, then this is an HMM where the scalar parameters 0< ϕ S , ϕ T <1 define the 2×2 transition matrix. Note that τ ≠0 so self‐transitions are not allowed, as this would conflict with the very definition of holding times between well‐defined regimes (Economou et al ., ). This implies that neither M S nor M T is a special case of M ST , which is a point that we return to later.…”
Section: Modelling Frameworkmentioning
confidence: 99%
See 3 more Smart Citations
“…If both these distributions are geometric, then this is an HMM where the scalar parameters 0< ϕ S , ϕ T <1 define the 2×2 transition matrix. Note that τ ≠0 so self‐transitions are not allowed, as this would conflict with the very definition of holding times between well‐defined regimes (Economou et al ., ). This implies that neither M S nor M T is a special case of M ST , which is a point that we return to later.…”
Section: Modelling Frameworkmentioning
confidence: 99%
“…For a given value of θ , Economou et al . () provided an efficient algorithm for computing the sum in equation and used this to fit HSMMs in a Bayesian setting by using MCMC sampling—and here we adopt this algorithm to fit model M ST as a Bayesian model. This is the so‐called forward algorithm (in HMM jargon), which sequentially computes the probability distribution p{Cfalse(tfalse)|bold-italicρ1:t,θ} of the latent states at each time step t given the data up to t , from which the likelihood is computed as a by‐product.…”
Section: Modelling Frameworkmentioning
confidence: 99%
See 2 more Smart Citations
“…Based on the results of the forward algorithm, we can sample the whole sequence of hidden states z i in d i at a time. Inspired by recursive algorithm given in [16], we give the modified version using RUP to generate duration times of hidden states for a text. Let…”
Section: Forward-backward Gibbs Samplermentioning
confidence: 99%