2009
DOI: 10.1051/ps:2007048
|View full text |Cite
|
Sign up to set email alerts
|

Number of hidden states and memory: a joint order estimation problem for Markov chains with Markov regime

Antoine Chambaz,
Catherine Matias

Abstract: Abstract. This paper deals with order identification for Markov chains with Markov regime (MCMR) in the context of finite alphabets. We define the joint order of a MCMR process in terms of the number k of states of the hidden Markov chain and the memory m of the conditional Markov chain. We study the properties of penalized maximum likelihood estimators for the unknown order (k, m) of an observed MCMR process, relying on information theoretic arguments. The novelty of our work relies in the joint estimation… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2009
2009
2015
2015

Publication Types

Select...
3

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 18 publications
(21 reference statements)
0
4
0
Order By: Relevance
“…We offer a new proof, based on Kruskal's theorem, of this well-known result. This provides an interesting alternative to Petrie's more direct approach, and one that might extend to more complex frameworks, such as Markov chains with Markov regime, where no identifiability results are known (see, for instance, [9]). Moreover, as a by-product, our approach establishes a new bound on the number of consecutive variables needed, such that the marginal distribution for a generic HMM uniquely determines the full probability distribution.…”
Section: Parameter Identifiability Of Finite Mixtures Of Finite Measumentioning
confidence: 99%
See 1 more Smart Citation
“…We offer a new proof, based on Kruskal's theorem, of this well-known result. This provides an interesting alternative to Petrie's more direct approach, and one that might extend to more complex frameworks, such as Markov chains with Markov regime, where no identifiability results are known (see, for instance, [9]). Moreover, as a by-product, our approach establishes a new bound on the number of consecutive variables needed, such that the marginal distribution for a generic HMM uniquely determines the full probability distribution.…”
Section: Parameter Identifiability Of Finite Mixtures Of Finite Measumentioning
confidence: 99%
“…This independence in both rows and columns leads to the tensor decomposition of B i . Now since A has full row rank, (9) implies that B i does as well.…”
Section: Proofsmentioning
confidence: 99%
“…algorithmic procedures), the statistical properties of these models are slightly more difficult to obtain, (see e.g. Chambaz and Matias, 2009, for model selection issues). As for the convergence properties of the MLE, only the article by Douc, Moulines and Rydén (2004) considers the autoregressive case (instead of HMM) explaining why we focus on their results in our context.…”
Section: Introductionmentioning
confidence: 99%
“…Based on this former work, Gassiat and Boucheron [10] have introduced considerable advances: they proved the strong consistency of the penalized estimator without assuming a priori upper bounds for the number of states; in addition, they showed that the probabilities for underestimating as well as for overestimating fall at an exponential rate with sample size. For AR-MR processes with observations belonging to a finite set the techniques introduced by Gassiat and Boucheron were further used by Chambaz and Matias [4] to simultaneously show the consistency of the number of states of the hidden chain and the memory of the observed process.…”
Section: Introductionmentioning
confidence: 99%