2009
DOI: 10.1214/09-aos689
|View full text |Cite
|
Sign up to set email alerts
|

Identifiability of parameters in latent structure models with many observed variables

Abstract: While hidden class models of various types arise in many statistical applications, it is often difficult to establish the identifiability of their parameters. Focusing on models in which there is some structure of independence of some of the observed variables conditioned on hidden ones, we demonstrate a general approach for establishing identifiability utilizing algebraic arguments. A theorem of J. Kruskal for a simple latent-class model with finite state space lies at the core of our results, though we apply… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

5
532
0
1

Year Published

2013
2013
2023
2023

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 393 publications
(538 citation statements)
references
References 44 publications
(93 reference statements)
5
532
0
1
Order By: Relevance
“…More recently, Hall and Zhou [2003] showed that mixtures of two arbitary distributions are generally identified as soon as three measurements are available, provided the component distributions are linearly independent and the outcomes satisfy a conditional-independence restriction. Allman, Matias and Rhodes [2009] have demonstrated that this result carries over to mixtures of more components. Their approach can be applied to a more general class of latent structures that feature some form of conditional independence, such as hidden Markov models with finite state spaces (see Petrie [1969] for seminal work on this) and to random-graph models.…”
mentioning
confidence: 92%
See 3 more Smart Citations
“…More recently, Hall and Zhou [2003] showed that mixtures of two arbitary distributions are generally identified as soon as three measurements are available, provided the component distributions are linearly independent and the outcomes satisfy a conditional-independence restriction. Allman, Matias and Rhodes [2009] have demonstrated that this result carries over to mixtures of more components. Their approach can be applied to a more general class of latent structures that feature some form of conditional independence, such as hidden Markov models with finite state spaces (see Petrie [1969] for seminal work on this) and to random-graph models.…”
mentioning
confidence: 92%
“…However, in contrast to Allman, Matias and Rhodes [2009], we go beyond appealing to his results to claim identification of the decomposition. Moreover, we show that a simple transformation of the array leads to a set of multilinear restrictions that identify the parameters of the latent structure at hand in a constructive manner.…”
mentioning
confidence: 98%
See 2 more Smart Citations
“…Thus in answering generic identifiability questions one need only consider Markov equivalence classes of DAGs. In Section 4 we revisit the fundamental result due to Kruskal [13], as developed in Allman et al [10] for identifiability questions. We give explicit identifiability procedures for the DAG this to which this result applies most directly (model 3-0), and also for the DAG of model 4-3b.…”
mentioning
confidence: 99%