2017
DOI: 10.1007/s00357-017-9221-2
|View full text |Cite
|
Sign up to set email alerts
|

Multivariate Response and Parsimony for Gaussian Cluster-Weighted Models

Abstract: A family of parsimonious Gaussian cluster-weighted models is presented. This family concerns a multivariate extension to cluster-weighted modelling that can account for correlations between multivariate responses. Parsimony is attained by constraining parts of an eigendecomposition imposed on the component covariance matrices. A sufficient condition for identifiability is provided and an expectation-maximization algorithm is presented for parameter estimation. Model performance is investigated on both syntheti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

2
37
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
6
1

Relationship

3
4

Authors

Journals

citations
Cited by 63 publications
(39 citation statements)
references
References 78 publications
(86 reference statements)
2
37
0
Order By: Relevance
“…To allow for a direct comparison of the competing models, all the algorithms are initialized by providing the initial quantities λzifalse(0false), i=1,,n: nine times using a random initialization and once with a k ‐means initialization (as implemented by the kmeans() function for R). The solution maximizing the observed‐data log‐likelihood among these 10 runs is then selected; see Dang et al (). For alternative initialization strategies, see (Bagnato & Punzo ).…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…To allow for a direct comparison of the competing models, all the algorithms are initialized by providing the initial quantities λzifalse(0false), i=1,,n: nine times using a random initialization and once with a k ‐means initialization (as implemented by the kmeans() function for R). The solution maximizing the observed‐data log‐likelihood among these 10 runs is then selected; see Dang et al (). For alternative initialization strategies, see (Bagnato & Punzo ).…”
Section: Resultsmentioning
confidence: 99%
“…We also evaluate the performance of MLNMs on the ais (Australian Institute of Sports) data set (Cook & Weisberg, ), a real benchmark data set which is often used (in some or all of its variables) for illustration in the model‐based clustering literature (see, e.g., Galimberti and Soffritti, ; Morris et al, ; Murray, Browne, & McNicholas, ; Tortora et al, ; Azzalini et al, ; Dang et al, ). The data set contains measurements on n=202 athletes, subdivided in k=2 groups (100 female and 102 male), and is available in the R‐packages alr3 (Weisberg, ) and sn (Azzalini, ).…”
Section: Resultsmentioning
confidence: 99%
“…Therefore, it could be of interest to evaluate the decision boundaries for mixtures of regressions when, for example, Gaussian assumptions are made about the distribution of Y |x in each mixture component. Finally, always in the case of multiple response variables, parsimonious variants of mixtures of (linear) regressions, under Gaussian assumptions, have been recently proposed by imposing constraints on the eigendecomposed component matrices of Y |x (see Dang & McNicholas, 2015 in the case of fixed covariates and Dang, Punzo, McNicholas, Ingrassia, & Browne, 2014 in the case of random covariates). Also in this case, in line with the studies about linear and quadratic discriminant analysis, it could be of interest to evaluate the impact of these constraints on the resulting decision boundaries.…”
Section: Discussionmentioning
confidence: 99%
“…However, identifiability results exist for some special cases of this class in the case of a single outcome ( d Y =1). For example, general conditions for identifiability of mixtures of linear models with fixed or random covariates are available . Generalizations of these results to mixtures of GLMs are also available in the case of fixed covariates and in the case of random covariates of a mixed type …”
Section: Methodsmentioning
confidence: 99%