Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2013
DOI: 10.48550/arxiv.1308.3381
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

High dimensional Sparse Gaussian Graphical Mixture Model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 0 publications
0
6
0
Order By: Relevance
“…For sparse Gaussian mixture model, Raftery and Dean (2006); Maugis et al (2009); Pan and Shen (2007); Maugis and Michel (2008); Städler et al (2010); Maugis and Michel (2011); Krishnamurthy (2011); Ruan et al (2011); He et al (2011); Lee and Li (2012); Lotsi and Wit (2013); Malsiner-Walli et al (2013); Azizyan et al (2013); Gaiffas and Michel (2014) study the problems of clustering and feature selection, but mostly either lack efficient algorithms to attain the proposed estimators, or do not have finite-sample guarantees. Verzelen and Arias-Castro (2017); Azizyan et al (2015) establish efficient algorithms for detection, feature selection, and clustering with finite-sample guarantees.…”
Section: Introductionmentioning
confidence: 99%
“…For sparse Gaussian mixture model, Raftery and Dean (2006); Maugis et al (2009); Pan and Shen (2007); Maugis and Michel (2008); Städler et al (2010); Maugis and Michel (2011); Krishnamurthy (2011); Ruan et al (2011); He et al (2011); Lee and Li (2012); Lotsi and Wit (2013); Malsiner-Walli et al (2013); Azizyan et al (2013); Gaiffas and Michel (2014) study the problems of clustering and feature selection, but mostly either lack efficient algorithms to attain the proposed estimators, or do not have finite-sample guarantees. Verzelen and Arias-Castro (2017); Azizyan et al (2015) establish efficient algorithms for detection, feature selection, and clustering with finite-sample guarantees.…”
Section: Introductionmentioning
confidence: 99%
“…Rodríguez et al (2011) and Talluri et al (2014) develop a Bayesian framework for estimating infinite mixtures of sparse Gaussian graphical models where different prior distributions on the inverse covariance are employed. Krishnamurthy (2011), Lotsi and Wit (2013), Gao et al (2016), and Lee and Xue (2017) present methods for estimating mixture models with sparse precision matrices via penalized likelihood estimation and lasso-type penalty functions. Compared to these approaches, in our proposed framework we parameterize the mixture of Gaussians directly in terms of the component covariance matrices.…”
Section: Discussionmentioning
confidence: 99%
“…Lastly, state-of-the-art prediction methods on time-series [22,15,17,25] commonly assume the relations among the past values of the variables and the present values of each variable to be constant in time. An idea similar to TAGM was proposed with Gaussian Mixture Models (GMMs) [7] where they combined GMM with GGMs [18]. The use of GMMs though would not allow to explicitly consider sequentiality and it is therefore not suited for the analysis of time-series.…”
Section: Related Workmentioning
confidence: 99%
“…Infer temporal-dependent conditional dependencies among variables: TAGM relaxed the chunk assumption common to the inference methods available in literature [13,24] thus inferring a time-varying network that adapts at each observation. Note that TAGM assumes a sequentiality of the states, while in [18,7] the authors combined GGMs with Gaussian Mixture Models. This approach is more prone in clustering the points in classes which can be seen as an unsupervised extension of the Joint Graphical Lasso [6].…”
Section: Introductionmentioning
confidence: 99%