2017
DOI: 10.1080/01621459.2016.1247002
|View full text |Cite
|
Sign up to set email alerts
|

Block-Diagonal Covariance Selection for High-Dimensional Gaussian Graphical Models

Abstract: Abstract. Gaussian graphical models are widely utilized to infer and visualize networks of dependencies between continuous variables. However, inferring the graph is difficult when the sample size is small compared to the number of variables. To reduce the number of parameters to estimate in the model, we propose a non-asymptotic model selection procedure supported by strong theoretical guarantees based on an oracle type inequality and a minimax lower bound. The covariance matrix of the model is approximated b… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
53
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 31 publications
(54 citation statements)
references
References 50 publications
(90 reference statements)
1
53
0
Order By: Relevance
“…In particular, when sample coskewness elements are similar, it may be possible to reduce the MSE by replacing the corresponding sample estimates by the sample average of these elements. This is a similar intuition as for the diagonal covariance matrix in Ledoit and Wolf (2004) or the block diagonal structure in Devijver and Gallopin (2017) and has a positive effect even if the restrictions are wrong. For p ¼ 2, the structured coskewness matrix based on the independence and equal marginal assumption of Ledoit and Wolf (2004), can be written as follows:…”
Section: Structured Coskewness Estimationmentioning
confidence: 60%
“…In particular, when sample coskewness elements are similar, it may be possible to reduce the MSE by replacing the corresponding sample estimates by the sample average of these elements. This is a similar intuition as for the diagonal covariance matrix in Ledoit and Wolf (2004) or the block diagonal structure in Devijver and Gallopin (2017) and has a positive effect even if the restrictions are wrong. For p ¼ 2, the structured coskewness matrix based on the independence and equal marginal assumption of Ledoit and Wolf (2004), can be written as follows:…”
Section: Structured Coskewness Estimationmentioning
confidence: 60%
“…Then, the result is still meaningful as the equivalence between the Kullback-Leibler divergence and Hellinger's distance is well known. Note that the parameter τ appears in the oracle type inequality (12), but also in the penalty term (11). We may construct a larger penalty independent of τ , achieving an oracle-type inequality, but the rate will be larger.…”
Section: 2mentioning
confidence: 99%
“…However, if q is large, we are unable to estimate these covariance matrices unless we make some additional assumptions. One way to allow for correlations might be to construct block-diagonal covariance matrices as in [12]. This strategy, however, will not be further considered here.…”
Section: Introductionmentioning
confidence: 99%
“…There is a growing literature on block diagonal estimation including: covariance matrix and graphical model estimation (Marlin and Murphy, 2009;Pavlenko et al, 2012;Tan et al, 2015;Hyodo et al, 2015;Sun et al, 2015;Egilmez et al, 2017;Devijver and Gallopin, 2018;Kumar et al, 2019;Broto et al, 2019), community detection (Nie et al, 2016), co-clustering (Han et al, 2017;Nie et al, 2017), subspace clustering (Feng et al, 2014;Lu et al, 2018), principal components analysis (Asteris et al, 2015), bipartite cross-correlation clustering (Dewaskar et al, 2020), neural network regularization (Tam and Dunson, 2020), and multi-view clustering (Carmichael, 2020).…”
Section: Introductionmentioning
confidence: 99%
“…A variety of approaches are used to estimate block diagonally structured parameters. Some methods exploit the structure of particular statistical models (Asteris et al, 2015;Tan et al, 2015;Devijver and Gallopin, 2018). Bayesian approaches to block diagonal estimation are based on priors that promote block diagonal structure (Mansinghka et al, 2006;Marlin and Murphy, 2009).…”
Section: Introductionmentioning
confidence: 99%