2015
DOI: 10.1016/j.csda.2014.11.015
|View full text |Cite
|
Sign up to set email alerts
|

The cluster graphical lasso for improved estimation of Gaussian graphical models

Abstract: The task of estimating a Gaussian graphical model in the high-dimensional setting is considered. The graphical lasso, which involves maximizing the Gaussian log likelihood subject to a lasso penalty, is a well-studied approach for this task. A surprising connection between the graphical lasso and hierarchical clustering is introduced: the graphical lasso in effect performs a two-step procedure, in which (1) single linkage hierarchical clustering is performed on the variables in order to identify connected comp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
38
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 41 publications
(43 citation statements)
references
References 28 publications
0
38
0
Order By: Relevance
“…[HSDR14] have improved the computational cost of the two-step procedure by using a quadratic approximation, and have proved the superlinear convergence of their algorithm. [TWS15] have noticed that the first step, i.e. the detection of connected components by thresholding, is equivalent to performing a single linkage clustering on the variables.…”
Section: Introductionmentioning
confidence: 99%
“…[HSDR14] have improved the computational cost of the two-step procedure by using a quadratic approximation, and have proved the superlinear convergence of their algorithm. [TWS15] have noticed that the first step, i.e. the detection of connected components by thresholding, is equivalent to performing a single linkage clustering on the variables.…”
Section: Introductionmentioning
confidence: 99%
“…Several algorithms have been proposed to solve the problem Equation . To date, the most popular approach is the graphical lasso (glasso), where a solution to (Equation ) is found by solving a series of coupled regression problems in an iterative fashion . A special property of glasso is that the estimated precision matrix is always positive definite as long as the algorithm is initialized with a positive definite matrix such as a shrinkage estimator .…”
Section: Sparse Precision Matrix Estimationmentioning
confidence: 99%
“…To date, the most popular approach is the graphical lasso (glasso), where a solution to (Equation 19) is found by solving a series of coupled regression problems in an iterative fashion. 27,[75][76][77] A special property of glasso is that the estimated precision matrix is always positive definite as long as the algorithm is initialized with a positive definite matrix such as a shrinkage estimator. 23 Hsieh et al propose a quadratic approximation to the objective in Equation 19 to reduce the computational load.…”
Section: The Graphical Lassomentioning
confidence: 99%
“…Furthermore, there is an interesting connection to hierarchical clustering. Specifically the connected components correspond to the subtrees from when we apply single linkage agglomerative clustering to S and then cut the dendrogram at level τ (Tan et al., ). Single linkage clustering is sometimes not very attractive in practice, as it can produce long and stringy clusters and hence components of very unequal size.…”
Section: The Component Lassomentioning
confidence: 99%