2016
DOI: 10.1080/01621459.2015.1034319
|View full text |Cite
|
Sign up to set email alerts
|

Sparse Regression Incorporating Graphical Structure Among Predictors

Abstract: With the abundance of high dimensional data in various disciplines, sparse regularized techniques are very popular these days. In this paper, we make use of the structure information among predictors to improve sparse regression models. Typically, such structure information can be modeled by the connectivity of an undirected graph using all predictors as nodes of the graph. Most existing methods use this undirected graph edge-by-edge to encourage the regression coefficients of corresponding connected predictor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

2
75
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 31 publications
(77 citation statements)
references
References 45 publications
2
75
0
Order By: Relevance
“…In other words, node i only contributes to the estimation of regression coefficients associated with its neighbours and so learning the support of V ( i ) is analogous to learning the structure of the predictor graph. Yu & Liu () exploit this relation in the sparse regression incorporating graphical structure among predictors (SRIG) model.…”
Section: Methods and Motivationmentioning
confidence: 99%
See 4 more Smart Citations
“…In other words, node i only contributes to the estimation of regression coefficients associated with its neighbours and so learning the support of V ( i ) is analogous to learning the structure of the predictor graph. Yu & Liu () exploit this relation in the sparse regression incorporating graphical structure among predictors (SRIG) model.…”
Section: Methods and Motivationmentioning
confidence: 99%
“…The SRIG model of Yu & Liu () assumes that the predictor graph structure is known. As such, the support of V ( i ) is also known and estimation of the SRIG coefficients proceeds by solving alignleftalign-1align-2argminβ,V(i):i=1,,p12nYXβ22+λfalsefalsei=1pτiV(i)2, where bold-italicβ=i=1pboldVfalse(ifalse), λ ≥ 0 is a tuning parameter and τ i is a weight for the i th predictor.…”
Section: Methods and Motivationmentioning
confidence: 99%
See 3 more Smart Citations