2021
DOI: 10.1007/s11749-021-00779-7
|View full text |Cite
|
Sign up to set email alerts
|

Sparse Laplacian Shrinkage with the Graphical Lasso Estimator for Regression Problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 28 publications
0
2
0
Order By: Relevance
“…In practical data analysis, acquiring the precise network structure as represented by the Laplacian matrix poses significant challenges and often proves unattainable. Prior studies primarily focused on employing Laplacian matrices derived solely from the data itself 22 . In this study, the network directly computed based on the correlation structure of the input gene expression data is referred to as reference-free network.…”
Section: Bi-network Regularization Modelmentioning
confidence: 99%
“…In practical data analysis, acquiring the precise network structure as represented by the Laplacian matrix poses significant challenges and often proves unattainable. Prior studies primarily focused on employing Laplacian matrices derived solely from the data itself 22 . In this study, the network directly computed based on the correlation structure of the input gene expression data is referred to as reference-free network.…”
Section: Bi-network Regularization Modelmentioning
confidence: 99%
“…Further, there are many studies about the Laplacian penalty. For example, Xia et al 21 proposed a two-step method called Sparse Laplacian Shrinkage with the Graphical Lasso Estimator. Compared with that, the proposed method applies the Laplacian penalty to the iterative algorithm.…”
Section: Model and Algorithmmentioning
confidence: 99%