2013
DOI: 10.5705/ss.2011.281
|View full text |Cite
|
Sign up to set email alerts
|

The degrees of freedom of the Lasso for general design matrix

Abstract: In this paper, we investigate the degrees of freedom (dof) of penalized 1 minimization (also known as the Lasso) for linear regression models. We give a closed-form expression of the dof of the Lasso response. Namely, we show that for any given Lasso regularization parameter λ and any observed data y belonging to a set of full (Lebesgue) measure, the cardinality of the support of a particular solution of the Lasso problem is an unbiased estimator of the degrees of freedom. This is achieved without the need of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
26
0

Year Published

2013
2013
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 24 publications
(26 citation statements)
references
References 37 publications
(61 reference statements)
0
26
0
Order By: Relevance
“…Let further |I| = k and P I ∈ R k×n be a projector onto I and A I the restriction of A to I. We have that df α = x α (y) 0 = k and gdf α = tr(ΠB [J] ), B [J] := P I (A * I A I ) −1 P * I , as shown, e.g., in [39,14,12], which allows us to compute PSURE (7) and SURE (9). Notice that whilex α (y) is a continuous function of α [7], PSURE and SURE are discontinuous at all α where the support I changes.…”
Section: Numerical Studies For Non-quadratic Regularizationmentioning
confidence: 99%
“…Let further |I| = k and P I ∈ R k×n be a projector onto I and A I the restriction of A to I. We have that df α = x α (y) 0 = k and gdf α = tr(ΠB [J] ), B [J] := P I (A * I A I ) −1 P * I , as shown, e.g., in [39,14,12], which allows us to compute PSURE (7) and SURE (9). Notice that whilex α (y) is a continuous function of α [7], PSURE and SURE are discontinuous at all α where the support I changes.…”
Section: Numerical Studies For Non-quadratic Regularizationmentioning
confidence: 99%
“…the lasso), the Jacobian matrix depends on the support (set of non-zero coefficients) of any lasso solution x(y, θ). An estimator of the DOF can then be retrieved from the number of non-zero entries of this solution [23,65,73]. These results have in turn been extended to more general sparsity promoting regularizations [20,40,61,[64][65][66]72], and spectral regularizations (e.g.…”
Section: Closed-form Surementioning
confidence: 99%
“…where d lava (y, X) is the lava estimator on the data (y, X) and d lasso (K 1/2 λ 2 y, K 1/2 λ 2 X)) is the lasso estimator on the data (K 1/2 λ 2 y, K 1/2 λ 2 X) with the penalty level λ 1 . The almost differentiability of the map y → d lasso (K 1/2 λ 2 y, K 1/2 λ 2 X) follows from the almost differentiability of the map u → d lasso (u, K 1/2 λ 2 X), which holds by the results in Dossal et al (2011) and Tibshirani and Taylor (2012).…”
Section: Degrees Of Freedom and Surementioning
confidence: 73%