The title Lasso has been suggested by Tibshirani 6] as colourful name for a technique for variable selection which requires the minimization of a sum of squares subject to an l 1 bound on the solution. This bound then has the role of the selection parameter. Here a descent method for solving the constrained problem is formulated, a homotopy method in which the constraint bound becomes the homotopy parameter is developed to completely describe the possible selection regimes, and it is suggested that modi ed Gram-Schmidt applied to the augmented design matrix provides an e ective base for implementing the suggested algorithms.
Proposed by Tibshirani, the least absolute shrinkage and selection operator (LASSO) estimates a vector of regression coefficients by minimizing the residual sum of squares subject to a constraint on the l'-norm of the coefficient vector. The LASSO estimator typically has one or more zero elements and thus shares characteristics of both shrinkage estimation and variable selection. In this article we treat the LASSO as a convex programming problem and derive its dual. Consideration of the primal and dual problems together leads to important new insights into the characteristics of the LASSO estimator and to an improved method for estimating its covariance matrix. Using these results we also develop an efficient algorithm for computing LASSO estimates which is usable even in cases where the number of regressors exceeds the number of observations. An S-Plus library based on this algorithm is available from StatLib.
Proposed by Tibshirani (1996), the LASSO (least absolute shrinkage and selection operator) estimates a vector of regression coefficients by minimising the residual sum of squares subject to a constraint on the l 1-norm of coefficient vector. The LASSO estimator typically has one or more zero elements and thus shares characteristics of both shrinkage estimation and variable selection. In this paper we treat the LASSO as a convex programming problem and derive its dual. Consideration of the primal and dual problems together leads to important new insights into the characteristics of the LASSO estimator and to an improved method for estimating its covariance matrix. Using these results we also develop an efficient algorithm for computing LASSO estimates which is usable even in cases where the number of regressors exceeds the number of observations.
Abstract. A modification of the classical technique of Prony for fitting sums of exponential functions to data is considered. The method maximizes the likelihood for the problem (unlike the usual implementation of Prony's method, which is not even consistent for transient signals), proves to be remarkably effective in practice, and is supported by an asymptotic stability result. Novel features include a discussion of the problem parametrization and its implications for consistency. The asymptotic convergence proofs are made possible by an expression for the algorithm in terms of circulant divided difference operators.
An analysis is given of the computational properties of Fisher's method of scoring for maximizing likelihoods and solving estimating equations based on quasi-likelihoods. Consistent estimation of the true parameter vector is shown to be important if a fast rate of convergence is to be achieved, but if this condition is met then the algorithm is very attractive. This link between the performance of the scoring algorithm and the adequacey of the underlying problem modelling is stressed. The e ect of linear constraints on performance is discussed, and examples of likelihood and quasi-likelihood calculations are presented.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.