2012
DOI: 10.1186/1753-6561-6-s2-s10
|View full text |Cite
|
Sign up to set email alerts
|

Genomic selection using regularized linear regression models: ridge regression, lasso, elastic net and their extensions

Abstract: BackgroundGenomic selection (GS) is emerging as an efficient and cost-effective method for estimating breeding values using molecular markers distributed over the entire genome. In essence, it involves estimating the simultaneous effects of all genes or chromosomal segments and combining the estimates to predict the total genomic breeding value (GEBV). Accurate prediction of GEBVs is a central and recurring challenge in plant and animal breeding. The existence of a bewildering array of approaches for predictin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

7
202
1

Year Published

2014
2014
2022
2022

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 279 publications
(211 citation statements)
references
References 27 publications
7
202
1
Order By: Relevance
“…Methods include genomic BLUP (GBLUP) and its extension (VanRaden 2008;Aguilar et al 2010;Christensen and Lund 2010); penalized regression methods such as ridge regression, Lasso, and elastic net (ENet) (Usai et al 2009;Li and Sillanpaa 2012a;Ogutu et al 2012); Bayesian regression methods such as BayesA and BayesB (Meuwissen et al 2001;de los Campos et al 2009;Hayashi and Iwata 2010;Habier et al 2011); non-parametric regression methods to capture non-additive genetic effects (Gianola et al 2006;Gianola and van Kaam 2008;Long et al 2010;Ober et al 2011); methods developed in the field of machine learning such as support vector machine and random forest (RForest) (Long et al 2011a;Ogutu et al 2011); and regression methods based on dimension reduction (Solberg et al 2009;Long et al 2011b). Ridge regression and its equivalent GBLUP, BayesA and BayesB, and Bayesian lasso (Blasso; Park and Casella 2008) are popular methods, and have been evaluated in many studies (reviewed in de los .…”
Section: Introductionmentioning
confidence: 99%
“…Methods include genomic BLUP (GBLUP) and its extension (VanRaden 2008;Aguilar et al 2010;Christensen and Lund 2010); penalized regression methods such as ridge regression, Lasso, and elastic net (ENet) (Usai et al 2009;Li and Sillanpaa 2012a;Ogutu et al 2012); Bayesian regression methods such as BayesA and BayesB (Meuwissen et al 2001;de los Campos et al 2009;Hayashi and Iwata 2010;Habier et al 2011); non-parametric regression methods to capture non-additive genetic effects (Gianola et al 2006;Gianola and van Kaam 2008;Long et al 2010;Ober et al 2011); methods developed in the field of machine learning such as support vector machine and random forest (RForest) (Long et al 2011a;Ogutu et al 2011); and regression methods based on dimension reduction (Solberg et al 2009;Long et al 2011b). Ridge regression and its equivalent GBLUP, BayesA and BayesB, and Bayesian lasso (Blasso; Park and Casella 2008) are popular methods, and have been evaluated in many studies (reviewed in de los .…”
Section: Introductionmentioning
confidence: 99%
“…On the other hand, as Lambda grew, the regularization term had a more substantial effect, and we saw fewer variables in our model (because more and more coefficients will be normed to zero). Lasso simultaneously selected relevant predictive variables and optimally estimated their effects [29].…”
Section: Resultsmentioning
confidence: 99%
“…In principle, various machine learning methods or Bayesian methods can be applied in the construction of PGS, as they have been applied in the estimation of breeding values in animal studies (Meuwissen et al, 2001;Abraham et al, 2013;Szymczak et al, 2009;Habier et al, 2011;Pirinen et al, 2013;Erbe et al, 2012;Ogutu et al, 2012;Zhou et al, 2013). These methods do not require the assumption of SNP independence or near independence, and have been shown to perform better than simple PGS in simulation settings.…”
Section: Introductionmentioning
confidence: 99%