2011
DOI: 10.1214/11-ejs608
|View full text |Cite
|
Sign up to set email alerts
|

Automatic grouping using smooth-threshold estimating equations

Abstract: Use of redundant statistical model is often the case with practical data analysis. Redundancy widely investigated is inclusion of irrelevant predictors which is resolved by setting their coefficients to zero. On the other hand, it is also useful to consider overlapping parameters of which the values are similar. Grouping by regarding a set of parameters as a single parameter contributes to building intimate parameterization and increasing estimation accuracy by dimension reduction.The paper proposes a data ada… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2012
2012
2022
2022

Publication Types

Select...
7

Relationship

2
5

Authors

Journals

citations
Cited by 7 publications
(9 citation statements)
references
References 24 publications
0
9
0
Order By: Relevance
“…Note that for each covariate, the total number of constraints is s=K+Cfalse(K,2false). Ueki () and Ueki and Kawasaki () considered a similar problem of variable grouping in a much simpler setting of single cross‐sectional study (i.e., K=1, m=1), where the 2‐norm penalty for group lasso was used. Equivalently, we may write the above fused lasso penalty Pfalse(βfalse) in a matrix notation: Pfalse(βfalse)=Dβ1=WBβ1, where ·1 is L1‐norm on Rsp, boldB is an italicsp×italicKp matrix that defines sp constraints involving p covariates across K studies, and boldW is an italicsp×italicsp diagonal matrix containing all weights corresponding to the constraints in boldB.…”
Section: Formulation and Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Note that for each covariate, the total number of constraints is s=K+Cfalse(K,2false). Ueki () and Ueki and Kawasaki () considered a similar problem of variable grouping in a much simpler setting of single cross‐sectional study (i.e., K=1, m=1), where the 2‐norm penalty for group lasso was used. Equivalently, we may write the above fused lasso penalty Pfalse(βfalse) in a matrix notation: Pfalse(βfalse)=Dβ1=WBβ1, where ·1 is L1‐norm on Rsp, boldB is an italicsp×italicKp matrix that defines sp constraints involving p covariates across K studies, and boldW is an italicsp×italicsp diagonal matrix containing all weights corresponding to the constraints in boldB.…”
Section: Formulation and Methodsmentioning
confidence: 99%
“…Note that for each covariate, the total number of constraints is s = K + C(K, 2). Ueki (2009) and Ueki and Kawasaki (2011) considered a similar problem of variable grouping in a much simpler setting of single cross-sectional study (i.e., K = 1, m = 1), where the 2 -norm penalty for group lasso was used.…”
Section: Formulation and Methodsmentioning
confidence: 99%
“…For this step, we employ the smooth-thresholding developed by one of us [53,54] that applies to the ridge logistic regression method, termed the ridge smooth-thresholding logistic regression. We explain the procedure in detail in what follows.…”
Section: Methodsmentioning
confidence: 99%
“…We examine applicability of existing sparse regression for prediction of bank telemarketing success. In particular, we consider the lasso, smoothlyclipped absolute deviation (SCAD, Fan and Li, 2001), minimum concave penalty (MCP, Zhang, 2010) and the smooth-threshold estimating equations (STEE, Ueki, 2009;Ueki and Kawasaki, 2011). Most of the popular sparse regression methods attempt to set irrelevant regression coefficients of predictor variables as zero.…”
Section: Introductionmentioning
confidence: 99%
“…Most of the popular sparse regression methods attempt to set irrelevant regression coefficients of predictor variables as zero. On the other hand, the method by Ueki and Kawasaki (2011) aims at automatic grouping which is an alternative to sparse modeling for variable selection. Automatic grouping gathers similar regression coefficients into a single one.…”
Section: Introductionmentioning
confidence: 99%