2023
DOI: 10.1007/978-3-031-36021-3_47
|View full text |Cite
|
Sign up to set email alerts
|

Improving Group Lasso for High-Dimensional Categorical Data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 16 publications
0
1
0
Order By: Relevance
“…where λ ≥ 0 is a tuning parameter which controls the amount of penalization, ω l = |G l | is used to normalize across groups of different sizes and ∥ • ∥ 2 denotes the L 2 norm of a vector. Meier et al [13] established the asymptotic consistency theory of Group Lasso for logistic regression, Wang et al [14] analyzed the rates of convergence, Blazere et al [15] stated oracle inequalities and Kwemou [16] and Nowakowski [17] studied non-asymptotic oracle inequalities. Furthermore, Zhang et al [18] studied the L p,q regularization penalty estimates for logistic regression.…”
Section: Introductionmentioning
confidence: 99%
“…where λ ≥ 0 is a tuning parameter which controls the amount of penalization, ω l = |G l | is used to normalize across groups of different sizes and ∥ • ∥ 2 denotes the L 2 norm of a vector. Meier et al [13] established the asymptotic consistency theory of Group Lasso for logistic regression, Wang et al [14] analyzed the rates of convergence, Blazere et al [15] stated oracle inequalities and Kwemou [16] and Nowakowski [17] studied non-asymptotic oracle inequalities. Furthermore, Zhang et al [18] studied the L p,q regularization penalty estimates for logistic regression.…”
Section: Introductionmentioning
confidence: 99%