2005
DOI: 10.1109/tpami.2005.127
|View full text |Cite
|
Sign up to set email alerts
|

Sparse multinomial logistic regression: fast algorithms and generalization bounds

Abstract: Abstract-Recently developed methods for learning sparse classifiers are among the state-of-the-art in supervised learning. These methods learn classifiers that incorporate weighted sums of basis functions with sparsity-promoting priors encouraging the weight estimates to be either significantly large or exactly zero. From a learning-theoretic perspective, these methods control the capacity of the learned classifier by minimizing the number of basis functions used, resulting in better generalization. This paper… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
586
0
1

Year Published

2005
2005
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 747 publications
(589 citation statements)
references
References 32 publications
(57 reference statements)
2
586
0
1
Order By: Relevance
“…In [29] the MM method was applied to the least absolute deviation regression problem, leading to a viable surrogate for the absolute value penalty. By combining these ideas [8] provides an iterative algorithm for the maximization of the log-likelihood function with sparsity inducing penalty in the multinomial case -this is the method used here in its binomial form. The resulting algorithm requires no more computational resource than the iteratively re-weighted least-squares method conventionally used for solving the logistic regression problem.…”
Section: Methods and Theorymentioning
confidence: 99%
See 2 more Smart Citations
“…In [29] the MM method was applied to the least absolute deviation regression problem, leading to a viable surrogate for the absolute value penalty. By combining these ideas [8] provides an iterative algorithm for the maximization of the log-likelihood function with sparsity inducing penalty in the multinomial case -this is the method used here in its binomial form. The resulting algorithm requires no more computational resource than the iteratively re-weighted least-squares method conventionally used for solving the logistic regression problem.…”
Section: Methods and Theorymentioning
confidence: 99%
“…The resulting algorithm requires no more computational resource than the iteratively re-weighted least-squares method conventionally used for solving the logistic regression problem. The authors of [8] have provided a fairly comprehensive, downloadable package [30], however, it is easily programmed in the Matlab TM environment [31], which is the approach adopted here.…”
Section: Methods and Theorymentioning
confidence: 99%
See 1 more Smart Citation
“…with equality if and only if β = β * (Krishnapuram et al, 2005) and L(β|β * , x) is differentiable with respect to β. At th iteration of IRWLS procedure, we have…”
Section: L1 Regularized Varying Coefficient Modelmentioning
confidence: 99%
“…Sparse multinomial logistic regression methods are available [11]. More recently, the introduction of the LORSAL (logistic regression via variable splitting and augmented Lagrangian) algorithm [12] has open the door to deal with larger data sets and number of classes.…”
Section: Introductionmentioning
confidence: 99%