The 2011 International Joint Conference on Neural Networks 2011
DOI: 10.1109/ijcnn.2011.6033554
|View full text |Cite
|
Sign up to set email alerts
|

Group lasso regularized multiple kernel learning for heterogeneous feature selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2012
2012
2023
2023

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 15 publications
0
1
0
Order By: Relevance
“…For heterogeneous variable selection problems, it has been observed that each variable prefers a different set of base kernels which best represent its property/distribution for recognition purposes [22]. As a result, the use of our proposed GL-MKL can be extended for variable selection purposes.…”
Section: Gl-mkl For Heterogeneous Variable Selectionmentioning
confidence: 95%
“…For heterogeneous variable selection problems, it has been observed that each variable prefers a different set of base kernels which best represent its property/distribution for recognition purposes [22]. As a result, the use of our proposed GL-MKL can be extended for variable selection purposes.…”
Section: Gl-mkl For Heterogeneous Variable Selectionmentioning
confidence: 95%