2014
DOI: 10.1016/j.ins.2014.01.008
|View full text |Cite
|
Sign up to set email alerts
|

Embedded local feature selection within mixture of experts

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
25
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 51 publications
(25 citation statements)
references
References 31 publications
(10 reference statements)
0
25
0
Order By: Relevance
“…We studied the two-class classification problem in these experiments and used the logistic regression models as the local expert models. The proposed trace norm regularized MOE model is compared with the L 1 norm regularized MOE model [22], support vector machines (SVM), linear logistic regression with L 1 norm regularization, SVM ensemble with bagging, and AdaBoost using decision trees as weak classifiers. The parameters of these methods are selected with 3-fold cross-validation using the grid search.…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…We studied the two-class classification problem in these experiments and used the logistic regression models as the local expert models. The proposed trace norm regularized MOE model is compared with the L 1 norm regularized MOE model [22], support vector machines (SVM), linear logistic regression with L 1 norm regularization, SVM ensemble with bagging, and AdaBoost using decision trees as weak classifiers. The parameters of these methods are selected with 3-fold cross-validation using the grid search.…”
Section: Methodsmentioning
confidence: 99%
“…[20]. Recently, many novel MOE methods are proposed to handle high dimensional data [21][22][23]. In this paper, a new trace norm regularized MOE model is proposed.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…In general, feature selection algorithms are classified into two categories: those based on the filter model [8] and those based on the wrapper model [27]. A number of feature selection methods based on the wrapper model for classification [30,38] and clustering [52] have been proposed in the literature. In supervised learning, one can generally use the wrapper model to construct a classifier and use the criterion (e.g., accuracy) to observe how the features can positively predict class labels via this classifier [10,35,36,40].…”
Section: Related Workmentioning
confidence: 99%