2013
DOI: 10.1109/tcyb.2013.2260736
|View full text |Cite
|
Sign up to set email alerts
|

The Relevance Sample-Feature Machine: A Sparse Bayesian Learning Approach to Joint Feature-Sample Selection

Abstract: Abstract-This paper introduces a novel sparse Bayesian machine-learning algorithm for embedded feature selection in classification tasks. Our proposed algorithm, called the relevance sample feature machine (RSFM), is able to simultaneously choose the relevance samples and also the relevance features for regression or classification problems. We propose a separable model in feature and sample domains. Adopting a Bayesian approach and using Gaussian priors, the learned model by RSFM is sparse in both sample and … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
20
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
7
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 52 publications
(20 citation statements)
references
References 32 publications
(64 reference statements)
0
20
0
Order By: Relevance
“…But, few of them consider a joint formulation (Mohsenzadeh et al, 2013). Authors in Mohsenzadeh et al (2013) extend the classic relevance vector machine (RVM) formulation by adding two parameter sets for feature and sample selection in a Bayesian graphical inference model. They consider sparsity in both feature and sample domains, as we do, but instead they solve the problem in a marginal likelihood maximization procedure.…”
Section: Introductionmentioning
confidence: 99%
“…But, few of them consider a joint formulation (Mohsenzadeh et al, 2013). Authors in Mohsenzadeh et al (2013) extend the classic relevance vector machine (RVM) formulation by adding two parameter sets for feature and sample selection in a Bayesian graphical inference model. They consider sparsity in both feature and sample domains, as we do, but instead they solve the problem in a marginal likelihood maximization procedure.…”
Section: Introductionmentioning
confidence: 99%
“…However, their work was only focused on unifying frameworks and placed the generalization on broader scale. Mohsenzadeh et al [15] utilized a sparse Bayesian learning approach for feature sample selection. Their proposed relevance sample feature machine (RSFM) is an extension of RVM algorithm.…”
Section: Related Studymentioning
confidence: 99%
“…7,9,15,22,23 In spite of using kernel method to increase data separability and its reasonable performance in neural decoding, it remains susceptible to the curse of dimensionality. [24][25][26][27] Researchers have applied both supervised and unsupervised dimensionality reduction methods to map neuroimaging data into a lower feature space prior to the classification phase. 28 These techniques diminish the curse of dimensionality and classifier complexity, avoid overfitting, 29,30 and enhance the generalization performance of the classifier.…”
Section: Introductionmentioning
confidence: 99%