2014 22nd International Conference on Pattern Recognition 2014
DOI: 10.1109/icpr.2014.240
|View full text |Cite
|
Sign up to set email alerts
|

Feature Selection Scheme Based on Zero-Sum Two-Player Game

Abstract: We propose a new filter methodology for feature selection using the concept of game theory whereby features are assimilated to players. In this game theoretical context, a strategy corresponds to a particular affinity between a group of features forming a cluster, and the payoff function is computed based on the weighted distance between a feature and a cluster. A zerosum two-player game problem is solved through a global combination of pairwise features. Finally, each feature is represented by the value of th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2015
2015
2015
2015

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 13 publications
(1 reference statement)
0
1
0
Order By: Relevance
“…This paper focuses on the first issue, namely selecting input variables in an attempt to maximize the perfonnance of a classifier on previously unseen data. It is worth noting that a large number of algorithms have been proposed in the literature for feature subset selection [3][4][5][6][7][8]. However, most of the proposed methodologies exhibit a common drawback: The selected features depend strongly on the application at hand, and there is no general feature selection scheme that provides an optimal classification with different data (e.g.…”
Section: Introductionmentioning
confidence: 99%
“…This paper focuses on the first issue, namely selecting input variables in an attempt to maximize the perfonnance of a classifier on previously unseen data. It is worth noting that a large number of algorithms have been proposed in the literature for feature subset selection [3][4][5][6][7][8]. However, most of the proposed methodologies exhibit a common drawback: The selected features depend strongly on the application at hand, and there is no general feature selection scheme that provides an optimal classification with different data (e.g.…”
Section: Introductionmentioning
confidence: 99%