2010
DOI: 10.1016/j.neucom.2009.11.032
|View full text |Cite
|
Sign up to set email alerts
|

Approximate k-NN delta test minimization method using genetic algorithms: Application to time series

Abstract: In many real world problems, the existence of irrelevant input variables (features) hinders the predictive quality of the models used to estimate the output variables. In particular, time series prediction often involves building large regressors of artificial variables that can contain irrelevant or misleading information. Many techniques have arisen to confront the problem of accurate variable selection, including both local and global search strategies. This paper presents a method based on genetic algorith… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0
1

Year Published

2010
2010
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(11 citation statements)
references
References 43 publications
0
10
0
1
Order By: Relevance
“…Phase 1: Selection of the best inputs by means of an evolutionary computation based on the Delta Test [13].…”
Section: Evolutionary Optimization Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Phase 1: Selection of the best inputs by means of an evolutionary computation based on the Delta Test [13].…”
Section: Evolutionary Optimization Methodsmentioning
confidence: 99%
“…In order to undertake the feature selection, it was very important to do without the intervention of a neural network, because in this way we became independent the selection of variables against the network topology. Until carrying out this study, we had used the optimization of the Delta Test using genetic algorithms, with regression problems with only one output [13], but this approach can be extended to classification problems with multiple outputs. The particular focus of this study did not concern this phase of the methodology.…”
Section: Evolutionary Optimization Methodsmentioning
confidence: 99%
“…Another algorithm which could be an alternative solution to this problem is to implement the K-nearest-neighbour search algorithm, which is one of the most known unsupervised learning algorithms used for data clustering and classification. This is explained briefly in [17] as "Nearest neighbour search is an optimization technique for finding closest points in metric spaces. Specifically, given a set of n reference points R and query point q, both in the same metric space V" [17].…”
Section: Other Machine Learning Algorithmsmentioning
confidence: 99%
“…Experimental evidence comparing to alternative methods is also provided. Choosing an efficient search scheme for high-dimensional tasks is mostly left as a practical matter of implementation, and several papers specifically about optimising the Delta test have also been published in the literature [13], [14], [15], [16], [17], [18], [19].…”
Section: Introductionmentioning
confidence: 99%