2011
DOI: 10.1007/978-3-642-20980-2_7
|View full text |Cite
|
Sign up to set email alerts
|

Selecting Machine Learning Algorithms Using the Ranking Meta-Learning Approach

Abstract: Abstract. In this work, we present the use of Ranking Meta-Learning approaches to ranking and selecting algorithms for problems of time series forecasting and clustering of gene expression data. Given a problem (forecasting or clustering), the Meta-Learning approach provides a ranking of the candidate algorithms, according to the characteristics of the problem's dataset. The best ranked algorithm can be returned as the selected one. In order to evaluate the Ranking Meta-Learning proposal, prototypes were imple… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
4
0

Year Published

2012
2012
2024
2024

Publication Types

Select...
4
3
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 20 publications
(6 citation statements)
references
References 35 publications
1
4
0
Order By: Relevance
“…We conducted our experiments on k = 3, meaning we may have a very different result if k is assigned to a different number. The conclusion obtained is the same result revealed by Prudancio et al (2011) and Chao et al (2020). where K-NN as a meta-learner is used for a small number of meta examples and a large number of meta examples (some of them generated by using datasetoids, comparing the results of both shows that K-NN as meta-learner has better performance when the number of examples is small as K-NN is sensitive to the irrelevant meta examples potentially produced by datasetoids.…”
Section: Resultssupporting
confidence: 92%
“…We conducted our experiments on k = 3, meaning we may have a very different result if k is assigned to a different number. The conclusion obtained is the same result revealed by Prudancio et al (2011) and Chao et al (2020). where K-NN as a meta-learner is used for a small number of meta examples and a large number of meta examples (some of them generated by using datasetoids, comparing the results of both shows that K-NN as meta-learner has better performance when the number of examples is small as K-NN is sensitive to the irrelevant meta examples potentially produced by datasetoids.…”
Section: Resultssupporting
confidence: 92%
“…Similarly, in [37], developers defined 32 meta-examples (each one corresponding to a dataset) described by a set of meta-features for a gene expression dataset, and a vector with the ranking of the clustering algorithms for that dataset. To measure the similarity between meta-features, they used Euclidean distance, Pearson correlation and cosine.…”
Section: Review On Algorithm Selection Using Meta-learningmentioning
confidence: 99%
“…Later Nakhaeizadeh and Schnabl (1997) , and later Keller et al (2000) , and Brazdil and Soares (2000) also adopted similar methods. In 2011, RBC Prudencio, MCPD Souto and TB Ludermir applied the ordering meta-learning method to the time series and gene expression data clustering field ( Prudêncio et al, 2011 ). In 2017, Finn et al introduced the theory of meta-learning in the fast adaptation study of deep networks ( Finn and Levine, 2017 ).…”
Section: Introductionmentioning
confidence: 99%