2014
DOI: 10.1007/978-3-319-08729-0_28
|View full text |Cite
|
Sign up to set email alerts
|

Meta-learning: Can It Be Suitable to Automatise the KDD Process for the Educational Domain?

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 8 publications
(9 citation statements)
references
References 7 publications
0
9
0
Order By: Relevance
“…The methods are implemented by several algorithms, and several were used in the selected papers. The most widely used Decision Tree algorithm is C4.5 [Zorrilla and Garcia-Saiz 2014], [Romero et al 2013a], [Hu et al 2014], followed by Simple Cart [Zorrilla and Garcia-Saiz 2014], [Romero et al 2013a], [Hu et al 2014], and Random Trees [Romero et al 2008], [Romero et al 2013a], [Hu et al 2014 [Romero et al 2013a]. Other popular algorithms are: K-means clustering [Moradi et al 2014], [Jovanovic et al 2012], [Pardos et al 2012], [Mogus et al 2012], [Sorour et al 2014]; K-nearest neighbor (KNN) [Zorrilla and Garcia-Saiz 2014], [Kotsiantis et al 2010], [Minaei-Bidgoli et al 2003], [Gamulin et al 2016]; and JRip [Zorrilla and Garcia-Saiz 2014], [Márquez-Vera et al 2013].…”
Section: Q01 Which Data Mining Techniques and Methods Were Used?mentioning
confidence: 99%
“…The methods are implemented by several algorithms, and several were used in the selected papers. The most widely used Decision Tree algorithm is C4.5 [Zorrilla and Garcia-Saiz 2014], [Romero et al 2013a], [Hu et al 2014], followed by Simple Cart [Zorrilla and Garcia-Saiz 2014], [Romero et al 2013a], [Hu et al 2014], and Random Trees [Romero et al 2008], [Romero et al 2013a], [Hu et al 2014 [Romero et al 2013a]. Other popular algorithms are: K-means clustering [Moradi et al 2014], [Jovanovic et al 2012], [Pardos et al 2012], [Mogus et al 2012], [Sorour et al 2014]; K-nearest neighbor (KNN) [Zorrilla and Garcia-Saiz 2014], [Kotsiantis et al 2010], [Minaei-Bidgoli et al 2003], [Gamulin et al 2016]; and JRip [Zorrilla and Garcia-Saiz 2014], [Márquez-Vera et al 2013].…”
Section: Q01 Which Data Mining Techniques and Methods Were Used?mentioning
confidence: 99%
“…Regarding what meta-features to use, in general, measurable properties of data sets and algorithms are chosen, for instance, statistical or information-theoretical measures [47], landmarkers [48] or model properties such as the average ratio of bias, variance error, or their sensitivity to noise [49] among others. The data context and its complexity for learning task are also used [50].…”
Section: Related Workmentioning
confidence: 99%
“…We also consider extraction of meta-features in the sense of some recent works [57,50] that have used a set of data complexity metrics (provided by DCoL tool [58] which measures characteristics of the data independently of the learning method) as meta-features. A description of the DCoL meta-features is included in Appendix B.…”
Section: Extraction Of Meta-featuresmentioning
confidence: 99%
“…The set of algorithms recommended for the new data set then coincided with the set of algorithms of its nearest neighbor. A more recent work that applied meta-learning to EDM was the paper by Zorrilla and Garcia 53 , were meta-learning is used to build a recommender that help instructors (as non-expert data miners) in applying the right DM algorithm on their data sets. It is also worth noting the work by Zapata et al 54 , where meta-learning techniques are used in the field of learning objects recommendation in order to automatically obtain or predict the final ratings.…”
Section: Related Workmentioning
confidence: 99%