“…In this study, we employed the following algorithms: k-nearest neighbors (Tikhonov, 1943;kNN, Fix & Hodges, 1951;Cover & Hart, 1967; Elastic-Net (for regression), Santosa & William, 1986;Tibshirani, 1996;Witten & Frank, 2005;Zou & Hastie, 2005 Gaussian Process, Rasmussen & Williams, 2006;Kotsiantis, 2007); support vector machines (Vapnik, 1998;Hsu & Lin, 2002;Karatzoglou et al, 2006); tree-based algorithms such as random forest and adaptive boosting or AdaBoost (Ho, 1995;Breiman, 1996aBreiman, , 1996bFreund & Schapire, 1997;Breiman, 2001a;Kotsiantis, 2014;Sagi & Rokach, 2018); logistic regression (for classification), Cramer, 2002); naı ¨ve Bayes (for classification, Rennie et al, 2003;Hastie et al, 2009); and artificial neural network (ANN, Curry, 1944;Rosenblatt, 1961;Rumelhart et al, 1986;Hastie et al, 2009;Lemare ´-chal, 2012). For details of these algorithms aside from Gaussian Process, their functionality and parameters, as well as for two application examples that are similar to those of this study, see Zhang et al (2021Zhang et al ( , 2022. For Gaussian Process, details can be found in Rasmussen and Williams (2006).…”