“…At present, the classification methods presently applied for grade prediction include the random forest (RF) (19,34,37,40), logistic regression (LR) (35,36), naïve Bayes (NB) (35), support vector machine (SVM) (35,37,38,40), eXtreme gradient boosting (XGBOOST) (37), multilayer perceptron (MLP) (37), and linear discriminant analysis (LDA) (38,39) (Table 3). Among these various algorithms, numerically, the best performance of prediction was achieved in a tree-based classification algorithm, XGBOOST, which based on a combination of features derived from multiple MRI sequences and yielded a high AUC of 0.97, a sensitivity of 1.0 and a specificity of 0.97 (37). While the most widely used algorithms are the RF and SVM, the RF is an ensemble method that calculates multiple decision tree-based classifiers containing several identically distributed random independent vectors (37, 51), whereas the SVM is a non-linear classifier that iteratively constructs a hyperplane or highdimensional feature space consisting of a series of hyperplanes that separates different classes (52,53).…”