“…d. KNN-hyperparameters: "n neighbors ": list (range (1, 31)), "weights": ["uniform," "distance"], "metric": ["euclidean," "manhattan"]. e. XGboost-Hyperparameters: "n estimators ": [50, 100, 200], "learning rate ": [0.001, 0.01, 0.1, 1], "max depth ":[3,4,5], "subsample": [0.5, 0.7, 1.0], "colsample bytree ": [0.5, 0.7, 1.0]. f. AdaBoost-Hyperparameters: "n estimators ": [50, 100, 200], "learning rate ": [0.001, 0.01, 0.1, 1], "base estimator ": [DecisionTreeClassifier(max depth = d) for d in[1,2,3]].…”