Improving the performance of classifiers is the realm of feature mapping, prototype selection, and kernel function transformations; these techniques aim for reducing the complexity, and also, improving the accuracy of models. In particular, our objective is to combine them to transform data's shape into another more convenient distribution; such that some simple algorithms, such as Naïve Bayes or k-Nearest Neighbors, can produce competitive classifiers. In this paper, we introduce a family of classifiers based on feature mapping and kernel functions, orchestrated by a model selection scheme that excels in performance.We provide an extensive experimental comparison of our methods with sixteen popular classifiers on more than thirty benchmarks supporting our claims. In addition to their competitive performance, our statistical tests also found that our methods are different among them, supporting our claim of a compelling family of classifiers.