“…Minimizing theoretical estimate bounds of generalization error is an alternative to kernel selection. The widely used theoretical estimates usually introduce some measures of the complexity of the hypothesis space, such as VC dimension (Vapnik 2000), radius-margin bound (Vapnik 2000), maximal discrepancy (Bartlett, Boucheron, and Lugosi 2002), Rademacher complexity (Bartlett and Mendelson 2002), compression coefficient (Luxburg, Bousquet, and Schölkopf 2004), eigenvalues perturbation (Liu, Jiang, and Liao 2013), spectral perturbation stability (Liu and Liao 2014a), kernel stability (Liu and Liao 2014b) and covering number (Ding and Liao 2014). Unfortunately, for most of these measures, it is difficult to estimate their specific values (Nguyen and Ho 2007), hence hard to use them for kernel selection in practice.…”