Abstract-Recent approaches to classification of text, images, and other types of structured data, launched the quest for positive definite (p.d.) kernels on probability measures. In particular, kernels based on the Jensen-Shannon (JS) divergence and other information-theoretic quantities have been proposed. We introduce new JS-type divergences, by extending its two building blocks: convexity and Shannon's entropy. These divergences are then used to define new information-theoretic kernels on measures. In particular, we introduce a new concept of qconvexity, for which a Jensen q-inequality is proved. Based on this inequality, we introduce the Jensen-Tsallis q-difference, a nonextensive generalization of the Jensen-Shannon divergence. Furthermore, we provide denormalization formulae for entropies and divergences, which we use to define a family of nonextensive information-theoretic kernels on measures. This family, grounded in nonextensive entropies, extends Jensen-Shannon divergence kernels, and allows assigning weights to its arguments.