The aim of this paper is to introduce new statistical criterions for estimation, suitable for inference in models with common continuous support. This proposal is in the direct line of a renewed interest for divergence based inference tools imbedding the most classical ones, such as maximum likelihood, Chi-square or Kullback Leibler. General pseudodistances with decomposable structure are considered, they allowing to define minimum pseudodistance estimators, without using nonparametric density estimators. A special class of pseudodistances indexed by α > 0, leading for α ↓ 0 to the Kulback Leibler divergence, is presented in detail. Corresponding estimation criteria are developed and asymptotic properties are studied. The estimation method is then extended to regression models. Finally, some examples based on Monte Carlo simulations are discussed.