The aim of this paper is to introduce new statistical criterions for estimation, suitable for inference in models with common continuous support. This proposal is in the direct line of a renewed interest for divergence based inference tools imbedding the most classical ones, such as maximum likelihood, Chi-square or Kullback Leibler. General pseudodistances with decomposable structure are considered, they allowing to define minimum pseudodistance estimators, without using nonparametric density estimators. A special class of pseudodistances indexed by α > 0, leading for α ↓ 0 to the Kulback Leibler divergence, is presented in detail. Corresponding estimation criteria are developed and asymptotic properties are studied. The estimation method is then extended to regression models. Finally, some examples based on Monte Carlo simulations are discussed.
a b s t r a c tThe class of dual φ-divergence estimators (introduced in Broniatowski and Keziou (2009) [5]) is explored with respect to robustness through the influence function approach. For scale and location models, this class is investigated in terms of robustness and asymptotic relative efficiency. Some hypothesis tests based on dual divergence criteria are proposed and their robustness properties are studied. The empirical performances of these estimators and tests are illustrated by Monte Carlo simulation for both non-contaminated and contaminated data.
a b s t r a c tThis paper is devoted to robust hypothesis testing based on saddlepoint approximations in the framework of general parametric models. As is known, two main problems can arise when using classical tests. First, the models are approximations of reality and slight deviations from them can lead to unreliable results when using classical tests based on these models. Then, even if a model is correctly chosen, the classical tests are based on first order asymptotic theory. This can lead to inaccurate p-values when the sample size is moderate or small. To overcome these problems, robust tests based on dual divergence estimators and saddlepoint approximations, with good performances in small samples, are proposed.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.