2001
DOI: 10.1093/biomet/88.3.865
|View full text |Cite
|
Sign up to set email alerts
|

A comparison of related density-based minimum divergence estimators

Abstract: This paper compares the minimum divergence estimator of Basu, Harris, Hjort and Jones (1998) to a competing minimum divergence estimator which turns out to be equivalent to a method proposed from a different perspective by Windham (1995). Both methods can be applied for any parametric model, contain maximum likelihood as a special case, and can be extended to the context of regression situations. Theoretical calculations are given to compare efficiencies under model conditions, and robustness properties are st… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
142
0

Year Published

2008
2008
2022
2022

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 114 publications
(143 citation statements)
references
References 16 publications
1
142
0
Order By: Relevance
“…For example, if the tuning parameter γ is large, then the robustness of the Gamma divergence will be strong but the efficiency could be lower, and vice versa, if the tuning parameter has small positive value, then the robustness of the method would be not strong but the efficiency will be higher [35]. The relation between the efficiency and the tuning parameters was discussed by several authors (see Jones et al [85], Basu et al [67], and Fujisawa and Eguchi [35]). …”
Section: Conclusion and Discussionmentioning
confidence: 99%
“…For example, if the tuning parameter γ is large, then the robustness of the Gamma divergence will be strong but the efficiency could be lower, and vice versa, if the tuning parameter has small positive value, then the robustness of the method would be not strong but the efficiency will be higher [35]. The relation between the efficiency and the tuning parameters was discussed by several authors (see Jones et al [85], Basu et al [67], and Fujisawa and Eguchi [35]). …”
Section: Conclusion and Discussionmentioning
confidence: 99%
“…In the context of probability distributions, these weighting factors may control the influence of likelihood ratios p it /q it . It has been shown [55] and [56] that the parameter β determines a tradeoff between robustness to outliers (for β > 0) and efficiency (for β near 0). In the special case of β = 1 the Euclidean distance is obtained, which is known to be more robust and less efficient than the Kullback-Leibler divergence β = 0.…”
Section: Why Is Ab-divergence Potentially Robust?mentioning
confidence: 99%
“…A wide class of parametric minimum divergence estimators may be found in Basu et al (1998), Jones et al (2001). Other diergence-based estimators are found in Menéndez et al (1995Menéndez et al ( ,2001).…”
Section: Divergence-based Inferencementioning
confidence: 99%