2003
DOI: 10.1007/bf02517812
|View full text |Cite
|
Sign up to set email alerts
|

A new class of metric divergences on probability spaces and its applicability in statistics

Abstract: Dissimilarities, metric divergences, minimum distance estimators,

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
140
0

Year Published

2009
2009
2020
2020

Publication Types

Select...
3
3
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 180 publications
(141 citation statements)
references
References 22 publications
1
140
0
Order By: Relevance
“…Section 5 is devoted to the class of f -divergences of perimeter type, introduced and studied inÖsterreicher and Vajda (2003) and Vajda (2009). It is based on the class of entropies due to Arimoto (1971) and contains, next to the f -divergence given by (1), the total variation distance and a symmetrized version of the I-divergence also the squared Hellinger distance (with f (u) = ( √ u − 1) 2 ) and the squared Puri-Vincze distance.…”
Section: Fösterreichermentioning
confidence: 99%
“…Section 5 is devoted to the class of f -divergences of perimeter type, introduced and studied inÖsterreicher and Vajda (2003) and Vajda (2009). It is based on the class of entropies due to Arimoto (1971) and contains, next to the f -divergence given by (1), the total variation distance and a symmetrized version of the I-divergence also the squared Hellinger distance (with f (u) = ( √ u − 1) 2 ) and the squared Puri-Vincze distance.…”
Section: Fösterreichermentioning
confidence: 99%
“…In effect, the use of this  in forming a convex combination of the two estimators can be viewed as pursuing an objective of minimizing the variation in the resultant estimator (12). If we make use of the optimum  in the optimal convex estimator in (12), the result comes out in the form of a Stein-like estimator [20][21], where for a given samples of data, shrinkage is from  …”
Section: Empirical Calculation Of mentioning
confidence: 99%
“…If we make use of the optimum  in the optimal convex estimator in (12), the result comes out in the form of a Stein-like estimator [20][21], where for a given samples of data, shrinkage is from  …”
Section: Empirical Calculation Of mentioning
confidence: 99%
See 1 more Smart Citation
“…Although Jensen-Shannon divergence does not guarantee the triangular inequality of a metric, the square root of the divergence follows the metric property (as shown in [27,28]). …”
Section: Description Of Js Divergencementioning
confidence: 99%