2022
DOI: 10.48550/arxiv.2203.00863
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Unifying Framework for Some Directed Distances in Statistics

Abstract: Density-based directed distances -particularly known as divergences -between probability distributions are widely used in statistics as well as in the adjacent research fields of information theory, artificial intelligence and machine learning. Prominent examples are the Kullback-Leibler information distance (relative entropy) which e.g. is closely connected to the omnipresent maximum likelihood estimation method, and Pearson's χ 2 ´distance which e.g. is used for the celebrated chisquare goodness-of-fit test.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 155 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?