2005
DOI: 10.1016/j.camwa.2004.07.017
|View full text |Cite
|
Sign up to set email alerts
|

A symmetric information divergence measure of the Csiszár's f-divergence class and its bounds

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0
1

Year Published

2008
2008
2022
2022

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 16 publications
(12 citation statements)
references
References 9 publications
0
11
0
1
Order By: Relevance
“…Regardless of the outbreak simulated, OSF = 1 can be considered as a kind of threshold below which the loss of information increasingly impairs simulation quality in terms of goodness-of-fit and variance. As an overall measure of simulation GOF, we used and recommend the χ 2 -divergence, which is also a measure of the loss of information during the simulation process [ 49 , 50 ].…”
Section: Resultsmentioning
confidence: 99%
“…Regardless of the outbreak simulated, OSF = 1 can be considered as a kind of threshold below which the loss of information increasingly impairs simulation quality in terms of goodness-of-fit and variance. As an overall measure of simulation GOF, we used and recommend the χ 2 -divergence, which is also a measure of the loss of information during the simulation process [ 49 , 50 ].…”
Section: Resultsmentioning
confidence: 99%
“…Corresponding to Kumar and Chhina [10] divergence measure, we proposed the divergence measure for FSs as follows:…”
Section: New Divergence For Fssmentioning
confidence: 99%
“…In the literature, various information measures have been proposed such that each definition enjoys some definite axiomatic or heuristic postulates, which lead to their extensive applications in different disciplines. A conventional categorization to distinguish these measures is as: parametric, non-parametric, and entropy-type measures of information [10]. Parametric measures determine the amount of information delivered by the object regarding an unknown parameter α and are functions of α.…”
Section: Introductionmentioning
confidence: 99%
“…Another direct application of the method improves Theorem 34 in [1], which is an upper bound on Rényi's divergence in terms of the variational distance and relative information maximum, while providing a simpler proof for this type of inequality. Vajda's well-known "range of values theorem" (see [7]- [11]) is also recovered as an application.…”
Section: Introductionmentioning
confidence: 99%