2011
DOI: 10.1016/j.physa.2011.03.012
|View full text |Cite
|
Sign up to set email alerts
|

Scaled Bregman divergences in a Tsallis scenario

Abstract: There exist two different versions of the Kullback-Leibler divergence (K-Ld) in Tsallis statistics, namely the usual generalized K-Ld and the generalized Bregman K-Ld. Problems have been encountered in trying to reconcile them. A condition for consistency between these two generalized K-Ld-forms by recourse to the additive duality of Tsallis statistics is derived. It is also shown that the usual generalized K-Ld subjected to this additive duality, known as the dual generalized K-Ld, is a scaled Bregman diverge… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
7
0

Year Published

2011
2011
2023
2023

Publication Types

Select...
3
1

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(7 citation statements)
references
References 36 publications
(89 reference statements)
0
7
0
Order By: Relevance
“…The extension of the generalization of the results derived in Ref. [14] is presented in Sections 4 and 5 of this Letter.…”
Section: Introductionmentioning
confidence: 71%
See 3 more Smart Citations
“…The extension of the generalization of the results derived in Ref. [14] is presented in Sections 4 and 5 of this Letter.…”
Section: Introductionmentioning
confidence: 71%
“…Note that for mutual information based models, defining the scaled Bregman information as the normal averages expectation of the dual generalized K-Ld [14], the Pythagorean theorem derived for the dual generalized K-Ld in this Letter provides the foundation to extend the optimality of minimum Bregman information principle [12], [32] which has immense utility in machine learning and allied disciplines, and, the Bregman projection theorem to the case of deformed statistics. Finally, the Pythagorean theorem and the minimum dual generalized K-Ld principle developed in this Letter serve as a basis to generalize the concept of I-projections [33] to the case of deformed statistics.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…It is noteworthy to mention that the additive duality was recently employed to successfully demonstrate that the dual generalized Kullback-Leibler divergence is a scaled Bregman divergence [24,25]. This paper derives the necessary conditions to reconcile the dual Tsallis maximum entropy principle with the asymptotic frequencies obtained from large deviation theory (i.e.…”
mentioning
confidence: 98%