2002
DOI: 10.2969/jmsj/1191593914
|View full text |Cite
|
Sign up to set email alerts
|

Free relative entropy for measures and a corresponding perturbation theory

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
14
0

Year Published

2004
2004
2010
2010

Publication Types

Select...
4
1

Relationship

3
2

Authors

Journals

citations
Cited by 7 publications
(15 citation statements)
references
References 14 publications
1
14
0
Order By: Relevance
“…Lemma 2 is reminiscent of the well-known result that if the sigma-fields satisfy , then (29) As in the incremental-channel proof of [1, Th. 1], consider the setup in Fig.…”
Section: Lemmamentioning
confidence: 94%
See 2 more Smart Citations
“…Lemma 2 is reminiscent of the well-known result that if the sigma-fields satisfy , then (29) As in the incremental-channel proof of [1, Th. 1], consider the setup in Fig.…”
Section: Lemmamentioning
confidence: 94%
“…To motivate the first expression, note that if is discrete, then (1) leads to (66) More generally, applying Theorem 1 to the conditional relative entropy and using (29), it is easy to obtain the following result.…”
Section: Lemmamentioning
confidence: 99%
See 1 more Smart Citation
“…We do not call this the "free relative entropy" introduced in [11], a slightly different relative entropy-like quantity Σ(µ, ν) for two probability measures in the framework of free probability. Indeed, the free relative entropy Σ(µ, ν) for µ, ν ∈ M(R) is defined as…”
Section: 2mentioning
confidence: 99%
“…Note that Σ Q (µ) is regarded as the relative version of the free entropy Σ(µ) introduced by D. Voiculescu [28] as the classical relative entropy is the relative version of the Boltzmann-Gibbs entropy. (The "free relative entropy" Σ(µ, ν) for two measures was introduced in [11] from a slightly different viewpoint.) In this paper the relative free entropy Σ Q (µ) is also introduced for µ ∈ M(T), the probability measures on the 1-dimensional torus T, relative to a real continuous function Q on T. An important fact is that the relative free entropy Σ Q (µ) is the rate function (or the so-called weighted logarithmic integral up to an additive constant) of a large deviation for the empirical eigenvalue distribution of a certain random matrix.…”
Section: Introductionmentioning
confidence: 99%