1993
DOI: 10.1007/bf02100050
|View full text |Cite
|
Sign up to set email alerts
|

The analogues of entropy and of Fisher's information measure in free probability theory, I

Abstract: Analogues of the entropy and Fisher information measure for random variables in the context of free probability theory are introduced. Monotonicity properties and an analogue of the Cramer-Rao inequality are proved.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
168
0

Year Published

1999
1999
2020
2020

Publication Types

Select...
7
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 225 publications
(172 citation statements)
references
References 18 publications
1
168
0
Order By: Relevance
“…entries [21]. Voiculescu has also introduced the free entropy (to be more precise, the free differential entropy); in the case of a univariate density , it takes the form [23], [24] (91) while the free Fisher information is defined as (92) which is equal to the reciprocal of the variance in the case of the semicircle law. Among all distributions with second moment equal to (91) is maximized by the semicircular law which attains the value…”
Section: Free Relative Entropymentioning
confidence: 99%
“…entries [21]. Voiculescu has also introduced the free entropy (to be more precise, the free differential entropy); in the case of a univariate density , it takes the form [23], [24] (91) while the free Fisher information is defined as (92) which is equal to the reciprocal of the variance in the case of the semicircle law. Among all distributions with second moment equal to (91) is maximized by the semicircular law which attains the value…”
Section: Free Relative Entropymentioning
confidence: 99%
“…In the case of a single variable a ∈ M sa whose distribution measure (with respect to τ ) is µ, χ(a) coincides with the free entropy Σ(µ) := log |x − y| dµ(x) dµ(y) of µ introduced in [15] up to an additive constant: …”
Section: Preliminariesmentioning
confidence: 99%
“…Biane and R. Speicher [5] for µ ∈ M(R), the probability measures on R, relative to a real continuous function Q on R with a certain growth condition. Note that Σ Q (µ) is regarded as the relative version of the free entropy Σ(µ) introduced by D. Voiculescu [28] as the classical relative entropy is the relative version of the Boltzmann-Gibbs entropy. (The "free relative entropy" Σ(µ, ν) for two measures was introduced in [11] from a slightly different viewpoint.)…”
Section: Introductionmentioning
confidence: 99%
“…In §2 of this paper we reprove Biane and Voiculescu's free TCI in a slightly more general setting by making use of random matrix approximation and furthermore give a free TCI for measures on T in a similar way. Our initial motivation is to find another proof to Biane and Voiculescu's TCI by use of random matrix approximation on the lines of so-called Voiculescu's heuristics in [28] and to justify their TCI as the right free analog from the viewpoint of random matrix theory. In § §2.1 we prove the free TCI…”
Section: Introductionmentioning
confidence: 99%