2004
DOI: 10.1137/s1064827503426711
|View full text |Cite
|
Sign up to set email alerts
|

Computation of the Entropy of Polynomials Orthogonal on an Interval

Abstract: We give an effective method to compute the entropy for polynomials orthogonal on a segment of the real axis that uses as input data only the coefficients of the recurrence relation satisfied by these polynomials. This algorithm is based on a series expression for the mutual energy of two probability measures naturally connected with the polynomials. The particular case of Gegenbauer polynomials is analyzed in detail. These results are applied also to the computation of the entropy of spherical harmonics, impor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
49
0

Year Published

2005
2005
2019
2019

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 43 publications
(49 citation statements)
references
References 32 publications
0
49
0
Order By: Relevance
“…Let us remind that the angular entropy S͑Y l,͕͖ ͒, which appears in the basic expressions (92), (94) and (95), can be numerically evaluated by the highly efficient algorithm of Buyarov et al [53] and analytically calculated in a few cases, as already discussed in subsection 2.5. In particular, for (ns)-Rydberg states one has that…”
mentioning
confidence: 99%
“…Let us remind that the angular entropy S͑Y l,͕͖ ͒, which appears in the basic expressions (92), (94) and (95), can be numerically evaluated by the highly efficient algorithm of Buyarov et al [53] and analytically calculated in a few cases, as already discussed in subsection 2.5. In particular, for (ns)-Rydberg states one has that…”
mentioning
confidence: 99%
“…The former has been theoretically [9,6] and numerically [3] examined for general orthogonal polynomials, while for the latter it has been studied the Fisher information associated with translations of the variable (i.e., the locality Fisher information) both analytically [39] and numerically [11]. Here we extend this study by means of the computation of a more general concept, the parameter-based Fisher information of the polynomialsỹ n (x; ).…”
Section: Parameter-based Fisher Information Of Jacobi and Laguerre Pomentioning
confidence: 95%
“…In particular, he has shown that these probability densities govern the asymptotic behavior of the ratio p n+1 (x)/p n (x) as n → ∞. On the other hand, these two fundamental and applied reasons have motivated an increasing interest for the determination of the spreading of the classical orthogonal polynomials {p n (x)} throughout its interval of orthogonality by means of the information-theoretic measures of their corresponding Rakhmanov densities (x) [3,[6][7][8][9][10][11]39]. The Shannon information entropy of these densities has been examined numerically [3].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…[3] for a recent summary of the main achievements. The numerical computation of this entropic integral on finite intervals is most conveniently done by the effective method of Buyarov et al [4].…”
Section: Introductionmentioning
confidence: 99%