2011
DOI: 10.1007/s10543-011-0363-z
|View full text |Cite
|
Sign up to set email alerts
|

Computation of connection coefficients and measure modifications for orthogonal polynomials

Abstract: We observe that polynomial measure modifications for families of univariate orthogonal polynomials imply sparse connection coefficient relations. We therefore propose connecting L 2 expansion coefficients between a polynomial family and a modified family by a sparse transformation. Accuracy and conditioning of the connection and its inverse are explored. The connection and recurrence coefficients can simultaneously be obtained as the Cholesky decomposition of a matrix polynomial involving the Jacobi matrix; th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2013
2013
2020
2020

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 9 publications
(9 citation statements)
references
References 30 publications
(39 reference statements)
0
9
0
Order By: Relevance
“…The statistical significance of main effects and interactions was assessed using Wald statistics. We used orthogonal polynomials, computed via the Cholesky decomposition to avoid high correlation between original “time” scale variables, which can cause estimation problems (Narayan & Hesthaven, ). SAS 9.3 (SAS Institute Inc., Cary, NC, USA) was used for the analysis.…”
Section: Methodsmentioning
confidence: 99%
“…The statistical significance of main effects and interactions was assessed using Wald statistics. We used orthogonal polynomials, computed via the Cholesky decomposition to avoid high correlation between original “time” scale variables, which can cause estimation problems (Narayan & Hesthaven, ). SAS 9.3 (SAS Institute Inc., Cary, NC, USA) was used for the analysis.…”
Section: Methodsmentioning
confidence: 99%
“…We note that O(N log(N )) fast algorithms can be applied to obtain B k.j ; see, e.g., [2] for β = δ = 0 and [27,32] for β, δ > −1 when x j are Gauss-Chebyshev quadrature points δ = γ = −1/2.…”
Section: Regularitymentioning
confidence: 99%
“…where ppxq is a polynomial, non-negative on the support of µ. This is a well-studied problem [10,6,7,19]. In particular, one may reduce the problem to iterating over modifications by linear and quadratic polynomials.…”
Section: Measure Modificationsmentioning
confidence: 99%
“…, M`A`2pn´jq with modification factor pu´px j,n´x qq 2 . (19), and return p F c n`x ; µ pα,ρq˘g iven by (20). Algorithm 3: Computation of p F c n pxq for µ pα,ρq HF corresponding to a half-line Freud weight.…”
Section: Computing Pmentioning
confidence: 99%