Proceedings of the 36th International Symposium on Symbolic and Algebraic Computation 2011
DOI: 10.1145/1993886.1993909
|View full text |Cite
|
Sign up to set email alerts
|

Diversification improves interpolation

Abstract: We consider the problem of interpolating an unknown multivariate polynomial with coefficients taken from a finite field or as numerical approximations of complex numbers. Building on the recent work of Garg and Schost, we improve on the best-known algorithm for interpolation over large finite fields by presenting a Las Vegas randomized algorithm that uses fewer black box evaluations. Using related techniques, we also address numerical interpolation of sparse polynomials with complex coefficients, and provide t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
56
0

Year Published

2012
2012
2019
2019

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 31 publications
(57 citation statements)
references
References 32 publications
1
56
0
Order By: Relevance
“…One may assume that some polynomials residues f (ω i ) mod (x p−1 + · · · + 1) are faulty. The Chinese remaindering of the term exponents with several p can be done by diversification [9]. The sparsest shift algorithms in [6] can be modified similarly.…”
Section: Future Workmentioning
confidence: 99%
See 1 more Smart Citation
“…One may assume that some polynomials residues f (ω i ) mod (x p−1 + · · · + 1) are faulty. The Chinese remaindering of the term exponents with several p can be done by diversification [9]. The sparsest shift algorithms in [6] can be modified similarly.…”
Section: Future Workmentioning
confidence: 99%
“…Hybrid symbolic-numeric algorithms have traditionally assumed that the input scalars have with some relative error been deformed. For instance, if one allows substantial oversampling, a sparse polynomial can be recovered from numerical noisy values, where each value has a relative error of 0.5 [9], perhaps even 0.99, with still more oversampling. Here we consider errors in the Hamming distance sense, i.e., several incorrect values without assumption on any accuracy.…”
Section: Introductionmentioning
confidence: 99%
“…Finding coefficients after finding the exponents is quite straightforward. Giesbrecht and Roche [13] studied polynomial interpolation over remainder black box which is somewhat similar to the modular black box introduced in [12]. A Las Vegas randomized algorithm that uses fewer black box evaluations was presented in [13] for this remainder black box.…”
Section: Introductionmentioning
confidence: 99%
“…Giesbrecht and Roche [13] studied polynomial interpolation over remainder black box which is somewhat similar to the modular black box introduced in [12]. A Las Vegas randomized algorithm that uses fewer black box evaluations was presented in [13] for this remainder black box. Arnold, Giesbrecht and Roche [2] devised a Monte Carlo algorithm for interpolating polynomials represented by straight-line programs.…”
Section: Introductionmentioning
confidence: 99%
“…Our investigations are motivated by numeric sparse polynomial interpolation algorithms [11,12,20,13]. In the Ben-Or/Tiwari version of this algorithm, one first computes for a sparse polynomial f (x) ∈ C[x] the linear generator for h l = f (ω l+1 ) for l = 0, 1, 2, .…”
Section: Introductionmentioning
confidence: 99%