We consider the problem of reconstructing an unknown function f on a domain X from samples of f at n randomly chosen points with respect to a given measure ρX . Given a sequence of linear spaces (Vm)m>0 with dim(Vm) = m ≤ n, we study the least squares approximations from the spaces Vm. It is well known that such approximations can be inaccurate when m is too close to n, even when the samples are noiseless. Our main result provides a criterion on m that describes the needed amount of regularization to ensure that the least squares method is stable and that its accuracy, measured in L 2 (X, ρX ), is comparable to the best approximation error of f by elements from Vm. We illustrate this criterion for various approximation schemes, such as trigonometric polynomials, with ρX being the uniform measure, and algebraic polynomials, with ρX being either the uniform or Chebyshev measure.For such examples we also prove similar stability results using deterministic samples that are equispaced with respect to these measures.We let e m (f ) = f − P m f denote the best approximation error. *
Abstract. We prove the following Whitney estimate. Given 0 < p ≤ ∞, r ∈ N, and d ≥ 1, there exists a constant C (d, r, p), depending only on the three parameters, such that for every bounded convex domain ⊂ R d , and each function f ∈ L p ( ),where E r −1 ( f, ) p is the degree of approximation by polynomials of total degree, r − 1, and ω r ( f, ·) p is the modulus of smoothness of order r . Estimates like this can be found in the literature but with constants that depend in an essential way on the geometry of the domain, in particular, the domain is assumed to be a Lipschitz domain and the above constant C depends on the minimal head-angle of the cones associated with the boundary. The estimates we obtain allow us to extend to the multivariate case, the results on bivariate Skinny B-spaces of Karaivanov and Petrushev on characterizing nonlinear approximation from nested triangulations. In a sense, our results were anticipated by Karaivanov and Petrushev.
The Binary Space Partition (BSP) technique is a simple and efficient method to adaptively partition an initial given domain to match the geometry of a given input function. As such the BSP technique has been widely used by practitioners, but up until now no rigorous mathematical justification to it has been offered. Here we attempt to put the technique on sound mathematical foundations, and we offer an enhancement of the BSP algorithm in the spirit of what we are going to call geometric wavelets. This new approach to sparse geometric representation is based on recent development in the theory of multivariate nonlinear piecewise polynomial approximation. We provide numerical examples of n-term geometric wavelet approximations of known test images and compare them with dyadic wavelet approximation. We also discuss applications to image denoising and compression.
Abstract. We study nonlinear m-term approximation with regard to a redundant dictionary D in a Hilbert space H. It is known that the Pure Greedy Algorithm (or, more generally, the Weak Greedy Algorithm) provides for each f ∈ H and any dictionary D an expansion into a series f =
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.