2019
DOI: 10.48550/arxiv.1912.07996
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Tensor Rank bounds for Point Singularities in $\mathbb{R}^3$

Abstract: A. We analyze rates of approximation by quantized, tensor-structured representations of functions with isolated point singularities in R 3 . We consider functions in countably normed Sobolev spaces with radial weights and analytic-or Gevrey-type control of weighted semi-norms. Several classes of boundary value and eigenvalue problems from science and engineering are discussed whose solutions belong to the countably normed spaces.It is shown that quantized, tensor-structured approximations of functions in these… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 2 publications
0
1
0
Order By: Relevance
“…The quantized tensor train (QTT) decomposition introduced in [43,63] allows to further reduce the representation complexity for multidimensional tensors to logarithmic scale in the volume size, O(d log n). QTT tensor decompositions are nowadays widely used in solution of the multidimensional problems in numerical analysis, see for example, [65,17,37,38,53,11], and [49,19,66,58,1,54,58], as well as the recent books on tensor numerical methods in computational quantum chemistry and in scientific computing [41,44] and references therein.…”
Section: Discussionmentioning
confidence: 99%
“…The quantized tensor train (QTT) decomposition introduced in [43,63] allows to further reduce the representation complexity for multidimensional tensors to logarithmic scale in the volume size, O(d log n). QTT tensor decompositions are nowadays widely used in solution of the multidimensional problems in numerical analysis, see for example, [65,17,37,38,53,11], and [49,19,66,58,1,54,58], as well as the recent books on tensor numerical methods in computational quantum chemistry and in scientific computing [41,44] and references therein.…”
Section: Discussionmentioning
confidence: 99%
“…Tensor numerical methods [1,2] provide means to overcome the problem of the exponential increase of numerical complexity in the dimension of the problem d, due to their intrinsic feature of reducing the computational costs of multi-linear algebra on rank-structured data to merely linear scaling in both the gridsize n and dimension d. They appeared as bridging of the algebraic tensor decompositions initiated in chemometrics [3][4][5][6][7][8][9][10] and of the nonlinear approximation theory on separable low-rank representation of multi-variate functions and operators [11][12][13]. The canonical [14,15], Tucker [16], tensor train (TT) [17,18], and hierarchical Tucker (HT) [19] formats are the most commonly used rank-structured parametrizations in applications of modern tensor numerical methods.…”
Section: Introductionmentioning
confidence: 99%