2022
DOI: 10.1002/nla.2467
|View full text |Cite
|
Sign up to set email alerts
|

Computing f‐divergences and distances of high‐dimensional probability density functions

Abstract: Very often, in the course of uncertainty quantification tasks or data analysis, one has to deal with high‐dimensional random variables. Here the interest is mainly to compute characterizations like the entropy, the Kullback–Leibler divergence, more general f$$ f $$‐divergences, or other such characteristics based on the probability density. The density is often not available directly, and it is a computational challenge to just represent it in a numerically feasible fashion in case the dimension is even modera… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 133 publications
(413 reference statements)
0
1
0
Order By: Relevance
“…NISQ superconducting devices suffer with respect to the inclusion of entanglement among qubit triplets and more complex structures than pairs [25], thus leading to a shortrange entanglement which causes a comparable computational capability of tensor network techniques [26]. This successful application for computational complexity reduction in the quantum framework intertwines with widespread applications in data science [27][28][29][30] and hierarchical tensor geometry [31][32][33]. Hierarchical structures typify in machine learning some information filtering procedures, thus naturally framing tensor network renormalization for these applications, as made by means of wavelets [34].…”
Section: Introductionmentioning
confidence: 99%
“…NISQ superconducting devices suffer with respect to the inclusion of entanglement among qubit triplets and more complex structures than pairs [25], thus leading to a shortrange entanglement which causes a comparable computational capability of tensor network techniques [26]. This successful application for computational complexity reduction in the quantum framework intertwines with widespread applications in data science [27][28][29][30] and hierarchical tensor geometry [31][32][33]. Hierarchical structures typify in machine learning some information filtering procedures, thus naturally framing tensor network renormalization for these applications, as made by means of wavelets [34].…”
Section: Introductionmentioning
confidence: 99%
“…For the problems of data analysis or uncertainty quantification one should deal with high‐dimensional random variables and the corresponding probability density function. Litvinenko et al 3 propose to present the discretized probability density in the low rank tensor format, which however, makes computationally difficult tasks, when some function of probability density is required which needs the point‐wise representations of a tensor. The arising computational problem becomes tractable when considering the compressed data as a an element of associative commutative algebra with an inner product and using the corresponding matrix algorithms.…”
mentioning
confidence: 99%