2017
DOI: 10.1561/2200000067
|View full text |Cite
|
Sign up to set email alerts
|

Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 2 Applications and Future Perspectives

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
126
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 187 publications
(130 citation statements)
references
References 210 publications
0
126
0
Order By: Relevance
“…Now, dimensionality reduction is an essential element of the engineering (the "practical man") approach to mathematical modeling [4]. Many model reduction methods were developed and successfully implemented in applications, from various versions of principal component analysis to approximation by manifolds, graphs, and complexes [5][6][7], and low-rank tensor network decompositions [8,9].Various reasons and forms of the curse of dimensionality were classified and studied, from the obvious combinatorial explosion (for example, for n binary Boolean attributes, to check all the combinations of values we have to analyze 2 n cases) to more sophisticated distance concentration: in a high-dimensional space, the distances between randomly selected points tend to concentrate near their mean value, and the neighbor-based methods of data analysis become useless in their standard forms [10,11]. Many "good" polynomial time algorithms become useless in high dimensions.Surprisingly, however, and despite the expected challenges and difficulties, common-sense heuristics based on the simple and the most straightforward methods "can yield results which are almost surely optimal" for high-dimensional problems [12].…”
mentioning
confidence: 99%
“…Now, dimensionality reduction is an essential element of the engineering (the "practical man") approach to mathematical modeling [4]. Many model reduction methods were developed and successfully implemented in applications, from various versions of principal component analysis to approximation by manifolds, graphs, and complexes [5][6][7], and low-rank tensor network decompositions [8,9].Various reasons and forms of the curse of dimensionality were classified and studied, from the obvious combinatorial explosion (for example, for n binary Boolean attributes, to check all the combinations of values we have to analyze 2 n cases) to more sophisticated distance concentration: in a high-dimensional space, the distances between randomly selected points tend to concentrate near their mean value, and the neighbor-based methods of data analysis become useless in their standard forms [10,11]. Many "good" polynomial time algorithms become useless in high dimensions.Surprisingly, however, and despite the expected challenges and difficulties, common-sense heuristics based on the simple and the most straightforward methods "can yield results which are almost surely optimal" for high-dimensional problems [12].…”
mentioning
confidence: 99%
“…where p n ñ |˜[ ] acts as the predicted label corresponding to the nth sample. Based on these, we derive a highly efficient training algorithm inspired by MERA 11 . The cost function to be minimized is chosen as…”
Section: Tree Tn and The Algorithmmentioning
confidence: 99%
“…As one of the most powerful numerical tools for studying quantum manybody systems [6][7][8][9], tensor networks (TNs) have drawn more attention. For instance, TNs have been recently applied to solve machine learning problems such as dimensionality reduction [10,11] and handwriting recognition [12,13]. Just as a TN allows the numerical treatment of difficult physical systems by providing layers of abstraction, deep learning achieved similar striking advances in automated feature extraction and pattern recognition using a hierarchical representation [14].…”
Section: Introductionmentioning
confidence: 99%
“…If we directly use (7) to compute X n+1 , we need to store and update each entry of X n during the iterative process. The time and space complexities are tremendous.…”
Section: Optimization Of Calculationmentioning
confidence: 99%
“…Moreover, tensors have also been applied in clustering and classification in some recent studies [5,6]. A comprehensive survey of the applications of tensors can be found in [7]. In these applications, a decisive work is to fill in the missing values of the tensor, namely, tensor completion.…”
Section: Introductionmentioning
confidence: 99%