The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2015
DOI: 10.1007/s00371-015-1130-y
|View full text |Cite
|
Sign up to set email alerts
|

Lossy volume compression using Tucker truncation and thresholding

Abstract: Tensor decompositions, in particular the Tucker model, are a powerful family of techniques for dimensionality reduction and are being increasingly used for compactly encoding large multidimensional arrays, images and other visual data sets. In interactive applications, volume data often needs to be decompressed and manipulated dynamically; when designing data reduction and reconstruction methods, several parameters must be taken into account, such as the achievable compression ratio, approximation error and re… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
31
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 34 publications
(31 citation statements)
references
References 27 publications
0
31
0
Order By: Relevance
“…Progressive tensor rank reduction (the so-called truncation; see later sections) has been shown to reveal features and structural details at different scales in volume data [21]. Further recent efforts in the context of tensor compression include [2,9,[22][23][24] for interactive volume rendering and visualization, [25] for 3D displays, [26] for integral histograms of images and volumes, and [27][28][29][30] for reflectance fields, among others. The large-scale renderer TAMRESH [23] resembles block-transform coding in that the input volume is partitioned in small multiresolution cubic bricks; each brick is then compressed as a separate HOSVD core.…”
Section: Compressors Based On Tensor Decompositionmentioning
confidence: 99%
See 2 more Smart Citations
“…Progressive tensor rank reduction (the so-called truncation; see later sections) has been shown to reveal features and structural details at different scales in volume data [21]. Further recent efforts in the context of tensor compression include [2,9,[22][23][24] for interactive volume rendering and visualization, [25] for 3D displays, [26] for integral histograms of images and volumes, and [27][28][29][30] for reflectance fields, among others. The large-scale renderer TAMRESH [23] resembles block-transform coding in that the input volume is partitioned in small multiresolution cubic bricks; each brick is then compressed as a separate HOSVD core.…”
Section: Compressors Based On Tensor Decompositionmentioning
confidence: 99%
“…The large-scale renderer TAMRESH [23] resembles block-transform coding in that the input volume is partitioned in small multiresolution cubic bricks; each brick is then compressed as a separate HOSVD core. Recently, Tucker core hard thresholding combined with factor matrix quantization was shown [2] to yield better compression rate than slice-wise truncating the core. These points have motivated the compressor proposed here.…”
Section: Compressors Based On Tensor Decompositionmentioning
confidence: 99%
See 1 more Smart Citation
“…The work by Ballester‐Ripoll and Pajarola [BRP16] aimed to improve the standard Tucker decomposition. They first presented core truncation, where they discarded the least significant ranks, or segments of the basis function matrices, and corresponding segments of the reduced core tensor.…”
Section: Lossy Compressionmentioning
confidence: 99%
“…The work by Ballester-Ripoll and Pajarola [BRP16] aimed to improve the standard Tucker decomposition. They first presented core truncation, where they discarded the least significant ranks, or Given the input data/tensor X , the least square fittings results in a core tensor G and three basis factor matrices A, B and C. G is considered an approximation of X .…”
Section: Tensor Decompositionmentioning
confidence: 99%