Abstract:Tensor decompositions, in particular the Tucker model, are a powerful family of techniques for dimensionality reduction and are being increasingly used for compactly encoding large multidimensional arrays, images and other visual data sets. In interactive applications, volume data often needs to be decompressed and manipulated dynamically; when designing data reduction and reconstruction methods, several parameters must be taken into account, such as the achievable compression ratio, approximation error and re… Show more
“…Progressive tensor rank reduction (the so-called truncation; see later sections) has been shown to reveal features and structural details at different scales in volume data [21]. Further recent efforts in the context of tensor compression include [2,9,[22][23][24] for interactive volume rendering and visualization, [25] for 3D displays, [26] for integral histograms of images and volumes, and [27][28][29][30] for reflectance fields, among others. The large-scale renderer TAMRESH [23] resembles block-transform coding in that the input volume is partitioned in small multiresolution cubic bricks; each brick is then compressed as a separate HOSVD core.…”
Section: Compressors Based On Tensor Decompositionmentioning
confidence: 99%
“…The large-scale renderer TAMRESH [23] resembles block-transform coding in that the input volume is partitioned in small multiresolution cubic bricks; each brick is then compressed as a separate HOSVD core. Recently, Tucker core hard thresholding combined with factor matrix quantization was shown [2] to yield better compression rate than slice-wise truncating the core. These points have motivated the compressor proposed here.…”
Section: Compressors Based On Tensor Decompositionmentioning
confidence: 99%
“…[20], [22], [23] and [24]. Right: the full core approach first considered in [2] and here extended into a full-fledged compressor with adaptive thresholding and bit-plane coding.…”
a) Original (512MB) (b) 10:1 compression (51.2MB) (c) 300:1 compression (1.71MB) Fig. 1. (a) a 512 3 isotropic turbulence volume [1]; (b) visually identical compression result; (c) result after extreme compression.Abstract-Memory and network bandwidth are decisive bottlenecks when handling high-resolution multidimensional data sets in visualization applications, and they increasingly demand suitable data compression strategies. We introduce a novel lossy compression algorithm for multidimensional data over regular grids. It leverages the higher-order singular value decomposition (HOSVD), a generalization of the SVD to three dimensions and higher, together with bit-plane, run-length and arithmetic coding to compress the HOSVD transform coefficients. Our scheme degrades the data particularly smoothly and achieves lower mean squared error than other state-of-the-art algorithms at low-to-medium bit rates, as it is required in data archiving and management for visualization purposes. Further advantages of the proposed algorithm include very fine bit rate selection granularity and the ability to manipulate data at very small cost in the compression domain, for example to reconstruct filtered and/or subsampled versions of all (or selected parts) of the data set.
“…Progressive tensor rank reduction (the so-called truncation; see later sections) has been shown to reveal features and structural details at different scales in volume data [21]. Further recent efforts in the context of tensor compression include [2,9,[22][23][24] for interactive volume rendering and visualization, [25] for 3D displays, [26] for integral histograms of images and volumes, and [27][28][29][30] for reflectance fields, among others. The large-scale renderer TAMRESH [23] resembles block-transform coding in that the input volume is partitioned in small multiresolution cubic bricks; each brick is then compressed as a separate HOSVD core.…”
Section: Compressors Based On Tensor Decompositionmentioning
confidence: 99%
“…The large-scale renderer TAMRESH [23] resembles block-transform coding in that the input volume is partitioned in small multiresolution cubic bricks; each brick is then compressed as a separate HOSVD core. Recently, Tucker core hard thresholding combined with factor matrix quantization was shown [2] to yield better compression rate than slice-wise truncating the core. These points have motivated the compressor proposed here.…”
Section: Compressors Based On Tensor Decompositionmentioning
confidence: 99%
“…[20], [22], [23] and [24]. Right: the full core approach first considered in [2] and here extended into a full-fledged compressor with adaptive thresholding and bit-plane coding.…”
a) Original (512MB) (b) 10:1 compression (51.2MB) (c) 300:1 compression (1.71MB) Fig. 1. (a) a 512 3 isotropic turbulence volume [1]; (b) visually identical compression result; (c) result after extreme compression.Abstract-Memory and network bandwidth are decisive bottlenecks when handling high-resolution multidimensional data sets in visualization applications, and they increasingly demand suitable data compression strategies. We introduce a novel lossy compression algorithm for multidimensional data over regular grids. It leverages the higher-order singular value decomposition (HOSVD), a generalization of the SVD to three dimensions and higher, together with bit-plane, run-length and arithmetic coding to compress the HOSVD transform coefficients. Our scheme degrades the data particularly smoothly and achieves lower mean squared error than other state-of-the-art algorithms at low-to-medium bit rates, as it is required in data archiving and management for visualization purposes. Further advantages of the proposed algorithm include very fine bit rate selection granularity and the ability to manipulate data at very small cost in the compression domain, for example to reconstruct filtered and/or subsampled versions of all (or selected parts) of the data set.
“…The work by Ballester‐Ripoll and Pajarola [BRP16] aimed to improve the standard Tucker decomposition. They first presented core truncation, where they discarded the least significant ranks, or segments of the basis function matrices, and corresponding segments of the reduced core tensor.…”
Section: Lossy Compressionmentioning
confidence: 99%
“…The work by Ballester-Ripoll and Pajarola [BRP16] aimed to improve the standard Tucker decomposition. They first presented core truncation, where they discarded the least significant ranks, or Given the input data/tensor X , the least square fittings results in a core tensor G and three basis factor matrices A, B and C. G is considered an approximation of X .…”
Data reduction is increasingly being applied to scientific data for numerical simulations, scientific visualizations and data analyses. It is most often used to lower I/O and storage costs, and sometimes to lower in‐memory data size as well. With this paper, we consider five categories of data reduction techniques based on their information loss: (1) truly lossless, (2) near lossless, (3) lossy, (4) mesh reduction and (5) derived representations. We then survey available techniques in each of these categories, summarize their properties from a practical point of view and discuss relative merits within a category. We believe, in total, this work will enable simulation scientists and visualization/data analysis scientists to decide which data reduction techniques will be most helpful for their needs.
Tensor decomposition methods and multilinear algebra are powerful tools to cope with challenges around multidimensional and multivariate data in computer graphics, image processing and data visualization, in particular with respect to compact representation and processing of increasingly large-scale data sets. Initially proposed as an extension of the concept of matrix rank for 3 and more dimensions, tensor decomposition methods have found applications in a remarkably wide range of disciplines. We briefly review the main concepts of tensor decompositions and their application to multidimensional visual data. Furthermore, we will include a first outlook on porting these techniques to multivariate data such as vector and tensor fields.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.