ACM SIGGRAPH 2006 Research Posters on - SIGGRAPH '06 2006
DOI: 10.1145/1179622.1179768
|View full text |Cite
|
Sign up to set email alerts
|

Azimuth-rotated vector quantization for BTF compression

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2010
2010
2019
2019

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 3 publications
0
3
0
Order By: Relevance
“…Although all the above mentioned methods allow very fast BTF rendering, they mostly achieve relatively moderate compression ratios, less than 1:200. Only some of them allow fast GPU implementation, only two ([LM01, KM06]) use a variant of vector quantization, and only a few of them allow importance sampling of BTF without reconstruction. Our method is designed to provide all the features required for CPU‐based and GPU‐based rendering algorithms.…”
Section: Previous Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Although all the above mentioned methods allow very fast BTF rendering, they mostly achieve relatively moderate compression ratios, less than 1:200. Only some of them allow fast GPU implementation, only two ([LM01, KM06]) use a variant of vector quantization, and only a few of them allow importance sampling of BTF without reconstruction. Our method is designed to provide all the features required for CPU‐based and GPU‐based rendering algorithms.…”
Section: Previous Workmentioning
confidence: 99%
“…This method was also applied for compression of psychophysically reduced BTF data in [FCGH08]. Another BTF vector quantization approach based on azimuthal rotation of resampled data F x was mentioned in [KM06]. In [LM01] introduced a BTF recognition method that captured surface appearance under different illumination and viewing conditions by using three‐dimensional (3D) textons constructed by means of K‐means clustering of responses to selective linear filters applied at individual planar positions in BTF.…”
Section: Previous Workmentioning
confidence: 99%
“…Combining factorization with a clustering method like K‐means [MMK03, TZL * 02] applied to the latent representation enables the use of fewer coefficients per cluster. Other factorization techniques include hierarchical tensor decomposition methods applied to the high dimensional BTF [WWS * 05, RK09] and vector quantization methods based on codebooks [KM06,HFM10, EV14].…”
Section: Related Workmentioning
confidence: 99%