ACM SIGGRAPH 2003 Sketches &Amp; Applications 2003
DOI: 10.1145/965400.965402
|View full text |Cite
|
Sign up to set email alerts
|

An efficient spatio-temporal architecture for animation rendering

Abstract: Abstract

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
35
0

Year Published

2004
2004
2011
2011

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 24 publications
(36 citation statements)
references
References 27 publications
1
35
0
Order By: Relevance
“…These algorithms only use two discrete camera locations (one per eye location) instead of a full camera line. Havran et al [16] divide shaders into a view-dependent and a view-independent part, and reuse the latter of these between frames in animation. This technique resembles our shader reuse over the camera line.…”
Section: Previous Workmentioning
confidence: 99%
See 1 more Smart Citation
“…These algorithms only use two discrete camera locations (one per eye location) instead of a full camera line. Havran et al [16] divide shaders into a view-dependent and a view-independent part, and reuse the latter of these between frames in animation. This technique resembles our shader reuse over the camera line.…”
Section: Previous Workmentioning
confidence: 99%
“…In the simplest implementation, we divide the shader into a view-dependent (V D ) and a view-independent (V I ) part [16,20]. For every back tracing sample, the V D part of the BRDF is evaluated.…”
Section: Shading Reusementioning
confidence: 99%
“…Ward and Simmons [WS99] and Bala et al [BDT99] store and reuse previously rendered rays. Havran et al [HDM03] calculate the temporal interval over which a given sample will remain visible in an offline animation and reproject that sample during the interval, recalculating shading for all reprojected samples in every frame.…”
Section: Related Workmentioning
confidence: 99%
“…In this section we consider a view-dependent algorithm called bi-directional path tracing (BPT) [Lafortune 1996;Veach 1997] which we extend to handle dynamic environments [Havran et al 2003]. In this algorithm, the bookkeeping of global illumination samples is organized in the image space.…”
Section: Spatio-temporal Bi-directional Path Tracingmentioning
confidence: 99%