2023
DOI: 10.48550/arxiv.2301.10241
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

K-Planes: Explicit Radiance Fields in Space, Time, and Appearance

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(6 citation statements)
references
References 0 publications
0
6
0
Order By: Relevance
“…The core component of NFM is the Spatially Sparse Neural Fields (SSNF), a hybrid INR employing a pyramid of multi-resolution, spatially sparse feature grids to maintain longrange buffers of spatiotemporal velocity fields. We show that for this purpose, our SSNF offers improved accuracy, training speed, and memory efficiency compared to state-of-the-art representations (e.g., Instant NGP [Müller et al 2022], K-Planes [Fridovich-Keil et al 2023], and SIREN [Sitzmann et al 2020]), reducing the fitting error by over 70%. Leveraging this effective neural structure, we compute high-quality bidirectional flow maps by marching the SSNF forward and backward in time, and consolidate them with the impulse-based fluid simulation method to drastically reduce simulation errors by over 90% with respect to analytical solutions.…”
Section: Introductionmentioning
confidence: 91%
See 2 more Smart Citations
“…The core component of NFM is the Spatially Sparse Neural Fields (SSNF), a hybrid INR employing a pyramid of multi-resolution, spatially sparse feature grids to maintain longrange buffers of spatiotemporal velocity fields. We show that for this purpose, our SSNF offers improved accuracy, training speed, and memory efficiency compared to state-of-the-art representations (e.g., Instant NGP [Müller et al 2022], K-Planes [Fridovich-Keil et al 2023], and SIREN [Sitzmann et al 2020]), reducing the fitting error by over 70%. Leveraging this effective neural structure, we compute high-quality bidirectional flow maps by marching the SSNF forward and backward in time, and consolidate them with the impulse-based fluid simulation method to drastically reduce simulation errors by over 90% with respect to analytical solutions.…”
Section: Introductionmentioning
confidence: 91%
“…However, pure neural representations suffer from their considerable time cost, and follow-up works have focused on hybrid INRs featuring classical data structures like dense [Sun et al 2022] and sparse voxel grids [Chabra et al 2020;Liu et al 2020;Martel et al 2021;Takikawa et al 2021;. Recently, plane-based data structures [Chen et al 2022;Fridovich-Keil et al 2023] have also been leveraged to good effects. Most relevant to our work is Instant NGP [Müller et al 2022], which [Park et al 2021a,b;Pumarola et al 2021] and scene flow fields [Du et al 2021;Li et al , 2021bXian et al 2021].…”
Section: Implicit Neural Representationmentioning
confidence: 99%
See 1 more Smart Citation
“…Another category of approaches learn a time-varying deformation of 3D points into a static canonical scene (Pumarola et al, 2021;Park et al, 2021a;Tretschk et al, 2021;Park et al, 2021b). Some approaches accelerate NeRFs on dynamic scenes using explicit voxel grids (Fang et al, 2022) or tensor factorization (Cao & Johnson, 2023;Fridovich-Keil et al, 2023).…”
Section: Related Workmentioning
confidence: 99%
“…Serval studies adopt a hybrid representation [3,4,10,14,31,33,38,44,52,62] to speed up the reconstruction and rendering. They employ explicit data structures such as discrete voxel grids [14,52], decomposed tensors [3,4,13], hash maps [38], etc. to store features or spherical harmonics, enabling fast convergence and inference.…”
Section: Related Workmentioning
confidence: 99%