2021
DOI: 10.1136/bmj.n389
|View full text |Cite
|
Sign up to set email alerts
|

Peter Bradbury

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
27
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(27 citation statements)
references
References 0 publications
0
27
0
Order By: Relevance
“…Our deferred NeRF model is based on JAXNeRF [11], an implementation of NeRF in JAX [3]. As in NeRF, we apply a positional encoding [41] to positions and view di- 2.…”
Section: Implementation Detailsmentioning
confidence: 99%
“…Our deferred NeRF model is based on JAXNeRF [11], an implementation of NeRF in JAX [3]. As in NeRF, we apply a positional encoding [41] to positions and view di- 2.…”
Section: Implementation Detailsmentioning
confidence: 99%
“…ADDs (×10 6 ) MULs (×10 high area-to-volume ratio for the sake of power delivery and heat removal. Inability to co-locate various circuits and memories may force existence of long wires, so called buses.…”
Section: Layer Typementioning
confidence: 99%
“…A small variant of PokeBNN achieves 70.5% top-1 with 2.6 ACE, more than 3x reduction in cost; a larger PokeBNN achieves 75.6% top-1 with 7.8 ACE, more than 5% improvement in accuracy without increasing the cost. PokeBNN implementation in JAX/Flax [6,18] and reproduction instructions are open sourced. 2 * Work performed while at Google, equal contribution.…”
mentioning
confidence: 99%
“…We use the bfloat16 ResNet50 v1.5 model as our baseline and implemented our quantized model on top of JAX [4] MLPerf ResNet50 submission. Figure 3 shows the ResNet50 architecture used in this work.…”
Section: Resnet50mentioning
confidence: 99%
“…We implemented a collection of quantization techniques and quantized neural networks on top of the JAX framework [4] and the Flax library [14] to enable fast experimentation, and used it to run the experiments in this paper. All code is at https://github.com/google-research/ google -research / tree / master / aqt.…”
Section: Quantization Library In Jax and Flaxmentioning
confidence: 99%