2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW) 2019
DOI: 10.1109/iccvw.2019.00447
|View full text |Cite
|
Sign up to set email alerts
|

AI Benchmark: All About Deep Learning on Smartphones in 2019

Abstract: The performance of mobile AI accelerators has been evolving rapidly in the past two years, nearly doubling with each new generation of SoCs. The current 4th generation of mobile NPUs is already approaching the results of CUDAcompatible Nvidia graphics cards presented not long ago, which together with the increased capabilities of mobile deep learning frameworks makes it possible to run complex and deep AI models on mobile devices. In this paper, we evaluate the performance and compare the results of all chipse… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
101
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 180 publications
(101 citation statements)
references
References 52 publications
0
101
0
Order By: Relevance
“…CNN-LSTM based models also has a larger decrease (≈ 3%) comparing to the LSTM. This is because the quantised inference was used in this work instead of the floating-point inference [29]. Quantised inference needs to convert the model from a 16-bit floating point type to int-8 format to reduce the size and random-access memory (RAM) consumption by a factor of 4 [29].…”
Section: ) Performance Of Selected Dnn Models With and Without Rule-based Reconstruction Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…CNN-LSTM based models also has a larger decrease (≈ 3%) comparing to the LSTM. This is because the quantised inference was used in this work instead of the floating-point inference [29]. Quantised inference needs to convert the model from a 16-bit floating point type to int-8 format to reduce the size and random-access memory (RAM) consumption by a factor of 4 [29].…”
Section: ) Performance Of Selected Dnn Models With and Without Rule-based Reconstruction Methodsmentioning
confidence: 99%
“…This is why most of previous studies on resource-constraints systems, such as embedded device and smartphones, are only using CNNs. Recent advances in smartphone hardware and DNNs optimisation technologies on mobile platforms have significantly overcome this obstacle [29]. With increasing computing power on mobile devices, computationally intensive models such as LSTM can be run on a mobile device like a mobile phone.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…3.1.3 On-demand Image Delivery. As chipsets on commodity devices are gradually getting more powerful [2,20], this enables many applications to run fully on-device, avoiding the latency and privacy issues of offloading. In this direction, Lee et al [29] proposed MobiSR (Fig.…”
Section: °Videomentioning
confidence: 99%