2022 57th International Scientific Conference on Information, Communication and Energy Systems and Technologies (ICEST) 2022
DOI: 10.1109/icest55168.2022.9828625
|View full text |Cite
|
Sign up to set email alerts
|

A Survey of Three Types of Processing Units: CPU, GPU and TPU

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 3 publications
0
4
0
Order By: Relevance
“…Python, on the other hand, is a high-level performance programming language that is widely used in machine learning (Raschka et al, 2020). TPU is a type of circuit known as ‘application-specific integrated circuits’ (ASICs) developed by Google for accelerated machine learning, and it is faster in computing than GPU (Nikolić et al, 2022). Keras is a neural network application programming interface (API) for Python that runs on top of the machine-learning platform TensorFlow (Parvat et al, 2017).…”
Section: Methodsmentioning
confidence: 99%
“…Python, on the other hand, is a high-level performance programming language that is widely used in machine learning (Raschka et al, 2020). TPU is a type of circuit known as ‘application-specific integrated circuits’ (ASICs) developed by Google for accelerated machine learning, and it is faster in computing than GPU (Nikolić et al, 2022). Keras is a neural network application programming interface (API) for Python that runs on top of the machine-learning platform TensorFlow (Parvat et al, 2017).…”
Section: Methodsmentioning
confidence: 99%
“…A possible improvement of the system is to replace the Raspberry Pi 4B with a more power-efficient SBC, such as a Raspberry Pi Zero 2 W (according to [ 81 ], it has half the power consumption in idle mode), and to use an efficient co-processor for the SDR operations such as a field-programmable gate array (FPGA) [ 82 ], a dedicated digital signal processor (DSP) [ 83 ], a graphics processing unit (GPU) [ 84 ], or tensor processing unit (TPU) [ 85 , 86 ]. The core concept is to leverage the strengths of different architectures for efficient processing [ 87 ]. However, all such approaches require significant hardware and software redesign, which results in higher development costs.…”
Section: Hardware Designmentioning
confidence: 99%
“…Several papers have analyzed the cloud version of the TPU. Specifically, [18,19] provide an in-depth survey of the differences and particularities of CPUs, GPUs and (Cloud) TPUs for different tasks related with deep learning. The use of multiple cloud TPUs to scale training tasks has also been deeply studied [11,13].…”
Section: Related Workmentioning
confidence: 99%