2019
DOI: 10.1186/s40708-019-0097-2
|View full text |Cite
|
Sign up to set email alerts
|

How Amdahl’s Law limits the performance of large artificial neural networks

Abstract: With both knowing more and more details about how neurons and complex neural networks work and having serious demand for making performable huge artificial networks, more and more efforts are devoted to build both hardware and/or software simulators and supercomputers targeting artificial intelligence applications, demanding an exponentially increasing amount of computing capacity. However, the inherently parallel operation of the neural networks is mostly simulated deploying inherently sequential (or in the b… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
29
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
3
3
2

Relationship

4
4

Authors

Journals

citations
Cited by 25 publications
(30 citation statements)
references
References 29 publications
0
29
0
Order By: Relevance
“…This step involves (mostly) losing phase information. Furthermore, as detailed in [11], it introduces severe payload performance limits for neuronal operations.…”
Section: Data Delivery Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…This step involves (mostly) losing phase information. Furthermore, as detailed in [11], it introduces severe payload performance limits for neuronal operations.…”
Section: Data Delivery Methodsmentioning
confidence: 99%
“…It was not suspected that the physical implementations of the electronic components have a temporal behavior [9]. However, it was the final reason of many experienced issues, from the payload per-formance limit of supercomputers [10] and brain simulation [11], to the weeks-long training time in deep learning [12,13].…”
Section: Introductionmentioning
confidence: 99%
“…Notice, that neurons at different Minkowski-distance "see" the local frequency at different Minkowski-times, so they set their internal phase angle to the Minkowski-distance. That is, "at the same time" means different absolute times 11 ent neurons, depending on their location: the biology uses Minkowski four-coordinates (Minkowski-distance).…”
Section: Manifestations Of the Role Of Time In Biologymentioning
confidence: 99%
“…Thanks to the decreasing density and the increasing frequency, the speed of changing the electronic states in a computing system, more and more approached the limiting speed. It was not suspected that the physical implementations of the electronic components have a temporal behavior [9], although it was the final reason of many experienced issues, from the payload performance limit of supercomputers [10] and brain simulation [11], to the weeks-long training time in deep learning [12,13].…”
Section: Introductionmentioning
confidence: 99%
“…As discussed in details in [52], two of the mentioned effects can become dominating in those applications. Since the operating time scale of the biological networks lies in the msec range, brain simulation applications commonly use integration time about 1 ms [53].…”
Section: What Factors Dominate the Performance Gainmentioning
confidence: 99%