SAE Technical Paper Series 2019
DOI: 10.4271/2019-01-0118
|View full text |Cite
|
Sign up to set email alerts
|

High Performance Processor Architecture for Automotive Large Scaled Integrated Systems within the European Processor Initiative Research Project

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
8
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 12 publications
(8 citation statements)
references
References 4 publications
0
8
0
Order By: Relevance
“…On board ML computing can be done only if the computational algorithm complexity is not too high, and a performing HW is adopted. Hence, on-board computing units for ML should be optimized in terms of the ratio between processing throughput performance and resources (memory, bandwidth, power consumption, ...) [7][8][9]. This is the trend that also big industrial players are following like Google, Nvidia or Intel, that are trying to enter in the autonomous driving market, or the recently announced Full Self Driving (FSD) chip from Tesla.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…On board ML computing can be done only if the computational algorithm complexity is not too high, and a performing HW is adopted. Hence, on-board computing units for ML should be optimized in terms of the ratio between processing throughput performance and resources (memory, bandwidth, power consumption, ...) [7][8][9]. This is the trend that also big industrial players are following like Google, Nvidia or Intel, that are trying to enter in the autonomous driving market, or the recently announced Full Self Driving (FSD) chip from Tesla.…”
Section: Introductionmentioning
confidence: 99%
“…This is the trend that also big industrial players are following like Google, Nvidia or Intel, that are trying to enter in the autonomous driving market, or the recently announced Full Self Driving (FSD) chip from Tesla. This topic is also the core of the automotive stream in the H2020 European Processor Initiative (embedded HPC for autonomous driving with BMW as main technology end-user [9,10]) funding this work. To address the above issues new computing arithmetic styles are appearing in research [11][12][13][14][15][16][17][18][19][20] overcoming the classic fixed-point (INT) vs. IEEE-754 floating-point duality in case of embedded DNN (Deep Neural Networks) signal processing.…”
Section: Introductionmentioning
confidence: 99%
“…Hence, on-board computing units for DNN should be optimized in terms of the ratio between signal processing throughput performance and resources (memory, bandwidth, power consumption, etc.) [15]- [17]. This is the trend that also big players are following like Google, NVIDIA or Intel, that are trying to enter in the autonomous driving market, or the recently announced Full Self Driving (FSD) chip from Tesla.…”
Section: Introductionmentioning
confidence: 99%
“…This is the trend that also big players are following like Google, NVIDIA or Intel, that are trying to enter in the autonomous driving market, or the recently announced Full Self Driving (FSD) chip from Tesla. This concept is also the core of the automotive stream in the H2020 European Processor Initiative (embedded HPC for autonomous driving with the BMW group as main technology end user [17], where the article's authors are involved. To address the above issues new computing arithmetic styles are appearing in state of art [18]- [26] to overcome the classic fixedpoint (INT) vs. IEEE-754 floating point duality in case of embedded DNN signal processing.…”
Section: Introductionmentioning
confidence: 99%
“…Future autonomous driving sub-tasks are very likely to be realizes as a service-oriented computing architecture. Accordingly, the authors in [1] define different levels of serviceoriented architectures. The lowest level is referred to as a Software-as-a-service (SaaS), which is defined by functionality and information on-demand.…”
Section: Introductionmentioning
confidence: 99%