Proceedings of the 29th International Symposium on High-Performance Parallel and Distributed Computing 2020
DOI: 10.1145/3369583.3392685
|View full text |Cite
|
Sign up to set email alerts
|

High Accuracy Matrix Computations on Neural Engines: A Study of QR Factorization and its Applications

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(2 citation statements)
references
References 19 publications
0
1
0
Order By: Relevance
“…It would also be worth investigating the extension of this work to other matrix factorization algorithms, in particular QR factorization. Indeed, to our knowledge, existing mixed precision QR algorithms (Yang et al, 2021; Zhang et al, 2020) also assume the matrix to be stored in fp32 precision; the ideas proposed here could be extended so as to store them in fp16.…”
Section: Discussionmentioning
confidence: 99%
“…It would also be worth investigating the extension of this work to other matrix factorization algorithms, in particular QR factorization. Indeed, to our knowledge, existing mixed precision QR algorithms (Yang et al, 2021; Zhang et al, 2020) also assume the matrix to be stored in fp32 precision; the ideas proposed here could be extended so as to store them in fp16.…”
Section: Discussionmentioning
confidence: 99%
“…The LR factorization can be utilized in parallel computation [5][6][7], applied in many engineering and technology needs like data movement analysis and circuit simulation [8][9][10]. Due to the special attributes of the orthogonal matrix after the QR factorization, the computation is more efficient and stable, often applied in least squares problems and optimization problems [11,12], in real life like the smart grid, pathological matrix problems, harmonic compensation of inductors and other aspects of important performance [13][14][15][16].…”
Section: Introductionmentioning
confidence: 99%