2018
DOI: 10.1155/2018/7012056
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Parallel Implementation of Matrix Multiplication for Lattice-Based Cryptography on Modern ARM Processor

Abstract: Recently, various types of postquantum cryptography algorithms have been proposed for the National Institute of Standards and Technology’s Postquantum Cryptography Standardization competition. Lattice-based cryptography, which is based on Learning with Errors, is based on matrix multiplication. A large-size matrix multiplication requires a long execution time for key generation, encryption, and decryption. In this paper, we propose an efficient parallel implementation of matrix multiplication and vector additi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
1
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 9 publications
(4 citation statements)
references
References 18 publications
0
4
0
Order By: Relevance
“…In [23], authors present an efficient matrix multiplication and vector addition for lattice-based cryptography. Using ARM NEON intrinsic functions and Raspberry Pi 3 Model B as hardware, they described the fastest method and time for multiplying two matrices (Z m×n 2 × Z n×l 2 ) is 93.91 ms where (m, n, l) := (1024, 536, 256).…”
Section: Discussionmentioning
confidence: 99%
“…In [23], authors present an efficient matrix multiplication and vector addition for lattice-based cryptography. Using ARM NEON intrinsic functions and Raspberry Pi 3 Model B as hardware, they described the fastest method and time for multiplying two matrices (Z m×n 2 × Z n×l 2 ) is 93.91 ms where (m, n, l) := (1024, 536, 256).…”
Section: Discussionmentioning
confidence: 99%
“…When it comes to high-performance big data analytics, it is regarded as the finest option. Spark Core uses YARN and Hadoop to distribute resources [60,61]. Additionally, it can retrieve data from the Hadoop Distributed File System (HDFS), which is used to read and store data and partition it into Resilient Distributed Datasets (RDDs).…”
Section: Parallel Computing Using Apache Spark Frameworkmentioning
confidence: 99%
“…When it comes to high-performance big data analytics, it is regarded as the finest option. Spark core uses YARN and Hadoop to distribute resources [52][53]. Additionally, it can retrieve data from the Hadoop Distributed File System (HDFS), which is used to read and store data, and partition it into Resilient Distributed Datasets (RDDs).…”
Section: Parallel Computing Using Apache Spark Frameworkmentioning
confidence: 99%