2018 IEEE International Conference on Big Data and Smart Computing (BigComp) 2018
DOI: 10.1109/bigcomp.2018.00091
|View full text |Cite
|
Sign up to set email alerts
|

Pseudo Random Number Generation Using LSTMs and Irrational Numbers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

2
8
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(10 citation statements)
references
References 6 publications
2
8
0
Order By: Relevance
“…Nearly 95 % of NIST tests can be consistently passed by the trained generator, indicating that the adversarial procedure is extremely effective in training the network to act as a PRNG. Findings are comparable to those of Tirdad and Sadeghian [50], and Jeong et al [21], outperforming a range of regular PRNGs. This proposed cluster synchronization has several advantages that are as follows:…”
Section: Related Worksupporting
confidence: 78%
See 1 more Smart Citation
“…Nearly 95 % of NIST tests can be consistently passed by the trained generator, indicating that the adversarial procedure is extremely effective in training the network to act as a PRNG. Findings are comparable to those of Tirdad and Sadeghian [50], and Jeong et al [21], outperforming a range of regular PRNGs. This proposed cluster synchronization has several advantages that are as follows:…”
Section: Related Worksupporting
confidence: 78%
“…Several efforts are being made with neural networks to produce PRNG sequences by Desai et al [10], Desai et al [9], Tirdad and Sadeghian [50], Jeong et al [21]. Tirdad and Sadeghian [50] and Jeong et al [21], have described the most effective methods. The previous was using Hopfield neural networks to avoid convergence and promote dynamic behavior, whereas the latter used a random data sampletrained LSTM to acquire indices in the pi digits.…”
Section: Related Workmentioning
confidence: 99%
“…Some of them are as follows: 1. PRNG: It stands for Pseudo-random number generators (PRNGs), are predefined procedures that are capable to deliberately generate long numbers with excellent random properties but sometimes the sequence may repeat [5]. The random numbers generated by such algorithms is generally determined by a fixed number called seed [6].…”
Section: Related Workmentioning
confidence: 99%
“…Unfortunately, as compared to a TRNG, a regular PRNG is quite vulnerable to machine learning (ML) attacks since ML is able to model algorithms [3]. For example, long-short term memory (LSTM) [4] is a kind of advanced ML technique that was demonstrated to be efficient in cracking a regular PRNG. The primary reason is that a hardware-implemented PRNG belongs to a sequential logic circuit while LSTM is pretty efficient in modelling sequential logic circuits.…”
mentioning
confidence: 99%
“…Security robustness against regular LSTM attacks: Since a hardware-implemented regular PRNG is a sequential logic circuit that contains at least one feedback loop, LSTM algorithm [4] can be explored to model the regular PRNG. Fig.…”
mentioning
confidence: 99%