2005 Annual IEEE India Conference - Indicon
DOI: 10.1109/indcon.2005.1590168
|View full text |Cite
|
Sign up to set email alerts
|

Cryptography Using Neural Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
16
0

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 33 publications
(16 citation statements)
references
References 3 publications
0
16
0
Order By: Relevance
“…If the synaptic depth (L) is increased, the complexity of a successful attack grows exponentially, but there is only a polynomial increase of the effort needed to generate a key. TPM based synchronization steps are as follows [3,4,5,6,7].…”
Section: Tree Parity Machine (Tpm)mentioning
confidence: 99%
See 2 more Smart Citations
“…If the synaptic depth (L) is increased, the complexity of a successful attack grows exponentially, but there is only a polynomial increase of the effort needed to generate a key. TPM based synchronization steps are as follows [3,4,5,6,7].…”
Section: Tree Parity Machine (Tpm)mentioning
confidence: 99%
“…• DHLP offers two hidden layers instead of one single hidden layer in TPM [3,4,5,6,7] • Instead of increasing number of hidden neurons in a single hidden layer DHLP introduces an additional layer (second hidden layer) which actually increased the architectural complexity of the network that in turn helps to make the attacker's life difficult to guessing the internal representation of DHLP.…”
Section: Double Hidden Layer Perceptron (Dhlp)mentioning
confidence: 99%
See 1 more Smart Citation
“…In order to construct an authentication certificate, GSMLPSA assumes that the group obtains a secret password which can be used to authenticate the exchange protocol. This password can be mapped to multilayer perceptron guided cryptographic public parameter which can be used as an initial seed for a random number generator which encrypts the output bits in a fashion similar to that was proposed in [12,13]. Assume a random number generator (RNG), …”
Section: Gsmlpsa Certificate Generationmentioning
confidence: 99%
“…In this case, the two partners A and B do not have to share a common secret but use their indistinguishable weights as a secret key needed for encryption. The fundamental conception of neural network based key exchange protocol [3,4,5,6] focuses mostly on two key attributes of neural networks. Firstly, two nodes coupled over a public channel will synchronize even though each individual network exhibits disorganized behavior.…”
Section: Encryption and Decryption Timementioning
confidence: 99%