2019
DOI: 10.1002/cta.2623
|View full text |Cite
|
Sign up to set email alerts
|

Using deep learning to combine static and dynamic power analyses of cryptographic circuits

Abstract: Summary Side‐channel attacks have shown to be efficient tools in breaking cryptographic hardware. Many conventional algorithms have been proposed to perform side‐channel attacks exploiting the dynamic power leakage. In recent years, with the development of processing technology, static power has emerged as a new potential source for side‐channel leakage. Both types of power leakage have their advantages and disadvantages. In this work, we propose to use the deep neural network technique to combine the benefits… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(8 citation statements)
references
References 46 publications
(81 reference statements)
0
5
0
Order By: Relevance
“…LSTM has three type of gates called forget gate ( normalΓft), update gate ( normalΓut), and output gate ( normalΓ0t), which are calculated using Equations (1)–(3). Forget gate has been used to clear the memory which is no longer use, and update gate is used to update the memory with new information, whereas output gate has been used to decide which output needs to be used 29,30 . Output of LSTM model has been calculated using Equations (4)–(7).…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…LSTM has three type of gates called forget gate ( normalΓft), update gate ( normalΓut), and output gate ( normalΓ0t), which are calculated using Equations (1)–(3). Forget gate has been used to clear the memory which is no longer use, and update gate is used to update the memory with new information, whereas output gate has been used to decide which output needs to be used 29,30 . Output of LSTM model has been calculated using Equations (4)–(7).…”
Section: Methodsmentioning
confidence: 99%
“…Forget gate has been used to clear the memory which is no longer use, and update gate is used to update the memory with new information, whereas output gate has been used to decide which output needs to be used. 29,30 Output of LSTM model has been calculated using Equations (4)- (7).…”
Section: Long Short-term Memorymentioning
confidence: 99%
“…It can be used to combine two different neural networks which are trained for the same task but on different datasets. Two fusion techniques are commonly used in existing works, one is called early fusion and another one is late fusion [24], [25]. Early fusion merges layers of different neural networks at an early stage while late fusion merges layer lately.…”
Section: Construction Of Multi-input Modelmentioning
confidence: 99%
“…In addition, the LFDRNN evaluation from input to output is very fast 23 . Moreover, LFDRNN can better approximate the complex nonlinear device input–output relationships compared to the conventional RNN structure 17,46 . In this paper, we have implemented a boost converter via an EH system, and LFDRNN is investigated and used to model its nonlinear behavior.…”
Section: Local Feedback Deep Recurrent Neural Networkmentioning
confidence: 99%