Proceedings of the 18th ACM International Conference on Computing Frontiers 2021
DOI: 10.1145/3457388.3458870
|View full text |Cite
|
Sign up to set email alerts
|

Fault injection attacks on SoftMax function in deep neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 4 publications
0
3
0
Order By: Relevance
“…As described in Figure 9, the dot product AM first calculates the point multiplication of query and all keys, divides each by the square root (SQR) dk to prevent overlarge product, and then feeds it to the Softmax function to obtain the corresponding weight (Jap et al. , 2021).…”
Section: Methodsmentioning
confidence: 99%
“…As described in Figure 9, the dot product AM first calculates the point multiplication of query and all keys, divides each by the square root (SQR) dk to prevent overlarge product, and then feeds it to the Softmax function to obtain the corresponding weight (Jap et al. , 2021).…”
Section: Methodsmentioning
confidence: 99%
“…The normalization process is highly susceptible to larger or smaller inputs, and it is easy to map to 0 and 1. After the normalization transformation, when the full probability is assigned to the label corresponding to the maximum value, the model is trained with factor k for scaling to prevent the gradient from disappearing during backpropagation [41].…”
Section: Calculate the Distribution Of Attentionmentioning
confidence: 99%
“…SoftMax function [10][11][12] has a wide range of applications in machine learning and deep learning, mainly for multiclassification problems. Most of the time, in the multi-classification problem, we want to output the probability of getting a certain category.…”
Section: Anti-domain Adaptation Of Knowledge Mappingmentioning
confidence: 99%