2023
DOI: 10.1049/cit2.12263
|View full text |Cite
|
Sign up to set email alerts
|

Neural dynamics for improving optimiser in deep learning with noise considered

Abstract: As deep learning evolves, neural network structures become increasingly sophisticated, bringing a series of new optimisation challenges. For example, deep neural networks (DNNs) are vulnerable to a variety of attacks. Training neural networks under privacy constraints is a method to alleviate privacy leakage, and one way to do this is to add noise to the gradient. However, the existing optimisers suffer from weak convergence in the presence of increased noise during training, which leads to a low robustness of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(1 citation statement)
references
References 39 publications
0
0
0
Order By: Relevance
“…where the superscripts T and −1 represent transpose and inversion operations of a matrix, respectively (Su et al, 2023b ; Liao et al, 2024a ).…”
Section: Preliminariesmentioning
confidence: 99%
“…where the superscripts T and −1 represent transpose and inversion operations of a matrix, respectively (Su et al, 2023b ; Liao et al, 2024a ).…”
Section: Preliminariesmentioning
confidence: 99%