2022
DOI: 10.1155/2022/3277730
|View full text |Cite
|
Sign up to set email alerts
|

A Frobenius Norm Regularization Method for Convolutional Kernel Tensors in Neural Networks

Abstract: The convolutional neural network is a very important model of deep learning. It can help avoid the exploding/vanishing gradient problem and improve the generalizability of a neural network if the singular values of the Jacobian of a layer are bounded around 1 in the training process. We propose a new Frobenius norm penalty function for a convolutional kernel tensor to let the singular values of the corresponding transformation matrix be bounded around 1. We show how to carry out the gradient-type methods. This… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 15 publications
(17 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?