2023
DOI: 10.1109/access.2023.3315308
|View full text |Cite
|
Sign up to set email alerts
|

PolyLU: A Simple and Robust Polynomial-Based Linear Unit Activation Function for Deep Learning

Han-Shen Feng,
Cheng-Hsiung Yang

Abstract: The activation function has a critical influence on whether a convolutional neural network in deep learning can converge or not; a proper activation function not only makes the convolutional neural network converge faster but also can reduce the complexity of convolutional neural network architecture and gets the same or better performance. Many activation functions have been proposed; however, various activation functions have advantages, defects, and applicable network architectures. A new activation functio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
references
References 68 publications
0
0
0
Order By: Relevance