2021
DOI: 10.1007/s11042-020-10269-x
|View full text |Cite
|
Sign up to set email alerts
|

Average biased ReLU based CNN descriptor for improved face retrieval

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
14
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 31 publications
(14 citation statements)
references
References 42 publications
0
14
0
Order By: Relevance
“…The role of activation function is to introduce nonlinear factors to enhance the feature expression ability of the model. At present, the Rectified Linear Unit (ReLU) is the most versatile and effective activation function ( Dubey & Chakraborty, 2021 ), which can improve the network sparsity and reduce the network over-fitting. The expression of ReLU is: …”
Section: Theoretical Methodsmentioning
confidence: 99%
“…The role of activation function is to introduce nonlinear factors to enhance the feature expression ability of the model. At present, the Rectified Linear Unit (ReLU) is the most versatile and effective activation function ( Dubey & Chakraborty, 2021 ), which can improve the network sparsity and reduce the network over-fitting. The expression of ReLU is: …”
Section: Theoretical Methodsmentioning
confidence: 99%
“…In [25], the authors proposed the average biased ReLU (AB-ReLU). Popular CNNs, such as AlexNet and VGGNet, have been used as discriminative feature descriptors in computer vision.…”
Section: Related Workmentioning
confidence: 99%
“…In comparison to the typical deconvolution layer, this subpixel convolution can decompose low-resolution data into highresolution space without additional computation. Since ReLU has good nonlinear fitting performance [41], it is used as the model's activation function.…”
Section: A Data Collection and Preprocessingmentioning
confidence: 99%