2015
DOI: 10.1007/978-3-319-26532-2_6
|View full text |Cite
|
Sign up to set email alerts
|

Max-Pooling Dropout for Regularization of Convolutional Neural Networks

Abstract: Abstract.Recently, dropout has seen increasing use in deep learning. For deep convolutional neural networks, dropout is known to work well in fully-connected layers. However, its effect in pooling layers is still not clear. This paper demonstrates that max-pooling dropout is equivalent to randomly picking activation based on a multinomial distribution at training time. In light of this insight, we advocate employing our proposed probabilistic weighted pooling, instead of commonly used max-pooling, to act as mo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
106
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 144 publications
(106 citation statements)
references
References 1 publication
0
106
0
Order By: Relevance
“…In [31], a comprehensive theoretical analysis of the max pooling and average pooling is generated, whereas in [32] it has shown that max pooling can result in faster convergence of information, and the network picks the high-ranking features in the image thus enhancing generalization. Also, pooling layer possesses other variations such as stochastic pooling [33], spatial pyramid pooling [34], and def-pooling [35] that serves marked purposes.…”
Section: Appl Sci 2019 9 X For Peer Review 3 Of 22mentioning
confidence: 99%
“…In [31], a comprehensive theoretical analysis of the max pooling and average pooling is generated, whereas in [32] it has shown that max pooling can result in faster convergence of information, and the network picks the high-ranking features in the image thus enhancing generalization. Also, pooling layer possesses other variations such as stochastic pooling [33], spatial pyramid pooling [34], and def-pooling [35] that serves marked purposes.…”
Section: Appl Sci 2019 9 X For Peer Review 3 Of 22mentioning
confidence: 99%
“…The CNN architecture, presented in Figure 1, consists of three convolutional layers with a max-pooling layer, a dropout function [33] and four fully connected layers. Using this structure, five different CNNs were developed as presented in Table 1.…”
Section: Convolutional Neural Network Architecturementioning
confidence: 99%
“…The Adam optimizer is utilized to update weights and bias in the entire network. Wu and Gu [18] proved that a dropout layer improves the neural network and can effectively prevent overfitting, so we added a dropout layer behind the two fully connected layers behind the network. Partial improvement of the basic LeNet-5 by the above method is effective for classifying flutter signals (Figure 1).…”
Section: Convolutional Neural Networkmentioning
confidence: 99%