2018
DOI: 10.1016/j.neunet.2018.05.015
|View full text |Cite
|
Sign up to set email alerts
|

A sparsity-based stochastic pooling mechanism for deep convolutional neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
17
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 29 publications
(22 citation statements)
references
References 16 publications
0
17
0
Order By: Relevance
“…To a certain extent, it will alleviate the lack of medical service capacity in remote areas and the unbalanced development of medical standards [29]. Medical breakthroughs in geographical limits are conducive to the development of clinical research and improving the level of diagnosis and treatment of small and medium-sized physicians.…”
Section: Medical Internet Of Thingsmentioning
confidence: 99%
“…To a certain extent, it will alleviate the lack of medical service capacity in remote areas and the unbalanced development of medical standards [29]. Medical breakthroughs in geographical limits are conducive to the development of clinical research and improving the level of diagnosis and treatment of small and medium-sized physicians.…”
Section: Medical Internet Of Thingsmentioning
confidence: 99%
“…The objective of the pooling layer is to achieve robustness to illumination changes and position variations with invariance to feature transformations [25]. In practice, most incipient CNNs have employed popular pooling methods, such as average pooling and maximum pooling [26]. Average pooling considers all the elements in the pooling regions to prevent variance increases while retaining the background information [26,27], whereas maximum pooling only captures the foreground texture information of the strongest activation as a representative feature of a region of an image, as shown in Figure 2 [24].…”
Section: Hybrid Poolingmentioning
confidence: 99%
“…In practice, most incipient CNNs have employed popular pooling methods, such as average pooling and maximum pooling [26]. Average pooling considers all the elements in the pooling regions to prevent variance increases while retaining the background information [26,27], whereas maximum pooling only captures the foreground texture information of the strongest activation as a representative feature of a region of an image, as shown in Figure 2 [24]. However, Boureau et al [28] noted that there are some drawbacks, as maximum and average pooling may lose information representing the background and foreground, respectively [26].…”
Section: Hybrid Poolingmentioning
confidence: 99%
See 1 more Smart Citation
“…Therefore, the learning network used in this paper is also a guided learning network. A convolutional network is essentially an input-to-output mapping that learns a large number of mappings between input and output without the need for any precise mathematical expression between input and output, as long as it is known [32]. The mode trains the convolutional network, and the network has the ability to map between input and output pairs.…”
Section: B Convolutional Network Training Processmentioning
confidence: 99%