The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2021
DOI: 10.1155/2021/8820116
|View full text |Cite
|
Sign up to set email alerts
|

Automatic Impervious Surface Area Detection Using Image Texture Analysis and Neural Computing Models with Advanced Optimizers

Abstract: Up-to-date information regarding impervious surface is valuable for urban planning and management. The objective of this study is to develop neural computing models used for automatic impervious surface area detection at a regional scale. To achieve this task, advanced optimizers of adaptive moment estimation (Adam), a variation of Adam called Adamax, Nesterov-accelerated adaptive moment estimation (Nadam), Adam with decoupled weight decay (AdamW), and a new exponential moving average variant (AMSGrad) are use… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(6 citation statements)
references
References 60 publications
(69 reference statements)
0
6
0
Order By: Relevance
“…The Nadam optimizer explained in Dozat (2016), tries to integrate Nesterov-accelerated adaptive moment estimation as to Adam. The main benefit of this combined model is that utilized adaptive moment estimation uses for performing extremely accurate steps from gradient direction using upgrades of model parameter through the momentum step previously the calculation of gradient (Hoang, 2021). The upgrade rule of Nadam is expressed as:…”
Section: Data Classification Modelmentioning
confidence: 99%
“…The Nadam optimizer explained in Dozat (2016), tries to integrate Nesterov-accelerated adaptive moment estimation as to Adam. The main benefit of this combined model is that utilized adaptive moment estimation uses for performing extremely accurate steps from gradient direction using upgrades of model parameter through the momentum step previously the calculation of gradient (Hoang, 2021). The upgrade rule of Nadam is expressed as:…”
Section: Data Classification Modelmentioning
confidence: 99%
“…At last, the hyperparameters of DRNN are ideally chosen through the Nadam optimizer. The Nadam optimization is the incorporation of Nesterov‐accelerated adaptive moment estimation and Adam (Hoang, 2021). The gain of this combined technique is that the adaptive moment estimation used in it is used to implement highly accurate steps from the gradient direction.…”
Section: Proposed Intrusion Detection Schemementioning
confidence: 99%
“…The Adamax optimization method is a variation of the Adam optimization method, which has update rules to weight the models. Proportional scales the 𝐿 𝑝 norm of gradients from the current and previous networks [29]. Equation 21shows the weight update according to the Adamax optimization method.…”
Section: G Adamaxmentioning
confidence: 99%
“…Nesterov-Accelerated Adaptive Moment Estimation (Nadam) is calculated by integrating adaptive moment estimation into the optimization method Adam optimization method calculation formula [29]. It ensures that the specified insertion gradient value achieves high accuracy.…”
Section: H Nadammentioning
confidence: 99%