2019
DOI: 10.1109/access.2019.2959662
|View full text |Cite
|
Sign up to set email alerts
|

Coastline Extraction Method Based on Convolutional Neural Networks—A Case Study of Jiaozhou Bay in Qingdao, China

Abstract: The traditional edge detection-based shoreline extraction method is severely disturbed by noise, and it is difficult to obtain a continuous coastline. In response to the above problems, we propose a coastline extraction method based on convolutional neural networks. Firstly, we replace the standard convolution with the Mini-Inception structure in the backbone network to extract multi-scale features of the object, and all the multi-scale features are concatenated. Then, we use the leaky-ReLU activation function… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
14
0
3

Year Published

2020
2020
2023
2023

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 27 publications
(17 citation statements)
references
References 24 publications
0
14
0
3
Order By: Relevance
“…The literature presents different activation functions, such as rigid, linear limit, log-sigmoid (sigmoid), tangent hyperbolic sigmoid (tanh), positive linear (poslin) [1], [21], also called rectified linear unit (ReLU), Leaky-ReLU (LReLU), among others [1], [3], [20]. The sigmoid is one of the most used activation functions until recently [3].…”
Section: A Artificial Neural Network Overviewmentioning
confidence: 99%
See 4 more Smart Citations
“…The literature presents different activation functions, such as rigid, linear limit, log-sigmoid (sigmoid), tangent hyperbolic sigmoid (tanh), positive linear (poslin) [1], [21], also called rectified linear unit (ReLU), Leaky-ReLU (LReLU), among others [1], [3], [20]. The sigmoid is one of the most used activation functions until recently [3].…”
Section: A Artificial Neural Network Overviewmentioning
confidence: 99%
“…ReLU is very similar to the identity function, making the neural net learning process, based on this activation function, faster than sigmoid [20]. The ReLU function and its derivative are expressed as:…”
Section: A Artificial Neural Network Overviewmentioning
confidence: 99%
See 3 more Smart Citations