2022
DOI: 10.3390/rs14092034
|View full text |Cite
|
Sign up to set email alerts
|

A Novel Anchor-Free Method Based on FCOS + ATSS for Ship Detection in SAR Images

Abstract: Ship detection in synthetic aperture radar (SAR) images has been widely applied in maritime management and surveillance. However, some issues still exist in SAR ship detection due to the complex surroundings, scattering interferences, and diversity of the scales. To address these issues, an improved anchor-free method based on FCOS + ATSS is proposed for ship detection in SAR images. First, FCOS + ATSS is applied as the baseline to detect ships pixel by pixel, which can eliminate the effect of anchors and avoi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 14 publications
(5 citation statements)
references
References 42 publications
0
5
0
Order By: Relevance
“…Currently, anchor-based methods still have some drawbacks, such as difficulty in assigning anchor boxes for each target when overlaps exist between targets. To address this issue, Zhu et al [27] introduced an enhanced anchor-free detector based on the FCOS + ATSS network. This paper embeds an improved residual module (IRM) and deformable convolutions (Dconv) into the feature extraction network (FEN) to enhance accuracy.…”
Section: Cnn-based Sar Ship Detectormentioning
confidence: 99%
“…Currently, anchor-based methods still have some drawbacks, such as difficulty in assigning anchor boxes for each target when overlaps exist between targets. To address this issue, Zhu et al [27] introduced an enhanced anchor-free detector based on the FCOS + ATSS network. This paper embeds an improved residual module (IRM) and deformable convolutions (Dconv) into the feature extraction network (FEN) to enhance accuracy.…”
Section: Cnn-based Sar Ship Detectormentioning
confidence: 99%
“…Humayun et al [23] formulated anchor frames matching the targets based on the data distribution of target sizes in the dataset to enhance the network's sensitivity to targets of specific sizes and to improve the training efficiency and 2 > REPLACE THIS LINE WITH YOUR MANUSCRIPT ID NUMBER (DOUBLE-CLICK HERE TO EDIT) < detection accuracy, but it is easy to ignore targets outside the specific size interval. Anchor-less box detection has gained widespread adoption for accurately delineating ship boundaries in SAR image target detection [24][25][26][27][28][29][30]. Sun et al [31] performed pixel-by-pixel prediction of images to reduce false positives and misses based on the anchorless frame mechanism and optimized the target location using the CP module.…”
Section: Introductionmentioning
confidence: 99%
“…Current research often emphasizes deepening feature fusion without fundamentally addressing semantic ambiguity problems. Current SAR image ship target detection methods predominantly rely on horizontal anchor frame algorithms [43][44][45], directional anchor frame algorithms [46][47][48], and anchor frame-free detection algorithms [24][25][26][27][28][29][30]. However, the introduction of anchor frames can consume substantial computational resources due to the discontinuous boundaries of SAR image instances, varying instance density, and scattering noise.…”
Section: Introductionmentioning
confidence: 99%
“…The acquisition methods of optical and synthetic aperture radar (SAR) images in remote sensing have been gradually diversified and improved in quality with the development of Earth observation technology. The interpretation of optical and SAR images can help relevant departments to effectively obtain valuable information, which can be beneficial for many tasks, including environmental monitoring [1,2], marine management [3,4], resource planning [5], and natural disaster damage analysis [6]. Therefore, research on optical and SAR image interpretation is of great practical importance.…”
Section: Introductionmentioning
confidence: 99%