2020
DOI: 10.3390/rs12183096
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised Parameterization for Optimal Segmentation of Agricultural Parcels from Satellite Images in Different Agricultural Landscapes

Abstract: Image segmentation is a cost-effective way to obtain information about the sizes and structural composition of agricultural parcels in an area. To accurately obtain such information, the parameters of the segmentation algorithm ought to be optimized using supervised or unsupervised methods. The difficulty in obtaining reference data makes unsupervised methods indispensable. In this study, we evaluated an existing unsupervised evaluation metric that minimizes a global score (GS), which is computed by summing up… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 12 publications
(4 citation statements)
references
References 52 publications
0
4
0
Order By: Relevance
“…In computer vision tasks, the intersection over union (IoU) is a commonly employed evaluation index. Furthermore, the F measure can also be utilized for assessing the segmentation results of deep learning methods [45]. Given that SAM only segments some geographical objects in the image, it is unreasonable to apply evaluation indicators other than F and IoU to assess its segmentation results.…”
Section: B Performance Evaluationsmentioning
confidence: 99%
See 1 more Smart Citation
“…In computer vision tasks, the intersection over union (IoU) is a commonly employed evaluation index. Furthermore, the F measure can also be utilized for assessing the segmentation results of deep learning methods [45]. Given that SAM only segments some geographical objects in the image, it is unreasonable to apply evaluation indicators other than F and IoU to assess its segmentation results.…”
Section: B Performance Evaluationsmentioning
confidence: 99%
“…The second category of segmentation methods is supervised segmentation, which requires the use of labeled training data to guide the algorithm for learning, such as Graph Cut [41], Conditional Random Field [42], and deep learning methods [43], [44], [45]. In recent years, the persistent exploration and refinement of deep learning have been motivated by the remarkable achievements of convolutional neural networks and the explosion of novel architectures.…”
Section: Introductionmentioning
confidence: 99%
“…This layer was used to create a mask to remove non-agricultural areas from the MMC images before segmenting the agricultural fields. This approach has also been used in other studies [31], [36], [37].…”
Section: Agricultural Land-covermentioning
confidence: 99%
“…Building on the experiences of the Landsat and SPOT missions, Sentinel-2 (S2) was designed within the framework of the European Copernicus program for land surface and agriculture monitoring [28] at a temporal resolution of 5 days and a spatial resolution of 10 m. As opposed to optical sensors, which are inhibited by clouds, Sentinel-1 (S1), which is also part of the Copernicus program enables the continuous monitoring of the earth's surface in all weather conditions at a temporal resolution of 6 days and a spatial resolution of 20 m. Various researchers have used S1 [29], [30], and predominantly S2 [31]- [40] for segmenting agricultural fields. In using the S1 or S2 images, most of those authors used existing segmentation algorithms (e.g., [29], [30], [32]- [34], [36]- [40]), some proposed new segmentation algorithms (e.g., [31], [35]), and others proposed new segmentation parameter optimization approaches (e.g., [36], [37]). One area that is yet to be comprehensively explored is the determination of the optimal feature set from S1 and S2 images for segmenting agricultural fields given that both sensors come with different bands and additional features like band indices can be calculated as well.…”
Section: Introductionmentioning
confidence: 99%