2019
DOI: 10.18494/sam.2019.2472
|View full text |Cite
|
Sign up to set email alerts
|

Comparison between Object-based Method and Deep Learning Method for Extracting Road Features Using Submeter-grade High-resolution Satellite Imagery

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 26 publications
0
1
0
Order By: Relevance
“…Meanwhile, land use classification and geospatial feature extraction technology, change detection and time-series monitoring technology, and digital surface model (DSM) and digital terrain model (DTM) extraction technology are currently under development for the purposes of monitoring land. (1) In particular, land use classification and geospatial feature extraction can provide excellent indicators of the environment (2,3) and environmental changes. (4) For land use classification and geospatial feature extraction from high-resolution satellite images, an objectbased classification method, which classifies a group of pixels recognized together as an object, is commonly used.…”
Section: Introductionmentioning
confidence: 99%
“…Meanwhile, land use classification and geospatial feature extraction technology, change detection and time-series monitoring technology, and digital surface model (DSM) and digital terrain model (DTM) extraction technology are currently under development for the purposes of monitoring land. (1) In particular, land use classification and geospatial feature extraction can provide excellent indicators of the environment (2,3) and environmental changes. (4) For land use classification and geospatial feature extraction from high-resolution satellite images, an objectbased classification method, which classifies a group of pixels recognized together as an object, is commonly used.…”
Section: Introductionmentioning
confidence: 99%