2018
DOI: 10.1016/j.jag.2018.05.003
|View full text |Cite
|
Sign up to set email alerts
|

The potential of Unmanned Aerial Systems: A tool towards precision classification of hard-to-distinguish vegetation types?

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
35
2

Year Published

2018
2018
2023
2023

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 53 publications
(38 citation statements)
references
References 46 publications
1
35
2
Order By: Relevance
“…More recently, researchers have also turned to modified off-the-shelf RGB and near-infrared (NIR) cameras, in pursuit of more accurate vegetation mapping [34][35][36][37][38][39]. This provides an advantage for vegetation mapping, leading to an increase in the classification accuracy of up to 15% [40].…”
Section: Platform and Sensor Choicementioning
confidence: 99%
See 1 more Smart Citation
“…More recently, researchers have also turned to modified off-the-shelf RGB and near-infrared (NIR) cameras, in pursuit of more accurate vegetation mapping [34][35][36][37][38][39]. This provides an advantage for vegetation mapping, leading to an increase in the classification accuracy of up to 15% [40].…”
Section: Platform and Sensor Choicementioning
confidence: 99%
“…Capabilities for multi-sensor combinations to better inform processes has been a particularly useful benefit of UAS. For example, to explore the impact that these sensors may have on the accuracy of vegetation mapping, Komárek et al [40] showed that identification accuracy of aquatic vegetation increased from 60-68% (using only visible) to 63-71% (combining visible + thermal). Replacing the visible with multispectral data further improved the accuracy interval to 74-81%.…”
Section: Platform and Sensor Choicementioning
confidence: 99%
“…Dense macroalgae exhibits textural differences that are generally distinguishable from dense eelgrass, whereas sparse eelgrass can easily be mistaken for sparse macroalgae, especially in areas where eelgrass is mixed with non-eelgrass SAV. The omission of sparse eelgrass is a common issue in classification of optical imagery (e.g., Barrell and Grant 2015) , and it may be possible to reduce this issue by: obtaining finer resolution imagery at lower flight altitudes; using pixel-based instead of object-based classifications in cases where there is minimal non-eelgrass SAV mixing (Duffy et al 2018); or through the use of multispectral sensors mounted on UAS to utilize the spectral differences between eelgrass and macroalgae (O'Neill et al 2011;Kom arek et al 2018). Cloud cover was not amongst the top predictors of mapping confidence as we hypothesized.…”
Section: Additional Influences On Mapping Outcomementioning
confidence: 99%
“…The method is particularly useful for cases where individual objects, such as trees, are comprised of multiple pixels. This characteristic has made GEOBIA a common choice for UAS image analysis of vegetation [36]. A variety of classification methods can be implemented in a GEOBIA workflow.…”
Section: Geographic Object-based Image Analysismentioning
confidence: 99%