2021
DOI: 10.1016/j.rse.2021.112731
|View full text |Cite
|
Sign up to set email alerts
|

Integration of multi-scale remote sensing data for reindeer lichen fractional cover mapping in Eastern Canada

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
13
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 20 publications
(14 citation statements)
references
References 68 publications
1
13
0
Order By: Relevance
“…On the other hand, the finer spatial resolution of PlanetScope images allowed us to model the variability of FCover values registered by the very high UAVbased map, thus offering a good support for identifying A. saligna patch edges. Similarly, this complementarity between coarse and fine-scale images has been observed in previous studies predicting the fractional cover of tundra vegetation and lichen of taiga using UAV and satellite images with different spatial resolutions (e.g., PlanetScope, Sentinel-2, and Landsat; Riihimäki et al, 2019;He et al, 2021). The analysis of multi-temporal spectral variables derived from visible (blue, green, and red) and NIR bands acquired by Sentinel-2 and PlanetScope platforms effectively depicted the phenological behavior of specific IAS.…”
Section: Figuresupporting
confidence: 78%
See 1 more Smart Citation
“…On the other hand, the finer spatial resolution of PlanetScope images allowed us to model the variability of FCover values registered by the very high UAVbased map, thus offering a good support for identifying A. saligna patch edges. Similarly, this complementarity between coarse and fine-scale images has been observed in previous studies predicting the fractional cover of tundra vegetation and lichen of taiga using UAV and satellite images with different spatial resolutions (e.g., PlanetScope, Sentinel-2, and Landsat; Riihimäki et al, 2019;He et al, 2021). The analysis of multi-temporal spectral variables derived from visible (blue, green, and red) and NIR bands acquired by Sentinel-2 and PlanetScope platforms effectively depicted the phenological behavior of specific IAS.…”
Section: Figuresupporting
confidence: 78%
“…Other scale issues that might influence the IAS model accuracy are the low number of pixels registering "pure" A. saligna patches in Sentinel-2 images and the relatively limited number of total pixels in the prediction area. In fact, the fuzzy Frontiers in Environmental Science frontiersin.org shape of A. saligna patches as derived by Sentinel-2 models corresponds to better model performance values (Riihimäki et al, 2019;He et al, 2021). On the other hand, the finer spatial resolution of PlanetScope images allowed us to model the variability of FCover values registered by the very high UAVbased map, thus offering a good support for identifying A. saligna patch edges.…”
Section: Figurementioning
confidence: 93%
“…Only the near-infrared band was used from the Sentera camera resampled to 2 cm. Lichen maps were then derived using support vector machine classification that yielded an accuracy greater than 93% (He et al 2021).…”
Section: Pixels Sampled Training Sample Sizementioning
confidence: 99%
“…cover or volume, underpinned by field plot reference data, is becoming a common method for quantifying caribou lichen forage over large spatial extents (Theau and Duguay 2004a, b;Nelson et al 2013;Falldorf et al 2014;Kennedy et al 2020;Macander et al 2020). High-resolution drone mapping has recently been used to provide more comprehensive reference data for training satellite-based lichen retrieval models, assuming that it provides more accurate sampling of satellite pixels compared to conventional field plots (Macander et al 2020;Fraser et al 2021;He et al 2021).…”
Section: Introductionmentioning
confidence: 99%
“…Techniques from spectral mixture analysis (Théau et al 2005), to spectral indices (Falldorf et al 2014) combined with environmental variables (Nelson et al 2013Silva et al 2019), and machine learning methods (Kennedy et al 2020) have been used. Several recent studies have utilized data from multiple remote sensing sensors; for example, using measurements from sensors mounted on unmanned aerial vehicles (UAVs) as a training data for models employing optical satellite data (Macander et al 2020;He et al 2021), or estimates of forest structure from Light detection and ranging (LiDAR) data combined with optical data from satellite imagery (Hillman and Nielsen 2020). LiDAR data alone has also been utilized for separation of lichens from other ground cover types (Korpela 2008;Moeslund et al 2019).…”
Section: Introductionmentioning
confidence: 99%