2021
DOI: 10.3390/rs13061144
|View full text |Cite
|
Sign up to set email alerts
|

Assessing the Effect of Drought on Winter Wheat Growth Using Unmanned Aerial System (UAS)-Based Phenotyping

Abstract: Drought significantly limits wheat productivity across the temporal and spatial domains. Unmanned Aerial Systems (UAS) has become an indispensable tool to collect refined spatial and high temporal resolution imagery data. A 2-year field study was conducted in 2018 and 2019 to determine the temporal effects of drought on canopy growth of winter wheat. Weekly UAS data were collected using red, green, and blue (RGB) and multispectral (MS) sensors over a yield trial consisting of 22 winter wheat cultivars in both … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
14
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 24 publications
(18 citation statements)
references
References 82 publications
(104 reference statements)
0
14
1
Order By: Relevance
“…Based on previous experience and research on UAS data collection for breeding programs (Shi et al., 2016; Yeom et al., 2019), we develop flight specifications on image overlap, flight altitude, and flight pattern to design UAS missions. For example, the RGB platform DJI Phantom 4 Pro (SZ DJI Technology Co., Ltd., Shenzhen, China) equipped with a 2.54‐cm (1‐inch) 20 MP (megapixel) CMOS (Complementary Metal Oxide Semiconductor) sensor was flown at 20–30 m altitude with 80∼85% forward and side overlap following grid pattern (available in Pix4D capture) to obtain subcentimeters (0.5–1 cm/pixel) ground sampling distance orthomosaics (Bhandari et al., 2021; Yeom et al., 2018). As the multispectral camera has a narrower field of view and needs more time to cover the same area as the RGB sensor, a multispectral platform would fly over the study area at a relatively higher altitude with lower overlap (70∼75%) than the RGB platform.…”
Section: Basic Protocols and Procedures For Uas Data Collectionmentioning
confidence: 99%
“…Based on previous experience and research on UAS data collection for breeding programs (Shi et al., 2016; Yeom et al., 2019), we develop flight specifications on image overlap, flight altitude, and flight pattern to design UAS missions. For example, the RGB platform DJI Phantom 4 Pro (SZ DJI Technology Co., Ltd., Shenzhen, China) equipped with a 2.54‐cm (1‐inch) 20 MP (megapixel) CMOS (Complementary Metal Oxide Semiconductor) sensor was flown at 20–30 m altitude with 80∼85% forward and side overlap following grid pattern (available in Pix4D capture) to obtain subcentimeters (0.5–1 cm/pixel) ground sampling distance orthomosaics (Bhandari et al., 2021; Yeom et al., 2018). As the multispectral camera has a narrower field of view and needs more time to cover the same area as the RGB sensor, a multispectral platform would fly over the study area at a relatively higher altitude with lower overlap (70∼75%) than the RGB platform.…”
Section: Basic Protocols and Procedures For Uas Data Collectionmentioning
confidence: 99%
“…Recently, the potential of NDVI metrics has been shown to estimate yield variation in field grown wheat in Western Australia (Shen & Evans, 2021). In addition, NDVI has also been found an effective indicator of vegetation response under terminal drought stress (Bhandari et al, 2021;Condorelli et al, 2018;Naser et al, 2020). The present study found a highly significant correlation of NDVI_AH with the GY, which The traits such as GFD and TGW were less affected by the environment under TS conditions, whereas under LS conditions, the environment's contribution to expression of such traits had increased.…”
Section: Discussionmentioning
confidence: 99%
“…Figure S2 illustrates a representative near-infrared (NIR) reflectance orthophoto image from 2019_NYH2 taken on August 15, 2019, with the plot-polygons overlaid. Normalized difference vegetation index (NDVI) HTP were extracted from ImageBreed, derived from the plot image mean pixel value (Gitelson et al 2002; Hunt et al 2013; Patrignani and Ochsner 2015; Bhandari et al 2021). The image, field experiment, phenotypic, and genotypic data within ImageBreed are FAIR and queryable through openly described APIs (Selby et al 2019; Wilkinson et al 2016).…”
Section: Methodsmentioning
confidence: 99%