2020
DOI: 10.1016/j.compag.2020.105557
|View full text |Cite
|
Sign up to set email alerts
|

Improving model robustness for soybean iron deficiency chlorosis rating by unsupervised pre-training on unmanned aircraft system derived images

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 10 publications
(4 citation statements)
references
References 37 publications
0
4
0
Order By: Relevance
“…The identification accuracy is 98.58%. Li et al 16 proposed a nutrient rating model for detecting different nutrient growth stages of soybeans. This model included an unsupervised pre-trained convolutional autoencoder (CAE) and a convolutional neural network used to supervise the training of soybean images.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…The identification accuracy is 98.58%. Li et al 16 proposed a nutrient rating model for detecting different nutrient growth stages of soybeans. This model included an unsupervised pre-trained convolutional autoencoder (CAE) and a convolutional neural network used to supervise the training of soybean images.…”
Section: Related Workmentioning
confidence: 99%
“…The identification accuracy is 98.58%. Li et al 16 . proposed a nutrient rating model for detecting different nutrient growth stages of soybeans.…”
Section: Related Workmentioning
confidence: 99%
“…For identification of disease and deficiency from image data, CNN based DL models are most popularly used. Pre-trained convolutional encoders performed better than recurrent attention neural network (RAN)-CNN in detection of iron deficiency of Soybean [20]. TL model ResNet-50 showed an accuracy of 65.44% in detecting seven nutrient deficiencies in a dataset of 4088 images of black gram [21].…”
Section: Related Workmentioning
confidence: 99%
“…Various phenotypes can be sensed at the canopy level by UAVs including plant height, canopy cover, and spectral reflectance (Galli et al., 2020; Kamilaris & Prenafeta‐Boldú, 2018; Sankaran et al., 2015; Volpato et al., 2021). For example, canopy reflectance are sensed by using multispectral or hyperspectral sensors to estimate leaf chlorophyll content, evapotranspiration rate, yield and productivity (Buchaillot et al., 2019; Chivasa et al., 2020; Galli et al., 2020; Li et al., 2020). Plant height and leaf rotation angle are derived from high‐resolution red‐green‐blue (RGB) images (Holman et al., 2016; Kawamura et al., 2020; Xu et al., 2021).…”
Section: Introductionmentioning
confidence: 99%