2019 International Joint Conference on Neural Networks (IJCNN) 2019
DOI: 10.1109/ijcnn.2019.8851976
|View full text |Cite
|
Sign up to set email alerts
|

AdaBoost with Neural Networks for Yield and Protein Prediction in Precision Agriculture

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
1
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 14 publications
(13 citation statements)
references
References 18 publications
0
12
0
Order By: Relevance
“…classifiers, e.g., k-nearest neighbor (KNN) [61], support vector machines (SVM) [62], random forest (RF) [63], gradient boosting decision tree (GBDT) [64], Naï ve Bayes classifier (NB) [65], logistic regression (LR) [66], light gradient boosting machine (LightGBM) [67], extreme gradient boosting (XGBoost) [54], and adaptive boosting (AdaBoost) [68]. Step…”
Section: Stacked Ensemble Classifiermentioning
confidence: 99%
“…classifiers, e.g., k-nearest neighbor (KNN) [61], support vector machines (SVM) [62], random forest (RF) [63], gradient boosting decision tree (GBDT) [64], Naï ve Bayes classifier (NB) [65], logistic regression (LR) [66], light gradient boosting machine (LightGBM) [67], extreme gradient boosting (XGBoost) [54], and adaptive boosting (AdaBoost) [68]. Step…”
Section: Stacked Ensemble Classifiermentioning
confidence: 99%
“…The results showed improvements over other shallow neural networks and simple linear and non-linear regression models. Peerlinck et al 14 also proposed to use an Approximation AdaBoost algorithm with feedforward neural networks (FNNs) as weak learners. This method approximates the loss function by using a threshold approach that discards small errors during the weight update of the weak learners.…”
Section: Related Workmentioning
confidence: 99%
“…Some of them rely only on remotely sensed data, such as Moderate Resolution Imaging Spectroradiometer (MODIS) or Sentinel satellite imagery, 11,12 while others incorporate on-ground data, such as soil electroconductivity or nitrogen rate. 13,14 The common goal is to train a regression model to estimate the crop yield in terms of bushels per acre (bu/ac) as accurately as possible given some input information. All previous works have focused on predicting the yield values of single georeferenced points of the field, which represent small regions of the field (e.g.…”
Section: Introductionmentioning
confidence: 99%
“…classifiers, e.g., k-nearest neighbor (KNN) [61], support vector machines (SVM) [62], random forest (RF) [63], gradient boosting decision tree (GBDT) [64], Naï ve Bayes classifier (NB) [65], logistic regression (LR) [66], light gradient boosting machine (LightGBM) [67], extreme gradient boosting (XGBoost) [54], and adaptive boosting (AdaBoost) [68]. Output:ensemble classifier H…”
Section: Stacked Ensemble Classifiermentioning
confidence: 99%