2019
DOI: 10.1016/j.compag.2018.09.003
|View full text |Cite
|
Sign up to set email alerts
|

Neglecting spatial autocorrelation causes underestimation of the error of sugarcane yield models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 15 publications
(7 citation statements)
references
References 21 publications
0
6
0
Order By: Relevance
“…As a whole, KNN is the method that has the highest significant effect. KNN is one of the simplest algorithms for looking at the nearest neighbor value [71], even though KNN is considered a poor test on the IRIS dataset [72].…”
Section: Effect Of Synthetic Data On ML Classifier Performance Using ...mentioning
confidence: 99%
“…As a whole, KNN is the method that has the highest significant effect. KNN is one of the simplest algorithms for looking at the nearest neighbor value [71], even though KNN is considered a poor test on the IRIS dataset [72].…”
Section: Effect Of Synthetic Data On ML Classifier Performance Using ...mentioning
confidence: 99%
“…Additional preprocessing referred to avoiding the effect of spatial autocorrelation, described by Ferraciolli et al (2018). This effect means that fields that are close to each other share characteristics, which could compromise the independence of training and test sets in the cases in which close fields are present in both sets.…”
Section: Pre-processingmentioning
confidence: 99%
“…In most of these studies, ML reduced prediction errors. The presence of spatial autocorrelation (SAC) in explanatory or predictive variables, however, has recently been suggested to induce the overfitting or the underestimation of the ML model (Rocha et al, 2018; Ferraciolli, Bocca, and Rodrigues, 2019; Sinha et al, 2019). This highlights that spatial dependency still has possible influences even on the ML models (Nikparvar and Thill, 2021).…”
Section: Introductionmentioning
confidence: 99%