2007
DOI: 10.1002/hyp.6837
|View full text |Cite
|
Sign up to set email alerts
|

The potential of different ANN techniques in evapotranspiration modelling

Abstract: Abstract:The potential of three different artificial neural network (ANN) techniques, the multi-layer perceptrons (MLPs), radial basis neural networks (RBNNs) and generalized regression neural networks (GRNNs), in modelling of reference evapotranspiration (ET 0 ) is investigated in this paper. Various daily climatic data, that is, solar radiation, air temperature, relative humidity and wind speed from two stations, Pomona and Santa Monica, in Los Angeles, USA, are used as inputs to the ANN techniques so as to … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
44
0
1

Year Published

2008
2008
2017
2017

Publication Types

Select...
10

Relationship

1
9

Authors

Journals

citations
Cited by 136 publications
(48 citation statements)
references
References 34 publications
0
44
0
1
Order By: Relevance
“…However, if the hidden layer neuron number is extremely high, the model will undergo overtraining and memorize rather than analyse the data. Thus, trial and error should be employed to determine the number of hidden layers (Bilhan et al, 2010;Gholami et al, 2015;Kisi, 2008b;Kisi & Cigizoglu, 2007). As stated earlier, the hidden and output layers require weighted summations.…”
Section: Multi-layer Perceptron (Mlp) Neural Networkmentioning
confidence: 99%
“…However, if the hidden layer neuron number is extremely high, the model will undergo overtraining and memorize rather than analyse the data. Thus, trial and error should be employed to determine the number of hidden layers (Bilhan et al, 2010;Gholami et al, 2015;Kisi, 2008b;Kisi & Cigizoglu, 2007). As stated earlier, the hidden and output layers require weighted summations.…”
Section: Multi-layer Perceptron (Mlp) Neural Networkmentioning
confidence: 99%
“…However, there is no definitive rule to determine the number of hidden layer neurons. In this study, trial and error is used to determine the number of hidden layer neurons in the MLP models employed (Bilhan et al, 2010;Breiman et al, 1993;Kisi, 2008). …”
Section: Multi-layer Perceptron Neural Networkmentioning
confidence: 99%
“…However, if the number of hidden layer neurons is excessively high, the model will undergo overtraining and memorize the data instead of analyzing them. The number of neurons within the hidden layer(s) is commonly determined through trial and error (Cobaner, Unal, & Kisi, 2009;Kalteh, 2008;Kisi, 2008;Zaji & Bonakdari, 2015). Therefore, the trial and error method was utilized and different models were tested with various numbers of neurons considered within the hidden layer.…”
Section: Ann Structurementioning
confidence: 99%