2007
DOI: 10.1016/j.advwatres.2007.02.009
|View full text |Cite
|
Sign up to set email alerts
|

A deterministic linearized recurrent neural network for recognizing the transition of rainfall–runoff processes

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2008
2008
2022
2022

Publication Types

Select...
6
2

Relationship

3
5

Authors

Journals

citations
Cited by 15 publications
(12 citation statements)
references
References 38 publications
0
12
0
Order By: Relevance
“…Tsoi and Back (1997) classified ANNs into two categories, feedforward neural networks (FNNs) and recurrent neural networks (RNNs). Although Pan et al (2007) showed that RNNs perform superiorly for dynamical systems, FNNs is still one of the most popular forms due to its ability of simplifying calculation and enhancing the adaptability. Based on the learning algorithms of ANNs, unsupervised ANNs are applied in classification or clustering while supervised ANNs are adopted as a function approximator.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Tsoi and Back (1997) classified ANNs into two categories, feedforward neural networks (FNNs) and recurrent neural networks (RNNs). Although Pan et al (2007) showed that RNNs perform superiorly for dynamical systems, FNNs is still one of the most popular forms due to its ability of simplifying calculation and enhancing the adaptability. Based on the learning algorithms of ANNs, unsupervised ANNs are applied in classification or clustering while supervised ANNs are adopted as a function approximator.…”
Section: Methodsmentioning
confidence: 99%
“…The connections among neurons determine the dynamics of a network, like data flow. Because ANNs are trained by data-driven learning algorithms, and the data-driven learning algorithms always develop full connections between neurons in different layers (Pan et al 2007), the dynamics of a network could be accounted as the strengths of connections, like the weights in Fig 2. Therefore, the input net of a neuron in the hidden and output layers are described as Eqs. (3) and (4), respectively.…”
Section: Methodsmentioning
confidence: 99%
“…The averages of absolute criteria of the DLRNN and FNNs to simulate the rest 28 events (Pan et al, 2007). Table 4 shows the averages of the absolute criteria of the DLRNN and the FNNs in which FNN(1-8-1) is the only neural network without any feedback connection.…”
Section: Input Layermentioning
confidence: 99%
“…According to the manner of the adjustment to a synaptic weight by various data-driven learning algorithms, ANNs are classified into supervised and unsupervised neural networks. Based on the structures of the connections between neurons, ANNs are grouped into feedforward and recursive neural networks (Pan et al, 2007). As shown in Fig.…”
Section: Hybrid Neural Networkmentioning
confidence: 99%
“…Pan et al: Hybrid neural networks in inundation forecasting Artificial neural networks (ANNs) have become an attractive inductive approach in hydrological forecasting because of their flexibility and data-driven learning in building models, as well as their tolerance of inputs with error and time-saving calculation in real-time models (Thirumalaiah and Deo, 1998;Kisi and Kerem Cigizoglu, 2007). Although many studies have applied different ANNs to achieve the prediction and forecasting of various water resource aspects (Maier and Dandy, 2000;Toth et al, 2000;Bodria andČermák, 2000;Kim and Barros, 2001;Wei et al, 2002;Pan and Wang, 2004;Kerh and Lee, 2006;Dawson et al, 2006;Kisi and Kerem Cigizoglu, 2007;Chau, 2007;Chen and Yu, 2007;Goswami and O'Connor, 2007;Pan et al, 2008), few investigations have utilized ANNs to achieve rainfall-inundation forecasting, which is essential to providing real-time flood warning information in emergency responses, as stated previously. An algorithm must be developed to perform realtime calculations for inundation forecasting as fast as it receives the observed rainfall records.…”
Section: Introductionmentioning
confidence: 99%