2012 IEEE PES Innovative Smart Grid Technologies (ISGT) 2012
DOI: 10.1109/isgt.2012.6175757
|View full text |Cite
|
Sign up to set email alerts
|

Solar radiation prediction based on recurrent neural networks trained by Levenberg-Marquardt backpropagation learning algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2013
2013
2022
2022

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 42 publications
(17 citation statements)
references
References 19 publications
0
17
0
Order By: Relevance
“…Derivative-based methods are widely used in this move. Although the second derivative methods, such as Hessian optimization [28] and Levenberg-Marquardt algorithm [29], can get optimum estimation accurately, the first derivative methods are more efficient in practice because of the balance between calculation complication and iteration effect.…”
Section: Gradient Problemmentioning
confidence: 99%
“…Derivative-based methods are widely used in this move. Although the second derivative methods, such as Hessian optimization [28] and Levenberg-Marquardt algorithm [29], can get optimum estimation accurately, the first derivative methods are more efficient in practice because of the balance between calculation complication and iteration effect.…”
Section: Gradient Problemmentioning
confidence: 99%
“…It is a network training function that updates weight and bias values according to Levenberg-Marquardt optimization. It is often the fastest backpropagation algorithm for training moderate-sized feedforward neural networks (up to several hundred weights), although it does require more memory than other algorithms [13].…”
Section: Learning Algorithmmentioning
confidence: 99%
“…The error is the difference between the output and the target. Validation vectors are used to stop training early if the network performance on the validation vectors fails to improve or substantially remains the same [13]; the best validation performance is 2.221e-07 at epoch 31. As showed in Figure 6.…”
Section: Mean Squared Errormentioning
confidence: 99%
“…Several papers used statistical time series methods such as auto-regressive (AR) [4] and auto-regressive with moving average (ARMA) [5]. Others use artificial intelligence [6] or neural networks (NN) [7], [8]. In [9] wavelet recurrent neural networks are used.…”
Section: Introductionmentioning
confidence: 99%