2019
DOI: 10.3390/a12040085
|View full text |Cite
|
Sign up to set email alerts
|

Forecasting Economy-Related Data Utilizing Weight-Constrained Recurrent Neural Networks

Abstract: During the last few decades, machine learning has constituted a significant tool in extracting useful knowledge from economic data for assisting decision-making. In this work, we evaluate the performance of weight-constrained recurrent neural networks in forecasting economic classification problems. These networks are efficiently trained with a recently-proposed training algorithm, which has two major advantages. Firstly, it exploits the numerical efficiency and very low memory requirements of the limited memo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 13 publications
(10 citation statements)
references
References 28 publications
0
10
0
Order By: Relevance
“…Recurrent neural network (RNN) [36] is a type of artificial neural network which has the capability of learning from previous time-steps. RNNs are extended forms of typical feed-forward neural networks (FNNs) [37]. But, in contrast with FNNs, RNNs use their internal state while processing sequential data.…”
Section: Recurrent Neural Network (Rnn)mentioning
confidence: 99%
“…Recurrent neural network (RNN) [36] is a type of artificial neural network which has the capability of learning from previous time-steps. RNNs are extended forms of typical feed-forward neural networks (FNNs) [37]. But, in contrast with FNNs, RNNs use their internal state while processing sequential data.…”
Section: Recurrent Neural Network (Rnn)mentioning
confidence: 99%
“…In this learning method, the weights change their values according to the learning records until one epoch (an entire learning dataset) is reached. This method aims to minimize the error function, described as follows [14,78,79]: (11) where t = 1, . .…”
Section: Artificial Neural Networkmentioning
confidence: 99%
“…Taking into account the above consideration, we recall that WCNNs eliminate the likelihood that the weights can "blow up" to unrealistically large values by placing box-constraints on the weights. Furthermore, these new prediction models have been empirically proven by numerical experiments in a variety of real-world problems [19][20][21] to present better generalization ability compared to classical ANNs.…”
Section: Weight-constrained Neural Networkmentioning
confidence: 99%
“…We compared the prediction performance of WCNNs against classical ANNs. For this purpose, we utilized the performance profiles proposed by Dolan and Morè [26] to present perhaps the most complete information in terms of solution quality and efficiency [19,20]. Notice that the performance profile plots the fraction P of simulations for which any given model is within a factor τ of the best forecasting model.…”
Section: Performance Evaluation Of Wcnns Against Annsmentioning
confidence: 99%