2013
DOI: 10.1016/j.eswa.2013.04.013
|View full text |Cite
|
Sign up to set email alerts
|

A Bayesian regularized artificial neural network for stock market forecasting

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
142
0
2

Year Published

2013
2013
2023
2023

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 441 publications
(173 citation statements)
references
References 28 publications
0
142
0
2
Order By: Relevance
“…In their study, they found that the BRANN gave slightly better performance, but not significantly so. In many studies [8,[31][32][33], the BR training algorithm has given either moderate or the best performance in terms of comparison with other training algorithms. BRANNs have some important advantages, such as choice and robustness of model, choice of validation set, size of validation effort, and optimization of network architecture [13].…”
Section: Resultsmentioning
confidence: 99%
“…In their study, they found that the BRANN gave slightly better performance, but not significantly so. In many studies [8,[31][32][33], the BR training algorithm has given either moderate or the best performance in terms of comparison with other training algorithms. BRANNs have some important advantages, such as choice and robustness of model, choice of validation set, size of validation effort, and optimization of network architecture [13].…”
Section: Resultsmentioning
confidence: 99%
“…In fact, even if there are a large sample data, but it does not necessarily find rule, even if there is a statistical rule, but it is also not a typical. The other is the artificial intelligence technique such as artificial neural network (ANN) [3][4][5], genetic algorithm (GA) [6,7], and many hybrid intelligent algorithms [8][9][10][11]. The hybrid intelligent algorithms have more flexibility to solve the complex models, so more and more researchers tend to use them to deal with forecasting problems.…”
Section: Introductionmentioning
confidence: 99%
“…[37][38][39]. In this paper, we choose Bayesian regularization mainly because the data in our study are scarce, and the Bayesian method can improve the performance of neural networks (by reducing the training iterations) [36]. One of our future works is to use other methods, such as GA and PSO, to optimize the weights in neural networks.…”
Section: Training Parameter Selection On the Bp Neural Networkmentioning
confidence: 99%
“…In this paper, we use regularization to limit the scale of network weights so that we can improve the generalization of neural networks. In particular, we consider Bayesian regularization, which is a method to estimate the regularization parameters based on Bayesian methods [36]. It is worth mentioning that there are many methods to improve or optimize the weights of the neural networks, including Genetic Algorithms (GA), Particle Swarm Optimization (PSO), etc.…”
Section: Training Parameter Selection On the Bp Neural Networkmentioning
confidence: 99%