2018
DOI: 10.1016/j.apenergy.2017.10.058
|View full text |Cite
|
Sign up to set email alerts
|

A bat optimized neural network and wavelet transform approach for short-term price forecasting

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
62
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 103 publications
(62 citation statements)
references
References 45 publications
0
62
0
Order By: Relevance
“…Therefore, in the case of continuous distribution, the probability that different retailers have same quote is zero, that is, P (P i = P j ) = 0. Suppose the BNE is [P i , P j ] (I = j) when the profit of retailer is maximized, that is, for each C i ∈T i , P i (C i ) satisfies Equation (14).…”
Section: Optimal Bidding Strategies Under Different Probability Distrmentioning
confidence: 99%
See 2 more Smart Citations
“…Therefore, in the case of continuous distribution, the probability that different retailers have same quote is zero, that is, P (P i = P j ) = 0. Suppose the BNE is [P i , P j ] (I = j) when the profit of retailer is maximized, that is, for each C i ∈T i , P i (C i ) satisfies Equation (14).…”
Section: Optimal Bidding Strategies Under Different Probability Distrmentioning
confidence: 99%
“…. Therefore, in the case of continuous distribution, the probability that different retailers have the same quote is zero, so the objective function is still Equation (14). However, the distribution density function becomes complicated at this time.…”
Section: Of 19mentioning
confidence: 99%
See 1 more Smart Citation
“…However, the BPNN model still has some intrinsic defects, for example, slow convergence speed and over-fitting problem [40][41][42][43]. Fortunately, a large collection of optimization algorithms have been developed to optimize the BPNN model, such as GA [44,45], MEA [46], particle swarm optimization (PSO) [47,48], simulated annealing (SA) [49], bat algorithm (BA) [50,51], etc. Among them, evolutionary algorithms, such as GA and MEA, have recently been widely used as optimization algorithms searching the optimal weights and thresholds of neural networks.…”
Section: Introductionmentioning
confidence: 99%
“…10. Table 2 shows the result of proposed model in compare of simple NN and combination of NN and bat algorithm of [18] for each clusters by simple and improved MAE.…”
Section: Clustermentioning
confidence: 99%