2024
DOI: 10.3390/en17071625
|View full text |Cite
|
Sign up to set email alerts
|

Data-Driven Techniques for Short-Term Electricity Price Forecasting through Novel Deep Learning Approaches with Attention Mechanisms

Vasileios Laitsos,
Georgios Vontzos,
Dimitrios Bargiotas
et al.

Abstract: The electricity market is constantly evolving, being driven by factors such as market liberalization, the increasing use of renewable energy sources (RESs), and various economic and political influences. These dynamics make it challenging to predict wholesale electricity prices. Accurate short-term forecasting is crucial to maintaining system balance and addressing anomalies such as negative prices and deviations from predictions. This paper investigates short-term electricity price forecasting using historica… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 32 publications
(32 reference statements)
0
0
0
Order By: Relevance
“…Lastly, neural network models such as the Multi-Layer Perceptron (MLP) and the Long Short-Term Memory network (LSTM) are black-box approaches that enable adaptive and accurate approximation of complex non-linear functions and exhibit improved resilience towards missing values and outliers [23][24][25][26][27][28]. It is worth noting that advances in neural network research led to the integration of attention mechanisms such as self-attention and multi-head attention [29]. These attention mechanisms enhanced the interpretability of complex neural network architectures and contributed towards the development of models that focus on the most important parts of the data in order to express non-linear relationships.…”
Section: Introductionmentioning
confidence: 99%
“…Lastly, neural network models such as the Multi-Layer Perceptron (MLP) and the Long Short-Term Memory network (LSTM) are black-box approaches that enable adaptive and accurate approximation of complex non-linear functions and exhibit improved resilience towards missing values and outliers [23][24][25][26][27][28]. It is worth noting that advances in neural network research led to the integration of attention mechanisms such as self-attention and multi-head attention [29]. These attention mechanisms enhanced the interpretability of complex neural network architectures and contributed towards the development of models that focus on the most important parts of the data in order to express non-linear relationships.…”
Section: Introductionmentioning
confidence: 99%