2022
DOI: 10.1016/j.buildenv.2022.109536
|View full text |Cite
|
Sign up to set email alerts
|

Attention-LSTM architecture combined with Bayesian hyperparameter optimization for indoor temperature prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 34 publications
(8 citation statements)
references
References 36 publications
0
6
0
Order By: Relevance
“…To assess the efficacy of the proposed method for forecasting COVID-19 in time-series data, three crucial metrics (Root Mean Square Error, Mean Absolute Error, and Mean Absolute Percentage Error) were calculated. The equations for computing SMAPE, MAPE, and RMSE were obtained from Equations 8 – 10 , correspondingly explained by Jiang et al ( 2022 ) and Kistenev et al ( 2022 ). The model's effectiveness was assessed through out-of-sample predictions from the test set (TS).…”
Section: Prediction Methods Stepsmentioning
confidence: 99%
See 1 more Smart Citation
“…To assess the efficacy of the proposed method for forecasting COVID-19 in time-series data, three crucial metrics (Root Mean Square Error, Mean Absolute Error, and Mean Absolute Percentage Error) were calculated. The equations for computing SMAPE, MAPE, and RMSE were obtained from Equations 8 – 10 , correspondingly explained by Jiang et al ( 2022 ) and Kistenev et al ( 2022 ). The model's effectiveness was assessed through out-of-sample predictions from the test set (TS).…”
Section: Prediction Methods Stepsmentioning
confidence: 99%
“…Zarzycki and Ławryńczuk ( 2022 ) explain that the architecture relies entirely on the network to use an LSTM layer. Jiang et al ( 2022 ) proposed the incorporation of hourly weather measurements as data references in the LSTM architecture. In their 2022 study, Kistenev et al used unbalanced microcomputed tomography (Micro-CT) imaging, supervised deep learning, and adaptive self-degradation.…”
Section: Introductionmentioning
confidence: 99%
“…This modeling technique has become increasingly popular in recent years. There are also researchers in related fields using recurrent neural networks such as long short-term memory (LSTM) to predict temporal changes in temperature and achieving good results [11,[28][29][30], and LSTM is a special type of recurrent neural network (RNN) that is capable of learning long-term dependencies, which is more evident in the prediction of temporal sequences.…”
Section: Ref Time Granularitymentioning
confidence: 99%
“…In Machine learning, data is processed using attention mechanism, which is used to determine contribution size between input and out data, making it applicable to a variety of disciplines. For example, Jiang proposed a combined LSTM, transformer and attention mechanism for indoor temperature prediction model, which achieves accurate and efficient prediction of room temperature trends [17]. Zhang integrated transformer model and multiple attention mechanism to develop an attention network framework based on Transformer Encoder, which effectively achieved accurate prediction of stock trends [18].…”
Section: Introductionmentioning
confidence: 99%