2023
DOI: 10.1016/j.compag.2023.108261
|View full text |Cite
|
Sign up to set email alerts
|

Multistep ahead prediction of temperature and humidity in solar greenhouse based on FAM-LSTM model

Yongxia Yang,
Pan Gao,
Zhangtong Sun
et al.
Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(4 citation statements)
references
References 50 publications
0
4
0
Order By: Relevance
“…This mechanism dynamically adjusts weight distribution, prioritizing more significant information, thereby effectively enhancing the model's capacity to process time series data. In contrast to the method proposed by Yang et al [40], where a feedforward attention mechanism is directly added to LSTM, our approach, which combines an encoder-decoder with residual learning, not only augments the model's ability to mine complex data but also improves the accuracy and performance of time series prediction. This indicates that through carefully designed feature fusion and attention mechanisms, the LSTM model can be further optimized to accommodate the increasingly complex and variable demands of time series analysis.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…This mechanism dynamically adjusts weight distribution, prioritizing more significant information, thereby effectively enhancing the model's capacity to process time series data. In contrast to the method proposed by Yang et al [40], where a feedforward attention mechanism is directly added to LSTM, our approach, which combines an encoder-decoder with residual learning, not only augments the model's ability to mine complex data but also improves the accuracy and performance of time series prediction. This indicates that through carefully designed feature fusion and attention mechanisms, the LSTM model can be further optimized to accommodate the increasingly complex and variable demands of time series analysis.…”
Section: Discussionmentioning
confidence: 99%
“…Attention mechanisms can capture the most important data for current tasks in complex datasets [45], achieving tremendous success in computer vision tasks [46]. In large-scale data processing, compared to conventional attention mechanisms, feedforward attention mechanisms can demonstrate better performance in dealing with medium to long-term time series issues [40].…”
Section: Feedforward Attention Mechanismmentioning
confidence: 99%
See 1 more Smart Citation
“…With the development of sensor technology, the amount of data about the greenhouse environment is increasing. The use of data-driven time series prediction model construction is gaining attention [12].…”
Section: Introductionmentioning
confidence: 99%