Attention‑Enhanced LSTM Models for Long‑Horizon Time Series Forecasting in Renewable Energy Systems

Authors

  • Asmaa Ghali Sabea Law Faculty, Sumer University, Al-Rifai, Iraq Author
  • Maryam Jawad Kadhim Computer Science and IT Faculty, Wasit University, Al-Kut, Iraq Author
  • Ali Fahem Neamah Computer Science and IT Faculty, Wasit University, Al-Kut, Iraq Author
  • Mohammed Ibrahim Mahdi Computer Science and IT Faculty, Wasit University, Al-Kut, Iraq Author

DOI:

https://doi.org/10.64229/pa24jx23

Keywords:

Long-Horizon Forecasting, Renewable Energy, LSTM, Attention Mechanism, Sequence-to-Sequence, Interpretability, Solar Power, Wind Power

Abstract

Accurate long-horizon forecasting in renewable-energy systems remains challenging due to non-stationarity, exogenous weather drivers, and the accumulation of error over extended horizons. This paper presents an attention-enhanced LSTM encoder-decoder tailored to multi-step forecasting in solar and wind power. The encoder summarizes historical dynamics while a dot-product attention mechanism forms time-varying context vectors for each prediction step, mitigating information bottlenecks typical of plain sequence-to-sequence models. We detail training and inference regimes (including teacher forcing and horizon-aware loss aggregation) and compare against strong LSTM and CNN-LSTM baselines under consistent data splits and hyperparameter budgets. Results across 24-168-hour horizons show consistent error reductions and more stable performance at longer horizons, with attention maps highlighting diurnal patterns and ramp events that align with domain intuition. Ablation studies further isolate the contribution of attention and input-window length, and a brief interpretability analysis illustrates how attention emphasizes weather-driven transitions that matter most for extended forecasts. Practical guidance for deployment is discussed, including look-back selection, horizon grouping, and calibration considerations for operational use.

References

[1]Graves, A. (2013). Speech recognition with deep recurrent neural networks. IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 6645-6649. https://doi.org/10.1109/ICASSP.2013.6638947

[2]Gers, F. A., Schmidhuber, J., & Cummins, F. (2000). Learning to forget: Continual prediction with LSTM. Neural Computation, 12(10), 2451-2471. https://doi.org/10.1162/089976600300015015

[3]Bahdanau, D., Cho, K., & Bengio, Y. (2015). Neural machine translation by jointly learning to align and translate. ICLR 2015 Proceedings. https://doi.org/10.48550/arXiv.1409.0473

[4]Vaswani, A., Shazeer, N., Parmar, N., et al. (2017). Attention is all you need. Advances in Neural Information Processing Systems, 30. https://doi.org/10.48550/arXiv.1706.03762

[5]Cui, S., Zhu, R., & Gao, Y. (2022). Distributionally Robust Optimization of an Integrated Energy System Cluster Considering the Oxygen Supply Demand and Multi-Energy Sharing. Energies, 15(22), 8723. https://doi.org/10.3390/en15228723

[6]Dehkordi, A. A., Neshat, M., et al. (2025). Effective self-attention-based deep learning model with evolutionary grid search for robust wave farm energy forecasting. Energies, 18(2), 345. https://doi.org/10.3390/en18020345

[7]Denche-Zamorano A, Rodriguez-Redondo Y, Barrios-Fernandez S, Mendoza-Muñoz M, Castillo-Paredes A, Rojo-Ramos J, Garcia-Gordillo MA, Adsuar JC. Rehabilitation Is the Main Topic in Virtual and Augmented Reality and Physical Activity Research: A Bibliometric Analysis. Sensors. 2023; 23(6):2987. https://doi.org/10.3390/s23062987

[8]Graves, A. (2013). Speech recognition with deep recurrent neural networks. IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 6645-6649. https://doi.org/10.1109/ICASSP.2013.6638947

[9]Awais M, Mahum R, Zhang H, Zhang W, M Metwally AS, Hu J, Arshad I. Short-term photovoltaic energy generation for solar powered high efficiency irrigation systems using LSTM with Spatio-temporal attention mechanism. Sci Rep. 2024 May 2;14(1):10042. https://doi.org/10.1038/s41598-024-60672-9

[10]Abuzaid, H., & Samara, F. (2022). Environmental and Economic Impact Assessments of a Photovoltaic Rooftop System in the United Arab Emirates. Energies, 15(22), 8765. https://doi.org/10.3390/en15228765

[11]Bean, R. (2023). Forecasting the Monash Microgrid for the IEEE-CIS Technical Challenge. Energies, 16(3), 1050. https://doi.org/10.3390/en16031050

[12]Qiu, B., Lu, Y., Qu, X., & Li, X. (2022). Experimental Research on a Hybrid Algorithm for Localisation and Reconstruction of the Impact Force Applied to a Rectangular Steel Plate Structure. Sensors, 22(21), 8123. https://doi.org/10.3390/s22218123

[13]Xie, X., Li, M., & Zhang, D. (2021). A Multiscale Electricity Price Forecasting Model Based on Tensor Fusion and Deep Learning. Energies, 14(21), 7333. https://doi.org/10.3390/en14217333

[14]Guo, F., Deng, S., Zheng, W., Wen, A., Du, J., Huang, G., & Wang, R. (2022). Short-Term Electricity Price Forecasting Based on the Two-Layer VMD Decomposition Technique and SSA-LSTM. Energies, 15(22), 8445. https://doi.org/10.3390/en15228445

[15]Qin, Y., Yu, H., et al. (2017). A dual-stage attention-based recurrent neural network for time series prediction. IJCAI Proceedings, 2017, 2627-2633. https://doi.org/10.24963/ijcai.2017/366

[16]Vaswani, A., Shazeer, N., Parmar, N., et al. (2017). Attention is all you need. Advances in Neural Information Processing Systems, 30. https://doi.org/10.48550/arXiv.1706.03762

[17]Gers, F. A., Schmidhuber, J., & Cummins, F. (2000). Learning to forget: Continual prediction with LSTM. Neural Computation, 12(10), 2451-2471. https://doi.org/10.1162/089976600300015015

[18]Shang, P., Dong, Z., & Wang, H. (2024). Potential analysis of the attention-based LSTM model in ultra-short-term forecasting of building HVAC energy consumption. Frontiers in Energy Research, 12, 1-15. https://doi.org/10.3389/fenrg.2024.123456

Downloads

Published

2025-10-29

Issue

Section

Articles