Transformers-Based Encoder Model for Forecasting Hourly Power Output of Transparent Photovoltaic Module Systems
Abstract
:1. Introduction
- Meteorological methods: Indirect methods based on numerical weather forecasting and processing of satellite images allow first to predict the intensity of solar radiation, then convert it into the output power of a photovoltaic system.
- Statistical methods: These methods use statistical approaches, such as the autoregressive moving average model (ARMA) [7], auto-regressive integrated moving average model (ARIMA) [8], and exponential smoothing (ES) [9]. These models can be used to directly predict the power output of photovoltaic arrays without the need first to predict solar radiation.
- Machine learning methods: These approaches use machine learning algorithms, such as k-nearest neighbors, neural networks (NN) [10], support vector machine (SVR) [11], and pattern sequence-based forecasting (PSF) [12], to predict output directly the power of photovoltaic batteries. As a rule, there are two approaches to applying machine learning methods: building one forecasting model or combining several forecasting models to form an ensemble of forecasting models.
- Hybrid methods: These methods combine models or different components from the previous three categories. Unlike ensembles that combine machine learning models, hybrid ones typically combine machine learning meteorological, and statistical models or their components. The use of machine learning methods, such as RNN [13] and SVR, and statistical methods, such as ARIMA and ES, are widespread for building models to predict solar plant capacity. However, most of these methods are based on one general prediction model for all meteorological conditions and their corresponding daily photovoltaic characteristics.
2. Experimental and Methods
2.1. Dataset
2.2. Attention Based Encoder
3. Results and Discussions
3.1. Performance Evaluation
3.2. Visualization Results
3.3. Feature Importance
4. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Halkos, G.E.; Gkampoura, E.C. Reviewing usage, potentials, and limitations of renewable energy sources. Energies 2020, 13, 2906. [Google Scholar] [CrossRef]
- Reindl, T.; Walsh, W.; Yanqin, Z.; Bieri, M. Energy meteorology for accurate forecasting of PV power output on different time horizons. Energy Procedia 2017, 130, 130–138. [Google Scholar] [CrossRef]
- Hayat, M.B.; Ali, D.; Monyake, K.C.; Alagha, L.; Ahmed, N. Solar energy—A look into power generation, challenges, and a solar-powered future. Int. J. Energy Res. 2019, 43, 1049–1067. [Google Scholar] [CrossRef]
- Sampaio, P.G.V.; González, M.O.A. Photovoltaic solar energy: Conceptual framework. Renew. Sustain. Energy Rev. 2017, 74, 590–601. [Google Scholar] [CrossRef]
- Sağlam, Ş. Meteorological parameters effects on solar energy power generation. WSEAS Trans. Circuits Syst. 2010, 9, 637–649. [Google Scholar]
- Kabir, E.; Kumar, P.; Kumar, S.; Adelodun, A.A.; Kim, K.H. Solar energy: Potential and future prospects. Renew. Sustain. Energy Rev. 2018, 82, 894–900. [Google Scholar] [CrossRef]
- Choi, B. ARMA Model Identification; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2012. [Google Scholar]
- Shumway, R.H.; Stoffer, D.S. Time series regression and ARIMA models. In Time Series Analysis and Its Applications; Springer: New York, NY, USA, 2000; pp. 89–212. [Google Scholar] [CrossRef]
- Gardner, E.S., Jr. Exponential smoothing: The state of the art. J. Forecast. 1985, 4, pp. 1–28 doiorg/101002/for3980040103. [Google Scholar] [CrossRef]
- Gurney, K. An Introduction to Neural Networks; CRC press: Boca Raton, FL, USA, 2018. [Google Scholar] [CrossRef]
- Drucker, H.; Burges, C.J.; Kaufman, L.; Smola, A.; Vapnik, V. Support vector regression machines. In Advances in Neural Information Processing Systems 9; MIT Press: Cambridge, MA USA, 1996. [Google Scholar]
- Martínez–Álvarez, F.; Troncoso, A.; Riquelme, J.C.; Aguilar–Ruiz, J.S. LBF: A labeled-based forecasting algorithm and its application to electricity price time series. In Proceedings of the 2008 Eighth IEEE International Conference on Data Mining, Pisa, Italy, 15–19 December 2008; IEEE: Piscataway, NJ, USA, 2008; pp. 453–461. [Google Scholar] [CrossRef] [Green Version]
- Sherstinsky, A. Fundamentals of recurrent neural network (RNN) and long short-term memory (LSTM) network. Physica D Nonlinear Phenom. 2020, 404, 132306. [Google Scholar] [CrossRef]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention is all you need. In Proceedings of the 31st Conference on Neural Information Processing Systems (NIPS 2017), Long Beach, CA, USA, 4–9 December 2017; Volume 30. [Google Scholar] [CrossRef]
- Wu, N.; Green, B.; Ben, X.; O’Banion, S. Deep transformer models for time series forecasting: The influenza prevalence case. arXiv 2020, arXiv:2001.08317. [Google Scholar] [CrossRef]
- Wu, S.; Xiao, X.; Ding, Q.; Zhao, P.; Wei, Y.; Huang, J. Adversarial sparse transformer for time series forecasting. Adv. Neural Inf. Process. Syst. 2020, 33, 17105–17115. [Google Scholar]
- Wolf, T.; Debut, L.; Sanh, V.; Chaumond, J.; Delangue, C.; Moi, A.; Cistac, P.; Rault, T.; Louf, R.; Funtowicz, M.; et al. Transformers: State-of-the-art natural language processing. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, Online, 16–20 November 2020; Association for Computational Linguistics: Cedarville, OH, USA, 2020; pp. 38–45. [Google Scholar] [CrossRef]
- Devlin, J.; Chang, M.W.; Lee, K.; Toutanova, K. Bert: Pre training of deep bidirectional transformers for language understanding. arXiv 2018, arXiv:1810.04805. [Google Scholar] [CrossRef]
- Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef] [PubMed]
- Chung, J.; Gulcehre, C.; Cho, K.; Bengio, Y. Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv 2014, arXiv:1412.3555. [Google Scholar] [CrossRef]
- Available online: https://towardsdatascience.com/the-fall-of-rnn-lstm-2d1594c74ce0 (accessed on 25 September 2022).
- Grigsby, J.; Wang, Z.; Qi, Y. Long-range transformers for dynamic spatiotemporal forecasting. arXiv 2021, arXiv:2109.12218. [Google Scholar] [CrossRef]
- Zhou, H.; Zhang, S.; Peng, J.; Zhang, S.; Li, J.; Xiong, H.; Zhang, W. Informer: Beyond efficient transformer for long sequence time-series forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, Online, 2–9 February 2021; Curran Associates, Inc.: Red Hook, NY, USA, 2021; Volume 35, pp. 11106–11115. [Google Scholar] [CrossRef]
- Lundberg, S.M.; Lee, S.I. A unified approach to interpreting model predictions. Adv. Neural Inf. Process. Syst. 2017, 30. [Google Scholar] [CrossRef]
Type | Non-Transparent | Transparent |
---|---|---|
Module output | 400 W | 405 W |
Module size | 2039 × 1001 × 40 mm | 2039 × 1001 × 40 mm |
Module weight | 22.2 kg | 22 kg |
Non-Transparent Module | Transparent Module | ||||||
---|---|---|---|---|---|---|---|
Model/Metrics | MSE | MAE | RMSE | Model/Metrics | MSE | MAE | RMSE |
LSTM | 0.070 | 0.217 | 0.27 | LSTM | 0.057 | 0.19 | 0.23 |
GRU | 0.657 | 0.210 | 0.26 | GRU | 0.052 | 0.18 | 0.22 |
Transformers | 0.057 | 0.187 | 0.24 | Transformers | 0.04 | 0.17 | 0.21 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Sherozbek, J.; Park, J.; Akhtar, M.S.; Yang, O.-B. Transformers-Based Encoder Model for Forecasting Hourly Power Output of Transparent Photovoltaic Module Systems. Energies 2023, 16, 1353. https://doi.org/10.3390/en16031353
Sherozbek J, Park J, Akhtar MS, Yang O-B. Transformers-Based Encoder Model for Forecasting Hourly Power Output of Transparent Photovoltaic Module Systems. Energies. 2023; 16(3):1353. https://doi.org/10.3390/en16031353
Chicago/Turabian StyleSherozbek, Jumaboev, Jaewoo Park, Mohammad Shaheer Akhtar, and O-Bong Yang. 2023. "Transformers-Based Encoder Model for Forecasting Hourly Power Output of Transparent Photovoltaic Module Systems" Energies 16, no. 3: 1353. https://doi.org/10.3390/en16031353
APA StyleSherozbek, J., Park, J., Akhtar, M. S., & Yang, O. -B. (2023). Transformers-Based Encoder Model for Forecasting Hourly Power Output of Transparent Photovoltaic Module Systems. Energies, 16(3), 1353. https://doi.org/10.3390/en16031353