An Adaptive, Data-Driven Stacking Ensemble Learning Framework for the Short-Term Forecasting of Renewable Energy Generation
Abstract
:1. Introduction
- (1)
- A novel, data-driven, adaptive stacking ensemble learning framework is developed for the output power forecasting of renewable energy. The stacking structure and different base-models deeply explore the information hidden in the raw data, thereby boosting the regression ability for multi-dimensional heterogeneous datasets.
- (2)
- Twelve independent candidate regression models, including bagging, boosting, linear, K nearest neighbor and SVR methods, are comprehensively compared. Then, five better models are determined adaptively to integrate the stacking ensemble structure. The diversity among the different base-models can ensure the excellent stability and generalization performance of the stacking model.
- (3)
- A meta-model is constructed using the linear regression method. The weights of base-models are determined via minimizing the cross-validation risk of the base-models estimator.
- (4)
- The hyperparameters of base-models and meta-model are tuned and optimized using the Bayesian global optimization method, which further enhances the forecasting accuracy of the proposed model.
2. Adaptive Ensemble Learning Framework for Renewable Energy Forecast
- (1)
- Twelve candidate models are trained and tested to select five base-models by evaluating the R2 index.For each base-model:
- Select a 5-fold split of the training dataset;
- Evaluate using 5-fold cross-validation;
- Tune hyperparameters using the Bayesian optimal method;
- Store all out-of-fold predictions.
- (2)
- Fit a meta-model on the out-of-fold predictions by linear regression.
- (3)
- Evaluate the model on a holdout prediction dataset.
3. Methodology
3.1. Regression Method Based on Boosting Learning
Algorithm 1 LGBM Regression |
|
3.2. Regression Method Based on Bagging Learning
3.3. Other Regression Models
3.4. Stacking Ensemble
3.5. Bayesian Hyperparameters Optimization
4. Results and Discussions
4.1. Data
4.2. Data Standardization and Evaluation Indices
4.3. Model Selection and Hyperparameter Optimization
4.4. Wind Power Forecasting and Results Analysis
4.5. PV Power Forecasting and Results Analysis
5. Conclusions
- (1)
- The models with different algorithm principles can deeply mine the space and structural characteristics of multi-dimensional heterogeneous datasets from multiple perspectives, realizing the performance complementarity among algorithms. The proposed stacking ensemble learning framework can track the dynamic changes within data, combining multiple base-models to improve the forecasting accuracy, as well as the generalization ability and adaptability.
- (2)
- The cross-validation and Bayesian hyperparameter optimization methods are used in the model training, which can effectively improve the model’s prediction accuracy.
- (3)
- The linear model is employed as a meta-model to integrate base-models. The weight of each base-model is determined by the minimum cross-validation error principle, which can further improve the model’s prediction accuracy without increasing the model’s complexity or calculation cost.
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Mlilo, N.; Brown, J.; Ahfock, T. Impact of intermittent renewable energy generation penetration on the power system networks—A review. Technol. Econ. Smart Grids Sustain. Energy 2021, 6, 1–19. [Google Scholar] [CrossRef]
- Wan, C.; Cao, Z.; Lee, W.J.; Song, Y.; Ju, P. An Adaptive Ensemble Data Driven Approach for Nonpara-metric Probabilistic Forecasting of Electricity Load. IEEE Trans. Smart Grid 2021, 12, 5396–5408. [Google Scholar] [CrossRef]
- Sanjari, M.J.; Gooi, H.B.; Nair, N.-K.C. Power Generation Forecast of Hybrid PV–Wind System. IEEE Trans. Sustain. Energy 2019, 11, 703–712. [Google Scholar] [CrossRef] [Green Version]
- Zhang, Y.; Le, J.; Liao, X.; Zheng, F.; Li, Y. A novel combination forecasting model for wind power integrating least square support vector machine, deep belief network, singular spectrum analysis and locality-sensitive hashing. Energy 2019, 168, 558–572. [Google Scholar] [CrossRef]
- Yu, X.; Wang, Y.; Wu, L.; Chen, G.; Wang, L.; Qin, H. Comparison of support vector regression and extreme gradient boosting for decomposition-based data-driven 10-day streamflow forecasting. J. Hydrol. 2019, 582, 124293. [Google Scholar] [CrossRef]
- Hanifi, S.; Liu, X.; Lin, Z.; Lotfian, S. A Critical Review of Wind Power Forecasting Methods—Past, Present and Future. Energies 2020, 13, 3764. [Google Scholar] [CrossRef]
- Hao, Y.; Tian, C. A novel two-stage forecasting model based on error factor and ensemble method for multi-step wind power forecasting. Appl. Energy 2019, 238, 368–383. [Google Scholar] [CrossRef]
- Rafati, A.; Joorabian, M.; Mashhour, E.; Shaker, H.R. High dimensional very short-term solar power forecasting based on a data-driven heuristic method. Energy 2021, 15, 119647. [Google Scholar] [CrossRef]
- Yu, C.; Li, Y.; Bao, Y.; Tang, H.; Zhai, G. A novel framework for wind speed prediction based on recurrent neural networks and support vector machine. Energy Convers. Manag. 2018, 178, 137–145. [Google Scholar] [CrossRef]
- Hong, Y.-Y.; Rioflorido, C.L.P.P. A hybrid deep learning-based neural network for 24-h ahead wind power forecasting. Appl. Energy 2019, 250, 530–539. [Google Scholar] [CrossRef]
- Qing, X.; Niu, Y. Hourly day-ahead solar irradiance prediction using weather forecasts by LSTM. Energy 2018, 148, 461–468. [Google Scholar] [CrossRef]
- Zhang, J.; Yan, J.; Infield, D.; Liu, Y.; Lien, F.-S. Short-term forecasting and uncertainty analysis of wind turbine power based on long short-term memory network and Gaussian mixture model. Appl. Energy 2019, 241, 229–244. [Google Scholar] [CrossRef] [Green Version]
- Ahmad, M.W.; Mourshed, M.; Rezgui, Y. Tree-based ensemble methods for predicting PV power generation and their comparison with support vector regression. Energy 2018, 164, 465–474. [Google Scholar] [CrossRef]
- Zheng, H.; Feng, Y.; Li, X.; Yang, H.; Lv, W.; Li, S. Investigation on Molecular Dynamics Simulation for Predicting Kinematic Viscosity of Natural Ester Insulating Oil. IEEE Trans. Dielectr. Electr. Insul. 2022, 29, 1882–1888. [Google Scholar] [CrossRef]
- Munawar, U.; Wang, Z. A Framework of Using Machine Learning Approaches for Short-Term Solar Power Forecasting. J. Electr. Eng. Technol. 2020, 15, 561–569. [Google Scholar] [CrossRef]
- Zhang, H.; Zhu, T. Stacking Model for Photovoltaic-Power-Generation Prediction 2022. Sustainability 2022, 14, 5669. [Google Scholar] [CrossRef]
- Torres-Barrán, A.; Alonso, Á.; Dorronsoro, J.R. Regression tree ensembles for wind energy and solar radiation prediction. Neurocomputing 2019, 326-327, 151–160. [Google Scholar] [CrossRef]
- Kumari, P.; Toshniwal, D. Extreme gradient boosting and deep neural network based ensemble learning approach to forecast hourly solar irradiance. J. Clean. Prod. 2021, 279, 123285. [Google Scholar] [CrossRef]
- Sansine, V.; Ortega, P.; Hissel, D.; Hopuare, M. Solar Irradiance Probabilistic Forecasting Using Machine Learning, Metaheuristic Models and Numerical Weather Predictions. Sustainability 2022, 14, 15260. [Google Scholar] [CrossRef]
- Kumari, P.; Toshniwal, D. Long short term memory–convolutional neural network based deep hybrid approach for solar irradiance forecasting. Appl. Energy 2021, 295, 117061. [Google Scholar] [CrossRef]
- Abdellatif, A.; Mubarak, H.; Ahmad, S.; Ahmed, T.; Shafiullah, G.M.; Hammoudeh, A.; Abdellatef, H.; Rahman, M.M.; Gheni, H.M. Forecasting Photovoltaic Power Generation with a Stacking Ensemble Model. Sustainability 2022, 14, 11083. [Google Scholar] [CrossRef]
- Jiajun, H.; Chuanjin, Y.; Yongle, L.; Huoyue, X. Ultra-short term wind prediction with wavelet transform, deep belief network and ensemble learning. Energy Convers. Manag. 2020, 205, 112418. [Google Scholar] [CrossRef]
- Wang, H.-Z.; Li, G.-Q.; Wang, G.-B.; Peng, J.-C.; Jiang, H.; Liu, Y.-T. Deep learning based ensemble approach for probabilistic wind power forecasting. Appl. Energy 2017, 188, 56–70. [Google Scholar] [CrossRef]
- Persson, C.; Bacher, P.; Shiga, T.; Madsen, H. Multi-site solar power forecasting using gradient boosted regression trees. Sol. Energy 2017, 150, 423–436. [Google Scholar] [CrossRef]
- Fan, J.; Wang, X.; Wu, L.; Zhou, H.; Zhang, F.; Yu, X.; Lu, X.; Xiang, Y. Comparison of Support Vector Machine and Extreme Gradient Boosting for predicting daily global solar radiation using temperature and precipitation in humid subtropical climates: A case study in China. Energy Convers. Manag. 2018, 164, 102–111. [Google Scholar] [CrossRef]
- Shao, H.; Deng, X.; Cui, F. Short-term wind speed forecasting using the wavelet decomposition and AdaBoost technique in wind farm of East China. IET Gener. Transm. Distrib. 2016, 10, 2585–2592. [Google Scholar] [CrossRef]
- Ribeiro, M.H.D.M.; da Silva, R.G.; Moreno, S.R.; Mariani, V.C.; dos Santos Coelho, L. Efficient bootstrap stacking ensemble learning model applied to wind power generation forecasting. Int. J. Electr. Power Energy Syst. 2022, 136, 107712. [Google Scholar] [CrossRef]
- Sáez, J.A.; Romero-Béjar, J.L. Impact of Regressand Stratification in Dataset Shift Caused by Cross-Validation. Mathematics 2022, 10, 2538. [Google Scholar] [CrossRef]
- Da Silva, R.G.; Ribeiro, M.H.D.M.; Moreno, S.R.; Mariani, V.C.; dos Santos Coelho, L. A novel decomposition-ensemble learning framework for multi-step ahead wind energy forecasting. Energy 2021, 216, 119174. [Google Scholar] [CrossRef]
- Liu, H.; Tian, H.-Q.; Li, Y.-F.; Zhang, L. Comparison of four Adaboost algorithm based artificial neural networks in wind speed predictions. Energy Convers. Manag. 2015, 92, 67–81. [Google Scholar] [CrossRef]
- Zheng, H.; Cui, Y.; Yang, W.; Li, J.; Ji, L.; Ping, Y.; Hu, S.; Chen, X. An Infrared Image Detection Method of Substation Equipment Combining Iresgroup Structure and CenterNet. IEEE Trans. Power Deliv. 2022, 37, 4757–4765. [Google Scholar] [CrossRef]
- Huang, H.; Jia, R.; Shi, X.; Liang, J.; Dang, J. Feature selection and hyper parameters optimization for short-term wind power forecast. Appl. Intell. 2021, 2, 1–19. [Google Scholar] [CrossRef]
- Xia, R.; Gao, Y.; Zhu, Y.; Gu, D.; Wang, J. An Efficient Method Combined Data-Driven for Detecting Electricity Theft with Stacking Structure Based on Grey Relation Analysis. Energies 2022, 15, 7423. [Google Scholar] [CrossRef]
- Agrawal, R.K.; Muchahary, F.; Tripathi, M.M. Ensemble of relevance vector machines and boosted trees for electricity price forecasting—ScienceDirect. Appl. Energy 2019, 250, 540–548. [Google Scholar] [CrossRef]
- Fan, J.; Yue, W.; Wu, L.; Zhang, F.; Cai, H.; Wang, X.; Lu, X.; Xiang, Y. Evaluation of SVM, ELM and four tree-based ensemble models for predicting daily reference evapotranspiration using limited meteorological data in different climates of China. Agric. For. Meteorol. 2018, 263, 225–241. [Google Scholar] [CrossRef]
- Müller, I.M. Feature selection for energy system modeling: Identification of relevant time series information. Energy AI 2021, 4, 100057. [Google Scholar] [CrossRef]
- Breiman, L. Random forests. Mach Learn. 2001, 45, 532. [Google Scholar]
- Geurts, P.; Ernst, D.; Wehenkel, L. Extremely randomized trees. Mach. Learn. 2006, 63, 3–42. [Google Scholar] [CrossRef] [Green Version]
- Long, H.; Zhang, Z.; Su, Y. Analysis of daily solar power prediction with data-driven approaches. Appl. Energy 2014, 126, 29–37. [Google Scholar] [CrossRef]
- Kusiak, A.; Zheng, H.; Song, Z. On-line monitoring of power curves. Renew. Energy 2009, 34, 1487–1493. [Google Scholar] [CrossRef]
- Kusiak, A.; Zheng, H.; Song, Z. Models for monitoring wind farm power. Renew. Energy 2009, 34, 583–590. [Google Scholar] [CrossRef]
- Li, L.L.; Zhao, X.; Tseng, M.L.; Tan, R.R. Short-term wind power forecasting based on support vector machine with improved dragonfly algorithm. J. Clean. Prod. 2020, 242, 118447. [Google Scholar] [CrossRef]
- Divina, F.; Gilson, A.; Goméz-Vela, F.; García Torres, M.; Torres, J.F. Stacking Ensemble Learning for Short-Term Electricity Con-sumption Forecasting. Energies 2018, 11, 949. [Google Scholar] [CrossRef] [Green Version]
- Wu, J.; Chen, X.Y.; Zhang, H.; Xiong, L.D.; Lei, H.; Deng, S.H. Hyperparameter optimization for machine learning models based on Bayesian optimization. J. Electron. Sci. Technol. 2019, 17, 26–40. [Google Scholar]
- Victoria, A.H.; Maragatham, G. Automatic tuning of hyperparameters using Bayesian optimization. Evol. Syst. 2021, 12, 217–223. [Google Scholar] [CrossRef]
- Hutter, F.; Kotthoff, L.; Vanschoren, J. Automated Machine Learning: Methods, Systems, Challenges; Springer Nature: New York, NY, USA, 2019; pp. 8–13. [Google Scholar]
- Huang, H.; Jia, R.; Liang, J.; Dang, J.; Wang, Z. Wind Power Deterministic Prediction and Uncertainty Quantification Based on Interval Estimation. J. Sol. Energy Eng. 2021, 1, 143. [Google Scholar] [CrossRef]
- Falkner, S.; Klein, A.; Hutter, F. BOHB: Robust and Efficient Hyperparameter Optimization at Scale. arXiv 2018, arXiv:1807.01774[P]. [Google Scholar]
- Huang, Z.; Huang, J.; Min, J. SSA-LSTM: Short-Term Photovoltaic Power Prediction Based on Feature Matching. Energies 2022, 15, 7806. [Google Scholar] [CrossRef]
- Ahmed, R.; Sreeram, V.; Mishra, Y.; Arif, M. A review and evaluation of the state-of-the-art in PV solar power forecasting: Techniques and optimization. Renew. Sustain. Energy Rev. 2020, 124, 109792. [Google Scholar] [CrossRef]
Method | Indices | Spring Dataset | Summer Dataset | Autumn Dataset | Winter Dataset |
---|---|---|---|---|---|
LR | RMSE | 0.189 | 0.135 | 0.205 | 0.237 |
MAE | 0.166 | 0.114 | 0.123 | 0.208 | |
R2 | 0.579 | 0.52 | 0.49 | 0.315 | |
ELAN | RMSE | 0.294 | 0.209 | 0.295 | 0.287 |
MAE | 0.269 | 0.188 | 0.273 | 0.259 | |
R2 | −0.016 | −0.151 | −0.063 | 0 | |
SVR | RMSE | 0.169 | 0.109 | 0.111 | 0.174 |
MAE | 0.129 | 0.079 | 0.08 | 0.134 | |
R2 | 0.662 | 0.689 | 0.851 | 0.634 | |
DT | RMSE | 0.21 | 0.159 | 0.136 | 0.236 |
MAE | 0.139 | 0.116 | 0.085 | 0.167 | |
R2 | 0.48 | 0.335 | 0.775 | 0.322 | |
KNN | RMSE | 0.163 | 0.128 | 0.11 | 0.188 |
MAE | 0.112 | 0.095 | 0.068 | 0.136 | |
R2 | 0.686 | 0.569 | 0.853 | 0.573 | |
ADA | RMSE | 0.159 | 0.123 | 0.136 | 0.174 |
MAE | 0.132 | 0.103 | 0.104 | 0.142 | |
R2 | 0.701 | 0.604 | 0.776 | 0.633 | |
Bagging | RMSE | 0.166 | 0.126 | 0.113 | 0.197 |
MAE | 0.115 | 0.094 | 0.071 | 0.142 | |
R2 | 0.677 | 0.581 | 0.845 | 0.526 | |
RF | RMSE | 0.16 | 0.125 | 0.109 | 0.19 |
MAE | 0.111 | 0.092 | 0.068 | 0.138 | |
R2 | 0.698 | 0.591 | 0.854 | 0.562 | |
ET | RMSE | 0.173 | 0.133 | 0.117 | 0.202 |
MAE | 0.117 | 0.098 | 0.073 | 0.146 | |
R2 | 0.649 | 0.538 | 0.834 | 0.505 | |
GBRT | RMSE | 0.147 | 0.113 | 0.105 | 0.165 |
MAE | 0.106 | 0.08 | 0.065 | 0.124 | |
R2 | 0.746 | 0.667 | 0.867 | 0.667 | |
XGB | RMSE | 0.151 | 0.116 | 0.104 | 0.175 |
MAE | 0.105 | 0.084 | 0.062 | 0.128 | |
R2 | 0.731 | 0.648 | 0.869 | 0.628 | |
LGBM | RMSE | 0.145 | 0.112 | 0.104 | 0.167 |
MAE | 0.102 | 0.08 | 0.063 | 0.125 | |
R2 | 0.754 | 0.673 | 0.868 | 0.662 |
Method | Indices | Spring Dataset | Summer Dataset | Autumn Dataset | Winter Dataset |
---|---|---|---|---|---|
LR | RMSE | 0.157 | 0.12 | 0.167 | 0.162 |
MAE | 0.117 | 0.095 | 0.121 | 0.128 | |
R2 | 0.711 | 0.858 | 0.381 | 0.759 | |
ELAN | RMSE | 0.302 | 0.334 | 0.226 | 0.331 |
MAE | 0.255 | 0.303 | 0.187 | 0.299 | |
R2 | −0.07 | −0.096 | −0.137 | −0.004 | |
SVR | RMSE | 0.147 | 0.111 | 0.148 | 0.118 |
MAE | 0.095 | 0.076 | 0.107 | 0.097 | |
R2 | 0.746 | 0.879 | 0.515 | 0.872 | |
DT | RMSE | 0.154 | 0.153 | 0.154 | 0.128 |
MAE | 0.112 | 0.103 | 0.105 | 0.073 | |
R2 | 0.723 | 0.771 | 0.474 | 0.849 | |
KNN | RMSE | 0.187 | 0.116 | 0.132 | 0.101 |
MAE | 0.111 | 0.08 | 0.092 | 0.062 | |
R2 | 0.589 | 0.868 | 0.611 | 0.906 | |
ADA | RMSE | 0.148 | 0.118 | 0.142 | 0.121 |
MAE | 0.095 | 0.096 | 0.099 | 0.094 | |
R2 | 0.743 | 0.862 | 0.555 | 0.865 | |
Bagging | RMSE | 0.134 | 0.107 | 0.137 | 0.111 |
MAE | 0.084 | 0.082 | 0.094 | 0.067 | |
R2 | 0.791 | 0.887 | 0.581 | 0.888 | |
RF | RMSE | 0.144 | 0.105 | 0.132 | 0.107 |
MAE | 0.092 | 0.073 | 0.091 | 0.064 | |
R2 | 0.758 | 0.892 | 0.615 | 0.896 | |
ET | RMSE | 0.178 | 0.116 | 0.138 | 0.113 |
MAE | 0.106 | 0.078 | 0.096 | 0.068 | |
R2 | 0.627 | 0.868 | 0.576 | 0.883 | |
GBRT | RMSE | 0.163 | 0.108 | 0.132 | 0.1 |
MAE | 0.106 | 0.076 | 0.089 | 0.062 | |
R2 | 0.688 | 0.886 | 0.613 | 0.908 | |
XGB | RMSE | 0.168 | 0.107 | 0.134 | 0.102 |
MAE | 0.111 | 0.078 | 0.091 | 0.065 | |
R2 | 0.669 | 0.888 | 0.604 | 0.904 | |
LGBM | RMSE | 0.142 | 0.106 | 0.137 | 0.107 |
MAE | 0.092 | 0.074 | 0.094 | 0.064 | |
R2 | 0.762 | 0.889 | 0.58 | 0.894 |
Model | Indices | Spring Dataset | Summer Dataset | Autumn Dataset | Spring Dataset |
---|---|---|---|---|---|
LR | RMSE | 0.137 | 0.104 | 0.098 | 0.158 |
MAE | 0.095 | 0.072 | 0.06 | 0.115 | |
R2 | 0.759 | 0.681 | 0.878 | 0.678 | |
RF | RMSE | 0.156 | 0.117 | 0.108 | 0.181 |
MAE | 0.111 | 0.086 | 0.066 | 0.136 | |
R2 | 0.715 | 0.639 | 0.857 | 0.604 | |
GBRT | RMSE | 0.161 | 0.122 | 0.113 | 0.184 |
MAE | 0.112 | 0.089 | 0.071 | 0.136 | |
R2 | 0.697 | 0.61 | 0.843 | 0.587 | |
XGB | RMSE | 0.149 | 0.113 | 0.104 | 0.174 |
MAE | 0.104 | 0.081 | 0.063 | 0.128 | |
R2 | 0.74 | 0.662 | 0.868 | 0.632 | |
LGBM | RMSE | 0.152 | 0.117 | 0.106 | 0.175 |
MAE | 0.107 | 0.085 | 0.065 | 0.129 | |
R2 | 0.729 | 0.638 | 0.863 | 0.627 |
Model | Hyperparameters | Range | Model | Hyperparameters | Range |
---|---|---|---|---|---|
KNN | n_neighbors | 1–20 | SVR | svr_c | 0.1–100 |
weights | uniform | svr_gamma | 0.01–1.0 | ||
RF | n_estimators | 10–200 | GBRT | n_estimators | 10–200 |
max_depth | 10–200 | subsample | 0.1–1.0 | ||
min_samples_split | 1–10 | min_samples_split | 1–20 | ||
min_samples_leaf | 1–10 | min_samples_leaf | 1–20 | ||
LGBM | n_estimators | 10–200 | XGB | n_estimators | 10–200 |
max_depth | 1–10 | max_depth | 10–200 | ||
num_leaves | 1–20 | min_child_weight | 1–10 | ||
learning_rate | 1–20 | subsample | 0.1–1.0 | ||
subsamples | 0.1–1.0 | learning_rate | 0.1–1.0 | ||
ADA | n_estimators | 10–200 | Bagging | n_estimators | 10–200 |
learning_rate | 0.1–1.0 | max_samples | 1–10 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Huang, H.; Zhu, Q.; Zhu, X.; Zhang, J. An Adaptive, Data-Driven Stacking Ensemble Learning Framework for the Short-Term Forecasting of Renewable Energy Generation. Energies 2023, 16, 1963. https://doi.org/10.3390/en16041963
Huang H, Zhu Q, Zhu X, Zhang J. An Adaptive, Data-Driven Stacking Ensemble Learning Framework for the Short-Term Forecasting of Renewable Energy Generation. Energies. 2023; 16(4):1963. https://doi.org/10.3390/en16041963
Chicago/Turabian StyleHuang, Hui, Qiliang Zhu, Xueling Zhu, and Jinhua Zhang. 2023. "An Adaptive, Data-Driven Stacking Ensemble Learning Framework for the Short-Term Forecasting of Renewable Energy Generation" Energies 16, no. 4: 1963. https://doi.org/10.3390/en16041963
APA StyleHuang, H., Zhu, Q., Zhu, X., & Zhang, J. (2023). An Adaptive, Data-Driven Stacking Ensemble Learning Framework for the Short-Term Forecasting of Renewable Energy Generation. Energies, 16(4), 1963. https://doi.org/10.3390/en16041963