Enhancing Short-Term Load Forecasting Accuracy in High-Volatility Regions Using LSTM-SCN Hybrid Models
Abstract
:1. Introduction
2. Methodologies
2.1. Stochastic Configuration Network (SCN)
2.1.1. Related Work
2.1.2. Network Model Construction
2.1.3. Universal Approximation Theorem
2.2. Long Short-Term Memory (LSTM)
2.3. LSTM-SCN
3. Experimental Confirmation of SCN’s Universal Approximation Property
3.1. Evaluation Metric
3.2. Parameter Settings
3.3. Experimental Results
4. Experiments
4.1. Experimental Setup
4.2. Dataset
4.3. Data Preprocessing
4.3.1. Data Cleaning
4.3.2. Time Series Feature Construction
4.3.3. Data Normalization
4.4. Experimental Results
- (1)
- Adaptive Dynamic Network Structure: The LSTM-SCN model relies on a supervision mechanism during prediction, whereas other models depend on the backpropagation mechanism of fully connected layers. Backpropagation requires gradient optimization and relies on a fixed network structure, while the supervision mechanism can dynamically adjust the network structure based on data and task requirements. This adaptability allows the model to construct an optimal network architecture during learning, avoiding overfitting and unnecessary complexity.
- (2)
- Robustness to Gradient Issues: Backpropagation in deep networks often faces challenges like vanishing or exploding gradients, which hinder model convergence and learning efficiency. In contrast, the supervision mechanism does not rely entirely on gradients but adjusts through a feedback-driven process, making the model more robust to these issues.
- (3)
- Enhanced Interpretability: The supervision mechanism offers higher interpretability. Its dynamic structural adjustments can reveal the relationship between data features and model behavior, providing a more transparent and efficient learning process.
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Zeng, P.; Jin, M.; Elahe, M.d.F. Short-Term Power Load Forecasting Based on Cross Multi-Model and Second Decision Mechanism. IEEE Access 2020, 8, 184061–184072. [Google Scholar] [CrossRef]
- Wang, Y.; Chen, Q.; Zhang, N.; Wang, Y. Conditional Residual Modeling for Probabilistic Load Forecasting. IEEE Trans. Power Syst. 2018, 33, 7327–7330. [Google Scholar] [CrossRef]
- Mei, T.; Si, Z.; Yan, J.; Lu, L. Short-Term Power Load Forecasting Study Based on IWOA Optimized CNN-BiLSTM. In Proceedings of the International Conference on Intelligent Computing, Tianjin, China, 5–8 August 2024. [Google Scholar]
- Li, K.; Huang, W.; Hu, G.; Li, J. Ultra-Short Term Power Load Forecasting Based on CEEMDAN-SE and LSTM Neural Network. Energy Build. 2023, 279, 112666. [Google Scholar] [CrossRef]
- Wan, A.; Chang, Q.; AL-Bukhaiti, K.; He, J. Short-Term Power Load Forecasting for Combined Heat and Power Using CNN-LSTM Enhanced by Attention Mechanism. Energy 2023, 282, 128274. [Google Scholar] [CrossRef]
- Wan, C.; Zhao, J.; Song, Y.; Xu, Z.; Hu, Z. Photovoltaic and Solar Power Forecasting for Smart Grid Energy Management. CSEE J. Power Energy Syst. 2015, 1, 38–46. [Google Scholar] [CrossRef]
- Krstonijević, S. Adaptive Load Forecasting Methodology Based on Generalized Additive Model with Automatic Variable Selection. Sensors 2022, 22, 7247. [Google Scholar] [CrossRef]
- Shi, T.; Mei, F.; Lu, J.; Lu, J.; Pan, Y.; Zhou, C.; Wu, J.; Zheng, J. Phase Space Reconstruction Algorithm and Deep Learning-Based Very Short-Term Bus Load Forecasting. Energies 2019, 12, 4349. [Google Scholar] [CrossRef]
- Barta, G.; Nagy, G.; Papp, G.; Simon, G. Forecasting Framework for Open Access Time Series in Energy. In Proceedings of the 2016 IEEE International Energy Conference (ENERGYCON), Leuven, Belgium, 4–8 April 2016; IEEE: Leuven, Belgium, 2016; pp. 1–6. [Google Scholar]
- Wijaya, T.K. Forecasting Uncertainty in Electricity Demand. In Proceedings of the AAAI-15 Workshop on Computational Sustainability, Austin, TX, USA, 26 January 2015. [Google Scholar]
- Farrag, T.A.; Elattar, E.E. Optimized Deep Stacked Long Short-Term Memory Network for Long-Term Load Forecasting. IEEE Access 2021, 9, 68511–68522. [Google Scholar] [CrossRef]
- Wen, Z.; Xie, L.; Fan, Q.; Feng, H. Long Term Electric Load Forecasting Based on TS-Type Recurrent Fuzzy Neural Network Model. Electr. Power Syst. Res. 2020, 179, 106106. [Google Scholar] [CrossRef]
- Guan, Y.; Li, D.; Xue, S.; Xi, Y. Feature-Fusion-Kernel-Based Gaussian Process Model for Probabilistic Long-Term Load Forecasting. Neurocomputing 2021, 426, 174–184. [Google Scholar] [CrossRef]
- Kazemzadeh, M.-R.; Amjadian, A.; Amraee, T. A Hybrid Data Mining Driven Algorithm for Long Term Electric Peak Load and Energy Demand Forecasting. Energy 2020, 204, 117948. [Google Scholar] [CrossRef]
- Andriopoulos, N.; Magklaras, A.; Birbas, A.; Papalexopoulos, A.; Valouxis, C.; Daskalaki, S.; Birbas, M.; Housos, E.; Papaioannou, G. Short Term Electric Load Forecasting Based on Data Transformation and Statistical Machine Learning. Appl. Sci. 2020, 11, 158. [Google Scholar] [CrossRef]
- Qinwei, D.; Xiangzhen, H.; Zhu, C.; Xuchen, T.; Zugang, L. Short-Term Power Load Forecasting Based on Sparrow Search Algorithm-Variational Mode Decomposition and Attention-Long Short-Term Memory. Int. J. Low-Carbon Technol. 2024, 19, 1089–1097. [Google Scholar]
- Pavlatos, C.; Makris, E.; Fotis, G.; Vita, V.; Mladenov, V. Enhancing Electrical Load Prediction Using a Bidirectional LSTM Neural Network. Electronics 2023, 12, 4652. [Google Scholar] [CrossRef]
- Shi, H.; Xu, M.; Li, R. Deep Learning for Household Load Forecasting—A Novel Pooling Deep RNN. IEEE Trans. Smart Grid 2018, 9, 5271–5280. [Google Scholar] [CrossRef]
- Chen, K.; Chen, K.; Wang, Q.; He, Z.; Hu, J.; He, J. Short-Term Load Forecasting With Deep Residual Networks. IEEE Trans. Smart Grid 2019, 10, 3943–3952. [Google Scholar] [CrossRef]
- Qiu, X.; Zhao, Q.; Wang, Y.; Tian, J.; Ding, H.; Zhang, J.; Zhao, H. Load Transfer Analysis of Regional Power Grid Based on Expert System Theory. In Proceedings of the 2021 6th Asia Conference on Power and Electrical Engineering (ACPEE), Chongqing, China, 8–11 April 2021; pp. 557–561. [Google Scholar]
- Kandil, M.S.; El-Debeiky, S.M.; Hasanien, N.E. Long-Term Load Forecasting for Fast Developing Utility Using a Knowledge-Based Expert System. IEEE Power Eng. Rev. 2002, 22, 78. [Google Scholar] [CrossRef]
- Arora, S.; Taylor, J.W. Short-Term Forecasting of Anomalous Load Using Rule-Based Triple Seasonal Methods. IEEE Trans. Power Syst. 2013, 28, 3235–3242. [Google Scholar] [CrossRef]
- Song, X.; Wang, Z.; Wang, H. Short-Term Load Prediction with LSTM and FCNN Models Based on Attention Mechanisms. J. Phys. Conf. Ser. 2024, 2741, 012026. [Google Scholar] [CrossRef]
- Liyun, P.; Wenjun, Z.; Sining, W.; Lu, H. Short-Term Load Forecasting Based on DenseNet-LSTM Fusion Model. In Proceedings of the 2021 IEEE International Conference on Energy Internet (ICEI), Southampton, UK, 27–29 September 2021; IEEE: Southampton, UK, 2021; pp. 84–89. [Google Scholar]
- Ma, L.; Wang, L.; Zeng, S.; Zhao, Y.; Liu, C.; Zhang, H.; Wu, Q.; Ren, H. Short-Term Household Load Forecasting Based on Attention Mechanism and CNN-ICPSO-LSTM. Energy Eng. 2024, 121, 1473–1493. [Google Scholar] [CrossRef]
- Goh, H.H.; He, B.; Liu, H.; Zhang, D.; Dai, W.; Kurniawan, T.A.; Goh, K.C. Multi-Convolution Feature Extraction and Recurrent Neural Network Dependent Model for Short-Term Load Forecasting. IEEE Access 2021, 9, 118528–118540. [Google Scholar] [CrossRef]
- Chen, X.; Chen, W.; Dinavahi, V.; Liu, Y.; Feng, J. Short-Term Load Forecasting and Associated Weather Variables Prediction Using ResNet-LSTM Based Deep Learning. IEEE Access 2023, 11, 5393–5405. [Google Scholar] [CrossRef]
- Zhou, R.; Zhang, X. Short-Term Power Load Forecasting Based on ARIMA-LSTM. J. Phys. Conf. Ser. 2024, 2803, 012002. [Google Scholar] [CrossRef]
- Shin, S.M.; Rasheed, A.; Kil-Heum, P.; Veluvolu, K.C. Fast and Accurate Short-Term Load Forecasting with a Hybrid Model. Electronics 2024, 13, 1079. [Google Scholar] [CrossRef]
- Wang, D.; Li, M. Stochastic Configuration Networks: Fundamentals and Algorithms. IEEE Trans. Cybern. 2017, 47, 3466–3479. [Google Scholar] [CrossRef] [PubMed]
- Broomhead, D.S.; Lowe, D. Multivariable Functional. Interpolation and Adaptative Networks. Complex Syst. 1988, 2, 321–355. [Google Scholar]
- Pao, Y.-H.; Takefuji, Y. Functional-Link Net Computing: Theory, System Architecture, and Functionalities. Computer 1992, 25, 76–79. [Google Scholar] [CrossRef]
- Dai, W.; Li, D.; Zhou, P.; Chai, T. Stochastic Configuration Networks with Block Increments for Data Modeling in Process Industries. Inf. Sci. 2019, 484, 367–386. [Google Scholar] [CrossRef]
- Li, J.; Wang, D. 2D Convolutional Stochastic Configuration Networks. Knowl.-Based Syst. 2024, 300, 112249. [Google Scholar] [CrossRef]
- Wang, D. Editorial: Randomized Algorithms for Training Neural Networks. Inf. Sci. Int. J. 2016, 100, 126–128. [Google Scholar] [CrossRef]
- Wang, D.; Cui, C. Stochastic Configuration Networks Ensemble with Heterogeneous Features for Large-Scale Data Analytics. Inf. Sci. 2017, 417, 55–71. [Google Scholar] [CrossRef]
- Wang, D.; Li, M. Robust Stochastic Configuration Networks with Kernel Density Estimation for Uncertain Data Regression. Inf. Sci. 2017, 412–413, 210–222. [Google Scholar] [CrossRef]
- Li, M.; Wang, D. Insights into Randomized Algorithms for Neural Networks: Practical Issues and Common Pitfalls. Inf. Sci. 2017, 382–383, 170–178. [Google Scholar] [CrossRef]
- Gorban, A.N.; Tyukin, I.Y.; Prokhorov, D.V.; Sofeikov, K.I. Approximation with Random Bases: Pro et Contra. Inf. Sci. 2016, 364–365, 129–145. [Google Scholar] [CrossRef]
- Wang, D.; Dang, G. Recurrent Stochastic Configuration Networks for Temporal Data Analytics. arXiv 2024, arXiv:2406.16959. [Google Scholar]
- Fan, G.F.; Peng, L.L.; Hong, W.C.; Sun, F. Electric Load Forecasting by the SVR Model with Differential Empirical Mode Decomposition and Auto Regression. Neurocomputing 2016, 173, 958–970. [Google Scholar] [CrossRef]
- Alghamdi, M.A.; AL–Malaise AL–Ghamdi, A.S.; Ragab, M. Predicting Energy Consumption Using Stacked LSTM Snapshot Ensemble. Big Data Min. Anal. 2024, 7, 247–270. [Google Scholar] [CrossRef]
- Wu, J.; Tang, X.; Zhou, D.; Deng, W.; Cai, Q. Application of Improved DBN and GRU Based on Intelligent Optimization Algorithm in Power Load Identification and Prediction. Energy Inform. 2024, 7, 36. [Google Scholar] [CrossRef]
Symbol/ Abbreviation | Definition |
---|---|
-th node in the hidden layer | |
-th node in the hidden layer | |
− 1 | |
Activation function | |
− 1 | |
ε | Training tolerance |
The maximum number of nodes in the hidden layer | |
The number of candidates in the hidden layer | |
SVR | Support Vector Regression |
RVFL | Random Vector Functional Link Networks |
FC | Fully Connected Layer |
LR | Linear Regression |
XGBoost | Extreme Gradient Boosting |
GBRT | Gradient Boosting Regression Trees |
SCN | Stochastic Configuration Network |
LSTM | Long Short-Term Memory |
BiLSTM | Bidirectional Long Short-Term Memory |
GRU | Gated Recurrent Unit |
Dataset | Model | RMSE | MAE | MAPE |
---|---|---|---|---|
Stock | SVR | 0.641 | 0.486 | 1.042% |
RVFL | 32.291 | 30.521 | 65.303% | |
FC | 1.645 | 1.645 | 2.759% | |
LR | 2.335 | 1.795 | 3.857% | |
XGBoost | 0.783 | 0.585 | 1.256% | |
GBRT | 0.900 | 0.685 | 1.461% | |
SCN(our) | 0.540 | 0.540 | 0.876% | |
Concrete | SVR | 9.610 | 6.245 | 22.595% |
RVFL | 10.793 | 8.345 | 30.080% | |
FC | 6.612 | 4.925 | 16.943% | |
LR | 2.615 | 2.032 | 4.411% | |
XGBoost | 0.893 | 0.655 | 1.412% | |
GBRT | 0.963 | 0.742 | 1.599% | |
SCN(our) | 1.698 | 0.194 | 0.568% |
Experimental Environment | Experimental Setup |
---|---|
OS | Ubuntu 22.04 |
Development Environment | VS code |
Experimental Setup | Intel(R)Xeon® Silver 4116 CPU @ 2.10 GHz |
Graphics Card Model | NVIDIA GeForce RTX 3090, RTX(24GB) |
Programming Language | Python3.10 |
Deep Learning Framework | Pytorch |
Feature Name | Unit | Description |
---|---|---|
Power Load | MW | The rate of electrical energy consumption of electrical devices |
Dry Bulb Temperature | °C | The air temperature measured by a conventional thermometer |
Dew Point Temperature | °C | The temperature at which water vapor in the air condenses into dew |
Wet Bulb Temperature | °C | Wet bulb temperature indicates humidity and cooling potential |
Humidity | % | The amount of water vapor in the air |
Price | $/MWh | The cost per unit of electricity consumed by the user per hour |
Model | RMSE | MAE | MAPE |
---|---|---|---|
LSTM [42] | 93.797 | 72.175 | 0.819% |
BiLSTM [17] | 124.220 | 100.635 | 1.179% |
GRU [43] | 69.620 | 51.879 | 0.593% |
SCN | 97.626 | 74.071 | 0.847% |
CNN-LSTM [5] | 62.986 | 57.829 | 0.545% |
LSTM-RVFL [29] | 187.695 | 167.114 | 1.866% |
LSTM-SCN (our) | 56.970 | 43.033 | 0.492% |
Region Name | Region 1 | Region 2 | Region 3 | Region 4 | Region 5 |
---|---|---|---|---|---|
Data Range | [200, 240) | [540, 580) | [875, 915) | [1165, 1205) | [1310, 1350) |
Region Name | Model | RMSE | MAE | MAPE |
---|---|---|---|---|
Region 1 | LSTM | 69.503 | 56.357 | 0.723% |
BiLSTM | 43.366 | 34.632 | 0.453% | |
GRU | 49.503 | 40.009 | 0.511% | |
SCN | 82.894 | 69.422 | 0.894% | |
CNN-LSTM | 48.532 | 36.188 | 0.454% | |
LSTM-RVFL | 120.833 | 111.067 | 1.391% | |
LSTM-SCN (our) | 5.465 | 4.851 | 0.062% | |
Region 2 | LSTM | 77.431 | 55.698 | 0.666% |
BiLSTM | 63.474 | 46.654 | 0.566% | |
GRU | 47.208 | 36.597 | 0.451% | |
SCN | 58.148 | 47.428 | 0.591% | |
CNN-LSTM | 52.001 | 41.473 | 0.517% | |
LSTM-RVFL | 158.725 | 149.760 | 1.853% | |
LSTM-SCN (our) | 7.285 | 5.095 | 0.062% | |
Region 3 | LSTM | 66.255 | 48.467 | 0.632% |
BiLSTM | 48.305 | 36.094 | 0.477% | |
GRU | 53.502 | 43.131 | 0.573% | |
SCN | 86.465 | 81.428 | 0.794% | |
CNN-LSTM | 53.183 | 41.249 | 0.542% | |
LSTM-RVFL | 154.871 | 146.592 | 1.947% | |
LSTM-SCN (our) | 6.104 | 6.104 | 0.057% | |
Region 4 | LSTM | 109.402 | 93.140 | 1.262% |
BiLSTM | 56.048 | 37.506 | 0.506% | |
GRU | 71.450 | 54.467 | 0.733% | |
SCN | 93.136 | 82.856 | 1.492% | |
CNN-LSTM | 42.403 | 30.666 | 0.412% | |
LSTM-RVFL | 156.803 | 148.207 | 2.012% | |
LSTM-SCN (our) | 7.452 | 5.932 | 0.080% | |
Region 5 | LSTM | 95.331 | 84.480 | 1.166% |
BiLSTM | 41.727 | 30.050 | 0.406% | |
GRU | 60.292 | 46.030 | 0.625% | |
SCN | 79.458 | 69.475 | 0.916% | |
CNN-LSTM | 58.118 | 45.542 | 0.617% | |
LSTM-RVFL | 112.343 | 104.097 | 1.418% | |
LSTM-SCN (our) | 7.175 | 5.976 | 0.081% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Tang, B.; Hu, J.; Yang, M.; Zhang, C.; Bai, Q. Enhancing Short-Term Load Forecasting Accuracy in High-Volatility Regions Using LSTM-SCN Hybrid Models. Appl. Sci. 2024, 14, 11606. https://doi.org/10.3390/app142411606
Tang B, Hu J, Yang M, Zhang C, Bai Q. Enhancing Short-Term Load Forecasting Accuracy in High-Volatility Regions Using LSTM-SCN Hybrid Models. Applied Sciences. 2024; 14(24):11606. https://doi.org/10.3390/app142411606
Chicago/Turabian StyleTang, Bingbing, Jie Hu, Mei Yang, Chenglong Zhang, and Qiang Bai. 2024. "Enhancing Short-Term Load Forecasting Accuracy in High-Volatility Regions Using LSTM-SCN Hybrid Models" Applied Sciences 14, no. 24: 11606. https://doi.org/10.3390/app142411606
APA StyleTang, B., Hu, J., Yang, M., Zhang, C., & Bai, Q. (2024). Enhancing Short-Term Load Forecasting Accuracy in High-Volatility Regions Using LSTM-SCN Hybrid Models. Applied Sciences, 14(24), 11606. https://doi.org/10.3390/app142411606