Deep Learning for Short-Term Load Forecasting—Industrial Consumer Case Study
Abstract
:1. Introduction
2. Literature Review
3. Materials and Methods
- Installations that serve the equipment for cutting and exhaust;
- Installations that serve the cooling system to ensure the necessary cold to keep in optimal conditions the substances used in the foaming process;
- Installations that serve the processing and cutting of sponges;
- The installations that serve the different subsections when making the mattresses;
- Equipment used for making upholstery and assembling all subassemblies;
- Interior lighting installations located in the physical perimeter of all production halls;
- Robots for packing finished products;
- Conveyors for the transport of products in the logistics warehouse;
- Specific facilities for food preparation in the canteen;
- other installations are specific to the universal processes which take place within this undertaking.
3.1. Deep Learning (DL)
3.2. Gated Recurrent Units (GRU)
3.3. Proposed Methodology
4. Results
5. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Hernandez, L.; Baladron, C.; Aguiar, J.M.; Carro, B.; Sanchez-Esguevillas, A.J.; Lloret, J.; Massana, J. A Survey on Electric Power Demand Forecasting: Future Trends in Smart Grids, Microgrids and Smart Buildings. IEEE Commun. Surv. Tutor. 2014, 16, 1460–1495. [Google Scholar] [CrossRef]
- Chambers, J.C.; Mullick, S.K.; Smith, D.D. How to Choose the Right Forecasting Technique; Magazine; Harvard University, Graduate School of Business Administration: Cambridge, MA, USA, 1971. [Google Scholar]
- Armstrong, J.S. Selecting Forecasting Methods. Princ. Forec. Int. Ser. Oper. Res. Manag. Sci. 2001, 30, 365–386. [Google Scholar] [CrossRef] [Green Version]
- Archer, B.H. Forecasting demand: Quantitative and intuitive techniques. Int. J. Tour. Manag. 1980, 1, 5–12. [Google Scholar] [CrossRef]
- Feinberg, E.A.; Genethliou, D. Load Forecasting. In Applied Mathematics for Restructured Electric Power Systems: Optimization, Control, and Computational Intelligence; Chow, J.H., Wu, F.F., Momoh, J., Eds.; Springer: Boston, MA, USA, 2005; pp. 269–285. [Google Scholar] [CrossRef]
- Nti, I.K.; Teimeh, M.; Nyarko-Boateng, O.; Adekoya, A.F. Electricity load forecasting: A systematic review. J. Electr. Syst. Inf. Technol. 2020, 7, 13. [Google Scholar] [CrossRef]
- Gravesteijn, B.Y.; Nieboer, D.; Ercole, A.; Lingsma, H.F.; Nelson, D.; Van Calster, B.; Steyerberg, E.W. Machine learning algorithms performed no better than regression models for prognostication in traumatic brain injury. J. Clin. Epidemiol. 2020, 122, 95–107. [Google Scholar] [CrossRef]
- Kuster, C.; Rezgui, Y.; Mourshed, M. Electrical load forecasting models: A critical systematic review. Sustain. Cities Soc. 2017, 35, 257–270. [Google Scholar] [CrossRef]
- Green, K.C.; Armstrong, J.S. Simple versus complex forecasting: The evidence. J. Bus. Res. 2015, 68, 1678–1685. [Google Scholar] [CrossRef] [Green Version]
- Armstrong, J.; Green, K.; Graefe, A. Golden rule of forecasting: Be conservative. J. Bus. Res. 2015, 68, 1717–1731. [Google Scholar] [CrossRef] [Green Version]
- Hyndman, R.J. A brief history of forecasting competitions. Int. J. Forec. 2020, 36, 7–14. [Google Scholar] [CrossRef]
- Makridakis, S.; Spiliotis, E.; Assimakopoulos, V. The M4 Competition: 100,000 time series and 61 forecasting methods. Int. J. Forec. 2020, 36, 54–74. [Google Scholar] [CrossRef]
- Lara-Benítez, P.; Carranza-García, M.; Riquelme, J.C. An Experimental Review on Deep Learning Architectures for Time Series Forecasting. Int. J. Neural Syst. 2021, 31, 2130001. [Google Scholar] [CrossRef]
- Sehovac, L.; Grolinger, K. Deep Learning for Load Forecasting: Sequence to Sequence Recurrent Neural Networks with Attention. IEEE Access 2020, 8, 36411–36426. [Google Scholar] [CrossRef]
- Marvuglia, A.; Messineo, A. Using Recurrent Artificial Neural Networks to Forecast Household Electricity Consumption. Energy Procedia 2012, 14, 45–55. [Google Scholar] [CrossRef] [Green Version]
- He, W. Load Forecasting via Deep Neural Networks. Procedia Comput. Sci. 2017, 122, 308–314. [Google Scholar] [CrossRef]
- Eskandari, H.; Imani, M.; Moghaddam, M. Convolutional and recurrent neural network based model for short-term load forecasting. Electr. Power Syst. Res. 2021, 195, 107173. [Google Scholar] [CrossRef]
- Hewamalage, H.; Bergmeir, C.; Bandara, K. Recurrent Neural Networks for Time Series Forecasting: Current status and future directions. Int. J. Forec. 2021, 37, 388–427. [Google Scholar] [CrossRef]
- Chitalia, G.; Pipattanasomporn, M.; Garg, V.; Rahman, S. Robust short-term electrical load forecasting framework for commercial buildings using deep recurrent neural networks. Appl. Energy 2020, 278, 115410. [Google Scholar] [CrossRef]
- Learning Libraries. Tensorflow. Available online: https://www.tensorflow.org/ (accessed on 10 September 2021).
- Learning Libraries. Keras. Available online: https://keras.io/ (accessed on 16 September 2021).
- Learning Libraries. Scikit-Learn. Available online: https://scikit-learn.org/stable (accessed on 8 September 2021).
- Learning Libraries. Numpy. Available online: https://numpy.org/ (accessed on 8 September 2021).
- Learning Libraries. Seaborn. Available online: https://seaborn.pydata.org/ (accessed on 8 September 2021).
- Dickey, D.A.; Fuller, W.A. Distribution of the Estimators for Autoregressive Time Series with a Unit Root. J. Am. Stat. Assoc. 1979, 74, 427–431. [Google Scholar] [CrossRef]
- Ray, S. A Quick Review of Machine Learning Algorithms. In Proceedings of the 2019 International Conference on Machine Learning, Big Data, Cloud and Parallel Computing (COMITCon), Faridabad, India, 14–16 February 2019; pp. 35–39. [Google Scholar] [CrossRef]
- Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning; MIT Press: Cambridge, MA, USA, 2016; Available online: http://www.deeplearningbook.org (accessed on 28 July 2021).
- Hochreiter, S.; Schmidhuber, J. Long Short-Term Memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
- Cho, K.; van Merrienboer, B.; Gulcehre, C.; Bahdanau, D.; Bougares, F.; Schwenk, H.; Bengio, Y. Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation. arXiv 2014, arXiv:cs.CL/1406.1078. [Google Scholar]
- Gonçalves, J.N.; Cortez, P.; Carvalho, M.S.; Frazão, N.M. A multivariate approach for multi-step demand forecasting in assembly industries: Empirical evidence from an automotive supply chain. Dec. Support Syst. 2021, 142, 113452. [Google Scholar] [CrossRef]
- Yan, R.; Liao, J.; Yang, J.; Sun, W.; Nong, M.; Li, F. Multi-hour and multi-site air quality index forecasting in Beijing using CNN, LSTM, CNN-LSTM, and spatiotemporal clustering. Expert Syst. Appl. 2021, 169, 114513. [Google Scholar] [CrossRef]
- Kim, T.Y.; Cho, S.B. Predicting residential energy consumption using CNN-LSTM neural networks. Energy 2019, 182, 72–81. [Google Scholar] [CrossRef]
- Rafi, S.H.; Deeba, S.R.; Hossain, E. A Short-Term Load Forecasting Method Using Integrated CNN and LSTM Network. IEEE Access 2021, 9, 32436–32448. [Google Scholar] [CrossRef]
- Cioffi, R.; Travaglioni, M.; Piscitelli, G.; Petrillo, A.; De Felice, F. Artificial Intelligence and Machine Learning Applications in Smart Production: Progress, Trends, and Directions. Sustainability 2020, 12, 492. [Google Scholar] [CrossRef] [Green Version]
- Almalaq, A.; Edwards, G. A Review of Deep Learning Methods Applied on Load Forecasting. In Proceedings of the 2017 16th IEEE International Conference on Machine Learning and Applications (ICMLA), Cancun, Mexico, 18–21 December 2017; pp. 511–516. [Google Scholar] [CrossRef]
- Zor, K.; Timur, O.; Teke, A. A state-of-the-art review of artificial intelligence techniques for short-term electric load forecasting. In Proceedings of the 2017 6th International Youth Conference on Energy (IYCE), Budapest, Hungary, 21–24 June 2017; pp. 1–7. [Google Scholar] [CrossRef]
- Zhang, L.; Wen, J.; Li, Y.; Chen, J.; Ye, Y.; Fu, Y.; Livingood, W. A review of machine learning in building load prediction. Appl. Energy 2021, 285, 116452. [Google Scholar] [CrossRef]
- Lu, K.; Meng, X.R.; Sun, W.X.; Zhang, R.G.; Han, Y.K.; Gao, S.; Su, D. GRU-based Encoder-Decoder for Short-term CHP Heat Load Forecast. Mater. Sci. Eng. 2018, 392, 062173. [Google Scholar] [CrossRef]
- Jiao, R.; Zhang, T.; Jiang, Y.; He, H. Short-Term Non-Residential Load Forecasting Based on Multiple Sequences LSTM Recurrent Neural Network. IEEE Access 2018, 6, 59438–59448. [Google Scholar] [CrossRef]
- Aguilar Madrid, E.; Antonio, N. Short-Term Electricity Load Forecasting with Machine Learning. Information 2021, 12, 50. [Google Scholar] [CrossRef]
- Wu, D.C.; Bahrami Asl, B.; Razban, A.; Chen, J. Air compressor load forecasting using artificial neural network. Expert Syst. Appl. 2021, 168, 114209. [Google Scholar] [CrossRef]
- Zheng, J.; Xu, C.; Zhang, Z.; Li, X. Electric load forecasting in smart grids using Long-Short-Term-Memory based Recurrent Neural Network. In Proceedings of the 2017 51st Annual Conference on Information Sciences and Systems (CISS), Baltimore, MD, USA, 22–24 March 2017; pp. 1–6. [Google Scholar] [CrossRef]
- Nguyen, H.; Tran, K.; Thomassey, S.; Hamad, M. Forecasting and Anomaly Detection approaches using LSTM and LSTM Autoencoder techniques with the applications in supply chain management. Int. J. Inf. Manag. 2021, 57, 102282. [Google Scholar] [CrossRef]
- Sutskever, I.; Vinyals, O.; Le, Q.V. Sequence to Sequence Learning with Neural Networks. CoRR 2014, 2014, 3104–3112. [Google Scholar]
- Irsoy, O.; Cardie, C. Deep Recursive Neural Networks for Compositionality in Language. In Advances in Neural Information Processing Systems; Ghahramani, Z., Welling, M., Cortes, C., Lawrence, N., Weinberger, K.Q., Eds.; Curran Associates, Inc.: New York, NY, USA, 2014; Volume 27. [Google Scholar]
- Yang, S.; Yu, X.; Zhou, Y. LSTM and GRU Neural Network Performance Comparison Study: Taking Yelp Review Dataset as an Example. In Proceedings of the 2020 International Workshop on Electronic Communication and Artificial Intelligence (IWECAI), Shanghai, China, 12–14 June 2020; pp. 98–101. [Google Scholar] [CrossRef]
- Chung, J.; Gulcehre, C.; Cho, K.; Bengio, Y. Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling. arXiv 2014, arXiv:cs.NE/1412.3555. [Google Scholar]
- Tazi, K.; Abdi, F.; Abbou, M.F. Demand and Energy Management in Smart Grid: Techniques and Implementation. In Proceedings of the 2017 International Renewable and Sustainable Energy Conference (IRSEC), Tangier, Morocco, 4–7 December 2017; pp. 1–6. [Google Scholar] [CrossRef]
- Liu, R.; Liu, Y.; Jing, Z. Impact of industrial virtual power plant on renewable energy integration. Glob. Energy Interconn. 2020, 3, 545–552. [Google Scholar] [CrossRef]
- Agency, I.E. World Electricity Final Consumption by Sector, 1974–2018; IEA: Paris, France; Available online: https://www.iea.org/data-and-statistics/charts/world-electricity-final-consumption-by-sector-1974-2018 (accessed on 6 October 2021).
Market Entity | Advantages | Disadvantages/Obstacles |
---|---|---|
DSO | optimized power flow; RES integration; lower grid losses | big data; high and inconsistent errors by similar models and methods (depending on the period of time, load category, forecasting window); additional resources required (time, qualified employees, software, training) |
TSO | lower transmission losses, better planning | big data processing; high errors; inconsistent forecasting; |
Energy Supply | better trade, lower costs for portfolio balancing | client commitment; communication about activity planning; high errors or inconsistency of load forecasting; |
End-user | lower bills, increased energy efficiency | low interest, as main activities are more important than electricity usage; lack of transparency; |
Coeff. | Std. Error | t Stat | p-Value | Lower 95% | Upper 95% | Lower 95.0% | Upper 95.0% | ||
---|---|---|---|---|---|---|---|---|---|
0.41 | 0.039 | 6.823 | 0.0000000 | 0.157 | 0.31 | 0.17 | 0.401 | ||
T-1 | 0.311 | 0.0121 | 31.86 | 0.0000000 | 0.46 | 0.402 | 0.3 | 0.32 | |
T-2 | −0.016 | 0.012 | −1.277 | 0.2015198 | −0.040 | 0.008 | −0.040 | 0.008 | |
T-3 | 0.039 | 0.012 | 3.220 | 0.0012885 | 0.015 | 0.063 | 0.015 | 0.063 | |
T-4 | 0.071 | 0.012 | 5.791 | 0.0000000 | 0.047 | 0.095 | 0.047 | 0.095 | |
T-5 | −0.008 | 0.012 | −0.653 | 0.5135898 | −0.032 | 0.016 | −0.032 | 0.016 | |
T-6 | 0.025 | 0.012 | 2.092 | 0.0364606 | 0.002 | 0.048 | 0.002 | 0.048 | |
T-7 | 0.576 | 0.011 | 51.288 | 0.0000000 | 0.554 | 0.598 | 0.554 | 0.598 | |
T-8 | −0.295 | 0.012 | −25.119 | 0.0000000 | −0.318 | −0.272 | −0.318 | −0.272 | |
T-9 | −0.013 | 0.012 | −1.087 | 0.2769461 | −0.037 | 0.011 | −0.037 | 0.011 | |
T-10 | −0.040 | 0.012 | −3.244 | 0.0011830 | −0.064 | −0.016 | −0.064 | −0.016 | |
T-11 | −0.071 | 0.012 | −5.792 | 0.0000000 | −0.095 | −0.047 | −0.095 | −0.047 | |
T-12 | 0.006 | 0.012 | 0.497 | 0.6189813 | −0.018 | 0.030 | −0.018 | 0.030 | |
T-13 | −0.023 | 0.012 | −1.920 | 0.0548434 | −0.046 | 0.000 | −0.046 | 0.000 | |
T-14 | 0.303 | 0.011 | 28.200 | 0.0000000 | 0.282 | 0.324 | 0.282 | 0.324 |
Mean Absolute Error | Root Mean Square Error | Mean Absolute Percentage Error |
---|---|---|
MAE = | RMSE = | MAPE = |
Method | Parameters Considered |
---|---|
AR | Autogressive prediction for each hour is based on the same hour in the past 14 days. Coefficients are presented in Table 2 |
MLP | Multi-Layer Perceptron. Input matrix [24,11]. Input variable: Past 14 days, Day of week, Working/non-working day and hours, special days, temperature. 2 × hidden layers (300, 200). Output layer: Dense; Activation: Relu; Optimizer: Adam; Loss: MSE; Epochs: 100 |
Simple RNN | Recurrent neural network. Input matrix [24,11]. Input variable: Past 14 days, Day of week, Working/non-working day and hours, Special days, temperature, humidity, dew point. 3 × hidden layers (100, 100, 96). Output layer: Dense; Activation: Tanh, Sigmoid; Optimizer: Adam; Loss: MSE; Epochs: 100 |
LSTM | Long short-term memory. Input matrix [24,11]. Input variable: Past 14 days, Day of week, Working/non-working day and hours, special days, temperature, humidity, dew point. 3 × hidden layers (100, 100, 168). Output layer: Dense; Activation: Tanh, Sigmoid; Optimizer: Adam; Loss: mean square error; Epochs: 100 |
LSTM encoder-decoder | Input matrix [24,11]. Input variable: Past 14 days, Day of week, Working/non-working day and hours, special days, temperature, humidity, dew point. 3 × hidden layers (100, 100, 100), 1 × Repeat Vector, 1 × Time distributed layer (96). Activation: Tanh, Sigmoid; Optimizer: Adam; Loss: mean square error; Epochs: 100 |
GRU | Gated recurrent unit. Input matrix [24,11]. Input variable: Past 14 days, Day of the week, Working/non-working day and hours, special days, temperature, humidity, dew point. 3 × hidden layers (100, 100, 48). Output layer: Dense; Activation: Tanh, Sigmoid; Optimizer: Adam; Loss: mean square error; Epochs: 100 |
GRU—LSTM | Combination of LSTM and GRU layers. Input matrix [24,11]. Input variable: Past 14 days, AR(9), Day of week, Working/non-working day and hours, special days, temperature, humidity, dew point. 3 × hidden layers (100, 100, 48). Output: Dense; Activation: Tanh, Sigmoid; Optimizer: Adam; Loss: MSE; Epochs: 100 |
AR(9) | LSTM enc dec | LSTM-GRU | LSTM | MLP | Simple RNN | GRU | |
---|---|---|---|---|---|---|---|
MAPE | 5.53% | 6.28% | 5.14% | 5.43% | 5.71% | 6.63% | 4.82% |
RMSE | 0.146 | 0.193 | 0.138 | 0.141 | 0.165 | 0.181 | 0.131 |
MAE | 0.112 | 0.145 | 0.1001 | 0.104 | 0.129 | 0.140 | 0.0998 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ungureanu, S.; Topa, V.; Cziker, A.C. Deep Learning for Short-Term Load Forecasting—Industrial Consumer Case Study. Appl. Sci. 2021, 11, 10126. https://doi.org/10.3390/app112110126
Ungureanu S, Topa V, Cziker AC. Deep Learning for Short-Term Load Forecasting—Industrial Consumer Case Study. Applied Sciences. 2021; 11(21):10126. https://doi.org/10.3390/app112110126
Chicago/Turabian StyleUngureanu, Stefan, Vasile Topa, and Andrei Cristinel Cziker. 2021. "Deep Learning for Short-Term Load Forecasting—Industrial Consumer Case Study" Applied Sciences 11, no. 21: 10126. https://doi.org/10.3390/app112110126
APA StyleUngureanu, S., Topa, V., & Cziker, A. C. (2021). Deep Learning for Short-Term Load Forecasting—Industrial Consumer Case Study. Applied Sciences, 11(21), 10126. https://doi.org/10.3390/app112110126