A Hybrid Neural Network Model for Power Demand Forecasting
Abstract
:1. Introduction
2. Related Work
2.1. Power Demand Forecasting Using Deep Learning
2.2. Approaches Based on a Hybrid Network Model
3. Data Processing and Deep Learning Models
3.1. Data Processing
3.1.1. Vertical Partitioning and <Key, Context[1, c]> Pairing
3.1.2. Overlapped Window and Dataset
3.2. (c, l)-LSTM+CNN Hybrid Forecasting Model
4. Experiments and Results
4.1. Experiment Environment and Determination of the Number of Layers l
4.2. Case 1: With Holidays
4.3. Case 2: Without Holidays
4.4. Forecasting an n-Day Profile
5. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Yuan, C.; Liu, S.; Fang, Z. Comparison of China’s primary energy consumption forecasting by using ARIMA (the autoregressive integrated moving average) model and GM (1, 1) model. Energy 2016, 100, 384–390. [Google Scholar] [CrossRef]
- Taylor, J.W. Short-term electricity demand forecasting using double seasonal exponential smoothing. J. Oper. Res. Soc. 2003, 54, 799–805. [Google Scholar] [CrossRef] [Green Version]
- Bair, E.; Hastie, T.; Paul, D.; Tibshirani, R. Prediction by supervised principal components. J. Am. Stat. Assoc. 2006, 101, 119–137. [Google Scholar] [CrossRef]
- Wen, T.H.; Gasic, M.; Mrksic, N.; Su, P.H.; Vandyke, D.; Young, S. Semantically conditioned LSTM-based natural language generation for spoken dialogue systems. arXiv, 2015; arXiv:1508.01745. [Google Scholar]
- Sundermeyer, M.; Ney, H.; Schlüter, R. From feedforward to recurrent LSTM neural networks for language modeling. IEEE/ACM Trans. Audio Speech Lang. Process. 2015, 23, 517–529. [Google Scholar] [CrossRef]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Syst. 2012, 25, 1106–1114. [Google Scholar] [CrossRef]
- Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going deeper with convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015. [Google Scholar]
- Chen, L.C.; Papandreou, G.; Kokkinos, I.; Murphy, K.; Yuille, A.L. Deeplab: Semantic image segmentation with deep convolutional nets, Atrous convolution, and Fully connected CRFs. IEEE Trans. Pattern Anal. Mach. Intell. 2018, 40, 834–848. [Google Scholar] [CrossRef] [PubMed]
- Qiu, X.; Zhang, L.; Ren, Y.; Suganthan, P.N.; Amaratunga, G. Ensemble deep learning for regression and time series forecasting. In Proceedings of the 2014 IEEE Symposium on Computational Intelligence in Ensemble Learning (CIEL), Orlando, FL, USA, 9–12 December 2014. [Google Scholar]
- Zheng, Y.; Liu, Q.; Chen, E.; Ge, Y.; Zhao, J.L. Time series classification using multi-channels deep convolutional neural networks. In International Conference on Web-Age Information Management; Springer: Cham, Switzerland, 2014. [Google Scholar]
- Yang, J.; Nguyen, M.N.; San, P.P.; Li, X.L.; Krishnaswamy, S. Deep Convolutional Neural Networks on Multichannel Time Series for Human Activity Recognition. In Proceedings of the Twenty-Fourth International Joint Conference on Artificial Intelligence (IJCAI 2015), Buenos Aires, Argentina, 25–31 July 2015. [Google Scholar]
- Khosravani, H.; Castilla, M.; Berenguel, M.; Ruano, A.; Ferreira, P. A comparison of energy consumption prediction models based on neural networks of a bioclimatic building. Energies 2016, 9, 57. [Google Scholar] [CrossRef]
- Ryu, S.; Noh, J.; Kim, H. Deep neural network based demand side short term load forecasting. Energies 2016, 10, 3. [Google Scholar] [CrossRef]
- Zheng, J.; Xu, C.; Zhang, Z.; Li, X. Electric load forecasting in smart grids using long-short-term-memory based recurrent neural network. In Proceedings of the 2017 51st Annual Conference on Information Sciences and Systems (CISS), Baltimore, MD, USA, 22–24 March 2017. [Google Scholar]
- Kong, W.; Dong, Z.Y.; Jia, Y.; Hill, D.J.; Xu, Y.; Zhang, Y. Short-term residential load forecasting based on LSTM recurrent neural network. IEEE Trans. Smart Grid 2019, 10, 841–851. [Google Scholar] [CrossRef]
- Hsu, D. Time Series Forecasting Based on Augmented Long Short-Term Memory. arXiv, 2017; arXiv:1707.00666. [Google Scholar]
- Gensler, A.; Henze, J.; Sick, B.; Raabe, N. Deep Learning for solar power forecasting—An approach using AutoEncoder and LSTM Neural Networks. In Proceedings of the 2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Budapest, Hungary, 9–12 October 2016. [Google Scholar]
- Marino, D.L.; Amarasinghe, K.; Manic, M. Building energy load forecasting using deep neural networks. In Proceedings of the 42nd Annual Conference of the IEEE Industrial Electronics Society (IECON 2016), Florence, Italy, 23–26 October 2016. [Google Scholar]
- Shi, H.; Xu, M.; Li, R. Deep learning for household load forecasting—A novel pooling deep RNN. IEEE Trans. Smart Grid 2018, 9, 5271–5280. [Google Scholar] [CrossRef]
- Amarasinghe, K.; Marino, D.L.; Manic, M. Deep neural networks for energy load forecasting. In Proceedings of the 2017 IEEE 26th International Symposium on Industrial Electronics (ISIE), Edinburgh, UK, 19–21 June 2017. [Google Scholar]
- Dong, X.; Qian, L.; Huang, L. A CNN based bagging learning approach to short-term load forecasting in smart grid. In Proceedings of the 2017 IEEE SmartWorld, Ubiquitous Intelligence & Computing, Advanced & Trusted Computed, Scalable Computing & Communications, Cloud & Big Data Computing, Internet of People and Smart City Innovation (SmartWorld/SCALCOM/UIC/ATC/CBDCom/IOP/SCI), San Francisco, CA, USA, 4–8 August 2017. [Google Scholar]
- Kuo, P.-H.; Huang, C.-J. A high precision artificial neural networks model for short-term energy load forecasting. Energies 2018, 11, 213. [Google Scholar] [CrossRef]
- Mikolov, T.; Zweig, G. Context dependent recurrent neural network language model. In Proceedings of the 2012 IEEE Spoken Language Technology Workshop (SLT), Miami, FL, USA, 2–5 December 2012; pp. 234–239. [Google Scholar]
- Liu, Q.; Wu, S.; Wang, L.; Tan, T. Predicting the Next Location: A Recurrent Model with Spatial and Temporal Contexts. In Proceedings of the Thirtieth AAAI Conference (AAAI-16), Phoenix, AZ, USA, 12–17 February 2016. [Google Scholar]
- Sainath, T.N.; Vinyals, O.; Senior, A.; Sak, H. Convolutional, long short-term memory, fully connected deep neural networks. In Proceedings of the 2015 IEEE International Conference on Acoustics, Speech and Signak Processing (ICASSP), Brisbane, QLD, Australia, 19–24 April 2015. [Google Scholar]
- Yan, K.; Wang, X.; Du, Y.; Jin, N.; Huang, H.; Zhou, H. Multi-Step Short-Term Power Consumption Forecasting with a Hybrid Deep Learning Strategy. Energies 2018, 11, 3089. [Google Scholar] [CrossRef]
- Srivastava, N.; Hinton, G.; Krizhevsky, A.; Sutskever, I.; Salakhutdinov, R. Dropout: A simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 2014, 15, 1929–1958. [Google Scholar]
- Kollia, I.; Kollias, S. A Deep Learning Approach for Load Demand Forecasting of Power Systems. In Proceedings of the 2018 IEEE Symposium Series on Computational Intelligence (SSCI), Bangalore, India, 18–21 November 2018. [Google Scholar]
- Tian, C.; Ma, J.; Zhang, C.; Zhan, P. A Deep Neural Network Model for Short-Term Load Forecast Based on Long Short-Term Memory Network and Convolutional Neural Network. Energies 2018, 11, 3493. [Google Scholar] [CrossRef]
- Khotanzad, A.; Afkhami-Rohani, R.; Maratukulam, D. ANNSTLF-artificial neural network short-term load forecaster-generation three. IEEE Trans. Power Syst. 1998, 13, 1413–1422. [Google Scholar] [CrossRef]
- Korea’s Daily Power Demand Data. 2017. Available online: https://www.kpx.or.kr/www/contents.do?key=15 (accessed on 10 April 2018).
- Deng, L.; Platt, J.C. Ensemble deep learning for speech recognition. In Proceedings of the Fifteenth Annual Conference of the International Speech Communication Association, Singapore, 14–18 September 2014. [Google Scholar]
- Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; et al. Scikit-learn: Machine learning in Python. J. Mach. Learn. Res. 2011, 12, 2825–2830. [Google Scholar]
- Kingma, D.P.; Ba, J. Adam: A method for stochastic optimization. arXiv, 2014; arXiv:1412.6980. [Google Scholar]
- Abadi, M.; Barham, P.; Chen, J.; Chen, Z.; Davis, A.; Dean, J.; Devin, M.; Ghemawat, S.; Irving, G.; Isard, M.; et al. Tensorflow: A system for large-scale machine learning. In Proceedings of the 12th USENIX Symposium on Operating Systems Design and Implementation (OSDI 16), Savannah, GA, USA, 2–4 November 2016. [Google Scholar]
- Chollet, F. Keras. 2015. Available online: https://keras.io (accessed on 7 August 2018).
- Abu-Shikhah, N.; Elkarmi, F.; Aloquili, O.M. Medium-term electric load forecasting using multivariable linear and non-linear regression. Smart Grid Renew. Energy 2011, 2, 126. [Google Scholar] [CrossRef]
- Saxena, H. Forecasting Strategies for Predicting Peak Electric Load Days. Master’s Thesis, Rochester Institute of Technology, New York, NY, USA, 2017. [Google Scholar]
- Bartoš, S. Prediction of Energy Load Profiles. Master’s Thesis, Charles University, Prague, Czech Republic, 2017. [Google Scholar]
Power | Key |
Context Information | Context |
Number of Domains | c |
Data Pair Set | <Key, Context[1, c]> |
Number of LSTM Layers | l |
Holidays Status | Dataset Type | Notation | Size of Training Set | Window Size |
---|---|---|---|---|
With holidays | All-day dataset | d1 | 24,948 | 14 |
Seasonal dataset | d2 | 5105 | 5 | |
Dataset by day | d3 | 2950 | 14 | |
Without holidays | Weekday dataset | d4 | 14,225 | 10 |
Seasonal dataset | d5 | 2680~2780 | 5 | |
Dataset by day | d6 | 2240~2280 | 10 |
Filter Size | 1, 8, 16, 32, 64, 128 |
Kernel Size | 1, 3, 5 |
Batch Size | 7, 14, 21, 28, 35, 42 |
Epoch | 10, 30, 50, 70, 80, 100, 120, 140 |
Optimizer | SGD, RMSprop, Adagrad, Adadelta, Adam, Adamax, Nadam |
Number of Layers l | MAPE (%) | RRMSE (%) |
---|---|---|
l = 1 | 3.60 | 4.47 |
l = 2 | 1.45 | 1.83 |
l = 3 | 1.85 | 2.08 |
l = 4 | 2.02 | 2.36 |
Dataset | Season | MAPE (%) | RRMSE (%) | ||||||
---|---|---|---|---|---|---|---|---|---|
ARIMA | (c, l)-LSTM | S2S LSTM | Proposed Model | ARIMA | (c, l)-LSTM | S2S LSTM | Proposed Model | ||
d1 | - | 4.78 | 3.42 | 2.66 | 1.45 | 5.70 | 3.93 | 2.86 | 1.83 |
d2 | Spring | 4.07 | 2.62 | 3.09 | 2.30 | 4.87 | 2.98 | 3.83 | 2.54 |
Summer | 4.32 | 3.13 | 3.51 | 2.35 | 5.09 | 3.78 | 3.64 | 2.57 | |
Autumn | 5.45 | 2.67 | 2.88 | 2.09 | 6.52 | 3.15 | 3.77 | 2.75 | |
Winter | 5.17 | 3.20 | 3.96 | 2.77 | 6.04 | 3.69 | 4.52 | 3.19 | |
d3 | - | 3.17 | 3.35 | 1.64 | 0.81 | 3.85 | 3.84 | 2.02 | 1.17 |
Dataset | Season | MAPE (%) | RRMSE (%) | ||||||
---|---|---|---|---|---|---|---|---|---|
ARIMA | (c, l)-LSTM | S2S LSTM | Proposed Model | ARIMA | (c, l)-LSTM | S2S LSTM | Proposed Model | ||
d4 | - | 2.20 | 2.53 | 1.20 | 1.26 | 2.24 | 2.59 | 1.80 | 2.04 |
d5 | Spring | 2.21 | 1.59 | 1.38 | 1.00 | 2.25 | 1.91 | 2.19 | 1.15 |
Summer | 2.08 | 2.99 | 1.27 | 2.08 | 2.13 | 3.12 | 1.43 | 2.27 | |
Autumn | 2.37 | 1.76 | 3.10 | 1.27 | 2.43 | 1.92 | 3.54 | 1.64 | |
Winter | 2.18 | 2.57 | 2.28 | 2.70 | 2.23 | 2.92 | 2.32 | 2.99 | |
d6 | - | 3.61 | 2.34 | 0.84 | 0.82 | 3.85 | 2.44 | 1.40 | 0.90 |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kim, M.; Choi, W.; Jeon, Y.; Liu, L. A Hybrid Neural Network Model for Power Demand Forecasting. Energies 2019, 12, 931. https://doi.org/10.3390/en12050931
Kim M, Choi W, Jeon Y, Liu L. A Hybrid Neural Network Model for Power Demand Forecasting. Energies. 2019; 12(5):931. https://doi.org/10.3390/en12050931
Chicago/Turabian StyleKim, Myoungsoo, Wonik Choi, Youngjun Jeon, and Ling Liu. 2019. "A Hybrid Neural Network Model for Power Demand Forecasting" Energies 12, no. 5: 931. https://doi.org/10.3390/en12050931
APA StyleKim, M., Choi, W., Jeon, Y., & Liu, L. (2019). A Hybrid Neural Network Model for Power Demand Forecasting. Energies, 12(5), 931. https://doi.org/10.3390/en12050931