A Spiking Neural Network Based Wind Power Forecasting Model for Neuromorphic Devices
Abstract
:1. Introduction
2. Methodology
- Build the non-spiking neural model as usual. The network must be designed considering the specific requirements for its implementation on Loihi hardware, such as the communication with the chip.
- Train the equivalent rate-based network with the methodology described by Hunsberger and Eliasmith [42], the default method implemented in NengoDL to train SNNs.
- Replace the activation functions with their spiking counterparts. We used spiking Rectified Linear Unit (ReLU) activations for the inference process. The activation profile of this function is restricted by the discretization required for the Loihi chip [43], leading to discrepancies compared to the theoretical spiking ReLU activation (Figure 1). Such discrepancies increase for higher firing rates due to this discretization. Furthermore, the Loihi chip can only fire a spike once per timestep, limiting its firing rate at a maximum of 1000 Hz. This constraint does not exist otherwise, and multiple spikes could, in theory, be fired simultaneously and exceed that value [44].
- Run the network using the NengoDL framework, setting parameters such as the number of timesteps that each input will present to the spiking model, allowing for the network to settle and spike in the given timeframe, and the firing rate scale, letting the network spike at a higher rate. These preliminary results will help us monitor the neural activities and tune the parameters of the SNN.
- Once an acceptable model performance is reached, we need to configure some additional parameters to set up the SNN for Loihi and simulate it for either Loihi hardware or the emulator [45] to replicate the chip’s behavior. This is achieved with the extra functionalities provided by the library NengoLoihi.
- Collect the results to evaluate them. One-step ahead point predictions are calculated, and normalized mean absolute error (NMAE) [46] is the metric used to measure the accuracy of these forecasts.
2.1. ANN-to-SNN Conversion
2.2. Spiking Model Architecture
3. Results
3.1. Synthetic Signal Forecasting
3.2. Load Forecasting
3.3. Wind Power Forecasting
4. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
ANN | Artificial Neural Network |
CNN | Convolutional Neural Network |
DL | Deep Learning |
ENTSO-E | European Network of Transmission System Operators for Electricity |
FFNN | Feedforward Neural Network |
GRU | Gated Recurrent Unit |
LIF | Leaky Integrate-and-Fire |
LSTM | Long Short-Term Memory |
ML | Machine Learning |
NEF | Neural Engineering Framework |
NMAE | Normalized Mean Absolute Error |
SLAYER | Spike Layer Error Reassignment |
ReLU | Rectified Linear Unit |
SNN | Spiking Neural Network |
VMD | Variational Mode Decomposition |
WPF | Wind Power Forecasting |
References
- Lim, B.; Zohren, S. Time-series forecasting with deep learning: A survey. Philos. Trans. R. Soc. A 2021, 379, 20200209. [Google Scholar] [CrossRef] [PubMed]
- Fawaz, H.I.; Forestier, G.; Weber, J.; Idoumghar, L.; Muller, P.A. Deep learning for time series classification: A review. Data Min. Knowl. Discov. 2019, 33, 917–963. [Google Scholar] [CrossRef] [Green Version]
- Ma, Q.; Zheng, J.; Li, S.; Cottrell, G.W. Learning representations for time series clustering. Adv. Neural Inf. Process. Syst. 2019, 32, 3781–3791. [Google Scholar] [CrossRef]
- Wang, Y.; Zou, R.; Liu, F.; Zhang, L.; Liu, Q. A review of wind speed and wind power forecasting with deep neural networks. Appl. Energy 2021, 304, 117766. [Google Scholar] [CrossRef]
- Marugán, A.P.; Márquez, F.P.G.; Perez, J.M.P.; Ruiz-Hernández, D. A survey of artificial neural network in wind energy systems. Appl. Energy 2018, 228, 1822–1836. [Google Scholar] [CrossRef] [Green Version]
- Rumelhart, D.E.; Durbin, R.; Golden, R.; Chauvin, Y. Backpropagation: The basic theory. In Backpropagation: Theory, Architectures and Applications; Psychology Press: London, UK, 1995; pp. 1–34. [Google Scholar]
- Wang, R.; Li, C.; Fu, W.; Tang, G. Deep learning method based on gated recurrent unit and variational mode decomposition for short-term wind power interval prediction. IEEE Trans. Neural Netw. Learn. Syst. 2019, 31, 3814–3827. [Google Scholar] [CrossRef] [PubMed]
- Li, C.; Tang, G.; Xue, X.; Chen, X.; Wang, R.; Zhang, C. The short-term interval prediction of wind power using the deep learning model with gradient descend optimization. Renew. Energy 2020, 155, 197–211. [Google Scholar] [CrossRef]
- Yildiz, C.; Acikgoz, H.; Korkmaz, D.; Budak, U. An improved residual-based convolutional neural network for very short-term wind power forecasting. Energy Convers. Manag. 2021, 228, 113731. [Google Scholar] [CrossRef]
- He, Y.; Li, H.; Wang, S.; Yao, X. Uncertainty analysis of wind power probability density forecasting based on cubic spline interpolation and support vector quantile regression. Neurocomputing 2021, 430, 121–137. [Google Scholar] [CrossRef]
- Lahouar, A.; Slama, J.B.H. Hour-ahead wind power forecast based on random forests. Renew. Energy 2017, 109, 529–541. [Google Scholar] [CrossRef]
- Landry, M.; Erlinger, T.P.; Patschke, D.; Varrichio, C. Probabilistic gradient boosting machines for GEFCom2014 wind forecasting. Int. J. Forecast. 2016, 32, 1061–1066. [Google Scholar] [CrossRef]
- Mohammadzaheri, M.; Mirsepahi, A.; Asef-afshar, O.; Koohi, H. Neuro-fuzzy modeling of superheating system of a steam power plant. Appl. Math. Sci 2007, 1, 2091–2099. [Google Scholar]
- Mohammadzaheri, M.; AlQallaf, A.; Ghodsi, M.; Ziaiefar, H. Development of a fuzzy model to estimate the head of gaseous petroleum fluids driven by electrical submersible pumps. Fuzzy Inf. Eng. 2018, 10, 99–106. [Google Scholar] [CrossRef] [Green Version]
- Bengio, Y. Learning Deep Architectures for AI; Now Publishers Inc.: Hannover, MA, USA, 2009. [Google Scholar]
- Wang, K.; Qi, X.; Liu, H.; Song, J. Deep belief network based k-means cluster approach for short-term wind power forecasting. Energy 2018, 165, 840–852. [Google Scholar] [CrossRef]
- Hong, Y.Y.; Rioflorido, C.L.P.P. A hybrid deep learning-based neural network for 24-h ahead wind power forecasting. Appl. Energy 2019, 250, 530–539. [Google Scholar] [CrossRef]
- Putz, D.; Gumhalter, M.; Auer, H. A novel approach to multi-horizon wind power forecasting based on deep neural architecture. Renew. Energy 2021, 178, 494–505. [Google Scholar] [CrossRef]
- Munawar, U.; Wang, Z. A framework of using machine learning approaches for short-term solar power forecasting. J. Electr. Eng. Technol. 2020, 15, 561–569. [Google Scholar] [CrossRef]
- Nam, K.; Hwangbo, S.; Yoo, C. A deep learning-based forecasting model for renewable energy scenarios to guide sustainable energy policy: A case study of Korea. Renew. Sustain. Energy Rev. 2020, 122, 109725. [Google Scholar] [CrossRef]
- González Sopeña, J.; Pakrashi, V.; Ghosh, B. Can we improve short-term wind power forecasts using turbine-level data? A case study in Ireland. In Proceedings of the 2021 IEEE Madrid PowerTech, Madrid, Spain, 28 June–July 2021; pp. 1–6. [Google Scholar]
- González Sopeña, J.; Maury, C.; Pakrashi, V.; Ghosh, B. Turbine-Level Clustering for Improved Short-Term Wind Power Forecasting; Journal of Physics: Conference Series; IOP Publishing: Bristol, UK, 2022; Volume 2265, p. 022052. [Google Scholar]
- Wang, X.; Han, Y.; Leung, V.C.; Niyato, D.; Yan, X.; Chen, X. Convergence of edge computing and deep learning: A comprehensive survey. IEEE Commun. Surv. Tutor. 2020, 22, 869–904. [Google Scholar] [CrossRef] [Green Version]
- Li, W.; Yang, T.; Delicato, F.C.; Pires, P.F.; Tari, Z.; Khan, S.U.; Zomaya, A.Y. On enabling sustainable edge computing with renewable energy resources. IEEE Commun. Mag. 2018, 56, 94–101. [Google Scholar] [CrossRef]
- Justus, D.; Brennan, J.; Bonner, S.; McGough, A.S. Predicting the computational cost of deep learning models. In Proceedings of the 2018 IEEE International Conference on Big Data (Big Data), Seattle, WA, USA, 10–13 December 2018; pp. 3873–3882. [Google Scholar]
- Lin, C.K.; Wild, A.; Chinya, G.N.; Cao, Y.; Davies, M.; Lavery, D.M.; Wang, H. Programming spiking neural networks on Intel’s Loihi. Computer 2018, 51, 52–61. [Google Scholar] [CrossRef]
- Maass, W. Networks of spiking neurons: The third generation of neural network models. Neural Netw. 1997, 10, 1659–1671. [Google Scholar] [CrossRef]
- Davies, M.; Wild, A.; Orchard, G.; Sandamirskaya, Y.; Guerra, G.A.F.; Joshi, P.; Plank, P.; Risbud, S.R. Advancing neuromorphic computing with Loihi: A survey of results and outlook. Proc. IEEE 2021, 109, 911–934. [Google Scholar] [CrossRef]
- Stewart, K.; Orchard, G.; Shrestha, S.B.; Neftci, E. On-chip few-shot learning with surrogate gradient descent on a neuromorphic processor. In Proceedings of the 2020 2nd IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS), Genova, Italy, 31 August–2 September 2020; pp. 223–227. [Google Scholar]
- Tavanaei, A.; Maida, A. BP-STDP: Approximating backpropagation using spike timing dependent plasticity. Neurocomputing 2019, 330, 39–47. [Google Scholar] [CrossRef] [Green Version]
- Bellec, G.; Scherr, F.; Subramoney, A.; Hajek, E.; Salaj, D.; Legenstein, R.; Maass, W. A solution to the learning dilemma for recurrent networks of spiking neurons. Nat. Commun. 2020, 11, 3625. [Google Scholar] [CrossRef]
- Kasabov, N.; Scott, N.M.; Tu, E.; Marks, S.; Sengupta, N.; Capecci, E.; Othman, M.; Doborjeh, M.G.; Murli, N.; Hartono, R.; et al. Evolving spatio-temporal data machines based on the NeuCube neuromorphic framework: Design methodology and selected applications. Neural Netw. 2016, 78, 1–14. [Google Scholar] [CrossRef] [Green Version]
- Diehl, P.U.; Neil, D.; Binas, J.; Cook, M.; Liu, S.C.; Pfeiffer, M. Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing. In Proceedings of the 2015 International Joint Conference on Neural Networks (IJCNN), Killarney, Ireland, 12–17 July 2015; pp. 1–8. [Google Scholar]
- Taherkhani, A.; Belatreche, A.; Li, Y.; Maguire, L.P. DL-ReSuMe: A delay learning-based remote supervised method for spiking neurons. IEEE Trans. Neural Netw. Learn. Syst. 2015, 26, 3137–3149. [Google Scholar] [CrossRef]
- Bekolay, T.; Bergstra, J.; Hunsberger, E.; DeWolf, T.; Stewart, T.C.; Rasmussen, D.; Choo, X.; Voelker, A.; Eliasmith, C. Nengo: A Python tool for building large-scale functional brain models. Front. Neuroinform. 2014, 7, 48. [Google Scholar] [CrossRef]
- Eliasmith, C.; Anderson, C.H. Neural Engineering: Computation, Representation, and Dynamics in Neurobiological Systems; MIT Press: Cambridge, MA, USA, 2003. [Google Scholar]
- Rasmussen, D. NengoDL: Combining deep learning and neuromorphic modelling methods. Neuroinformatics 2019, 17, 611–628. [Google Scholar] [CrossRef] [Green Version]
- Abadi, M.; Agarwal, A.; Barham, P.; Brevdo, E.; Chen, Z.; Citro, C.; Corrado, G.S.; Davis, A.; Dean, J.; Devin, M.; et al. TensorFlow: Large-Scale Machine Learning on Heterogeneous Systems. 2015. Available online: http://tensorflow.org (accessed on 27 September 2022).
- Shrestha, S.B.; Orchard, G. Slayer: Spike layer error reassignment in time. In Proceedings of the Advances in Neural Information Processing Systems, Montreal, QC, Canada, 3–8 December 2018; pp. 1412–1421. [Google Scholar]
- Intel’s Neuromorphic Computing Lab. Lava: A Software Framework for Neuromorphic Computing. 2021. Available online: https://github.com/lava-nc/lava (accessed on 25 March 2022).
- Davies, M.; Srinivasa, N.; Lin, T.H.; Chinya, G.; Cao, Y.; Choday, S.H.; Dimou, G.; Joshi, P.; Imam, N.; Jain, S.; et al. Loihi: A neuromorphic manycore processor with on-chip learning. IEEE Micro 2018, 38, 82–99. [Google Scholar] [CrossRef]
- Hunsberger, E.; Eliasmith, C. Training spiking deep networks for neuromorphic hardware. arXiv 2016, arXiv:1611.05141. [Google Scholar]
- DeWolf, T.; Jaworski, P.; Eliasmith, C. Nengo and low-power AI hardware for robust, embedded neurorobotics. Front. Neurorobot. 2020, 14, 568359. [Google Scholar] [CrossRef] [PubMed]
- Applied Brain Research. Converting a Keras Model to an SNN on Loihi. 2021. Available online: https://www.nengo.ai/nengo-loihi/v1.0.0/examples/keras-to-loihi.html (accessed on 20 April 2022).
- Voelker, A.R.; Eliasmith, C. Programming neuromorphics using the Neural Engineering Framework. In Handbook of Neuroengineering; Springer Nature: Singapore, 2020; pp. 1–43. [Google Scholar]
- González Sopeña, J.; Pakrashi, V.; Ghosh, B. An overview of performance evaluation metrics for short-term statistical wind power forecasting. Renew. Sustain. Energy Rev. 2021, 138, 110515. [Google Scholar] [CrossRef]
- Tavanaei, A.; Ghodrati, M.; Kheradpisheh, S.R.; Masquelier, T.; Maida, A. Deep learning in spiking neural networks. Neural Netw. 2019, 111, 47–63. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Cao, Y.; Chen, Y.; Khosla, D. Spiking deep convolutional neural networks for energy-efficient object recognition. Int. J. Comput. Vis. 2015, 113, 54–66. [Google Scholar] [CrossRef]
- Schmidt-Hieber, J. Nonparametric regression using deep neural networks with ReLU activation function. Ann. Stat. 2020, 48, 1875–1897. [Google Scholar]
- González Sopeña, J.M.; Pakrashi, V.; Ghosh, B. Decomposition-based hybrid models for very short-term wind power forecasting. Eng. Proc. 2021, 5, 39. [Google Scholar]
- Dragomiretskiy, K.; Zosso, D. Variational mode decomposition. IEEE Trans. Signal Process. 2013, 62, 531–544. [Google Scholar] [CrossRef]
- Tang, L.; Lv, H.; Yang, F.; Yu, L. Complexity testing techniques for time series data: A comprehensive literature review. Chaos Solitons Fractals 2015, 81, 117–135. [Google Scholar] [CrossRef]
- Ren, Y.; Suganthan, P.; Srikanth, N. A comparative study of empirical mode decomposition-based short-term wind speed forecasting methods. IEEE Trans. Sustain. Energy 2014, 6, 236–244. [Google Scholar] [CrossRef]
- Hong, T.; Fan, S. Probabilistic electric load forecasting: A tutorial review. Int. J. Forecast. 2016, 32, 914–938. [Google Scholar] [CrossRef]
- Quan, H.; Srinivasan, D.; Khosravi, A. Short-term load and wind power forecasting using neural network-based prediction intervals. IEEE Trans. Neural Netw. Learn. Syst. 2013, 25, 303–315. [Google Scholar] [CrossRef] [PubMed]
- Sadaei, H.J.; e Silva, P.C.d.L.; Guimarães, F.G.; Lee, M.H. Short-term load forecasting by using a combined method of convolutional neural networks and fuzzy time series. Energy 2019, 175, 365–377. [Google Scholar] [CrossRef]
- ENTSO-E. Hourly Load Demand Data. 2021. Available online: https://www.entsoe.eu/data/power-stats/ (accessed on 20 April 2022).
- Chollet, F. Keras. 2015. Available online: https://github.com/fchollet/keras (accessed on 20 April 2022).
- Patel, K.; Hunsberger, E.; Batir, S.; Eliasmith, C. A spiking neural network for image segmentation. arXiv 2021, arXiv:2106.08921. [Google Scholar]
- Tan, C.; Šarlija, M.; Kasabov, N. Spiking neural networks: Background, recent development and the NeuCube architecture. Neural Process. Lett. 2020, 52, 1675–1701. [Google Scholar] [CrossRef]
- Kasabov, N.K. Time-Space, Spiking Neural Networks and Brain-Inspired Artificial Intelligence; Springer: Berlin, Germany, 2019. [Google Scholar]
- Davies, M. Benchmarks for progress in neuromorphic computing. Nature Mach. Intell. 2019, 1, 386–388. [Google Scholar] [CrossRef]
- Orchard, G.; Frady, E.P.; Rubin, D.B.D.; Sanborn, S.; Shrestha, S.B.; Sommer, F.T.; Davies, M. Efficient Neuromorphic Signal Processing with Loihi 2. In Proceedings of the 2021 IEEE Workshop on Signal Processing Systems (SiPS), Coimbra, Portugal, 19–21 October 2021; pp. 254–259. [Google Scholar]
Neuron Type | Spiking Amplitude | Firing Rate Scale | |
---|---|---|---|
Mode 1 | Spiking ReLU | 0.1 | 25 |
Mode 2 | Spiking ReLU | 0.05 | 40 |
Mode 3 | Spiking ReLU | 0.01 | 70 |
Mode 4 | Spiking ReLU | 0.1 | 90 |
Mode 5 | Spiking ReLU | 0.3 | 200 |
Mode 6 | Spiking ReLU | 0.3 | 200 |
Mode 7 | Spiking ReLU | 0.5 | 400 |
Mode 8 | Spiking ReLU | 1.5 | 500 |
Off-Chip Layer | Conv Layer | Dense Layer | |
---|---|---|---|
Mode 1 | 8.1 | 8.3 | 12.0 |
Mode 2 | 1.9 | 1.8 | 2.1 |
Mode 3 | 7.2 | 4.9 | 2.5 |
Mode 4 | 1.6 | 1.2 | 1.0 |
Mode 5 | 1.2 | 1.3 | 1.0 |
Mode 6 | 1.1 | 1.0 | 1.0 |
Mode 7 | 1.4 | 1.1 | 1.0 |
Mode 8 | 3.7 | 9.6 | 11.5 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
González Sopeña, J.M.; Pakrashi, V.; Ghosh, B. A Spiking Neural Network Based Wind Power Forecasting Model for Neuromorphic Devices. Energies 2022, 15, 7256. https://doi.org/10.3390/en15197256
González Sopeña JM, Pakrashi V, Ghosh B. A Spiking Neural Network Based Wind Power Forecasting Model for Neuromorphic Devices. Energies. 2022; 15(19):7256. https://doi.org/10.3390/en15197256
Chicago/Turabian StyleGonzález Sopeña, Juan Manuel, Vikram Pakrashi, and Bidisha Ghosh. 2022. "A Spiking Neural Network Based Wind Power Forecasting Model for Neuromorphic Devices" Energies 15, no. 19: 7256. https://doi.org/10.3390/en15197256
APA StyleGonzález Sopeña, J. M., Pakrashi, V., & Ghosh, B. (2022). A Spiking Neural Network Based Wind Power Forecasting Model for Neuromorphic Devices. Energies, 15(19), 7256. https://doi.org/10.3390/en15197256