Performance Comparison of Deep Learning Approaches in Predicting EV Charging Demand
Abstract
:1. Introduction
2. Literature Review
Deep Learning Models
3. Materials and Methods
3.1. Charging Demand Prediction Framework
3.1.1. Recurrent Neural Networks (RNNs)
3.1.2. Long Short-Term Memory (LSTM)
3.1.3. Bidimensional LSTM (Bi-LSTM)
3.1.4. Gated Recurrent Unit (GRU)
3.1.5. Convolutional Neural Networks (CNNs)
3.1.6. Transformers
3.2. Training Objective
3.3. Data
3.4. Data Preparation
4. Discussion
Model Performance
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A
Layers | Hidden Dimensions | Heads | Epochs | Daily | Weekly | Monthly | ||||
---|---|---|---|---|---|---|---|---|---|---|
RMSE | MAE | RMSE | MAE | RMSE | MAE | |||||
Transformer | 6 | 64 | 8 | 100 | 0.191 | 0.133 | 0.132 | 0.109 | 0.292 | 0.244 |
3 | 128 | 8 | 100 | 0.101 | 0.080 | 0.210 | 0.167 | 0.243 | 0.181 | |
6 | 64 | 1 | 50 | 0.073 | 0.062 | 0.151 | 0.125 | 0.284 | 0.262 | |
6 | 64 | 8 | 50 | 0.174 | 0.121 | 0.212 | 0.164 | 0.485 | 0.375 | |
3 | 32 | 8 | 200 | 0.112 | 0.083 | 0.117 | 0.091 | 0.283 | 0.181 | |
3 | 64 | 1 | 200 | 0.130 | 0.123 | 0.087 | 0.061 | 0.072 | 0.061 | |
3 | 128 | 8 | 50 | 0.24 | 0.143 | 0.334 | 0.271 | 0.485 | 0.364 | |
6 | 64 | 8 | 10 | 0.280 | 0.187 | 0.425 | 0.323 | 0.971 | 0.813 | |
LSTM | 6 | 64 | 8 | 100 | 0.301 | 0.225 | 0.322 | 0.229 | 0.336 | 0.256 |
3 | 128 | 8 | 100 | 0.303 | 0.236 | 0.343 | 0.231 | 0.297 | 0.228 | |
6 | 64 | 1 | 50 | 0.324 | 0.248 | 0.328 | 0.241 | 0.421 | 0.325 | |
6 | 64 | 8 | 50 | 0.321 | 0.246 | 0.324 | 0.249 | 0.386 | 0.290 | |
3 | 32 | 8 | 200 | 0.308 | 0.226 | 0.326 | 0.226 | 0.349 | 0.253 | |
3 | 64 | 1 | 200 | 0.314 | 0.234 | 0.313 | 0.230 | 0.271 | 0.206 | |
3 | 128 | 8 | 50 | 0.314 | 0.236 | 0.341 | 0.224 | 0.289 | 0.210 | |
6 | 64 | 8 | 10 | 0.300 | 0.222 | 0.519 | 0.408 | 0.690 | 0.546 | |
RNN | 6 | 64 | 8 | 100 | 0.383 | 0.210 | 0.390 | 0.298 | 0.221 | 0.163 |
3 | 128 | 8 | 100 | 0.413 | 0.257 | 0.351 | 0.268 | 0.200 | 0.150 | |
6 | 64 | 1 | 50 | 0.407 | 0.248 | 0.414 | 0.322 | 0.287 | 0.211 | |
6 | 64 | 8 | 50 | 0.466 | 0.248 | 0.427 | 0.328 | 0.315 | 0.231 | |
3 | 32 | 8 | 200 | 0.467 | 0.320 | 0.394 | 0.291 | 0.240 | 0.186 | |
3 | 64 | 1 | 200 | 0.384 | 0.213 | 0.372 | 0.277 | 0.209 | 0.157 | |
3 | 128 | 8 | 50 | 0.574 | 0.420 | 0.381 | 0.290 | 0.244 | 0.197 | |
6 | 64 | 8 | 10 | 0.441 | 0.288 | 0.609 | 0.467 | 0.670 | 0.579 | |
Bi-LSTM | 3 | 128 | 8 | 100 | 0.244 | 0.163 | 0.167 | 0.122 | 0.200 | 0.162 |
6 | 64 | 1 | 50 | 0.349 | 0.248 | 0.213 | 0.161 | 0.221 | 0.176 | |
6 | 64 | 8 | 50 | 0.341 | 0.241 | 0.211 | 0.160 | 0.223 | 0.175 | |
3 | 32 | 8 | 200 | 0.384 | 0.286 | 0.229 | 0.171 | 0.242 | 0.197 | |
3 | 64 | 1 | 200 | 0.401 | 0.298 | 0.234 | 0.174 | 0.251 | 0.196 | |
3 | 128 | 8 | 50 | 0.226 | 0.150 | 0.161 | 0.120 | 0.193 | 0.160 | |
6 | 64 | 8 | 10 | 0.472 | 0.359 | 0.329 | 0.254 | 0.405 | 0.340 | |
6 | 128 | 8 | 100 | 0.313 | 0.220 | 0.207 | 0.158 | 0.222 | 0.173 | |
GRU | 3 | 128 | 8 | 100 | 0.313 | 0.228 | 0.305 | 0.227 | 0.189 | 0.156 |
6 | 64 | 1 | 50 | 0.390 | 0.285 | 0.317 | 0.237 | 0.209 | 0.170 | |
6 | 64 | 8 | 50 | 0.392 | 0.289 | 0.321 | 0.243 | 0.251 | 0.175 | |
3 | 32 | 8 | 200 | 0.437 | 0.326 | 0.345 | 0.260 | 0.247 | 0.198 | |
3 | 64 | 1 | 200 | 0.434 | 0.324 | 0.340 | 0.258 | 0.294 | 0.230 | |
3 | 128 | 8 | 50 | 0.317 | 0.231 | 0.308 | 0.257 | 0.200 | 0.164 | |
6 | 64 | 8 | 10 | 0.513 | 0.389 | 0.563 | 0.443 | 0.473 | 0.388 | |
6 | 128 | 8 | 100 | 0.391 | 0.289 | 0.312 | 0.233 | 0.206 | 0.170 | |
CNN | 3 | 128 | 8 | 100 | 0.664 | 0.468 | 0.669 | 0.519 | 0.451 | 0.337 |
6 | 64 | 1 | 50 | 0.684 | 0.484 | 0.727 | 0.572 | 0.469 | 0.352 | |
6 | 64 | 8 | 50 | 0.638 | 0.447 | 0.687 | 0.539 | 0.445 | 0.354 | |
3 | 32 | 8 | 200 | 0.655 | 0.465 | 0.687 | 0.544 | 0.482 | 0.367 | |
3 | 64 | 1 | 200 | 0.644 | 0.453 | 0.669 | 0.529 | 0.458 | 0.345 | |
3 | 128 | 8 | 50 | 0.684 | 0.484 | 0.678 | 0.531 | 0.445 | 0.334 | |
6 | 64 | 8 | 10 | 0.648 | 0.486 | 0.683 | 0.545 | 0.468 | 0.358 | |
6 | 128 | 8 | 100 | 0.609 | 0.410 | 0.670 | 0.528 | 0.475 | 0.377 |
References
- Tribioli, L. Energy-based design of powertrain for a re-engineered post-transmission hybrid electric vehicle. Energies 2017, 10, 918. [Google Scholar] [CrossRef] [Green Version]
- Habib, S.; Khan, M.M.; Abbas, F.; Sang, L.; Shahid, M.U.; Tang, H. A comprehensive study of implemented international standards, technical challenges, impacts and prospects for electric vehicles. IEEE Access 2018, 6, 13866–13890. [Google Scholar] [CrossRef]
- Daina, N.; Sivakumar, A.; Polak, J.W. Polak, Modelling electric vehicles use: A survey on the methods. Renew. Sustain. Energy Rev. 2017, 68, 447–460. [Google Scholar] [CrossRef] [Green Version]
- Wu, X.; Hu, X.; Yin, X.; Moura, S.J. Stochastic optimal energy management of smart home with PEV energy storage. IEEE Trans. Smart Grid 2016, 9, 2065–2075. [Google Scholar] [CrossRef]
- Toquica, D.; De Oliveira-De Jesus, P.M.; Cadena, A.I. Power market equilibrium considering an ev storage aggregator exposed to marginal prices-a bilevel optimization approach. J. Energy Storage 2020, 28, 101267. [Google Scholar] [CrossRef]
- Al Mamun, A.; Sohel, M.; Mohammad, N.; Sunny, S.H.; Dipta, D.R.; Hossain, E. A comprehensive review of the load forecasting techniques using single and hybrid predictive models. IEEE Access 2020, 8, 134911–134939. [Google Scholar] [CrossRef]
- Speidel, S.; Bräunl, T. Driving and charging patterns of electric vehicles for energy usage. Renew. Sustain. Energy Rev. 2014, 40, 97–110. [Google Scholar] [CrossRef]
- Xu, M.; Meng, Q.; Liu, K.; Yamamoto, T. Joint charging mode and location choice model for battery electric vehicle users. Transp. Res. Part B Methodol. 2017, 103, 68–86. [Google Scholar] [CrossRef]
- Frades, M. A Guide to the Lessons Learned from the Clean Cities Community Electric Vehicle Readiness Projects. 2014. Available online: https://afdc.energy.gov/files/u/publication/guide_ev_projects.pdf (accessed on 4 February 2023).
- He, F.; Wu, D.; Yin, Y.; Guan, Y. Optimal deployment of public charging stations for plug-in hybrid electric vehicles. Transp. Res. Part B Methodol. 2013, 47, 87–101. [Google Scholar] [CrossRef]
- Caliwag, A.C.; Lim, W. Hybrid VARMA and LSTM method for lithium-ion battery state-of-charge and output voltage forecasting in electric motorcycle applications. IEEE Access 2019, 7, 59680–59689. [Google Scholar] [CrossRef]
- Hinton, G.E.; Osindero, S.; Teh, Y.-W. A fast learning algorithm for deep belief nets. Neural Comput. 2006, 18, 1527–1554. [Google Scholar] [CrossRef]
- Nandha, K.; Cheah, P.H.; Sivaneasan, B.; So, P.L.; Wang, D.Z.W.; Kumar, K.N. Electric vehicle charging profile prediction for efficient energy management in buildings. In Proceedings of the 10th International Power & Energy Conference (IPEC), Ho Chi Minh City, Vietnam, 12–14 December 2012; pp. 480–485. [Google Scholar]
- Medsker, L.; Jain, L.C. Recurrent Neural Networks: Design and Applications; CRC Press: Boca Raton, FL, USA, 1999. [Google Scholar]
- Vermaak, J.; Botha, E. Recurrent neural networks for short-term load forecasting. IEEE Trans. Power Syst. 1998, 13, 126–132. [Google Scholar] [CrossRef]
- Marino, D.L.; Amarasinghe, K.; Manic, M. Building energy load forecasting using deep neural networks. In Proceedings of the IECON 2016-42nd Annual Conference of the IEEE Industrial Electronics Society, IEEE, Florence, Italy, 23–26 October 2016. [Google Scholar]
- Kong, W.; Dong, Z.Y.; Jia, Y.; Hill, D.J.; Xu, Y.; Zhang, Y. Short-term residential load forecasting based on LSTM recurrent neural network. IEEE Trans. Smart Grid 2017, 10, 841–851. [Google Scholar] [CrossRef]
- Lu, F.; Lv, J.; Zhang, Y.; Liu, H.; Zheng, S.; Li, Y.; Hong, M. Ultra-Short-Term Prediction of EV Aggregator’s Demond Response Flexibility Using ARIMA, Gaussian-ARIMA, LSTM and Gaussian-LSTM. In Proceedings of the 2021 3rd International Academic Exchange Conference on Science and Technology Innovation (IAECST), IEEE, Guangzhou, China, 10–12 December 2021. [Google Scholar]
- Zhu, J.; Yang, Z.; Chang, Y.; Guo, Y.; Zhu, K.; Zhang, J. A novel LSTM based deep learning approach for multi-time scale electric vehicles charging load prediction. In Proceedings of the 2019 IEEE Innovative Smart Grid Technologies-Asia (ISGT Asia), Chengdu, China, 21–24 May 2019. [Google Scholar]
- Gao, Q.; Zhu, T.; Zhou, W.; Wang, G.; Zhang, T.; Zhang, Z.; Waseem, M.; Liu, S.; Han, C.; Lin, Z. Charging load forecasting of electric vehicle based on Monte Carlo and deep learning. In Proceedings of the 2019 IEEE Sustainable Power and Energy Conference (iSPEC), IEEE, Beijing, China, 21–23 November 2019. [Google Scholar]
- Zheng, J.; Xu, C.; Zhang, Z.; Li, X. Electric load forecasting in smart grids using long-short-term-memory based recurrent neural network. In Proceedings of the 2017 51st Annual Conference on Information Sciences and Systems (CISS), IEEE, Baltimore, MD, USA, 22–24 March 2017. [Google Scholar]
- Yan, K.; Wang, X.; Du, Y.; Jin, N.; Huang, H.; Zhou, H. Multi-step short-term power consumption forecasting with a hybrid deep learning strategy. Energies 2018, 11, 3089. [Google Scholar] [CrossRef] [Green Version]
- Chang, M.; Bae, S.; Cha, G.; Yoo, J. Aggregated electric vehicle fast-charging power demand analysis and forecast based on LSTM neural network. Sustainability 2021, 13, 13783. [Google Scholar] [CrossRef]
- Zhu, J.; Yang, Z.; Guo, Y.; Zhang, J.; Yang, H. Short-term load forecasting for electric vehicle charging stations based on deep learning approaches. Appl. Sci. 2019, 9, 1723. [Google Scholar] [CrossRef] [Green Version]
- Bengio, Y.; Simard, P.; Frasconi, P. Learning long-term dependencies with gradient descent is difficult. IEEE Trans. Neural Netw. 1994, 5, 157–166. [Google Scholar] [CrossRef]
- Mohsenimanesh, A.; Entchev, E.; Lapouchnian, A.; Ribberink, H. A Comparative Study of Deep Learning Approaches for Day-Ahead Load Forecasting of an Electric Car Fleet. In International Conference on Database and Expert Systems Applications; Springer: Berlin/Heidelberg, Germany, 2021. [Google Scholar]
- Di Persio, L.; Honchar, O. Analysis of recurrent neural networks for short-term energy load forecasting. In AIP Conference Proceedings; AIP Publishing LLC: Melville, NY, USA, 2017. [Google Scholar]
- Sadaei, H.J.; Silva, P.C.D.L.E.; Guimarães, F.G.; Lee, M.H. Short-term load forecasting by using a combined method of convolutional neural networks and fuzzy time series. Energy 2019, 175, 365–377. [Google Scholar] [CrossRef]
- Li, Y.; Huang, Y.; Zhang, M. Short-term load forecasting for electric vehicle charging station based on niche immunity lion algorithm and convolutional neural network. Energies 2018, 11, 1253. [Google Scholar] [CrossRef] [Green Version]
- Ahmed, S.; Nielsen, I.E.; Tripathi, A.; Siddiqui, S.; Rasool, G.; Ramachandran, R.P. Transformers in Time-series Analysis: A Tutorial. arXiv 2022, arXiv:2205.01138. [Google Scholar]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention is all you need. arXiv 2017, arXiv:1706.03762v5. [Google Scholar]
- Devlin, J.; Chang, M.W.; Lee, K.; Toutanova, K. Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv 2018, arXiv:1810.04805. [Google Scholar]
- Dong, L.; Xu, S.; Xu, B. Speech-transformer: A no-recurrence sequence-to-sequence model for speech recognition. In Proceedings of the 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), IEEE, Calgary, AB, Canada, 15–20 April 2018. [Google Scholar]
- Liu, Y.; Zhang, J.; Fang, L.; Jiang, Q.; Zhou, B. Multimodal motion prediction with stacked transformers. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 20–25 June 2021. [Google Scholar]
- Koohfar, S.; Woldemariam, W.; Kumar, A. Prediction of Electric Vehicles Charging Demand: A Transformer-Based Deep Learning Approach. Sustainability 2023, 15, 2105. [Google Scholar] [CrossRef]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
- Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
- Schuster, M.; Paliwal, K.K. Bidirectional recurrent neural networks. IEEE Trans. Signal Process. 1997, 45, 2673–2681. [Google Scholar] [CrossRef] [Green Version]
- Cho, K.; Van Merriënboer, B.; Bahdanau, D.; Bengio, Y. On the properties of neural machine translation: Encoder-decoder approaches. arXiv 2014, arXiv:1409.1259. [Google Scholar]
- Gated Recurrent Unit Networks. 23 May 2022. Available online: https://www.geeksforgeeks.org/gated-recurrent-unit-networks/ (accessed on 23 January 2023).
- Abdel-Hamid, O.; Mohamed, A.R.; Jiang, H.; Deng, L.; Penn, G.; Yu, D. Convolutional neural networks for speech recognition. IEEE/ACM Trans. Audio Speech Lang. Process. 2014, 22, 1533–1545. [Google Scholar] [CrossRef] [Green Version]
- Bahdanau, D.; Cho, K.; Bengio, Y. Neural machine translation by jointly learning to align and translate. arXiv 2014, arXiv:1409.0473. [Google Scholar]
- City of Boulder Open Data. Datasets. Available online: https://open-data.bouldercolorado.gov/datasets/ (accessed on 20 December 2022).
- Ketkar, N.; Ketkar, N. Introduction to keras. In Deep Learning with Python: A Hands-On Introduction; Springer: Berlin/Heidelberg, Germany, 2017; pp. 97–111. [Google Scholar]
- Abadi, M. TensorFlow: Learning functions at scale. In Proceedings of the 21st ACM SIGPLAN International Conference on Functional Programming, Nara, Japan, 18–22 September 2016. [Google Scholar]
Level | Power Level | Power Limit (kW) | Charging Speed | Charging Time (h) |
---|---|---|---|---|
Level 1 AC | 110/120 VAC | ≤3.3 | Slow | 4–36 |
Level 2 AC | 208/240 VAC | 3.3–20 | Medium | 1–4 |
Level 3 (DC fast charging) | 440 or 480 VAC | >50 | Fast | 0.2–1 |
Model | Key Characteristics |
---|---|
RNN | Sequential data is processed by passing hidden states from one step to another, allowing dependencies to be captured between them [36]. |
LSTM | An LSTM is an extension of an RNN and consists of memory cells and gates to control information flow and avoid the vanishing gradients problem. It is capable of capturing long-term relationships [37]. |
Bi-LSTM | A bidirectional version of LSTM, bi-LSTM processes a sequence in two directions and combines the results, capturing dependencies both previously and in the future [38]. |
GRU | As a variation of an RNN, a GRU incorporates two gates (update and reset gate) that control the information flow, improving convergence compared to traditional RNNs [39]. |
CNN | A CNN is designed to process grid-structured data (e.g., images) by combining local features extracted from the input with convolutional layers [41]. |
Transformer | A transformer uses self-attention mechanisms to process sequential data, can be parallelized efficiently, and operates on much longer sequences than RNNs [31]. |
Statistical Parameters | EV Charging Load (kWh) |
---|---|
Mean | 8.12 |
Std | 8.03 |
Min | 0.001 |
25% | 2.92 |
50% | 6.23 |
75% | 11.03 |
Max | 85.5 |
Parameters | Type | Description |
---|---|---|
Charging demand | Numeric | The energy consumed during the charging session in kW/day |
Weekday | Categorical | Mon, Tue, Wed, Thu, Fri, Sat, and Sun |
Month | Categorical | Jan, Feb, Mar, Apr, May, Jun, Jul, Aug Sep, Oct, Nov, and Dec |
Min temperature | Numeric | Minimum temperature in Fahrenheit (°F) |
Max temperature | Numeric | Maximum temperature in Fahrenheit (°F) |
Snow | Numeric | In millimeters/day |
Precipitation | Numeric | In millimeters/day |
Parameters | Type | Description |
---|---|---|
Charging demand | Numeric | Weekly average energy consumed in kW/week |
Month | Categorical | Jan, Feb, Mar, Apr, May, Jun, Jul, Aug Sep, Oct, Nov, and Dec |
Min temperature | Numeric | Weekly average of minimum temperature in Fahrenheit (°F) |
Max temperature | Numeric | Weekly average of maximum temperature in Fahrenheit (°F) |
Snow | Numeric | Weekly average in millimeters/week |
Precipitation | Numeric | Weekly average in millimeters per week |
Parameters | Type | Description |
---|---|---|
Charging demand | Numeric | Monthly average energy consumed in kW/month |
Month | Categorical | Jan, Feb, Mar, Apr, May, Jun, Jul, Aug Sep, Oct, Nov, and Dec |
Min temperature | Numeric | Monthly average of minimum temperature in Fahrenheit (°F) |
Max temperature | Numeric | Monthly average of maximum temperature in Fahrenheit (°F) |
Snow | Numeric | Monthly average in millimeters/month |
Precipitation | Numeric | Monthly average in millimeters per month |
Aggregated Data | Number of Record | Energy Usage | |
---|---|---|---|
Daily EV charging | 1668 | Min | Max |
0.74 (kW/day) | 729.0 (kW/day) | ||
Weekly EV charging | 239 | 144.75 (kW/week) | 4107.52 (kW/week) |
Monthly EV charging | 55 | 1015.72 (kW/month) | 15,493.44 (kW/month) |
Hyper Parameters | Range |
---|---|
Hidden dimension | 32, 64, 128 |
Number of epochs | 10, 50, 100, 200 |
Number of layers | 1, 3, 6 |
Number of heads | 1, 8 |
Loss function | Relu |
Models | No. of Heads | No. Hidden Dimensions | No. of Layer | Number of Epochs | RMSE | MAE |
---|---|---|---|---|---|---|
Transformer | 1 | 64 | 6 | 60 | 0.07 | 0.06 |
LSTM | 128 | 6 | 100 | 0.382 | 0.211 | |
RNN | 64 | 3 | 200 | 0.363 | 0.265 | |
Bi-LSTM | 64 | 3 | 200 | 0.366 | 0.268 | |
CNN | 32 | 1 | 200 | 0.609 | 0.410 | |
GRU | 64 | 3 | 200 | 0.313 | 0.228 |
Models | No. of Heads | No. Hidden Dimensions | No. of Layers | Number of Epochs | RMSE | MAE |
---|---|---|---|---|---|---|
Transformer | 1 | 64 | 3 | 200 | 0.08 | 0.06 |
LSTM | 128 | 3 | 50 | 0.30 | 0.22 | |
RNN | 128 | 3 | 100 | 0.35 | 0.26 | |
Bi-LSTM | 64 | 3 | 200 | 0.161 | 0.120 | |
CNN | 64 | 3 | 50 | 0.669 | 0.527 | |
GRU | 63 | 3 | 200 | 0.305 | 0.227 |
Models | No. of Heads | No. Hidden Dimensions | No. of Layers | Number of Epochs | RMSE | MAE |
---|---|---|---|---|---|---|
Transformer | 1 | 64 | 3 | 200 | 0.07 | 0.06 |
LSTM | 128 | 6 | 100 | 0.19 | 0.15 | |
RNN | 64 | 3 | 200 | 0.31 | 0.20 | |
Bi-LSTM | 64 | 3 | 200 | 0.193 | 0.160 | |
CNN | 64 | 3 | 200 | 0.445 | 0.334 | |
GRU | 64 | 3 | 200 | 0.189 | 0.156 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Koohfar, S.; Woldemariam, W.; Kumar, A. Performance Comparison of Deep Learning Approaches in Predicting EV Charging Demand. Sustainability 2023, 15, 4258. https://doi.org/10.3390/su15054258
Koohfar S, Woldemariam W, Kumar A. Performance Comparison of Deep Learning Approaches in Predicting EV Charging Demand. Sustainability. 2023; 15(5):4258. https://doi.org/10.3390/su15054258
Chicago/Turabian StyleKoohfar, Sahar, Wubeshet Woldemariam, and Amit Kumar. 2023. "Performance Comparison of Deep Learning Approaches in Predicting EV Charging Demand" Sustainability 15, no. 5: 4258. https://doi.org/10.3390/su15054258
APA StyleKoohfar, S., Woldemariam, W., & Kumar, A. (2023). Performance Comparison of Deep Learning Approaches in Predicting EV Charging Demand. Sustainability, 15(5), 4258. https://doi.org/10.3390/su15054258