Generative Adversarial Network for Synthesizing Multivariate Time-Series Data in Electric Vehicle Driving Scenarios
Abstract
:1. Introduction
- Data augmentation framework for EV driving scenarios: A GAN-based framework for synthesizing multivariate time-series data specifically for EV driving scenarios is presented. This framework addresses the limited availability of real-world data by enabling the generation of larger and more diverse datasets suitable for training and evaluating SOC estimation models.
- TS-p2pGAN model architecture: The TS-p2pGAN model, incorporating an integrated transformation network and a multiscale discriminator, is designed to handle high-dimensional, extended time sequences. This architecture aims to capture complex temporal dependencies within the data and generate synthetic time series that preserve these dependencies.
- Evaluation protocol using quantitative and qualitative metrics: An evaluation protocol employing both quantitative and qualitative metrics is implemented to assess the quality and characteristics of the generated synthetic data. This protocol provides insights into the model’s performance and facilitates further development.
- Validation with real-world driving data: The model’s performance is evaluated using data from 70 real-world driving trips, demonstrating its ability to generalize to real-world conditions and its potential for practical application in EV SOC estimation.
2. Materials and Methods
2.1. Dataset
- Vehicle dynamics: speed, altitude, throttle position, motor torque, and longitudinal acceleration;
- Battery metrics: power battery voltage, current, temperature, actual SOC, and displayed SOC;
- Climate control: heater wattage demand, air conditioner power consumption, heater voltage, and current;
- Environmental conditions: ambient temperature and related parameters.
2.2. Time-Series Synthesis with the pix2pix GAN
2.2.1. Architecture of the Transformation Net Model in the Generator Framework
2.2.2. Multiscale Discriminators
3. Experiments and Results
3.1. Data Preprocessing
3.2. Quantitative and Qualitative Performance Metrics
3.3. Real and Synthetic Data
4. Conclusions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Chen, Z.; He, T.; Mao, Y.; Zhu, W.; Xiong, Y.; Wang, S.; Niu, Y. State of Charge Estimation Method of Energy Storage Battery Based on Multiple Incremental Features. J. Electrochem. Soc. 2024, 171, 070522. [Google Scholar] [CrossRef]
- Zhou, W.; Zheng, Y.; Pan, Z.; Lu, Q. Review on the Battery Model and SOC Estimation Method. Processes 2021, 9, 1685. [Google Scholar] [CrossRef]
- Meng, J.; Ricco, M.; Luo, G.; Swierczynski, M.; Stroe, D.I.; Stroe, A.I.; Teodorescu, R. An Overview and Comparison of Online Implementable SOC Estimation Methods for Lithium-Ion Battery. IEEE Trans. Ind. Appl. 2017, 54, 1583–1591. [Google Scholar] [CrossRef]
- Yu, Q.; Xiong, R.; Lin, C. Online Estimation of State-of-Charge Based on H-Infinity and Unscented Kalman Filters for Lithium-Ion Batteries. Energy Procedia 2020, 105, 2791–2796. [Google Scholar] [CrossRef]
- Sarda, J.; Patel, H.; Popat, Y.; Hui, K.L.; Sain, M. Review of Management System and State-of-Charge Estimation Methods for Electric Vehicles. World Electr. Veh. J. 2023, 14, 325. [Google Scholar] [CrossRef]
- Obuli, P.D.; Preethem, S.; Babu, V.; Indragandhi, B.; Ashok, S.; Vedhanayaki, C.; Kavitha. Enhanced SOC Estimation of Lithium-Ion Batteries with Real-Time Data Using Machine Learning Algorithms. Dent. Sci. Rep. 2024, 14, 16036. [Google Scholar] [CrossRef]
- Khan, U.; Kirmani, S.; Rafat, Y.; Rehman, M.U.; Alam, M.S. Improved Deep Learning-Based State of Charge Estimation of Lithium-Ion Battery for Electrified Transportation. J. Energy Storage 2024, 91, 111877. [Google Scholar] [CrossRef]
- Selvaraj, V.; Vairavasundaram, I. A Bayesian Optimized Machine Learning Approach for Accurate State of Charge Estimation of Lithium-Ion Batteries Used for Electric Vehicle Application. J. Energy Storage 2024, 86, 111321. [Google Scholar] [CrossRef]
- Iglesias, G.; Talavera, E.; González-Prieto, Á.; Mozo, A.; Gómez-Canaval, S. Data Augmentation Techniques in Time Series Domain: A Survey and Taxonomy. Neural Comput. Appl. 2023, 35, 10123–10145. [Google Scholar] [CrossRef]
- Wen, Q.; Sun, L.; Yang, F.; Song, X.; Gao, J.; Wang, X.; Xu, H. Time Series Data Augmentation for Deep Learning: A Survey. arXiv 2020, arXiv:2002.12478. [Google Scholar]
- Lee, B.T.; Kwon, J.M.; Jo, Y.Y. TADA: Temporal Adversarial Data Augmentation for Time Series Data. arXiv 2024, arXiv:2407.15174. [Google Scholar]
- Victor, A.O.; Ali, M.I. Enhancing Time Series Data Predictions: A Survey of Augmentation Techniques and Model Performances. In Proceedings of the 2024 Australasian Computer Science Week, Sydney, Australia, 29 January–2 February 2024; pp. 1–13. [Google Scholar]
- Yoo, Y.; Lee, J. Designable Data Augmentation-based Domain-Adaptive Design of Electric Vehicle Considering Dynamic Responses. Int. J. Precis. Eng. Manuf. 2024, 2, 23–32. [Google Scholar] [CrossRef]
- Chakraborty, T.; KS, U.R.; Naik, S.M.; Panja, M.; Manvitha, B. Ten Years of Generative Adversarial Nets (GANs): A Survey of the State-of-the-Art. Mach. Learn. Sci. Technol. 2024, 5, 011001. [Google Scholar] [CrossRef]
- Rayavarapu, S.M.; Tammineni, S.P.; Gottapu, S.R.; Singam, A. A Review of Generative Adversarial Networks for Security Applications. Informatyka Autom. Pomiar. Gospod. Ochr. Środow. 2024, 14, 66–70. [Google Scholar] [CrossRef]
- Sabnam, S.; Rajagopal, S. Application of Generative Adversarial Networks in Image, Face Reconstruction and Medical Imaging: Challenges and the Current Progress. Comput. Methods Biomech. Biomed. Eng. Imaging Vis. 2024, 12, 2330524. [Google Scholar] [CrossRef]
- He, R.Y.; Sarwal, V.; Qiu, X.; Zhuang, Y.; Zhang, L.; Liu, Y.; Chiang, J.N. Generative AI Models in Time-Varying Biomedical Data: A Systematic Review. arXiv 2024. [Google Scholar] [CrossRef]
- Gui, J.; Sun, Z.; Wen, Y.; Tao, D.; Ye, J. A Review on Generative Adversarial Networks: Algorithms, Theory, and Applications. IEEE Trans. Knowl. Data Eng. 2023, 35, 3313–3332. [Google Scholar] [CrossRef]
- Brophy, E.; Wang, Z.; She, Q.; Ward, T. Generative Adversarial Networks in Time Series: A Systematic Literature Review. ACM Comput. Surv. 2023, 55, 1–31. [Google Scholar] [CrossRef]
- Pooyandeh, M.; Sohn, I. Smart Lithium-Ion Battery Monitoring in Electric Vehicles: An AI-Empowered Digital Twin Approach. Mathematics 2023, 11, 4865. [Google Scholar] [CrossRef]
- Pawar, D.R.; Yannawar, P. Advancements and Applications of Generative Adversarial Networks: A Comprehensive Review. Int. J. Res. Appl. Sci. Eng. Technol. 2024, 12. [Google Scholar] [CrossRef]
- Megahed, M.; Mohammed, A. A Comprehensive Review of Generative Adversarial Networks: Fundamentals, Applications, and Challenges. Wiley Interdiscip. Rev. Comput. Stat. 2024, 16, e1629. [Google Scholar] [CrossRef]
- Srivastava, P.; Yadav, M.U.; Ranjan, R.; Kumar, J.D. Emerging Trends in Generative Adversarial Networks: An Analysis of Recent Advances and Future Directions. In Proceedings of the International Conference on Cutting-Edge Developments in Engineering Technology and Science, Virtual Conference, 6–9 May 2024. [Google Scholar] [CrossRef]
- Wong, K.L.; Chou, K.S.; Tse, R.; Tang, S.K.; Pau, G. A Novel Fusion Approach Consisting of GAN and State-of-Charge Estimator for Synthetic Battery Operation Data Generation. Electronics 2023, 12, 657. [Google Scholar] [CrossRef]
- Hu, C.; Cheng, F.; Zhao, Y.; Guo, S.; Ma, L. State of Charge Estimation for Lithium-Ion Batteries Based on Data Augmentation with Generative Adversarial Network. J. Energy Storage 2024, 80, 110004. [Google Scholar] [CrossRef]
- Juneja, T.; Bajaj, S.B.; Sethi, N. Synthetic Time Series Data Generation Using Time GAN with Synthetic and Real-Time Data Analysis. In Proceedings of the International Conference on Recent Innovations in Computing, Jammu and Kashmir, India, 13–14 May 2022; Springer Nature: Singapore, 2022; pp. 657–667. [Google Scholar]
- Gu, X.; See, K.W.; Liu, Y.; Arshad, B.; Zhao, L.; Wang, Y. A Time-Series Wasserstein GAN Method for State-of-Charge Estimation of Lithium-Ion Batteries. J. Power Sources 2023, 581, 233472. [Google Scholar] [CrossRef]
- Soo, Y.Y.; Wang, Y.; Xiang, H.; Chen, Z. A Data Augmentation Method for Lithium-Ion Battery Capacity Estimation Based on Wasserstein Time Generative Adversarial Network. Energy Technol. 2024, 2400488. [Google Scholar] [CrossRef]
- Klopries, H.; Schwung, A. ITF-GAN: Synthetic Time Series Dataset Generation and Manipulation by Interpretable Features. Knowl. Based Syst. 2024, 283, 111131. [Google Scholar] [CrossRef]
- Zhang, C.; Zhang, Y.; Li, Z.; Zhang, Z.; Nazir, M.S.; Peng, T. Enhancing State of Charge and State of Energy Estimation in Lithium-Ion Batteries Based on a TimesNet Model with Gaussian Data Augmentation and Error Correction. Appl. Energy 2024, 359, 122669. [Google Scholar] [CrossRef]
- Isola, P.; Zhu, J.Y.; Zhou, T.; Efros, A.A. Image-to-Image Translation with Conditional Adversarial Networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 1125–1134. [Google Scholar]
- Wang, T.C.; Liu, M.Y.; Zhu, J.Y.; Tao, A.; Kautz, J.; Catanzaro, B. High-Resolution Image Synthesis and Semantic Manipulation with Conditional GANs. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 8798–8807. [Google Scholar]
- Chen, W.; Xu, X.; Luo, J.; Zhou, W. Ambient-Pix2PixGAN for Translating Medical Images from Noisy Data. In Proceedings of the Medical Imaging 2024: Image Perception, Observer Performance, and Technology Assessment, San Diego, CA, USA, 18–22 February 2024; SPIE: Bellingham, WA, USA, 2024; Volume 12929, pp. 83–89. [Google Scholar]
- Li, Z.; Guan, B.; Wei, Y.; Zhou, Y.; Zhang, J.; Xu, J. Mapping New Realities: Ground Truth Image Creation with Pix2Pix Image-to-Image Translation. arXiv 2024, arXiv:2404.19265. [Google Scholar]
- Battery and Heating Data for Real Driving Cycles. IEEE DataPort 2022. Available online: https://ieee-dataport.org/open-access/battery-and-heating-data-real-driving-cycles (accessed on 17 November 2024).
- Wikipedia. BMW i3. Available online: https://en.wikipedia.org/wiki/BMW_i3#:~:text=The%20top%20speed%20is%20limited,%2Fh%20(93%20mph) (accessed on 16 January 2025).
- Caminiti, C.M.; Ragaini, E.; Barbieri, J.; Dimovski, A.; Merlo, M. Limited-Size BESS for Mitigating the Impact of RES Volatility in Grid-Tied Microgrids in Developing Countries: Focus on Lacor Hospital. In Proceedings of the 2024 IEEE PES/IAS PowerAfrica, Johannesburg, South Africa, 7–11 October 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 1–5. [Google Scholar]
- Senin, P. Dynamic time warping algorithm review. Inf. Comput. Sci. Dep. Univ. Hawaii Manoa Honol. USA 2008, 855, 40. [Google Scholar]
- Van der Maaten, L.; Hinton, G. Visualizing Data Using t-SNE. J. Mach. Learn. Res. 2008, 9, 2579–2605. [Google Scholar]
- Abdi, H.; Williams, L.J. Principal Component Analysis. Wiley Interdiscip. Rev. Comput. Stat. 2010, 2, 433–459. [Google Scholar] [CrossRef]
- Li, X.; Metsis, V.; Wang, H.; Ngu, A.H.H. Tts-gan: A transformer-based time-series generative adversarial network. In Proceedings of the International Conference on Artificial Intelligence in Medicine, Halifax, NS, Canada, 14–17 June 2022; Springer International Publishing: Cham, Switzerland, 2022; pp. 133–143. [Google Scholar]
Parameter | Specification |
---|---|
Power | 170 horsepower (125 kW) |
Torque | 184 lb-ft (250 Nm) |
Acceleration (0–60 mph) | 7.3 s |
Top speed | 93 mph (150 km/h) |
Battery type | Lithium-ion |
Battery capacity | 60 Ah, 33 kWh (usable) |
Real-world range | Approximately 115 km (71 miles) |
Length | 3999 mm (157.4 inches) |
Width | 1775 mm (69.9 inches) |
Height | 1578 mm (62.1 inches) |
Wheelbase | 2570 mm (101.2 inches) |
Curb weight | 1195 kg (2635 lbs) |
AC charging capacity | 11 kW (optional 7.4 kW) |
DC fast charging | 50 kW (optional 11 kW) |
Charging time (0–100%) | Approximately 3 h (using a 7.4 kW charger) |
Name | Layer | (k, s, p) | f | d | Module |
---|---|---|---|---|---|
Input | 10 | 256 | Front-end Downsampling operations | ||
Reflect_1 | Pad1d | (-, -, 3) | 10 | 262 | |
Conv_1 | Conv1dN+R | (7, 1, 0) | 64 | 256 | |
Conv_2 | Conv1dN+R | (3, 2, 1) | 128 | 128 | |
Conv_3 | Conv1dN+R | (3, 2, 1) | 256 | 64 | |
Conv_4 | Conv1dN+R | (3, 2, 1) | 512 | 32 | |
Conv_5 | Conv1dN+R | (3, 2, 1) | 1024 | 16 | |
ResnetBlock_1 | Pad1d Conv1dN+R Pad1d Conv1dN+R | (-, -, 1) (3, 1, 1) (-, -, 1) (3, 1, 1) | 1024 1024 1024 1024 | 16 18 16 16 | Residual blocks () |
ResnetBlock_2 | Pad1d Conv1dN+R Pad1d Conv1dN+R | (-, -, 1) (3, 1, 1) (-, -, 1) (3, 1, 1) | 1024 1024 1024 1024 | 16 18 16 16 | |
ResnetBlock_3 | Pad1d Conv1dN+R Pad1d Conv1dN+R | (-, -, 1) (3, 1, 1) (-, -, 1) (3, 1, 1) | 1024 1024 1024 1024 | 16 18 16 16 | |
ResnetBlock_4 | Pad1d Conv1dN+R Pad1d Conv1dN+R | (-, -, 1) (3, 1, 1) (-, -, 1) (3, 1, 1) | 1024 1024 1024 1024 | 16 18 16 16 | |
ResnetBlock_5 | Pad1d Conv1dN+R Pad1d Conv1dN+R | (-, -, 1) (3, 1, 1) (-, -, 1) (3, 1, 1) | 1024 1024 1024 1024 | 16 18 16 16 | |
ResnetBlock_6 | Pad1d Conv1dN+R Pad1d Conv1dN+R | (-, -, 1) (3, 1, 1) (-, -, 1) (3, 1, 1) | 1024 1024 1024 1024 | 16 18 16 16 | |
ResnetBlock_7 | Pad1d Conv1dN+R Pad1d Conv1dN+R | (-, -, 1) (3, 1, 1) (-, -, 1) (3, 1, 1) | 1024 1024 1024 1024 | 16 18 16 16 | |
ResnetBlock_8 | Pad1d Conv1dN+R Pad1d Conv1dN+R | (-, -, 1) (3, 1, 1) (-, -, 1) (3, 1, 1) | 1024 1024 1024 1024 | 16 18 16 16 | |
ResnetBlock_9 | Pad1d Conv1dN+R Pad1d Conv1dN+R | (-, -, 1) (3, 1, 1) (-, -, 1) (3, 1, 1) | 1024 1024 1024 1024 | 16 18 16 16 | |
ConvTran_1 | ConvTranspose1dN+R | (3, 2, 1) | 512 | 32 | Back-end {). Upsampling operations |
ConvTran_2 | ConvTranspose1dN+R | (3, 2, 1) | 256 | 64 | |
ConvTran_3 | ConvTranspose1dN+R | (3, 2, 1) | 128 | 128 | |
ConvTran_4 | ConvTranspose1dN+R | (3, 2, 1) | 64 | 256 | |
Reflect_2 | Pad1D | (-, -, 3) | 64 | 262 | |
Conv_6 | Conv1D | (7, 1, 0) | 4 | 256 | |
Tanh_1 | Tanh | 4 | 256 |
Name | Layer | (k, s, p) | f | d | Scale |
---|---|---|---|---|---|
Input | 14 | 256 | scale 1 | ||
Conv_1 | Conv1dR | (4, 2, 2) | 64 | 129 | |
Conv_2 | Conv1dN+R | (4, 2, 2) | 128 | 65 | |
Conv_3 | Conv1dN+R | (4, 2, 2) | 256 | 33 | |
Conv_4 | Conv1dN+R | (4, 1, 2) | 512 | 34 | |
Conv_5 | Conv1d | (4, 1, 2) | 1 | 35 | |
Sigmoid_1 | Sigmoid | 1 | 35 | ||
Input | 14 | 256 | scale 2 | ||
Pool1D_1 | AvgPool1D | (3, 2, 1) | 14 | 128 | |
Conv_6 | Conv1dR | (4, 2, 2) | 64 | 65 | |
Conv_7 | Conv1dN+R | (4, 2, 2) | 128 | 33 | |
Conv_8 | Conv1dN+R | (4, 2, 2) | 256 | 17 | |
Conv_9 | Conv1dN+R | (4, 1, 2) | 512 | 18 | |
Conv_10 | Conv1d | (4, 1, 2) | 1 | 19 | |
Sigmoid_2 | Sigmoid | 1 | 35 |
Trip No | TS-p2pGAN | TTS-GAN | TimeGAN | Trip No | TS-p2pGAN | TTS-GAN | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
RMSE (%) | MAE (%) | DTW (%) | RMSE (%) | MAE (%) | DTW (%) | RMSE (%) | MAE (%) | DTW (%) | RMSE (%) | RMAE (%) | DTW (%) | RMSE (%) | MAE (%) | DTW (%) | ||
1 | 1.96 | 0.97 | 1.19 | 18.57 (3.87) | 11.61 (2.62) | 13.20 (2.70) | 3.67 | 2.60 | 2.84 | 36 | 1.75 | 0.86 | 1.00 | 11.70 | 8.47 | 8.50 |
2 | 2.02 | 1.01 | 1.12 | 13.25 | 9.94 | 10.68 | 3.94 | 2.81 | 2.93 | 37 | 1.89 | 0.93 | 1.13 | 35.53 | 24.78 | 31.31 |
3 | 1.91 | 1.02 | 1.21 | 31.09 | 22.51 | 27.46 | 5,35 | 3.83 | 3.45 | 38 | 2.30 | 1.13 | 1.36 | 3.62 | 2.21 | 2.56 |
4 | 1.79 | 0.77 | 0.84 | 46.13 | 31.66 | 30.31 | 31.48 | 18.56 | 17.2 | 39 | 2.17 | 1.18 | 1.40 | 42.53 | 33.33 | 25.59 |
5 | 1.91 | 0.92 | 1.09 | 24.69 | 17.64 | 21.47 | 7.45 | 5.34 | 5.50 | 40 | 2.45 | 1.33 | 1.67 | 7.54 | 5.36 | 6.39 |
6 | 1.62 | 0.84 | 1.03 | 12.75 | 9.33 | 10.69 | 7.94 | 5.80 | 6.26 | 41 | 2.40 | 1.19 | 1.48 | 16.69 | 12.24 | 15.32 |
7 | 1.56 | 0.84 | 1.01 | 31.13 | 20.87 | 24.75 | 4.79 | 3.44 | 3.33 | 42 | 2.71 | 1.26 | 1.40 | 27.03 | 18.74 | 26.44 |
8 | 1.51 | 0.78 | 0.92 | 35.81 | 23.64 | 28.84 | 5.81 | 4.10 | 3.71 | 43 | 2.33 | 1.22 | 1.48 | 14.79 | 9.07 | 9.58 |
9 | 1.73 | 0.85 | 0.95 | 49.62 | 32.90 | 32.61 | 25.78 | 15.63 | 9.57 | 44 | 1.47 | 0.77 | 0.90 | 50.96 | 33.71 | 39.99 |
10 | 2.03 | 1.06 | 1.26 | 32.81 | 19.02 | 18.78 | 6.89 | 5.18 | 4.52 | 45 | 1.58 | 0.79 | 0.93 | 6.56 | 4.31 | 5.19 |
11 | 2.08 | 1.01 | 1.15 | 62.69 | 43.97 | 59.21 | 7.94 | 6.12 | 5.60 | 46 | 2.41 | 1.20 | 1.39 | 16.99 | 11.57 | 16.62 |
12 | 1.26 | 0.66 | 0.72 | 31.30 | 25.48 | 21.52 | 11.27 | 8.18 | 5.94 | 47 | 2.28 | 1.08 | 1.30 | 24.72 | 16.80 | 22.56 |
13 | 2.69 | 1.46 | 1.56 | 37.27 | 27.90 | 36.33 | 6.36 | 3.89 | 3.06 | 48 | 1.72 | 0.85 | 1.00 | 8.40 | 5.53 | 6.30 |
14 | 1.48 | 0.80 | 0.91 | 37.71 | 22.43 | 21.33 | 6.96 | 4.95 | 4.46 | 49 | 2.58 | 1.24 | 1.45 | 19.53 | 13.56 | 17.77 |
15 | 1.83 | 0.96 | 1.17 | 9.80 | 6.53 | 6.95 | 7.30 | 5.01 | 4.67 | 50 | 3.29 | 1.46 | 1.75 | 2.58 | 1.46 | 1.67 |
16 | 1.92 | 1.02 | 1.21 | 28.90 | 21.91 | 25.54 | 6.76 | 4.98 | 4.80 | 51 | 2.19 | 1.15 | 1.39 | 5.22 | 3.98 | 4.81 |
17 | 2.11 | 1.07 | 1.28 | 49.46 | 35.62 | 44.59 | 52 | 2.21 | 1.09 | 1.30 | 22.12 | 14.56 | 18.67 | |||
18 | 1.66 | 0.87 | 1.01 | 12.46 | 8.00 | 9.99 | 53 | 1.58 | 0.79 | 0.94 | 38.20 | 25.29 | 33.19 | |||
19 | 1.80 | 0.92 | 1.09 | 8.55 | 5.85 | 6.19 | 54 | 1.45 | 0.73 | 0.87 | 5.08 | 3.72 | 3.96 | |||
20 | 1.91 | 0.96 | 1.11 | 21.95 | 15.79 | 19.62 | 55 | 2.31 | 1.14 | 1.27 | 14.24 | 9.94 | 12.81 | |||
21 | 1.56 | 0.80 | 0.97 | 44.54 | 29.58 | 39.98 | 56 | 1.48 | 0.84 | 0.99 | 4.89 | 3.07 | 3.52 | |||
22 | 2.05 | 0.97 | 1.20 | 10.73 | 7.62 | 8.64 | 57 | 2.54 | 1.24 | 1.45 | 16.64 | 11.41 | 15.79 | |||
23 | 2.65 | 1.31 | 1.52 | 30.09 | 21.17 | 25.40 | 58 | 1.75 | 0.94 | 1.15 | 44.50 | 29.56 | 39.91 | |||
24 | 2.36 | 1.25 | 1.42 | 41.96 | 29.39 | 39.62 | 59 | 1.55 | 0.79 | 0.97 | 5.83 | 3.35 | 4.01 | |||
25 | 2.23 | 1.09 | 1.26 | 58.61 | 40.63 | 54.61 | 60 | 2.05 | 1.00 | 1.19 | 3.29 | 2.26 | 2.50 | |||
26 | 2.75 | 1.30 | 1.51 | 74.71 | 51.57 | 69.51 | 61 | 2.05 | 1.03 | 1.27 | 9.19 | 7.27 | 7.98 | |||
27 | 2.04 | 1.11 | 1.30 | 49.42 | 32.70 | 44.03 | 62 | 2.78 | 1.27 | 1.43 | 20.50 | 14.73 | 18.37 | |||
28 | 2.21 | 1.23 | 1.40 | 70.08 | 46.91 | 64.95 | 63 | 1.87 | 0.99 | 1.21 | 4.19 | 2.67 | 3.24 | |||
29 | 2.90 | 1.67 | 1.77 | 82.92 | 56.46 | 76.35 | 64 | 2.04 | 1.06 | 1.35 | 16.53 | 11.77 | 15.13 | |||
30 | 2.25 | 1.02 | 1.22 | 8.66 | 5.65 | 6.45 | 65 | 2.52 | 1.16 | 1.45 | 11.00 | 7.59 | 10.04 | |||
31 | 1.86 | 0.87 | 1.06 | 25.35 | 17.74 | 21.54 | 66 | 1.63 | 0.87 | 1.09 | 21.60 | 14.68 | 20.88 | |||
32 | 2.70 | 1.30 | 1.52 | 48.52 | 32.72 | 44.29 | 67 | 2.74 | 1.37 | 1.69 | 3.35 | 2.23 | 2.52 | |||
33 | 2.45 | 1.24 | 1.60 | 11.11 | 8.21 | 10.31 | 68 | 2.16 | 1.10 | 1.35 | 11.02 | 7.69 | 9.05 | |||
34 | 2.32 | 1.12 | 1.32 | 12.58 | 8.33 | 12.32 | 69 | 1.88 | 0.96 | 1.22 | 19.09 | 11.41 | 12.81 | |||
35 | 1.95 | 0.93 | 1.08 | 10.59 | 7.37 | 8.39 | 70 | 2.37 | 1.13 | 1.34 | 11.70 | 8.47 | 8.50 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Jeng, S.-L. Generative Adversarial Network for Synthesizing Multivariate Time-Series Data in Electric Vehicle Driving Scenarios. Sensors 2025, 25, 749. https://doi.org/10.3390/s25030749
Jeng S-L. Generative Adversarial Network for Synthesizing Multivariate Time-Series Data in Electric Vehicle Driving Scenarios. Sensors. 2025; 25(3):749. https://doi.org/10.3390/s25030749
Chicago/Turabian StyleJeng, Shyr-Long. 2025. "Generative Adversarial Network for Synthesizing Multivariate Time-Series Data in Electric Vehicle Driving Scenarios" Sensors 25, no. 3: 749. https://doi.org/10.3390/s25030749
APA StyleJeng, S.-L. (2025). Generative Adversarial Network for Synthesizing Multivariate Time-Series Data in Electric Vehicle Driving Scenarios. Sensors, 25(3), 749. https://doi.org/10.3390/s25030749