Comparative Study of Physics-Based Modeling and Neural Network Approach to Predict Cooling in Vehicle Integrated Thermal Management System
Abstract
:1. Introduction
2. Physics-Based Modeling
2.1. Model Structure
2.1.1. Engine Oil Temperature
2.1.2. Coolant Temperature
2.2. Experimental Parameter Study
2.3. Simulation Implementation of Coolant and Oil Temperature
- Pre-determined parameters: among the values obtained from experimental parameter studies, a fixed variable that is not a value that changes in real time. It consists of the surface area ( ), specific heat (), and thickness of the cylinder wall (L) of various heat exchangers.
- Real-time measured input data: input data that changes every moment while the actual vehicle is driving. It is obtained by data acquisition device, and consists of engine speed ( ), vehicle speed (), ambient temperature (), etc.
- Cooling fan, ECV control model: a model that calculates the control target value using real-time input data to determine what effect it will have if the control logic is changed, or to verify that the existing control is working well.
- Correlation parameter model: a model that calculates parameters that change according to the value of real-time input data. Such as Heat transfer coefficient ( ), Effectiveness (), power generation (), etc. are calculated.
- Overall heat transfer model: a model that integrates the amount of heat transferred by each unit to calculate the temperature of engine oil and coolant, which are the prediction targets.
- Calculation of coolant temperature and engine oil temperature: a model that predicts the next stage temperature by using the heat quantity calculated in the integrated heat transfer model.
- Next step inputs: send the final calculated output to the next step input.
3. Neural Network-Based Modeling
3.1. Model Framework Determination
3.2. Convolutional LSTM
3.3. Temporal Convolutional Network
4. Comparison of Predicted Results and Actual Measurements
5. Results
5.1. Comparison Between Prediction and Actual Measurement
5.2. Comparative Study
6. Discussion
7. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Page, R.W.; Hnatczuk, W. Thermal Management for the 21st Century—Improved Thermal Control & Fuel Economy in an Army Medium Tactical Vehicle; SAE Technical Paper 2005-01-2068; SAE International: Washington, DC, USA, 2005. [Google Scholar]
- Mahmoud, K.G.; Loibner, E.; Wiesler, B.; Samhaber, C.; Kußmann, C. Simulation-based vehicle thermal management system—Concept and methodology. In Proceedings of the SAE World Congress, Detroit, MI, USA, 3–6 March 2003. [Google Scholar]
- Osborne, S.; Kopinsky, J.; Norton, S.; Sutherland, A.; Lancaster, D.; Nielsen, E.; Isenstadt, A.; German, J. Automotive Thermal Management Technology; The International Council on Clean Transportation (ICCT): Washington, DC, USA, 2016. [Google Scholar]
- Allen, D.; Lasecki, M. Thermal Management Evolution and Controlled Coolant Flow; SAE Technical Paper 2001-01-1732; SAE International: Washington, DC, USA, 2001. [Google Scholar]
- Curran, A.R.; Johnson, K.R.; Marttila, E.A.; Dudley, S.P. Automated Radiation Modeling for Vehicle Thermal Management; SAE Technical Paper 950615; SAE International: Washington, DC, USA, 1995. [Google Scholar]
- Kumar, V.; Shendge, S.A.; Baskar, S. Underhood Thermal Simulation of a Small Passenger Vehicle with Rear Engine Compartment to Evaluate and Enhance Radiator Performance; SAE Technical Paper 2010-01-0801; SAE International: Washington, DC, USA, 2010. [Google Scholar]
- Cipollone, R.; Villante, C. Vehicle thermal management: A model-based approach. In Proceedings of the ASME Internal Combustion Engine Division Fall Technical Conference, Long Beach, CA, USA, 24–27 October 2004. [Google Scholar]
- Berry, A.; Blissett, M.; Steiber, J.; Tobin, A.T.; McBroom, S.T. A New Approach to Improving Fuel Economy and Performance Prediction through Coupled Thermal Systems Simulation; SAE Technical Paper 2002-01-1208; SAE International: Washington, DC, USA, 2002. [Google Scholar]
- Zanini, F.; Atienza, D.; Jones, C.N.; De Micheli, G. Temperature sensor placement in thermal management systems for MPSoCs. In Proceedings of the 2010 IEEE International Symposium on Circuits and Systems, Paris, France, 30 May–2 June 2010; p. 11462528. [Google Scholar]
- Zerfowski, D.; Buttle, D. Paradigmenwechsel im automotive-software-markt. ATZ Automob. Z. 2019, 121, 28–35. [Google Scholar] [CrossRef]
- Samad, T.; Stewart, G. Systems Engineering and Innovation in Control—An Industry Perspective and an Application to Automotive Powertrains; University of Maryland Model-Based Systems Engineering Colloquia Series: Washington, DC, USA, 2013. [Google Scholar]
- Müller, M.T. Interview by Alfred Vollmer, Die neue Architektur der Fahrzeuge. Automob. Elektron. 2019, 3, 16ff. [Google Scholar]
- Proff, H.; Pottebaum, T.; Wolf, P. Software is transforming the automotive world—Four strategic options for pure-play software companies merging into the automotive lane. Deloitte Insights 2020, 1, 10–11. [Google Scholar]
- Patrini, G.; Rozza, A.; Menon, A.K.; Nock, R.; Qu, L. Making deep neural networks robust to label noise: A loss correction approach. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 1944–1952. [Google Scholar]
- Yang, X.; Ho, D.W.C. Synchronization of delayed memristive neural networks: Robust analysis approach. IEEE Trans. Cybern. 2016, 46, 3377–3387. [Google Scholar] [CrossRef]
- Maas, A.; Le, Q.V.; O’Neil, T.M.; Vinyals, O.; Nguyen, P.; Ng, A.Y. Recurrent neural networks for noise reduction in robust ASR. Interspeech 2012. Available online: http://ai.stanford.edu/~amaas/papers/drnn_intrspch2012_final.pdf (accessed on 2 September 2020).
- Seltzer, M.L.; Yu, D.; Wang, Y. An Investigation of Deep Neural Networks for Noise Robust Speech Recognition. Proc. ICASSP 2013, 7398–7402. [Google Scholar] [CrossRef]
- Parveen, S.; Green, P. Speech recognition with missing data using recurrent neural nets. Adv. Neural Inf. Process. Syst. 2001, 14, 1189–1195. [Google Scholar]
- Deng, L.; Yu, D. Deep learning: Methods and applications. Found. Trends Signal Process. 2014, 7, 197–387. [Google Scholar] [CrossRef] [Green Version]
- Gregor, K.; Danihelka, I.; Mnih, A.; Blundell, C. Deep AutoRegressive Networks. In Proceedings of the 31st International Conference on Machine Learning, Beijing, China, 21–26 June 2014. [Google Scholar]
- Cao, J.; Li, P.; Wang, W. Synchronization in arrays of delayed neural networks with constant and delayed coupling. Elsevier Phys. Lett. A 2006, 353, 318–325. [Google Scholar] [CrossRef]
- Gooijer, J.G.D.; Hyndman, R.J. 25 years of time series forecasting. Int. J. Forecast. 2006, 22, 443–473. [Google Scholar] [CrossRef] [Green Version]
- Gamboa, J.C.B. Deep learning for time-series analysis. arXiv 2017, arXiv:1701.01887. [Google Scholar]
- Dorffner, G. Neural networks for time series processing. Neural Network World 1996, 6, 447–466. [Google Scholar]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–445. [Google Scholar] [CrossRef] [PubMed]
- LeCun, Y.; Bengio, Y. Convolutional networks for images, speech, and time-series. In The Handbook of Brain Theory and Neural Networks; Arbib, M.A., Ed.; MIT Press: Cambridge, UK, 1995. [Google Scholar]
- Brownlee, J. Promise of deep learning for time series forecasting. In Deep Learning for Time Series Forecasting, 1st ed.; Machine Learning Mastery: Vermont Victoria, Australia, 2018; pp. 3–7. [Google Scholar]
- Chollet, F. Deep learning for text and sequence. In Deep Learning with Python, 1st ed.; Manning Publications Company: Shelter Island, NY, USA, 2017; pp. 305–309. [Google Scholar]
- Aggarwal, C.C. Recurrent neural networks. In Neural Networks and Deep Learning, 1st ed.; Springer: Berlin, Germany, 2018; pp. 271–313. [Google Scholar]
- Bai, S.; Kolter, J.Z.; Koltun, V. An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv 2018, arXiv:1803.01271. [Google Scholar]
- Zhao, W.; Gao, Y.; Ji, T.; Wan, X.; Ye, F.; Bai, G. Deep temporal convolutional networks for short-term traffic flow forecasting. IEEE Access 2019, 7, 114496–114507. [Google Scholar] [CrossRef]
- Lea, C.; Flynn, M.D.; Vidal, R.; Reiter, A. Temporal Convolutional Networks for Action Segmentation and Detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017. [Google Scholar]
- Oord, A.; Dieleman, S.; Zen, H.; Simonyan, K.; Vinyals, O.; Graves, A.; Kalchbrenner, N.; Senior, A.; Kavukcuoglu, K. WaveNet: A generative model for raw audio. arXiv 2016, arXiv:1609.03499. [Google Scholar]
- Briggs, I.; Murtagh, M.; Kee, R.; McCulloug, G.; Douglas, R. Sustainable non-automotive vehicles: The simulation challenges. Energy Rev. 2017, 68, 840–851. [Google Scholar] [CrossRef]
- Hussein, A.A. Capacity fade estimation in electric vehicle li-ion batteries using artificial neural networks. IEEE Trans. Ind. App. 2015, 51, 2321–2330. [Google Scholar] [CrossRef]
- Zahid, T.; Xu, K.; Li, W.; Li, C.; Li, H. State of charge estimation for electric vehicle power battery using advanced machine learning algorithm under diversified drive cycles. Energy 2018, 162, 871–882. [Google Scholar] [CrossRef]
- Park, J.; Kim, Y. Supervised-learning-based optimal thermal management in an electric vehicle. IEEE Access 2020, 8, 1290–1302. [Google Scholar] [CrossRef]
- Heywood, J.B. Engine heat transfer. In Internal Combustion Engine Fundamentals, 1st ed.; McGraw-Hill Book Company: New York, NY, USA, 1988; pp. 673–674. [Google Scholar]
- Ryu, T.Y.; Shin, S.Y.; Lee, E.H.; Choi, J.K. A study on the heat rejection to coolant in a gasoline engine. Trans. Korean Soc. Auto. Eng. 1997, 5, 77–88. [Google Scholar]
- Kuo, P.-H.; Huang, C.-J. A high precision artificial neural networks model for short-term energy load forecasting. Energies 2018, 11, 213. [Google Scholar] [CrossRef] [Green Version]
- Pascanu, R.; Mikolov, T.; Bengio, Y. On the difficulty of training recurrent neural networks. In Proceedings of the 30st International Conference on Machine Learning (ICML), Atlanta, GA, USA, 16–21 June 2013; Volume 28, pp. III-1310–III-1318. [Google Scholar]
- Jozefowicz, R.; Zaremba, W.; Sutskever, I. An empirical exploration of recurrent network architectures. In Proceedings of the 32nd International Conference on Machine Learning, Lille, France, 6–11 July 2015; Volume 37, pp. 2342–2350. [Google Scholar]
- Culurciello, E. The Fall of RNN/LSTM. Available online: https://towardsdatascience.com/the-fall-of-rnn-lstm-2d1594c74ce0 (accessed on 2 September 2020).
- Sherstinsky, A. Fundamentals of Recurrent Neural Network (RNN) and Long Short-Term Memory (LSTM) network. Elsevier Phys. D Nonlinear Phenom. 2020, 404, 132306. [Google Scholar] [CrossRef] [Green Version]
- Shi, X.; Chen, Z.; Wang, H.; Yeung, D.-Y. Convolutional LSTM network: A machine learning approach for precipitation nowcasting. Adv. Neural Inf. Process. Syst. 2015, 28, 3–4. [Google Scholar]
- Hinton, G.; Deng, L.; Yu, D.; Dahl, G.; Mohamed, A.; Jaitly, N.; Senior, A.; Vanhoucke, V.; Nguyen, P.; Sainath, T.N.; et al. Deep neural networks for acoustic modeling in speech recognition. IEEE Signal Process. Mag. 2012, 29, 82–97. [Google Scholar] [CrossRef]
- Zhang, Y.; Chan, W.; Jaitly, N. Very deep convolutional networks for end-to-end speech recognition. In Proceedings of the 42nd International Conference on Acoustics, Speech and Signal Processing (ICASSP), New Orleans, LA, USA, 5–9 March 2017; pp. 4845–4849. [Google Scholar]
- Lara-Benítez, P.; Carranza-García, M.; Luna-Romera, J.M.; Riquelme, J.C. Temporal convolutional networks applied to energy-related time series forecasting. Appl. Sci. 2020, 10, 2322. [Google Scholar] [CrossRef] [Green Version]
- Lea, C.; Vidal, R.; Reiter, A.; Hager, G.D. Temporal convolutional networks: A unified approach to action segmentation. In Proceedings of the Computer 14th European Conference on Computer Vision (ECCV), Amsterdam, The Netherlands, 8–10 and 15–16 October 2016; pp. 47–54. [Google Scholar]
- Lara-Benítez, P.; Carranza-García, M.; García-Gutiérrez, J.; Riquelme, J. Asynchronous dual-pipeline deep learning framework for online data stream classification. Integr. Comput. Aided Eng. 2020, 27, 1–19. [Google Scholar] [CrossRef]
- Keskar, N.S.; Mudigere, D.; Nocedal, J.; Smelyanskiy, M.; Tang, P.T.P. On large-batch training for deep learning: Generalization gap and sharp minima. arXiv 2016, arXiv:1609.04836. [Google Scholar]
- Elsayed, N.; Maida, A.S.; Bayoumi, M. Empirical activation function effects on unsupervised convolutional lstm learning. In Proceedings of the 30th International Conference on Tools with Artificial Intelligence (ICTAI), Volos, Greece, 5–7 November 2018; Volume 10, p. 1109. [Google Scholar]
- US Environmental Protection Agency. Fuel economy labeling of motor vehicles: Revisions to improve calculation of Fuel Economy Estimates; 40 CFR Part 86 and 600. Fed. Regist. 2006, 71, 77872–77969. [Google Scholar]
- United Nations. Global technical regulation No.15 worldwide harmonized Light vehicles Test procedure. UN ECE Trans. 2014, 180, Add.15. [Google Scholar]
Powertrain | Engine | Displacement (cc) | 998 |
Max Power (PS/RPM) | 77/6500 | ||
Max Torque (kg·m/RPM) | 9.7/4800 | ||
Cooling System | Thermostat ECV | ||
Injector | Solenoid | ||
Water Pump type | Mechanical | ||
Engine Oil | 0W-20 | ||
Transmission | Number of Gear Stages | 5th | |
FGR×TOP Gear Ratio | 3.3074 | ||
ATF Warmer/Cooler | X | ||
Weight | ETW (kg) | 1215 |
Parameter | Description | |
---|---|---|
#1 | hradiator | Heat transfer coefficient of radiator |
#2 | hoil pan | Heat transfer coefficient of oil pan |
#3 | hengine block | Heat transfer coefficient of engine block |
#4 | kengine block | Thermal conductivity of engine block |
#5 | koil-coolant | Thermal conductivity of wall surface where heat exchange occurs between engine oil and coolant |
#6 | Cp,engine oil | Specific heat capacity of engine oil |
#7 | Cp,coolant | Specific heat capacity of coolant |
#8 | Cp,engine crank shaft | Specific heat capacity of engine crank shaft |
#9 | Cp,engine block | Specific heat capacity of engine block |
#10 | Cp,radiator | Specific heat capacity of radiator |
#12 | Aradiator | Surface area where convective heat transfer of radiator occurs |
#12 | Aoil pan | Surface area where convective heat transfer of oil pan occurs |
#13 | Aengine block | Surface area where convective heat transfer of engine block occurs |
#14 | Lbolck | Wall thickness where heat conduction occurs between coolant and engine block |
#15 | Loil-coolant | Wall thickness where heat conduction occurs between coolant and engine oil |
#16 | εradiator | Effectiveness for predicting radiator outlet temperature |
#17 | εEOC | Effectiveness for engine oil cooler |
#18 | ηth | Efficiency at which fuel is converted from fuel energy to indicated power when internal combustion engine is operated |
#19 | ηengine heat transfer | Ratio of the amount of heat transferred to cooling system from heat loss |
#20 | Aoil-coolant | Area of heat conduction between oil and coolant at oil cooler |
#21 | Aengine block-coolant | Area of heat conduction between engine block and coolant |
Parameter | Description | I/O Classification | |
#1 | Vehicle speed | Vehicle speed for calculating wind speed | Input |
#2 | Engine speed | Engine rotating speed | Input |
#3 | Ambient temperature | Air temperature outside the vehicle | Input |
#4 | Indicated torque | Indicated torque/maximum torque ratio | Input |
#5 | ECV current angle | Angle at which ECV is currently operating | Input |
#6 | Radiator cooling fan current | Electric current for cooling fan operation (A) | Input |
#7 | Sampling rate | Data acquisition interval time | Input |
#8 | Coolant temperature | Predicted coolant temperature | Output |
#9 | Engine oil temperature | Predicted engine oil temperature | Output |
Model Description | MAE | MSE | |||||
---|---|---|---|---|---|---|---|
Filters | Kernel | Strides | Trainable Parameters | Activation | |||
#1 | 256 | 20 | 5 | 22,194,737 | SoftPlus | 3.2889 | 20.0674 |
#2 | 256 | 3 | 5 | 17,581,617 | ReLU | 6.1641 | 61.4168 |
#3 | 256 | 1 | 5 | 17,038,807 | ReLU | 15.5647 | 355.0674 |
#4 | 128 | 5 | 5 | 8,925,489 | ReLU | 4.3266 | 34.8466 |
#5 | 128 | 10 | 1 | 42,044,209 | SoftPlus | 4.4020 | 37.5245 |
#6 | 128 | 20 | 3 | 15,507,249 | SoftPlus | 3.3207 | 19.8283 |
#7 | 128 | 40 | 5 | 16,910,129 | ReLU | 2.7086 | 15.2777 |
#8 | 64 | 20 | 3 | 7,616,945 | SoftPlus | 3.4380 | 25.7581 |
#9 | 64 | 20 | 3 | 7,616,945 | ReLU | 2.6124 | 13.3402 |
#10 | 32 | 20 | 3 | 3,917,553 | ReLU | 2.7728 | 15.7787 |
Model Description | MAE | MSE | ||||
---|---|---|---|---|---|---|
Filters | Kernel | Dilations | Trainable Parameters | |||
#1 | 128 | 2 | [1, 2, 4, 8, 16, 32] | 365,826 | 3.7536 | 23.8921 |
#2 | 64 | 2 | [1, 2, 4, 8, 16, 32] | 92,802 | 0.9148 | 2.8306 |
#3 | 32 | 2 | [1, 2, 4, 8, 16, 32] | 23,874 | 0.2269 | 0.7441 |
#4 | 32 | 4 | [1, 2, 4, 8, 16, 32] | 46,978 | 0.3325 | 0.9843 |
#5 | 32 | 3 | [1, 2, 4, 8, 16, 32] | 23,874 | 0.2885 | 0.9369 |
#6 | 32 | 2 | [1, 2, 4, 8, 16] | 19,714 | 0.3225 | 1.0207 |
#7 | 64 | 4 | [1, 2, 4, 8, 16, 32] | 184,066 | 1.0216 | 3.0262 |
#8 | 64 | 2 | [1, 2, 4, 8, 16, 32, 64] | 109,506 | 1.1830 | 3.7137 |
#9 | 81 | 3 | [1, 3, 9, 27, 81] | 181,118 | 2.0541 | 14.3222 |
Model | MAE | MSE | Working Days | |||
---|---|---|---|---|---|---|
Experimental Parameter Study | Validation Test | Model Tuning | Total | |||
Physics-based | 1.9697 | 6.6763 | 8 months | 1 months | 3 months | 12 months |
NN-based | 0.2269 | 0.7441 | N/A | 2 months | 1 month | 3 months |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Choi, D.; An, Y.; Lee, N.; Park, J.; Lee, J. Comparative Study of Physics-Based Modeling and Neural Network Approach to Predict Cooling in Vehicle Integrated Thermal Management System. Energies 2020, 13, 5301. https://doi.org/10.3390/en13205301
Choi D, An Y, Lee N, Park J, Lee J. Comparative Study of Physics-Based Modeling and Neural Network Approach to Predict Cooling in Vehicle Integrated Thermal Management System. Energies. 2020; 13(20):5301. https://doi.org/10.3390/en13205301
Chicago/Turabian StyleChoi, Duwon, Youngkuk An, Nankyu Lee, Jinil Park, and Jonghwa Lee. 2020. "Comparative Study of Physics-Based Modeling and Neural Network Approach to Predict Cooling in Vehicle Integrated Thermal Management System" Energies 13, no. 20: 5301. https://doi.org/10.3390/en13205301
APA StyleChoi, D., An, Y., Lee, N., Park, J., & Lee, J. (2020). Comparative Study of Physics-Based Modeling and Neural Network Approach to Predict Cooling in Vehicle Integrated Thermal Management System. Energies, 13(20), 5301. https://doi.org/10.3390/en13205301