Remaining Useful Life Prediction for Turbofan Engine Using SAE-TCN Model †
Abstract
:1. Introduction
2. Feature Extraction Based on Stacked Autoencoder Neural Network
2.1. Sparse Autoencoder
2.1.1. Autoencoder
2.1.2. Sparse Autoencoder
2.1.3. Stacked Sparse Autoencoder
2.2. Temporal Convolutional Network
2.2.1. Causal Convolutions
2.2.2. Dilated Convolutions
2.2.3. Residual Connections
2.3. SAE-TCN Prediction Model
- Obtaining the monitoring data set of each sensor parameter of the aero engine, performing data preprocessing and feature extraction, and using a series of preprocessing training sets in the time-series convolutional network model for iterative training.
- Building the engine life prediction model: Preliminary construction of a temporal convolutional neural network model, given initial hyperparameters, including the dimension of the input matrix, the size of the convolution kernel (kernel size), the number of convolutional layers (number of filters), time steps (time steps), dropout rate, epoch, batch size, etc.
- Engine remaining life prediction: According to the initially established TCN model, input the prepared data set for training and evaluate it on the test set. The minimum mean square error (MSE) is used as the loss function to measure the error between the real remaining life and the predicted value. Import the Adam module as an optimizer for model training and parameter optimization change.
3. Experiments and Results
3.1. Benchmark Dataset
3.2. Performance Measures
3.3. Experimental Environment and Parameter Configuration
- Dropout: Dropout means the model will be retrained under a new structure where some hidden-layer neurons are randomly deleted. For random descent algorithms, this allows different networks to be trained on each batch of data. Finally, these models are integrated to obtain average prediction results. This practice is intend to reduce the dependence between neurons and improve the generalization ability.
- Early stopping: When a model has been trained for a long time, the performance of the model on the validation set might deteriorate [38]. This a common type of over-fitting, which can be avoid by early stopping. By early stopping, the training algorithm terminates when the performance of the model begins to decline. The balance between the training time and model generalization ability is expected to be achieved using this strategy.
3.4. Prediction Case
3.5. Performance Comparison
4. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Richter, H. Advanced Control of Turbofan Engines; Springer: London, UK, 2012. [Google Scholar]
- Salunkhe, T.; Jamadar, N.; Kivade, S. Prediction of Remaining Useful Life of mechanical components-a Review. Int. J. Eng. Sci. Innov. Technol. (IJESIT) 2014, 3, 125–135. [Google Scholar]
- Wang, X.; Li, Y.; Xu, Y.; Liu, X.; Zheng, T.; Zheng, B. Remaining useful life prediction for aero-engines using a time-enhanced multi-head self-attention model. Aerospace 2023, 10, 80. [Google Scholar] [CrossRef]
- Wang, H.; Li, D.; Li, D.; Liu, C.; Yang, X.; Zhu, G. Remaining Useful Life Prediction of Aircraft Turbofan Engine Based on Random Forest Feature Selection and Multi-Layer Perceptron. Appl. Sci. 2023, 13, 7186. [Google Scholar] [CrossRef]
- Huang, Y.; Tao, J.; Sun, G.; Zhang, H.; Hu, Y. A prognostic and health management framework for aero-engines based on a dynamic probability model and LSTM network. Aerospace 2022, 9, 316. [Google Scholar] [CrossRef]
- Chen, Z.; Cao, S.; Mao, Z. Remaining useful life estimation of aircraft engines using a modified similarity and supporting vector machine (SVM) approach. Energies 2017, 11, 28. [Google Scholar] [CrossRef]
- Rohan, A. Deep Scattering Spectrum Germaneness for Fault Detection and Diagnosis for Component-Level Prognostics and Health Management (PHM). Sensors 2022, 22, 9064. [Google Scholar] [CrossRef]
- Chui, K.T.; Gupta, B.B.; Vasant, P. A genetic algorithm optimized RNN-LSTM model for remaining useful life prediction of turbofan engine. Electronics 2021, 10, 285. [Google Scholar] [CrossRef]
- Muneer, A.; Taib, S.; Naseer, S.; Ali, R.; Aziz, A. Data-Driven deep learning-based attention mechanism for remainging useful life prediction: Case study application to turbofan engine analysis. Electronics 2021, 10, 2453. [Google Scholar] [CrossRef]
- Xie, Z.; Du, S.; Deng, Y.; Jia, S. A hybrid prognostics deep learning model for remaining useful life prediction. Electronics 2020, 10, 39. [Google Scholar] [CrossRef]
- Kang, Z.; Catal, C.; Tekinerdogan, B. Remaining useful life (RUL) prediction of equipment in production lines using artificial neural networks. Sensors 2021, 21, 932. [Google Scholar] [CrossRef]
- Zhao, C.; Huang, X.; Li, Y.; Yousaf Iqbal, M. A double-channel hybrid deep neural network based on CNN and BiLSTM for remaining useful life prediction. Sensors 2020, 20, 7109. [Google Scholar] [CrossRef]
- Elsheikh, A.; Yacout, S.; Ouali, M.S. Bidirectional handshaking LSTM for remaining useful life prediction. Neurocomputing 2019, 323, 148–156. [Google Scholar] [CrossRef]
- Orsagh, R.F.; Sheldon, J.; Klenke, C.J. Prognostics/Diagnostics for Gas Turbine Engine Bearings; American Society of Mechanical Engineers (ASME): New York, NY, USA, 2003; Volume 36843. [Google Scholar]
- Chelidze, D.; Cusumano, J.P. A dynamical systems approach to failure prognosis. J. Vib. Acoust. 2004, 126, 2–8. [Google Scholar] [CrossRef]
- Giantomassi, A.; Ferracuti, F.; Benini, A.; Ippoliti, G.; Longhi, S.; Petrucci, A. Hidden Markov model for health estimation and prognosis of turbofan engines. In Proceedings of the International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, Washington, DC, USA, 28–31 August 2011; Volume 54808, pp. 681–689. [Google Scholar]
- Wu, Q.; Ding, K.; Huang, B. Approach for fault prognosis using recurrent neural network. J. Intell. Manuf. 2018, 31, 1621–1633. [Google Scholar] [CrossRef]
- Sikorska, J.; Hodkiewicz, M.; Ma, L. Prognostic modelling options for remaining useful life estimation by industry. Mech. Syst. Signal Process. 2011, 25, 1803–1836. [Google Scholar] [CrossRef]
- Wu, Y.; Yuan, M.; Dong, S.; Lin, L.; Liu, Y. Remaining useful life estimation of engineered systems using vanilla LSTM neural networks. Neurocomputing 2018, 275, 167–179. [Google Scholar] [CrossRef]
- Chen, X.; Jin, G.; Qiu, S.; Lu, M.; Yu, D. Direct remaining useful life estimation based on random forest regression. In Proceedings of the 2020 Global Reliability and Prognostics and Health Management (PHM-Shanghai), Shanghai, China, 16–18 October 2020; pp. 1–7. [Google Scholar]
- Peng, C.; Chen, Y.; Chen, Q.; Tang, Z.; Li, L.; Gui, W. A remaining useful life prognosis of turbofan engine using temporal and spatial feature fusion. Sensors 2020, 21, 418. [Google Scholar] [CrossRef]
- Zhang, X.; Xiao, P.; Cheng, Y.; Chen, B.; Gao, D.; Liu, W.; Huang, Z. Remaining useful life estimation using CNN-XGB with extended time window. IEEE Access 2019, 7, 154386–154397. [Google Scholar] [CrossRef]
- Rumelhart, D.E.; Hinton, G.E.; Williams, R.J. Learning Internal Representations by Error Propagation; Technical Report; California University of San Diego, La Jolla Institute for Cognitive Science: La Jolla, CA, USA, 1985. [Google Scholar]
- Ranzato, M.; Poultney, C.; Chopra, S.; LeCun, Y. Efficient learning of sparse representations with an energy-based model. Adv. Neural Inf. Process. Syst. 2007, 19, 1137. [Google Scholar]
- Meng, L.; Ding, S.; Xue, Y. Research on denoising sparse autoencoder. Int. J. Mach. Learn. Cybern. 2017, 8, 1719–1729. [Google Scholar] [CrossRef]
- Lea, C.; Vidal, R.; Reiter, A.; Hager, G.D. Temporal convolutional networks: A unified approach to action segmentation. In Proceedings of the European Conference on Computer Vision, Amsterdam, The Netherlands, 8–10 October 2016; pp. 47–54. [Google Scholar]
- Lea, C.; Flynn, M.D.; Vidal, R.; Reiter, A.; Hager, G.D. Temporal convolutional networks for action segmentation and detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 156–165. [Google Scholar]
- Bai, S.; Kolter, J.Z.; Koltun, V. An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv 2018, arXiv:1803.01271. [Google Scholar]
- Wan, R.; Mei, S.; Wang, J.; Liu, M.; Yang, F. Multivariate temporal convolutional network: A deep neural networks approach for multivariate time series forecasting. Electronics 2019, 8, 876. [Google Scholar] [CrossRef]
- Chen, Y.; Kang, Y.; Chen, Y.; Wang, Z. Probabilistic forecasting with temporal convolutional neural network. Neurocomputing 2020, 399, 491–501. [Google Scholar] [CrossRef]
- Yan, J.; Mu, L.; Wang, L.; Ranjan, R.; Zomaya, A.Y. Temporal convolutional networks for the advance prediction of ENSO. Sci. Rep. 2020, 10, 8055. [Google Scholar] [CrossRef]
- Li, Y.; Yu, R.; Shahabi, C.; Liu, Y. Diffusion convolutional recurrent neural network: Data-driven traffic forecasting. arXiv 2017, arXiv:1707.01926. [Google Scholar]
- Oord, A.v.d.; Dieleman, S.; Zen, H.; Simonyan, K.; Vinyals, O.; Graves, A.; Kalchbrenner, N.; Senior, A.; Kavukcuoglu, K. Wavenet: A generative model for raw audio. arXiv 2016, arXiv:1609.03499. [Google Scholar]
- Yu, F.; Koltun, V. Multi-scale context aggregation by dilated convolutions. arXiv 2015, arXiv:1511.07122. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- Saxena, A.; Goebel, K.; Simon, D.; Eklund, N. Damage propagation modeling for aircraft engine run-to-failure simulation. In Proceedings of the 2008 International Conference on Prognostics and Health Management, Denver, CO, USA, 6–9 October 2008; pp. 1–9. [Google Scholar]
- Zheng, S.; Ristovski, K.; Farahat, A.; Gupta, C. Long short-term memory network for remaining useful life estimation. In Proceedings of the 2017 IEEE International Conference on Prognostics and Health Management (ICPHM), Dallas, TX, USA, 19–21 June 2017; pp. 88–95. [Google Scholar]
- Goodfellow, I.; Bengio, Y.; Courville, A.; Bengio, Y. Deep Learning; MIT Press: Cambridge, MA, USA, 2016; Volume 1. [Google Scholar]
- Muneer, A.; Taib, S.; Fati, S.; Alhussian, H. Deep-learning based prognosis approach for remaining useful life prediction of turbofan engine. Symmetry 2017, 13, 1861. [Google Scholar] [CrossRef]
- Yuan, N.; Yang, H.; Fang, H. Aero-engine prognostic method based on convolutional neural network. Comput. Meas. Control 2019, 27, 74–78. [Google Scholar]
- Sun, W.; Zhao, R.; Yan, R. Convolutional discriminative feature learning for induction motor fault diagnosis. IEEE Trans. Ind. Inform. 2017, 13, 1350–1359. [Google Scholar] [CrossRef]
No. | Parameter Description | Units |
---|---|---|
1 | Total temperature at fan inlet | R |
2 | Total temperature at LPC outlet | R |
3 | Total temperature at HPC outlet | R |
4 | Total temperature at LPT outlet | R |
5 | Pressure at fan inlet | psia |
6 | Total pressure in bypass inlet | psia |
7 | Total pressure at HPC outlet | psia |
8 | Physical fan speed | rpm |
9 | Physical core speed | rpm |
10 | Engine pressure ratio | - |
11 | Static pressure at HPC outlet | psia |
12 | Ratio of fuel flow to static pressure at HPC outlet | pps/psiu |
13 | Ratio of static pressure | psia |
14 | Corrected fan speed | rpm |
15 | Corrected core speed | rpm |
16 | Bypass ratio | - |
17 | Burner fuel–air ratio | - |
18 | Demanded fan speed | rpm |
19 | Demanded corrected fan speed | rpm |
20 | HPT cooland bleed | lbm/s |
21 | LPT cooland bleed | lbm/s |
22 | Altitude | ft |
23 | Mach | - |
24 | Throttle resolver angle | deg |
Name | Number of Engines in Training Set | Number of Engines in Test Set | Operating Conditions | Failure Modes | Training Sample | Test Sample | Sensors | Operating Parameters |
---|---|---|---|---|---|---|---|---|
FD001 | 100 | 100 | Sea Level | HPC Degradation | 20,630 | 13,095 | 21 | 3 |
Parameter | Value |
---|---|
Time Steps | 10 |
Dropout Rate | 0.4 |
Kernel Size | 2 |
Number of Filters | 64 |
Epochs | 10 |
Batch Size | 32 |
Model | RMSE | Score |
---|---|---|
DSAE-TCN | 18.01 | 161 |
TCN | 18.74 | 289 |
DLSTM | 19.53 | 327 |
Bi-LSTM | 19.94 | 435 |
GRU | 20.60 | 885 |
RNN | 22.59 | 780 |
CN4 | 22.93 | 1207 |
SVR | 23.75 | 989 |
MLP | 25.93 | 7890 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Liu, X.; Xiong, L.; Zhang, Y.; Luo, C. Remaining Useful Life Prediction for Turbofan Engine Using SAE-TCN Model. Aerospace 2023, 10, 715. https://doi.org/10.3390/aerospace10080715
Liu X, Xiong L, Zhang Y, Luo C. Remaining Useful Life Prediction for Turbofan Engine Using SAE-TCN Model. Aerospace. 2023; 10(8):715. https://doi.org/10.3390/aerospace10080715
Chicago/Turabian StyleLiu, Xiaofeng, Liuqi Xiong, Yiming Zhang, and Chenshuang Luo. 2023. "Remaining Useful Life Prediction for Turbofan Engine Using SAE-TCN Model" Aerospace 10, no. 8: 715. https://doi.org/10.3390/aerospace10080715
APA StyleLiu, X., Xiong, L., Zhang, Y., & Luo, C. (2023). Remaining Useful Life Prediction for Turbofan Engine Using SAE-TCN Model. Aerospace, 10(8), 715. https://doi.org/10.3390/aerospace10080715