Turbofan Engine Health Prediction Model Based on ESO-BP Neural Network
Abstract
:1. Introduction
2. Elite Snake Optimizer
2.1. Snake Optimizer
2.2. Elite-Guided Strategy
2.3. Reverse Learning Mechanism
2.4. Benchmark Test Results
3. Model Construction
3.1. Health Index Degradation
3.2. Introduction to Datasets
3.3. Feature Selection
3.4. Data Normalization and Correlation Analysis
3.5. BP Neural Network
3.6. Parameters of ESO
3.7. ESO Optimization Process
- 1.
- Employ grid search to determine the optimal number of neurons in the hidden layer and establish the optimal structure for the BP neural network model.
- 2.
- Initialize the parameters for ESO, define the dimensions and value ranges of the search space based on the optimal model structure, and specify the population size, maximum iteration count, and fitness function using Equations (10)–(12).
- 3.
- Randomly generate the initial population, compute the fitness value for each individual, and identify the elite sets for both the male and female populations.
- 4.
- In the early iterations, the populations enter the exploration phase due to the quantity of food being lower than 0.25. In this phase, individuals in both the male and female populations randomly select an elite guider and move to positions near it. The exploration phase is presented in Equation (13) [16].
- 5.
- In the later stages of iteration, the populations enter the exploitation phase when the food quantity exceeds 0.25. In this phase, the behavior of the populations is influenced by the temperature. If the temperature is greater than the threshold, both male and female individuals move toward the population’s optimal position (food). If the temperature is lower than the threshold, there is a 40% probability that male and female populations engage in conflict, and a 60% probability that they engage in mating. The snake fight stage is defined in Equations (14) and (15). The snake mating stage is defined in Equations (16) and (17). The calculation for updating the worst individual is defined in Equation (18).
- 6.
- Generate reverse populations using Equation (4). If there is a new individual with better fitness, the ESO algorithm will replace the original agent with the new individual.
- 7.
- Continuously adjust the position of the snake agent until reaching the maximum iteration count, obtaining the optimal spatial position.
- 8.
- Based on the coordinates of the optimal solution, set the weights from the input layer to the hidden layer, the biases for the hidden layer, the weights from the hidden layer to the output layer, and the biases for the output layer for the BP neural network.
Algorithm 1 The pseudo-code of ESO |
Stage 1. Initialization Initialized Problem Setting (d, , , n, T) |
Generate the ESO population . |
Calculate the fitness () of each individual of population. |
Divide the ESO population into two equal groups. Generate the Elite-Guided set. |
Stage 2. ESO Iteration for Calculate Temp using Equation (1). Calculate Q using Equation (2). if then end if Calculate Elite 4 using Equation (3). Select Elite 1, Elite2, and Elite 3 according to the fitness. if then Perform exploration using Equation (13). else if then Perform exploitation, all individuals move to food. else if then Male population fight with female population using Equations (14) and (15). else Snakes mate using Equations (16) and (17). Update the worst individual using Equation (18). end if end if Generate reverse populations using Equation (4). Calculate fitness of reverse individuals . if Change the best solution to reverse individual. end if end for Return optimal solution. |
4. Simulation Experiments and Result Analysis
4.1. Description to Hyperparameters
4.2. Model Evaluation Metrics
4.3. Optimization Results
4.4. Prediction Results
4.5. Comparative Experiments
4.6. Ablation Experiment
5. Discussion
- 1.
- Initializing neural network weights was demonstrated to accelerate convergence, prevent entrapment in local optima, simplify network architecture, identify feature importance, and improve prediction performance [45]. In this study, adjustments to the initial weights and biases were applied before training to improve the convergence accuracy and stability of the BP neural networks. ESO exhibited a superior convergence compared to SO in benchmark functions. Similarly, in the ablation experiment, ESO-BP outperformed SO-BP. This suggests that providing better initial weights and biases results in higher accuracy and stability in BP neural networks. ESO determined the initial weights and biases that minimized the RMSE after its iteration process, preventing the BP neural network from easily becoming stuck in local optima and experiencing stability issues due to random parameter selection.
- 2.
- In contrast to RUL prediction, Jiang et al. [46] also developed a health index prediction model for turbofan engines in 2023. According to their study, the average accuracy of 10 trials for hybrid methods was significantly lower than that of single methods. Several methods were combined to create multiple new models, including empirical mode decomposition (EMD), variational mode decomposition (VMD), scale-adaptive attention mechanism (SAA), dynamic step size-based fruit fly optimization algorithm (DSSFOA), bidirectional long short-term memory network (BiLSTM), support vector regression (SVR), and LSTM. The average accuracies of the FD002 sub-dataset are compared in Table 7.
- 3.
- The Levenberg–Marquardt algorithm is particularly effective in training BP neural networks for regression problems [47]. The LM algorithm provides numerical solutions for nonlinear minimization. It combines the strengths of the Gauss–Newton algorithm and the gradient descent method by dynamically adjusting parameters during execution to address the limitations of both methods. When the gradient drops rapidly, the LM algorithm behaves more like the Gauss–Newton algorithm. Otherwise, it behaves more like gradient descent [43].
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Zhou, H.; Farsi, M.; Harrison, A.; Parlikad, A.K.; Brintrup, A. Civil aircraft engine operation life resilient monitoring via usage trajectory mapping on the reliability contour. Reliab. Eng. Syst. Saf. 2023, 230, 108878. [Google Scholar] [CrossRef]
- Kordestani, M.; Saif, M.; Orchard, M.E.; Razavi-Far, R.; Khorasani, K. Failure prognosis and applications—A survey of recent literature. IEEE Trans. Reliab. 2019, 70, 728–748. [Google Scholar] [CrossRef]
- Zio, E. Prognostics and health management (PHM): Where are we and where do we (need to) go in theory and practice. Reliab. Eng. Syst. Saf. 2022, 218, 108119. [Google Scholar] [CrossRef]
- Najera-Flores, D.A.; Hu, Z.; Chadha, M.; Todd, M.D. A physics-constrained Bayesian neural network for battery remaining useful life prediction. Appl. Math. Modell. 2023, 122, 42–59. [Google Scholar] [CrossRef]
- Li, X.; Shao, H.; Jiang, H.; Xiang, J. Modified Gaussian convolutional deep belief network and infrared thermal imaging for intelligent fault diagnosis of rotor-bearing system under time-varying speeds. Struct. Health Monit. 2022, 21, 339–353. [Google Scholar]
- Muneer, A.; Taib, S.M.; Naseer, S.; Ali, R.F.; Aziz, I.A. Data-driven deep learning-based attention mechanism for remaining useful life prediction: Case study application to turbofan engine analysis. Electronics 2021, 10, 2453. [Google Scholar] [CrossRef]
- Ren, L.; Qin, H.; Xie, Z.; Li, B.; Xu, K. Aero-engine remaining useful life estimation based on multi-head networks. IEEE Trans. Instrum. Meas. 2022, 71, 3505810. [Google Scholar] [CrossRef]
- Chen, X.; Zeng, M. Convolution-graph attention network with sensor embeddings for remaining useful life prediction of turbofan engines. IEEE Sens. J. 2023, 23, 15786–15794. [Google Scholar] [CrossRef]
- Li, J.; Jia, Y.; Niu, M.; Zhu, W.; Meng, F. Remaining useful life prediction of turbofan engines using CNN-LSTM-SAM approach. IEEE Sens. J. 2023, 23, 10241–10251. [Google Scholar] [CrossRef]
- Li, W.; Zhang, L.-C.; Wu, C.-H.; Wang, Y.; Cui, Z.-X.; Niu, C. A data-driven approach to RUL prediction of tools. Adv. Manuf. 2024, 12, 6–18. [Google Scholar] [CrossRef]
- Rumelhart, D.E.; Hinton, G.E.; Williams, R.J. Learning representations by back-propagating errors. Nature 1986, 323, 533–536. [Google Scholar] [CrossRef]
- Ji, C.; Ding, H. Optimizing back-propagation neural network to retrieve sea surface temperature based on improved sparrow search algorithm. Remote Sens. 2023, 15, 5722. [Google Scholar] [CrossRef]
- Lv, D.; Liu, G.; Ou, J.; Wang, S.; Gao, M. Prediction of GPS satellite clock offset based on an improved particle swarm algorithm optimized BP neural network. Remote Sens. 2022, 14, 2407. [Google Scholar] [CrossRef]
- Yu, L.; Xie, L.; Liu, C.; Yu, S.; Guo, Y.; Yang, K. Optimization of BP neural network model by chaotic krill herd algorithm. Alex. Eng. J. 2022, 61, 9769–9777. [Google Scholar] [CrossRef]
- Lai, X.; Tu, Y.; Yan, B.; Wu, L.; Liu, X. A method for predicting ground pressure in meihuajing coal mine based on improved BP neural network by immune algorithm-particle swarm optimization. Processes 2024, 12, 147. [Google Scholar] [CrossRef]
- Hashim, F.A.; Hussien, A.G. Snake optimizer: A novel meta-heuristic optimization algorithm. Knowl.-Based Syst. 2022, 242, 108320. [Google Scholar] [CrossRef]
- Deng, L.; Liu, S. Snow ablation optimizer: A novel metaheuristic technique for numerical optimization and engineering design. Expert Syst. Appl. 2023, 225, 120069. [Google Scholar] [CrossRef]
- Rahman, M.A.; Sokkalingam, R.; Othman, M.; Biswas, K.; Abdullah, L.; Abdul Kadir, E. Nature-inspired metaheuristic techniques for combinatorial optimization problems: Overview and recent advances. Mathematics 2021, 9, 2633. [Google Scholar] [CrossRef]
- Li, H.; Xu, G.; Chen, B.; Huang, S.; Xia, Y.; Chai, S. Dual-mutation mechanism-driven snake optimizer for scheduling multiple budget constrained workflows in the cloud. Appl. Soft Comput. 2023, 149, 110966. [Google Scholar] [CrossRef]
- Xue, J.; Shen, B. A novel swarm intelligence optimization approach: Sparrow search algorithm. Syst. Sci. Control Eng. 2020, 8, 22–34. [Google Scholar] [CrossRef]
- Zhong, C.; Li, G.; Meng, Z. Beluga whale optimization: A novel nature-inspired metaheuristic algorithm. Knowl.-Based Syst. 2022, 251, 109215. [Google Scholar] [CrossRef]
- Xue, J.; Shen, B. Dung beetle optimizer: A new meta-heuristic algorithm for global optimization. J. Supercomput. 2023, 79, 7305–7336. [Google Scholar] [CrossRef]
- Abduljabbar, D.A.; Hashim, S.Z.M.; Sallehuddin, R. Nature-inspired optimization algorithms for community detection in complex networks: A review and future trends. Telecommun. Syst. 2020, 74, 225–252. [Google Scholar] [CrossRef]
- Miao, K.; Mao, X.; Li, C. Individualism of particles in particle swarm optimization. Appl. Soft Comput. 2019, 83, 105619. [Google Scholar] [CrossRef]
- Talbi, E.-G. Machine learning into metaheuristics: A survey and taxonomy. ACM Comput. Surv. 2021, 54, 129. [Google Scholar] [CrossRef]
- Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
- Cai, Y.; Wu, D.; Liu, S.; Fu, S.; Liu, P. Enhancing differential evolution on continuous optimization problems by detecting promising leaders. IEEE Access 2020, 8, 226557–226578. [Google Scholar] [CrossRef]
- Yu, F.; Guan, J.; Wu, H.; Chen, Y.; Xia, X. Lens imaging opposition-based learning for differential evolution with cauchy perturbation. Appl. Soft Comput. 2024, 152, 111211. [Google Scholar] [CrossRef]
- Sahoo, S.K.; Saha, A.K.; Nama, S.; Masdari, M. An improved moth flame optimization algorithm based on modified dynamic opposite learning strategy. Artif. Intell. Rev. 2023, 56, 2811–2869. [Google Scholar] [CrossRef]
- Wu, G.; Mallipeddi, R.; Suganthan, P. Problem Definitions and Evaluation Criteria for the CEC 2017 Competition and Special Session on Constrained Single Objective Real-Parameter Optimization; Technical Report; Nanyang Technological University: Singapore, 2016; pp. 1–18. [Google Scholar]
- Storn, R.; Price, K. Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. J. Global Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
- Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
- Meng, X.; Liu, Y.; Gao, X.; Zhang, H. A new bio-inspired algorithm: Chicken swarm optimization. In Proceedings of the Advances in Swarm Intelligence: 5th International Conference, ICSI 2014, Hefei, China, 17–20 October 2014; Part I. pp. 86–94. [Google Scholar]
- Alomari, Y.; Ando, M.; Baptista, M.L. Advancing aircraft engine RUL predictions: An interpretable integrated approach of feature engineering and aggregated feature importance. Sci. Rep. 2023, 13, 13466. [Google Scholar] [CrossRef] [PubMed]
- Hu, K.; Cheng, Y.; Wu, J.; Zhu, H.; Shao, X. Deep bidirectional recurrent neural networks ensemble for remaining useful life prediction of aircraft engine. IEEE Trans. Cybern. 2023, 53, 2531–2543. [Google Scholar] [CrossRef] [PubMed]
- Hou, M.; Pi, D.; Li, B. Similarity-based deep learning approach for remaining useful life prediction. Measurement 2020, 159, 107788. [Google Scholar] [CrossRef]
- Wang, C.; Zhu, Z.; Lu, N.; Cheng, Y.; Jiang, B. A data-driven degradation prognostic strategy for aero-engine under various operational conditions. Neurocomputing 2021, 462, 195–207. [Google Scholar] [CrossRef]
- Saxena, A.; Goebel, K.; Simon, D.; Eklund, N. Damage propagation modeling for aircraft engine run-to-failure simulation. In Proceedings of the 2008 International Conference on Prognostics and Health Management, Denver, CO, USA, 6–9 October 2008; pp. 1–9. [Google Scholar]
- Sun, X.; Chai, J. Random forest feature selection for partial label learning. Neurocomputing 2023, 561, 126870. [Google Scholar] [CrossRef]
- Qin, L.; Yang, G.; Sun, Q. Maximum correlation Pearson correlation coefficient deconvolution and its application in fault diagnosis of rolling bearings. Measurement 2022, 205, 112162. [Google Scholar] [CrossRef]
- Reyad, M.; Sarhan, A.M.; Arafa, M. A modified Adam algorithm for deep neural network optimization. Neural Comput. Appl. 2023, 35, 17095–17112. [Google Scholar] [CrossRef]
- Andrei, N. Scaled conjugate gradient algorithms for unconstrained optimization. Comput. Optim. Appl. 2007, 38, 401–416. [Google Scholar] [CrossRef]
- Ariizumi, S.; Yamakawa, Y.; Yamashita, N. Convergence properties of Levenberg–Marquardt methods with generalized regularization terms. Appl. Math. Comput. 2024, 463, 128365. [Google Scholar] [CrossRef]
- Zhang, J.; Hu, J.; Liu, J. Neural network with multiple connection weights. Pattern Recognit. 2020, 107, 107481. [Google Scholar] [CrossRef]
- Narkhede, M.V.; Bartakke, P.P.; Sutaone, M.S. A review on weight initialization strategies for neural networks. Artif. Intell. Rev. 2022, 55, 291–322. [Google Scholar] [CrossRef]
- Jiang, W.; Xu, Y.; Chen, Z.; Zhang, N.; Xue, X.; Liu, J.; Zhou, J. A feature-level degradation measurement method for composite health index construction and trend prediction modeling. Measurement 2023, 206, 112324. [Google Scholar] [CrossRef]
- Smith, J.S.; Wu, B.; Wilamowski, B.M. Neural network training with Levenberg–Marquardt and adaptable weight compression. IEEE Trans. Neural Networks Learn. Syst. 2018, 30, 580–587. [Google Scholar] [CrossRef]
Sub-Dataset | FD001 | FD002 | FD003 | FD004 | ||||
---|---|---|---|---|---|---|---|---|
Training Set | Test Set | Training Set | Test Set | Training Set | Test Set | Training Set | Test Set | |
Operating condition | 1 | 1 | 6 | 6 | 1 | 1 | 6 | 6 |
Fault mode | 1 | 1 | 1 | 1 | 2 | 2 | 2 | 2 |
Number of engines | 100 | 100 | 260 | 259 | 100 | 100 | 249 | 248 |
Number of operating cycles | 20,631 | 13,096 | 53,759 | 33,991 | 24,720 | 16,596 | 61,249 | 41,214 |
Mean of engine operating cycles | 206.31 | 130.96 | 206.77 | 131.24 | 247.20 | 165.96 | 245.98 | 166.19 |
Standard error of engine operating cycles | 0.32 | 0.47 | 0.20 | 0.34 | 0.55 | 0.67 | 0.30 | 0.45 |
Maximum of engine operating cycles | 362 | 303 | 378 | 367 | 525 | 475 | 543 | 486 |
Minimum of engine operating cycles | 128 | 31 | 128 | 21 | 145 | 38 | 128 | 19 |
Feature Description | Units | Symbol | ||||
---|---|---|---|---|---|---|
FD001 | FD002 | FD003 | FD004 | |||
Number of operating cycles | - | CYCLE | 5.574 | 10.141 | 5.386 | 9.868 |
Height | km | H | 0.090 | 0.867 | 0.047 | 0.231 |
Mach | - | MA | 0.086 | 0.343 | 0.002 | 0.493 |
Throttle resolver angle | ° | TRA | 0.000 | 0.141 | 0.000 | 0.146 |
Total temperature at fan inlet | °R | T2 | 0.000 | 0.193 | 0.000 | 0.224 |
Total temperature at LPC outlet | °R | T24 | 1.625 | 0.710 | 0.800 | 1.855 |
Total temperature at HPC outlet | °R | T30 | 1.134 | 1.525 | 1.191 | 0.301 |
Total temperature at LPT outlet | °R | T50 | 1.285 | 0.866 | 0.891 | 1.297 |
Pressure at fan inlet | psia | P2 | 0.000 | 0.146 | 0.000 | 0.118 |
Total pressure in bypass-duct | psia | P15 | 0.017 | 0.291 | 1.973 | 0.655 |
Total pressure at HPC outlet | psia | P30 | 1.334 | 0.698 | 0.966 | 0.812 |
Physical fan speed | rpm | NF | 1.649 | 0.766 | 0.825 | 1.085 |
Physical core speed | rpm | NC | 1.137 | 0.645 | 1.011 | 0.928 |
Engine pressure ratio (P50/P2) | - | EPR | 0.000 | 0.149 | 0.264 | 0.518 |
Static pressure at HPC outlet | psia | PS30 | 0.971 | 1.569 | 1.004 | 1.026 |
Ratio of fuel flow to Ps30 | pps/psi | PHI | 1.357 | 0.497 | 1.586 | 0.886 |
Corrected fan speed | rpm | NRF | 1.261 | 1.378 | 0.813 | 2.271 |
Corrected core speed | rpm | NRC | 0.799 | 0.959 | 1.290 | 1.124 |
Bypass ratio | - | BPR | 1.371 | 1.084 | 1.158 | 1.835 |
Burner fuel-air ratio | - | FARB | 0.000 | 0.677 | 0.000 | 0.355 |
Bleed enthalpy | - | HT_BLEED | 0.829 | 0.925 | 0.317 | 0.594 |
Demanded fan speed | rpm | NF_DMD | 0.000 | 0.259 | 0.000 | 0.180 |
Demanded corrected fan speed | rpm | PCNFR_DMD | 0.000 | 0.025 | 0.000 | 0.101 |
HPT coolant bleed | lbm/s | W31 | 0.752 | 0.554 | 1.652 | 0.382 |
LPT coolant bleed | lbm/s | W32 | 1.636 | 0.816 | 1.548 | 1.009 |
Feature | Pearson Correlation Coefficient | Feature | Pearson Correlation Coefficient |
---|---|---|---|
CYCLE | 1 | EPR | 0 |
H | −0.037 | PS30 | 0.840 |
MA | 0.013 | PHI | −0.824 |
TRA | 0 | NRF | 0.789 |
T2 | 0 | NRC | −0.057 |
T24 | 0.692 | BPR | 0.739 |
T30 | 0.672 | FARB | 0 |
T50 | 0.809 | HT_BLEED | 0.682 |
P2 | 0 | NF_DMD | 0 |
P15 | 0.121 | PCNFR_DMD | 0 |
P30 | −0.787 | W31 | −0.743 |
NF | 0.780 | W32 | −0.714 |
NC | 0.459 |
Parameter Name | Value |
---|---|
Limit of the solution space | [−2, 2] |
Population size | 10 |
Iterations | 100 |
Dataset | Model | RMSE | R2 | Score | |||
---|---|---|---|---|---|---|---|
Avg | Std | Avg | Std | Avg | Std | ||
FD001 | CNN | 6.641 × 10−2 | 4.346 × 10−3 | 9.163 × 10−1 | 1.158 × 10−2 | 5.728 × 10+1 | 4.813 × 100 |
RNN | 6.773 × 10−2 | 2.841 × 10−3 | 9.132 × 10−1 | 7.560 × 10−3 | 5.764 × 10+1 | 3.445 × 100 | |
LSTM | 6.253 × 10−2 | 1.722 × 10−3 | 9.261 × 10−1 | 4.219 × 10−3 | 5.152 × 10+1 | 2.342 × 100 | |
BP(SCG) | 6.337 × 10−2 | 1.346 × 10−3 | 9.241 × 10−1 | 3.249 × 10−3 | 5.175 × 10+1 | 1.812 × 100 | |
ESO-BP(SCG) | 6.223 × 10−2 | 7.070 × 10−4 | 9.268 × 10−1 | 1.664 × 10−3 | 5.021 × 10+1 | 8.930 × 10−1 | |
BP(LM) | 6.070 × 10−2 | 4.840 × 10−4 | 9.304 × 10−1 | 1.115 × 10−3 | 4.891 × 10+1 | 5.164 × 10−1 | |
ESO-BP(LM) | 6.040 × 10−2 | 4.160 × 10−4 | 9.311 × 10−1 | 9.580 × 10−4 | 4.859 × 10+1 | 4.111 × 10−1 | |
FD002 | CNN | 8.233 × 10−2 | 6.792 × 10−3 | 8.839 × 10−1 | 2.008 × 10−2 | 1.854 × 10+2 | 1.569 × 10+1 |
RNN | 8.645 × 10−2 | 7.401 × 10−3 | 8.719 × 10−1 | 2.294 × 10−2 | 1.976 × 10+2 | 1.629 × 10+1 | |
LSTM | 7.543 × 10−2 | 6.836 × 10−3 | 9.024 × 10−1 | 1.938 × 10−2 | 1.675 × 10+2 | 1.461 × 10+1 | |
BP(SCG) | 7.427 × 10−2 | 4.604 × 10−3 | 9.058 × 10−1 | 1.183 × 10−2 | 1.656 × 10+2 | 1.389 × 10+1 | |
ESO-BP(SCG) | 7.195 × 10−2 | 3.950 × 10−3 | 9.116 × 10−1 | 9.829 × 10−3 | 1.583 × 10+2 | 1.283 × 10+1 | |
BP(LM) | 6.528 × 10−2 | 2.730 × 10−4 | 9.275 × 10−1 | 6.080 × 10−4 | 1.380 × 10+2 | 7.994 × 10−1 | |
ESO-BP(LM) | 6.515 × 10−2 | 1.880 × 10−4 | 9.278 × 10−1 | 4.160 × 10−4 | 1.374 × 10+2 | 4.245 × 10−1 | |
FD003 | CNN | 7.155 × 10−2 | 4.230 × 10−3 | 9.098 × 10−1 | 1.164 × 10−2 | 7.954 × 10+1 | 6.028 × 100 |
RNN | 7.795 × 10−2 | 3.068 × 10−3 | 8.932 × 10−1 | 8.565 × 10−3 | 8.724 × 10+1 | 3.304 × 100 | |
LSTM | 7.133 × 10−2 | 1.807 × 10−3 | 9.106 × 10−1 | 4.604 × 10−3 | 7.732 × 10+1 | 1.962 × 100 | |
BP(SCG) | 6.892 × 10−2 | 1.986 × 10−3 | 9.165 × 10−1 | 4.976 × 10−3 | 7.478 × 10+1 | 3.088 × 100 | |
ESO-BP(SCG) | 6.729 × 10−2 | 7.630 × 10−4 | 9.205 × 10−1 | 1.812 × 10−3 | 7.269 × 10+1 | 1.384 × 100 | |
BP(LM) | 6.779 × 10−2 | 6.860 × 10−4 | 9.193 × 10−1 | 1.640 × 10−3 | 7.187 × 10+1 | 7.749 × 10−1 | |
ESO-BP(LM) | 6.744 × 10−2 | 4.090 × 10−4 | 9.202 × 10−1 | 9.690 × 10−4 | 7.146 × 10+1 | 5.216 × 10−1 | |
FD004 | CNN | 9.433 × 10−2 | 5.466 × 10−3 | 8.447 × 10−1 | 1.851 × 10−2 | 2.600 × 10+2 | 1.525 × 10+1 |
RNN | 9.982 × 10−2 | 8.187 × 10−3 | 8.256 × 10−1 | 2.961 × 10−2 | 2.756 × 10+2 | 1.847 × 10+1 | |
LSTM | 9.214 × 10−2 | 4.700 × 10−3 | 8.520 × 10−1 | 1.556 × 10−2 | 2.518 × 10+2 | 1.156 × 10+1 | |
BP(SCG) | 8.889 × 10−2 | 3.833 × 10−3 | 8.623 × 10−1 | 1.210 × 10−2 | 2.462 × 10+2 | 1.314 × 10+1 | |
ESO-BP(SCG) | 8.615 × 10−2 | 2.283 × 10−3 | 8.708 × 10−1 | 6.897 × 10−3 | 2.365 × 10+2 | 7.276 × 100 | |
BP(LM) | 7.546 × 10−2 | 6.580 × 10−4 | 9.010 × 10−1 | 1.744 × 10−3 | 1.973 × 10+2 | 1.875 × 100 | |
ESO-BP(LM) | 7.529 × 10−2 | 4.020 × 10−4 | 9.014 × 10−1 | 1.054 × 10−3 | 1.970 × 10+2 | 1.345 × 100 |
Dataset | Model | RMSE | R2 | Score |
---|---|---|---|---|
FD001 | BP | 6.070 × 10−2 | 9.304 × 10−1 | 4.891 × 10+1 |
SO-BP | 6.065 × 10−2 | 9.305 × 10−1 | 4.859 × 10+1 | |
ESO-BP | 6.040 × 10−2 | 9.311 × 10−1 | 4.859 × 10+1 | |
FD002 | BP | 6.528 × 10−2 | 9.275 × 10−1 | 1.380 × 10+2 |
SO-BP | 6.524 × 10−2 | 9.276 × 10−1 | 1.380 × 10+2 | |
ESO-BP | 6.515 × 10−2 | 9.278 × 10−1 | 1.374 × 10+2 | |
FD003 | BP | 6.779 × 10−2 | 9.193 × 10−1 | 7.187 × 10+1 |
SO-BP | 6.761 × 10−2 | 9.197 × 10−1 | 7.182 × 10+1 | |
ESO-BP | 6.744 × 10−2 | 9.202 × 10−1 | 7.146 × 10+1 | |
FD004 | BP | 7.546 × 10−2 | 9.010 × 10−1 | 1.973 × 10+2 |
SO-BP | 7.610 × 10−2 | 8.993 × 10−1 | 1.994 × 10+2 | |
ESO-BP | 7.529 × 10−2 | 9.014 × 10−1 | 1.970 × 10+2 |
Model | Number of Trial | RMSE |
---|---|---|
EMD-SAA-BiLSTM | 10 | 0.101 |
VMD-SAA-BiLSTM | 10 | 0.082 |
VMD-DSSFOA-SVR | 10 | 0.156 |
VMD-DSSFOA-LSTM | 10 | 0.093 |
VMD-DSSFOA-SAA-LSTM | 10 | 0.086 |
ESO-BP | 100 | 0.065 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, X.; Xu, N.; Dai, W.; Zhu, G.; Wen, J. Turbofan Engine Health Prediction Model Based on ESO-BP Neural Network. Appl. Sci. 2024, 14, 1996. https://doi.org/10.3390/app14051996
Zhang X, Xu N, Dai W, Zhu G, Wen J. Turbofan Engine Health Prediction Model Based on ESO-BP Neural Network. Applied Sciences. 2024; 14(5):1996. https://doi.org/10.3390/app14051996
Chicago/Turabian StyleZhang, Xiaoli, Nuo Xu, Wei Dai, Guifu Zhu, and Jun Wen. 2024. "Turbofan Engine Health Prediction Model Based on ESO-BP Neural Network" Applied Sciences 14, no. 5: 1996. https://doi.org/10.3390/app14051996
APA StyleZhang, X., Xu, N., Dai, W., Zhu, G., & Wen, J. (2024). Turbofan Engine Health Prediction Model Based on ESO-BP Neural Network. Applied Sciences, 14(5), 1996. https://doi.org/10.3390/app14051996