Top-Oil Temperature Prediction of Power Transformer Based on Long Short-Term Memory Neural Network with Self-Attention Mechanism Optimized by Improved Whale Optimization Algorithm
Abstract
:1. Introduction
2. Power Transformer and Top-Oil Temperature
3. The Proposed IWOA-LSTM-SA Method for Top-Oil Temperature Prediction
3.1. Framework
3.2. LSTM Integrated by SA
3.3. Hyper-Parameters Optimization by IWOA
3.3.1. Encircling Prey
3.3.2. Bubble-Net Attacking Method
- (1)
- Shrinkage bracketing mechanism: As decreases, represents an any value within the range of [−1, 1]. The new position is determined by the distance between its original position and the position of the currently best-so-far whale. The equation for calculation is as below:
- (2)
- Spiral updating location: the WOA uses spiral updating location to launch attacks on prey, and the spiral hunting equation is as below:
3.3.3. Search for Prey
3.3.4. Improved Whale Optimization Algorithm
- (1)
- Latin Hypercube Sampling (LHS) initialization of population: as stated in [25], population initialization plays a crucial role in swarm intelligence optimization algorithms. In WOA, population initialization follows a random approach. However, it can lead to uneven population distribution and individual overlap [26]. Therefore, it is necessary to optimize the population initialization. IWOA incorporates LHS to increase the diversity of initial population, and this method can initialize population more uniformly and efficiently.
- (2)
- Adaptive selection threshold: in WOA, the whales choose either encircling activity or spiral movement with 50% probability. However, this method prevents the whale population from choosing the appropriate movement for the current population [27,28]. In this paper, an adaptive selection threshold is used to replace the fixed threshold. The method automatically adjusts the threshold according to the problem’s characteristics throughout the search process. The calculation is given by the following formula:
- (3)
- Adaptive parameter: in traditional method, decreases linearly from 2 to 0. In order to enhances local searching ability, this study uses a nonlinear strategy to adjust b in Equation (16), which influences the shape of the logarithmic spiral. It can significantly improve the effectiveness of local search and the speed of global search, thereby enhancing overall accuracy [29]. At the same time, we establish a relationship between b and t to achieve adaptive adjustment. Equation (10) is updated to Equation (16).
4. Case Studies and Results Analysis
4.1. Data Source
4.2. Comparison of Algorithm Optimization Results
4.3. One-Step Prediction
4.4. Ablation Experiment
4.5. Multi-Step Forecasting
5. Conclusions
- (1)
- To verify the efficacy of the IWOA, this paper conducts tests with eight test functions. The findings demonstrate that the IWOA outperforms GA, PSO, and WOA in terms of convergence speed and accuracy.
- (2)
- To verify the effectiveness of the proposed model, extensive experiments were conducted using actual operating data. The experimental results indicate that the proposed approach outperforms current state-of-the-art methods. On Dataset 1, the model achieved reductions in RMSE of 15.31%, 12.64%, 7.41%, 11.94%, 6.44%, and 1.98% compared to the BP, CNN, GRU, LSTM, LSTM-SA, and WOA-LSTM-SA methods, respectively. Similarly, on Dataset 2, the model demonstrated significant improvements, with RMSE reductions of 18.85%, 9.09%, 1.19%, 14.29%, 7.42%, and 1.06% compared to the same benchmark methods.
- (3)
- The proposed model performs effectively across various prediction steps compared to benchmark models. Specifically, for the 3-step prediction, the RMSE of the proposed model is 1.537 and 1.015 for Dataset 1 and Dataset 2, respectively, reflecting reductions of 12.83% and 38.65% compared to the BP model, 6.98% and 20.89% compared to the CNN model, 3.75% and 13.62% compared to the GRU model, 4.24% and 27.16% compared to the LSTM model, 1.60% and 17.93% compared to the LSTM-SA model, and 1.16% and 4.34% compared to the WOA-LSTM-SA model. For the 5-step prediction, the RMSE of the proposed model is 1.714 and 1.634, representing reductions of 12.60% and 11.11% compared to the BP model, 7.61% and 15.89% compared to the CNN model, 6.49% and 17.30% compared to the GRU model, 5.19% and 14.14% compared to the LSTM model, 4.56% and 12.82% compared to the LSTM-SA model, and 3.06% and 1.80% compared to the WOA-LSTM-SA model.
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Appendix A
Function | Range |
---|---|
References
- Xu, X.; He, Y.; Li, X.; Peng, F.; Xu, Y. Overload Capacity for Distribution Transformers with Natural-Ester Immersed High-Temperature Resistant Insulating Paper. Power Sys. Technol. 2018, 42, 1001–1006. [Google Scholar]
- Wang, S.; Gao, M.; Zhuo, R. Research on high efficient order reduction algorithm for temperature coupling simulation model of transformer. High Volt. Appar. 2023, 59, 115–126. [Google Scholar]
- Liu, X.; Xie, J.; Luo, Y. A novel power transformer fault diagnosis method based on data augmentation for KPCA and deep residual network. Energy Rep. 2023, 9, 620–627. [Google Scholar] [CrossRef]
- Chen, T.; Chen, Y.; Li, X. Prediction for dissolved gas concentration in power transformer oil based on CEEMDAN-SG-BiLSTM. High Volt. Appar. 2023, 59, 168–175. [Google Scholar]
- Zang, C.; Zeng, J.; Li, P. Intelligent diagnosis model of mechanical fault for power transformer based on SVM algorithm. High Volt. Appar. 2023, 59, 216–222. [Google Scholar]
- Ji, H.; Wu, X.; Wang, H. A New Prediction Method of Transformer Oil Temperature Based on C-Prophet. Adv. Power Syst. Hyd. Eng. 2023, 39, 48–55. [Google Scholar]
- Tan, F.; Xu, G.; Zhang, P. Research on Top Oil Temperature Prediction Method of Similar Day Transformer Based on Topsis and Entropy Method. Elect. Power Sci. Eng. 2021, 37, 62–69. [Google Scholar]
- Amoda, O.A.; Tylavsky, D.J.; McCulla, G.A.; Knuth, W.A. Acceptability of three transformer hottest-spot temperature models. IEEE Trans. Power Deliv. 2011, 27, 13–22. [Google Scholar] [CrossRef]
- Zhou, L.; Wang, J.; Wang, L.; Yuan, S.; Huang, L.; Wand, D.; Guo, L. A Method for Hot-Spot Temperature Prediction and Thermal Capacity Estimation for Traction Transformers in High-Speed Railway Based on Genetic Programming. IEEE Trans. Transp. Electrif. 2019, 5, 1319–1328. [Google Scholar] [CrossRef]
- Deng, Y.; Ruan, J.; Quan, Y.; Gong, R.; Huang, D.; Duan, C.; Xie, Y. A Method for Hot Spot Temperature Prediction of a 10 kV Oil-Immersed Transformer. IEEE Access 2019, 7, 107380. [Google Scholar] [CrossRef]
- Zhao, B.; Zhang, X. Parameter Identification of Transformer Top Oil Temperature Model and Prediction of Top Oil Tempeature. High. Volt. Eng. 2004, 30, 9–10. [Google Scholar]
- Wang, H.; Su, P.; Wang, X. Prediction of Surface Temperatures of Large Oil-Immersed Power Transformers. J. Tsinghua Univ. Sci. Technol. 2005, 45, 569–572. [Google Scholar]
- Tan, M.; Hu, C.; Chen, J.; Wang, L.; Li, Z. Multi-node load forecasting based on multi-task learning with modal feature extraction. Eng. Appl. Artif. Intell. 2022, 112, 104856. [Google Scholar] [CrossRef]
- Shang, Y.; Li, S. FedPT-V2G: Security enhanced federated transformer learning for real-time V2G dispatch with non-IID data. Appl. Energy 2024, 358, 122626. [Google Scholar] [CrossRef]
- Bai, M.; Yao, P.; Dong, H.; Fang, Z.; Jin, W.; Yang, X.; Liu, J.; Yu, D. Spatial-temporal characteristics analysis of solar irradiance forecast errors in Europe and North America. Energy 2024, 297, 131187. [Google Scholar] [CrossRef]
- Qing, H.; Jennie, S.; Daniel, J. Prediction of top-oil temperature for transformers using neural network. IEEE Trans. Power Deliv. 2000, 15, 1205–1211. [Google Scholar]
- Tan, F.; Chen, H.; He, J. Top oil temperature forecasting of UHV transformer based on path analysis and similar time. Elect. Power Autom. Equip. 2021, 41, 217–224. [Google Scholar]
- Li, S.; Xue, J.; Wu, M.; Xie, R.; Jin, B.; Zhang, H.; Li, Q. Prediction of Transformer Top-oil Temperature with the Improved Weighted Support Vector Regression Based on Particle Swarm Optimization. High Volt. Appar. 2021, 57, 103–109. [Google Scholar]
- Tan, F.L.; Xu, G.; Li, Y.F.; Chen, H.; He, J.H. A method of transformer top oil temperature forecasting based on similar day and similar hour. Elect. Power Eng. Tech. 2022, 41, 193–200. [Google Scholar]
- Yi, Y. Research on Prediction Method of Transformer Top-Oil Temperature Based on Assisting Dispatchers in Decision-Making. Master’s Thesis, Southwest Jiaotong University, Chengdu, China, 2017. [Google Scholar]
- Gharehchopogh, F.S.; Gholizadeh, H. A comprehensive survey: Whale Optimization Algorithm and its applications. Swarm Evol. Comput. 2019, 48, 1–24. [Google Scholar] [CrossRef]
- Brodzicki, A.; Piekarski, M.; Jaworek-Korjakowska, J. The whale optimization algorithm approach for deep neural networks. Sensors 2021, 21, 8003. [Google Scholar] [CrossRef] [PubMed]
- Mostafa Bozorgi, S.; Yazdani, S. IWOA: An improved whale optimization algorithm for optimization problems. J. Comput. Des. Eng. 2019, 6, 243–259. [Google Scholar] [CrossRef]
- Naderi, E.; Azizivahed, A.; Asrari, A. A step toward cleaner energy production: A water saving-based optimization approach for economic dispatch in modern power systems. Electr. Power Syst. Res. 2022, 204, 107689. [Google Scholar] [CrossRef]
- Gao, W.; Liu, S.; Huang, L. Inspired artificial bee colony algorithm for global optimization problems. Acta Electron. Sin. 2012, 40, 2396. [Google Scholar]
- Shi, X.; Li, M.; Wei, Q. Application of Quadratic Interpolation Whale Optimization Algorithm in Cylindricity Error evaluation. Metrol. Meas. Tech. 2019, 46, 58–60. [Google Scholar]
- He, Q.; Wei, K.; Xu, Q. Mixed strategy based improved whale optimization algorithm. Appl. Res. Comput. 2019, 36, 3647–3651. [Google Scholar]
- Qiu, X.; Wang, R.; Zhang, W.; Zhang, Z.; Zhang, Q. Improved Whale Optimizer Algorithm Based on Hybrid Strategy. Comput. Eng. Appl. 2022, 58, 70–78. [Google Scholar]
- Chen, Y.; Han, B.; Xu, G.; Kan, Y.; Zhao, Z. Spatial Straightness Error Evaluation with Improved Whale Optimization Algorithm. Mech. Sci. Technol. Aero. Eng. 2022, 41, 1102–1111. [Google Scholar]
- Xu, J.; Yan, F. The Application of Improved Whale Optimization Algorithm in Power Load Dispatching. Oper. Res. Manag. Sci. 2020, 29, 149–159. [Google Scholar]
- Naderi, E.; Mirzaei, L.; Pourakbari-Kasmaei, M.; Cerna, F.V.; Lehtonen, M. Optimization of active power dispatch considering unified power flow controller: Application of evolutionary algorithms in a fuzzy framework. Evol. Intell. 2024, 17, 1357–1387. [Google Scholar] [CrossRef]
AI | BI | CI | P | Q | AU | BU | CU | T | |
---|---|---|---|---|---|---|---|---|---|
AI | 1.000 | 0.999 | 0.999 | 0.999 | 0.925 | −0.862 | −0.866 | −0.835 | 0.371 |
BI | 0.999 | 1.000 | 0.999 | 0.999 | 0.924 | −0.863 | −0.866 | −0.835 | 0.371 |
CI | 0.999 | 0.999 | 1.000 | 0.999 | 0.925 | −0.862 | −0.866 | −0.835 | 0.371 |
P | 0.999 | 0.999 | 0.999 | 1.000 | 0.925 | −0.857 | −0.859 | −0.828 | 0.369 |
Q | 0.925 | 0.924 | 0.925 | 0.925 | 1.000 | −0.842 | −0.844 | −0.823 | 0.372 |
AU | −0.862 | −0.863 | −0.862 | −0.857 | −0.842 | 1.000 | 0.979 | 0.964 | −0.346 |
BU | −0.866 | −0.866 | −0.866 | −0.859 | −0.844 | 0.979 | 1.000 | 0.981 | −0.342 |
CU | −0.835 | −0.835 | −0.835 | −0.828 | −0.823 | 0.964 | 0.981 | 1.000 | −0.339 |
T | 0.371 | 0.371 | 0.371 | 0.369 | 0.372 | −0.346 | −0.342 | −0.339 | 1.000 |
Function | Evaluation Index | GA | PSO | WOA | IWOA |
---|---|---|---|---|---|
Mean | 3602.311 | 0.035 | 7.21 × 10−10 | 1.46 × 10−19 | |
Best | 1454.955 | 0.001 | 3.32 × 10−13 | 1.17 × 10−24 | |
Mean | 21.197 | 32.013 | 5.16 × 10−9 | 1.73 × 10−13 | |
Best | 13.936 | 0.081 | 5.12 × 10−9 | 2.24 × 10−15 | |
Mean | 3477.958 | 0.047 | 8.98 × 10−10 | 4.16 × 10−20 | |
Best | 1771.241 | 0.001 | 1.68 × 10−12 | 1.42 × 10−22 | |
Mean | 1.432 | 5.176 | 0.015 | 0.00075 | |
Best | 0.413 | 0.065 | 0.003 | 0.00014 | |
Mean | 28.474 | 51.152 | 0 | 0 | |
Best | 5.522 | 0 | 0 | 0 | |
Mean | 91.831 | 127.257 | 0.462 | 1.78 × 10−16 | |
Best | 64.795 | 69.170 | 6.78 × 10−11 | 0 | |
Mean | 11.337 | 2.028 | 3.936 | 1.49 × 10−11 | |
Best | 9.197 | 0.023 | 8.06 × 10−7 | 1.35 × 10−12 | |
Mean | 77.000 | 551.976 | 0.988 | 0 | |
Best | 35.494 | 185.625 | 0 | 0 | |
Mean | 75.910 | 727.867 | −0.898 | −0.829 | |
Best | 28.593 | 479.302 | −0.967 | −0.986 | |
Mean | 73.449 | 596.665 | −0.890 | −0.796 | |
Best | 26.910 | 332.989 | −0.980 | −0.899 |
Model | RMSE | MAE | MAPE (%) | R2 | Time (s) | |
---|---|---|---|---|---|---|
Dataset 1 | BP | 1.698 | 1.228 | 2.581 | 0.825 | 13.287 |
CNN | 1.646 | 1.170 | 2.462 | 0.836 | 32.317 | |
GRU | 1.553 | 1.011 | 2.144 | 0.854 | 96.109 | |
LSTM | 1.633 | 1.022 | 2.175 | 0.838 | 129.666 | |
LSTM-SA | 1.537 | 1.031 | 2.253 | 0.861 | 174.497 | |
WOA-LSTM-SA | 1.462 | 0.998 | 2.103 | 0.870 | 11,058.906 | |
IWOA-LSTM-SA | 1.438 | 0.989 | 2.089 | 0.873 | 10,083.375 | |
Dataset 2 | BP | 0.923 | 0.715 | 2.428 | 0.974 | 38.216 |
CNN | 0.824 | 0.596 | 1.929 | 0.979 | 80.746 | |
GRU | 0.758 | 0.544 | 1.772 | 0.982 | 165.984 | |
LSTM | 0.874 | 0.643 | 2.129 | 0.977 | 234.946 | |
LSTM-SA | 0.809 | 0.576 | 1.890 | 0.980 | 383.995 | |
WOA-LSTM-SA | 0.757 | 0.535 | 1.739 | 0.982 | 13,016.477 | |
IWOA-LSTM-SA | 0.749 | 0.524 | 1.703 | 0.983 | 11,075.689 |
LSTM | LSTM-SA | WOA-LSTM | IWOA-LSTM | WOA-LSTM-SA | IWOA-LSTM-SA | ||
---|---|---|---|---|---|---|---|
Dataset 1 | RMSE | 1.633 | 1.537 | 1.596 | 1.517 | 1.462 | 1.438 |
MAPE | 2.175 | 2.253 | 2.141 | 2.106 | 2.103 | 2.089 | |
Dataset 2 | RMSE | 0.874 | 0.809 | 0.837 | 0.782 | 0.757 | 0.749 |
MAPE | 2.129 | 1.890 | 2.042 | 1.814 | 1.739 | 1.703 |
Step | Model | RMSE | MAE | MAPE (%) | Time (s) | |
---|---|---|---|---|---|---|
Dataset 1 | 1 (30 min) | BP | 1.698 | 1.228 | 2.581 | 13.287 |
CNN | 1.646 | 1.170 | 2.462 | 32.317 | ||
GRU | 1.553 | 1.011 | 2.144 | 96.109 | ||
LSTM | 1.633 | 1.022 | 2.175 | 129.666 | ||
LSTM-SA | 1.537 | 1.031 | 2.253 | 174.497 | ||
WOA-LSTM-SA | 1.462 | 0.998 | 2.103 | 11,058.906 | ||
IWOA-LSTM-SA | 1.438 | 0.989 | 2.089 | 10,083.375 | ||
3 (90 min) | BP | 1.763 | 1.382 | 2.873 | 14.082 | |
CNN | 1.652 | 1.221 | 2.557 | 22.572 | ||
GRU | 1.597 | 1.133 | 2.409 | 95.775 | ||
LSTM | 1.605 | 1.164 | 2.453 | 179.898 | ||
LSTM-SA | 1.562 | 1.162 | 2.448 | 229.012 | ||
WOA-LSTM-SA | 1.555 | 1.102 | 2.311 | 11,746.135 | ||
IWOA-LSTM-SA | 1.537 | 1.088 | 2.308 | 10,149.217 | ||
5 (150 min) | BP | 1.961 | 1.611 | 3.351 | 13.617 | |
CNN | 1.855 | 1.411 | 2.973 | 21.579 | ||
GRU | 1.833 | 1.387 | 2.943 | 98.763 | ||
LSTM | 1.808 | 1.367 | 2.878 | 197.507 | ||
LSTM-SA | 1.796 | 1.345 | 2.832 | 240.519 | ||
WOA-LSTM-SA | 1.768 | 1.352 | 2.859 | 12,212.086 | ||
IWOA-LSTM-SA | 1.714 | 1.294 | 2.702 | 10,778.976 | ||
Dataset 2 | 1 (30 min) | BP | 0.923 | 0.715 | 2.428 | 38.216 |
CNN | 0.824 | 0.596 | 1.929 | 80.746 | ||
GRU | 0.758 | 0.544 | 1.772 | 165.984 | ||
LSTM | 0.874 | 0.643 | 2.129 | 234.946 | ||
LSTM-SA | 0.809 | 0.576 | 1.890 | 383.995 | ||
WOA-LSTM-SA | 0.757 | 0.535 | 1.739 | 13,016.477 | ||
IWOA-LSTM-SA | 0.749 | 0.524 | 1.703 | 11,075.689 | ||
3 (90 min) | BP | 1.654 | 1.124 | 4.225 | 37.313 | |
CNN | 1.283 | 1.012 | 3.166 | 79.190 | ||
GRU | 1.175 | 0.831 | 2.821 | 229.788 | ||
LSTM | 1.394 | 1.080 | 3.674 | 320.336 | ||
LSTM-SA | 1.237 | 0.923 | 3.111 | 433.645 | ||
WOA-LSTM-SA | 1.061 | 0.833 | 2.746 | 13,623.563 | ||
IWOA-LSTM-SA | 1.015 | 0.750 | 2.537 | 11,284.158 | ||
5(150 min) | BP | 1.838 | 1.568 | 4.854 | 37.081 | |
CNN | 1.943 | 1.403 | 4.933 | 77.883 | ||
GRU | 1.976 | 1.387 | 4.801 | 264.860 | ||
LSTM | 1.903 | 1.414 | 4.765 | 171.239 | ||
LSTM-SA | 1.874 | 1.365 | 4.810 | 414.213 | ||
WOA-LSTM-SA | 1.664 | 1.249 | 4.298 | 12,823.645 | ||
IWOA-LSTM-SA | 1.634 | 1.229 | 4.162 | 10,984.776 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zou, D.; Xu, H.; Quan, H.; Yin, J.; Peng, Q.; Wang, S.; Dai, W.; Hong, Z. Top-Oil Temperature Prediction of Power Transformer Based on Long Short-Term Memory Neural Network with Self-Attention Mechanism Optimized by Improved Whale Optimization Algorithm. Symmetry 2024, 16, 1382. https://doi.org/10.3390/sym16101382
Zou D, Xu H, Quan H, Yin J, Peng Q, Wang S, Dai W, Hong Z. Top-Oil Temperature Prediction of Power Transformer Based on Long Short-Term Memory Neural Network with Self-Attention Mechanism Optimized by Improved Whale Optimization Algorithm. Symmetry. 2024; 16(10):1382. https://doi.org/10.3390/sym16101382
Chicago/Turabian StyleZou, Dexu, He Xu, Hao Quan, Jianhua Yin, Qingjun Peng, Shan Wang, Weiju Dai, and Zhihu Hong. 2024. "Top-Oil Temperature Prediction of Power Transformer Based on Long Short-Term Memory Neural Network with Self-Attention Mechanism Optimized by Improved Whale Optimization Algorithm" Symmetry 16, no. 10: 1382. https://doi.org/10.3390/sym16101382
APA StyleZou, D., Xu, H., Quan, H., Yin, J., Peng, Q., Wang, S., Dai, W., & Hong, Z. (2024). Top-Oil Temperature Prediction of Power Transformer Based on Long Short-Term Memory Neural Network with Self-Attention Mechanism Optimized by Improved Whale Optimization Algorithm. Symmetry, 16(10), 1382. https://doi.org/10.3390/sym16101382