A Time Series Prediction Model for Wind Power Based on the Empirical Mode Decomposition–Convolutional Neural Network–Three-Dimensional Gated Neural Network
Abstract
:1. Introduction
2. Data and Evaluation Method
2.1. Wind Farm Data
2.2. Data
2.3. Evaluation Method
2.3.1. Core Metrics
- MAE (Mean Absolute Error) reflects the average deviation between predicted and actual values. It is robust against outliers and is defined as follows:
- MSE (Mean Squared Error) is the average of the squared differences between observed and predicted values, emphasizing larger errors:
- RMSE (Root Mean Squared Error) provides a standard measure of error magnitude in the same units as the data:
- (Coefficient of Determination) indicates the proportion of variance in the dependent variable predictable from the independent variables:
2.3.2. Performance Improvement Metric
3. Methods
3.1. Construction of EMD–CNN–TGNN
3.1.1. EMD
- Extraction of Local Extrema: Identify all local extrema in the given signal .
- Envelope Extraction: For each pair of adjacent local extrema and , perform linear interpolation to derive the upper and lower envelopes:
- Extraction of IMFs: Subtract the mean envelope from the signal to obtain the first-order IMF . Repeat this process until meets the stopping criteria to become the first IMF. Continue similarly until all IMFs are extracted.
3.1.2. Construction of CNN
3.1.3. Construction of the Three-Dimensional Gated Neural (TGN) Unit
3.2. Implementation Steps
3.3. Comparison Models
4. Results and Analysis
- In Wind Farm A, the TGNN model (EMD–CNN–TGNN) demonstrated higher predictive accuracy for turbines A1, A2, and A3, especially in capturing rapid fluctuations in the data more closely. Compared to EMD–CNN–LSTM and EMD–CNN–GRU, which exhibited varying degrees of deviation, ARIMA and SVR performed the poorest, underscoring the TGNN model’s superior ability to capture complex patterns and trends in the data.
- Against EMD–CNN–LSTM, the TGNN model showed improvements of 7.88% in MAE, 26.09% in MSE, 14.03% in RMSE, and an increase of 3.30% in . Compared to EMD–CNN–GRU and particularly SVR, the differences were even more pronounced, with an up to 56.67% improvement in with SVR, highlighting the TGNN model’s significant advantages in prediction accuracy and data fitting.
- In the data for turbines A2 and A3, the TGNN model continued to show higher performance enhancements, notably against ARIMA in turbine A2 data, where the improvement in reached an impressive 66.67%, further proving the TGNN model’s excellence in complex data settings.
- In Wind Farm B, the performance of the five models varied across turbines B1, B2, and B3. Notably, in turbine B1 data, the EMD–CNN–LSTM improved and approached the actual data more closely. However, the EMD–CNN–TGNN maintained consistency and stability overall, especially in turbine B2 data, where its predictive path highly aligned with the actual wind power outputs. The TGNN model also exhibited superior performance, particularly in turbine B2 data compared to SVR, with a 59.68% improvement in and a 3.26% increase over the well-performing EMD–CNN–LSTM in turbine B1 data, indicating the TGNN model’s enhanced accuracy and reliability across different turbine data scenarios.
- In Wind Farm C, the EMD–CNN–TGNN maintained good performance in capturing rapid changes, while the EMD–CNN–GRU showed weaker performance in sudden shifts, and ARIMA and SVR lagged significantly behind the other three models. The performance of EMD–CNN–TGNN in Wind Farm C emphasizes its robustness against disturbances during abrupt events. Compared to other models, especially in turbine C2 data against SVR, the increase in reached 61.67%, demonstrating its strong capability in capturing data trends and fluctuations. In turbine C1 data, the TGNN model increased by 45.00% compared to ARIMA, further highlighting its accuracy across different wind power data scenarios.
5. Conclusions and Discussions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Wang, H.-z.; Li, G.-q.; Wang, G.-b.; Peng, J.-c.; Jiang, H.; Liu, Y.-t. Deep learning based ensemble approach for probabilistic wind power forecasting. Appl. Energy 2017, 188, 56–70. [Google Scholar] [CrossRef]
- Vargas, S.A.; Esteves, G.R.T.; Maçaira, P.M.; Bastos, B.Q.; Cyrino Oliveira, F.L.; Souza, R.C. Wind power generation: A review and a research agenda. J. Clean. Prod. 2019, 218, 850–870. [Google Scholar] [CrossRef]
- Hassan, Q.; Algburi, S.; Sameen, A.Z.; Salman, H.M.; Jaszczur, M. A review of hybrid renewable energy systems: Solar and wind-powered solutions: Challenges, opportunities, and policy implications. Results Eng. 2023, 20, 101621. [Google Scholar] [CrossRef]
- Petersen, C.; Reguant, M.; Segura, L. Measuring the impact of wind power and intermittency. Energy Econ. 2024, 129, 107200. [Google Scholar] [CrossRef]
- Rumelhart, D.E.; Hinton, G.E.; Williams, R.J. Learning representations by back-propagating errors. Nature 1986, 323, 533–536. [Google Scholar] [CrossRef]
- Hochreiter, S.; Schmidhuber, J. Long Short-Term Memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef] [PubMed]
- Cho, K.; Van Merriënboer, B.; Gulcehre, C.; Bahdanau, D.; Bougares, F.; Schwenk, H.; Bengio, Y. Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv 2014, arXiv:1406.1078. [Google Scholar]
- Prema, V.; Bhaskar, M.S.; Almakhles, D.; Gowtham, N.; Rao, K.U. Critical review of data, models and performance metrics for wind and solar power forecast. IEEE Access 2021, 10, 667–688. [Google Scholar] [CrossRef]
- Shahid, F.; Zameer, A.; Muneeb, M. A novel genetic LSTM model for wind power forecast. Energy 2021, 223, 120069. [Google Scholar] [CrossRef]
- Xiang, L.; Liu, J.; Yang, X.; Hu, A.; Su, H. Ultra-short term wind power prediction applying a novel model named SATCN-LSTM. Energy Convers. Manag. 2022, 252, 115036. [Google Scholar] [CrossRef]
- Cui, Y.; Chen, Z.; He, Y.; Xiong, X.; Li, F. An algorithm for forecasting day-ahead wind power via novel long short-term memory and wind power ramp events. Energy 2023, 263, 125888. [Google Scholar] [CrossRef]
- Kisvari, A.; Lin, Z.; Liu, X. Wind power forecasting—A data-driven method along with gated recurrent neural network. Renew. Energy 2021, 163, 1895–1909. [Google Scholar] [CrossRef]
- Liu, X.; Yang, L.; Zhang, Z. Short-Term Multi-Step Ahead Wind Power Predictions Based On A Novel Deep Convolutional Recurrent Network Method. IEEE Trans. Sustain. Energy 2021, 12, 1820–1833. [Google Scholar] [CrossRef]
- Xiao, Y.; Zou, C.; Chi, H.; Fang, R. Boosted GRU model for short-term forecasting of wind power with feature-weighted principal component analysis. Energy 2022, 267, 126503. [Google Scholar] [CrossRef]
- Masoumi, M. Machine Learning Solutions for Offshore Wind Farms: A Review of Applications and Impacts. J. Mar. Sci. Eng. 2023, 11, 1855. [Google Scholar] [CrossRef]
- Kontopoulou, V.I.; Panagopoulos, A.; Kakkos, I.; Matsopoulos, G. A Review of ARIMA vs. Machine Learning Approaches for Time Series Forecasting in Data Driven Networks. Future Internet 2023, 15, 255. [Google Scholar] [CrossRef]
- de Moraes Vieira, J.L.; Farias, F.C.; Ochoa, A.A.V.; de Menezes, F.D.; da Costa, A.C.A.; Ângelo Peixoto da Costa, J.; de Novaes Pires Leite, G.; de Castro Vilela, O.; de Souza, M.G.G.; Michima, P. Remaining Useful Life Estimation Framework for the Main Bearing of Wind Turbines Operating in Real Time. Energies 2024, 17, 1430. [Google Scholar] [CrossRef]
- Chicco, D.; Warrens, M.J.; Jurman, G. The coefficient of determination R-squared is more informative than SMAPE, MAE, MAPE, MSE and RMSE in regression analysis evaluation. Peerj Comput. Sci. 2021, 7, e623. [Google Scholar] [CrossRef] [PubMed]
- Bokde, N.; Feijóo, A.; Villanueva, D.; Kulat, K. A review on hybrid empirical mode decomposition models for wind speed and wind power prediction. Energies 2019, 12, 254. [Google Scholar] [CrossRef]
- Liang, H.; Bressler, S.L.; Desimone, R.; Fries, P. Empirical mode decomposition: A method for analyzing neural data. Neurocomputing 2005, 65, 801–807. [Google Scholar] [CrossRef]
- Khaldi, R.; El Afia, A.; Chiheb, R.; Tabik, S. What is the best RNN-cell structure to forecast each time series behavior? Expert Syst. Appl. 2023, 215, 119140. [Google Scholar] [CrossRef]
- Triguero, I.; García-Gil, D.; Maillo, J.; Luengo, J.; García, S.; Herrera, F. Transforming big data into smart data: An insight on the use of the k-nearest neighbors algorithm to obtain quality data. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 2019, 9, e1289. [Google Scholar] [CrossRef]
- Urolagin, S.; Sharma, N.; Datta, T.K. A combined architecture of multivariate LSTM with Mahalanobis and Z-Score transformations for oil price forecasting. Energy 2021, 231, 120963. [Google Scholar] [CrossRef]
- Khosravi, A.; Machado, L.; Nunes, R. Time-series prediction of wind speed using machine learning algorithms: A case study Osorio wind farm, Brazil. Appl. Energy 2018, 224, 550–566. [Google Scholar] [CrossRef]
- Zhao, Z.; Yun, S.; Jia, L.; Guo, J.; Meng, Y.; He, N.; Li, X.; Shi, J.; Yang, L. Hybrid VMD-CNN-GRU-based model for short-term forecasting of wind power considering spatio-temporal features. Eng. Appl. Artif. Intell. 2023, 121, 105982. [Google Scholar] [CrossRef]
Wind Farm | Terrain | Elevation (m) | Time Span |
---|---|---|---|
Wind Farm A | Plain | 26–30.1 | 6 March 2021–5 March 2024 |
Wind Farm B | Plain | 18.5 | 31 March 2021–29 February 2024 |
Wind Farm C | Plain | 28–36 | 1 January 2022–31 December 2023 |
Turbine | Max (kW) | Min (kW) | Median (kW) | Mean (kW) | Variance (kW) | Standard Deviation (kW) | Skewness (kW) | Kurtosis (kW) |
---|---|---|---|---|---|---|---|---|
A1 | 2336.60 | 0.00 | 382.28 | 702.70 | 584,459.00 | 764.50 | 0.97 | 0.44 |
A2 | 2335.97 | 0.00 | 405.64 | 734.90 | 622,922.48 | 789.25 | 0.90 | 0.61 |
A3 | 2336.70 | 0.00 | 411.91 | 740.92 | 626,814.16 | 791.72 | 0.89 | 0.63 |
Turbine | Max (kW) | Min (kW) | Median (kW) | Mean (kW) | Variance (kW) | Standard Deviation (kW) | Skewness (kW) | Kurtosis (kW) |
---|---|---|---|---|---|---|---|---|
B1 | 2200.10 | 0.00 | 321.90 | 634.51 | 513,606.25 | 716.66 | 0.99 | 0.44 |
B2 | 2200.10 | 0.00 | 289.00 | 609.82 | 507,952.22 | 712.71 | 1.06 | 0.30 |
B3 | 2200.10 | 0.00 | 277.60 | 594.64 | 495,569.65 | 703.97 | 1.10 | 0.20 |
Turbine | Max (kW) | Min (kW) | Median (kW) | Mean (kW) | Variance (kW) | Standard Deviation (kW) | Skewness (kW) | Kurtosis (kW) |
---|---|---|---|---|---|---|---|---|
C1 | 2248.59 | 0.00 | 419.49 | 752.98 | 676,617.35 | 822.57 | 0.76 | 0.96 |
C2 | 2249.91 | 0.00 | 387.54 | 728.84 | 677,555.42 | 823.14 | 0.83 | 0.84 |
C3 | 2244.22 | 0.00 | 341.04 | 687.23 | 640,244.73 | 800.15 | 0.91 | 0.66 |
Layer Type | Output Size | Kernel Size | Number of Filters | Activation Function | Remark |
---|---|---|---|---|---|
Input layer | 2048 × 7 × 1 | - | - | - | Input shape is 2048 × 7, 1 channel |
Convolutional layer 1 | 2044 × 7 × 32 | 5 × 1 | 32 | ReLU | Kernel stride is 1 |
Pooling layer 1 | 1022 × 7 × 32 | 2 × 1 | - | - | Max pooling |
Convolutional layer 2 | 1018 × 7 × 64 | 5 × 1 | 64 | ReLU | Kernel stride is 1 |
Pooling layer 2 | 509 × 7 × 64 | 2 × 1 | - | - | Max pooling |
Flatten | 227136 × 1 | - | - | - | - |
Fully connected layer | 128 × 1 | - | - | ReLU | - |
Scenario | MAE (kW) | MSE (kW) | RMSE (kW) | R2 | |
---|---|---|---|---|---|
A1 | EMD–CNN–TGNN | 93.46 | 35,482.13 | 188.37 | 0.94 |
A1 | EMD–CNN–LSTM | 101.45 | 48,006.73 | 219.10 | 0.91 |
A1 | EMD–CNN–GRU | 108.21 | 66,943.16 | 258.73 | 0.88 |
A1 | ARIMA | 304.21 | 137,700.8 | 371.08 | 0.75 |
A1 | SVR | 384.97 | 220,236 | 469.29 | 0.60 |
A2 | EMD–CNN–TGNN | 111.52 | 21,929.61 | 148.09 | 0.95 |
A2 | EMD–CNN–LSTM | 139.13 | 30,815.16 | 175.54 | 0.93 |
A2 | EMD–CNN–GRU | 133.44 | 62,025.05 | 249.05 | 0.86 |
A2 | ARIMA | 375.60 | 191,603 | 437.72 | 0.57 |
A2 | SVR | 347.53 | 169,374.6 | 411.55 | 0.62 |
A3 | EMD–CNN–TGNN | 90.64 | 45,111.26 | 212.39 | 0.91 |
A3 | EMD–CNN–LSTM | 162.25 | 57,743.46 | 240.30 | 0.89 |
A3 | EMD–CNN–GRU | 156.17 | 72,656.62 | 269.55 | 0.86 |
A3 | ARIMA | 338.55 | 191,558.9 | 437.67 | 0.63 |
A3 | SVR | 365.31 | 190,006.8 | 435.90 | 0.63 |
B1 | EMD–CNN–TGNN | 65.31 | 8332.26 | 91.28 | 0.95 |
B1 | EMD–CNN–LSTM | 94.70 | 13,425.87 | 115.87 | 0.92 |
B1 | EMD–CNN–GRU | 105.93 | 23,210.08 | 152.35 | 0.86 |
B1 | ARIMA | 201.43 | 62,710.59 | 250.42 | 0.61 |
B1 | SVR | 205.71 | 55,748.36 | 236.11 | 0.65 |
B2 | EMD–CNN–TGNN | 44.36 | 4159.19 | 64.49 | 0.99 |
B2 | EMD–CNN–LSTM | 96.85 | 14,164.19 | 119.01 | 0.96 |
B2 | EMD–CNN–GRU | 80.30 | 22,863.93 | 151.21 | 0.93 |
B2 | ARIMA | 213.81 | 106,353 | 326.12 | 0.67 |
B2 | SVR | 308.83 | 122,435.1 | 349.91 | 0.62 |
B3 | EMD–CNN–TGNN | 132.74 | 37,778.82 | 194.37 | 0.95 |
B3 | EMD–CNN–LSTM | 215.70 | 62,981.02 | 250.96 | 0.91 |
B3 | EMD–CNN–GRU | 203.78 | 81,130.46 | 284.83 | 0.88 |
B3 | ARIMA | 395.75 | 264,475.7 | 514.27 | 0.62 |
B3 | SVR | 442.85 | 251,979.5 | 501.98 | 0.64 |
C1 | EMD–CNN–TGNN | 25.88 | 2696.61 | 51.93 | 0.87 |
C1 | EMD–CNN–LSTM | 28.04 | 3167.82 | 56.28 | 0.84 |
C1 | EMD–CNN–GRU | 30.54 | 3597.08 | 59.98 | 0.82 |
C1 | ARIMA | 70.96 | 7938.54 | 89.10 | 0.60 |
C1 | SVR | 65.60 | 6375.56 | 79.85 | 0.68 |
C2 | EMD–CNN–TGNN | 75.57 | 16,574.71 | 128.74 | 0.97 |
C2 | EMD–CNN–LSTM | 76.92 | 22,429.54 | 149.76 | 0.96 |
C2 | EMD–CNN–GRU | 103.14 | 47,321.60 | 217.54 | 0.91 |
C2 | ARIMA | 373.86 | 197,084.5 | 443.94 | 0.63 |
C2 | SVR | 364.17 | 212,188.1 | 460.64 | 0.60 |
C3 | EMD–CNN–TGNN | 61.86 | 7675.15 | 87.61 | 0.91 |
C3 | EMD–CNN–LSTM | 67.43 | 9047.17 | 95.12 | 0.89 |
C3 | EMD–CNN–GRU | 69.26 | 9455.52 | 97.24 | 0.89 |
C3 | ARIMA | 153.35 | 28,552.06 | 168.97 | 0.66 |
C3 | SVR | 160.29 | 31,230.06 | 176.72 | 0.63 |
Scenario | Model Comparison | MAE (%) | MSE (%) | RMSE (%) | R2 (%) |
---|---|---|---|---|---|
A1 | EMD–CNN–TGNN vs. EMD–CNN–LSTM | 7.88 | 26.09 | 14.03 | 3.30 |
A1 | EMD–CNN–TGNN vs. EMD–CNN–GRU | 13.63 | 47.00 | 27.19 | 6.82 |
A1 | EMD–CNN–TGNN vs. ARIMA | 69.28 | 74.23 | 49.24 | 25.33 |
A1 | EMD–CNN–TGNN vs. SVR | 75.72 | 83.89 | 59.86 | 56.67 |
A2 | EMD–CNN–TGNN vs. EMD–CNN–LSTM | 19.84 | 28.83 | 15.64 | 2.15 |
A2 | EMD–CNN–TGNN vs. EMD–CNN–GRU | 16.43 | 64.64 | 40.54 | 10.47 |
A2 | EMD–CNN–TGNN vs. ARIMA | 70.31 | 88.55 | 66.17 | 66.67 |
A2 | EMD–CNN–TGNN vs. SVR | 67.91 | 87.05 | 64.02 | 53.23 |
A3 | EMD–CNN–TGNN vs. EMD–CNN–LSTM | 44.14 | 21.88 | 11.61 | 2.25 |
A3 | EMD–CNN–TGNN vs. EMD–CNN–GRU | 41.96 | 37.91 | 21.21 | 5.81 |
A3 | EMD–CNN–TGNN vs. ARIMA | 73.23 | 76.45 | 51.47 | 41.27 |
A3 | EMD–CNN–TGNN vs. SVR | 75.19 | 76.26 | 51.28 | 44.44 |
B1 | EMD–CNN–TGNN vs. EMD–CNN–LSTM | 31.03 | 37.94 | 21.22 | 3.26 |
B1 | EMD–CNN–TGNN vs. EMD–CNN–GRU | 38.35 | 64.10 | 40.09 | 10.47 |
B1 | EMD–CNN–TGNN vs. ARIMA | 67.58 | 86.71 | 63.55 | 55.74 |
B1 | EMD–CNN–TGNN vs. SVR | 68.25 | 85.05 | 61.34 | 46.15 |
B2 | EMD–CNN–TGNN vs. EMD–CNN–LSTM | 54.20 | 70.64 | 45.81 | 3.13 |
B2 | EMD–CNN–TGNN vs. EMD–CNN–GRU | 44.76 | 81.81 | 57.35 | 6.45 |
B2 | EMD–CNN–TGNN vs. ARIMA | 79.25 | 96.09 | 80.23 | 47.76 |
B2 | EMD–CNN–TGNN vs. SVR | 85.64 | 96.60 | 81.57 | 59.68 |
B3 | EMD–CNN–TGNN vs. EMD–CNN–LSTM | 38.46 | 40.02 | 22.55 | 4.40 |
B3 | EMD–CNN–TGNN vs. EMD–CNN–GRU | 34.86 | 53.43 | 31.76 | 7.95 |
B3 | EMD–CNN–TGNN vs. ARIMA | 66.46 | 85.72 | 62.20 | 53.23 |
B3 | EMD–CNN–TGNN vs. SVR | 70.03 | 85.01 | 61.28 | 48.44 |
C1 | EMD–CNN–TGNN vs. EMD–CNN–LSTM | 7.70 | 14.87 | 7.73 | 3.57 |
C1 | EMD–CNN–TGNN vs. EMD–CNN–GRU | 15.26 | 25.03 | 13.42 | 6.10 |
C1 | EMD–CNN–TGNN vs. ARIMA | 63.53 | 66.03 | 41.72 | 45.00 |
C1 | EMD–CNN–TGNN vs. SVR | 60.55 | 57.70 | 34.97 | 27.94 |
C2 | EMD–CNN–TGNN vs. EMD–CNN–LSTM | 1.76 | 26.10 | 14.04 | 1.04 |
C2 | EMD–CNN–TGNN vs. EMD–CNN–GRU | 26.73 | 64.97 | 40.82 | 6.59 |
C2 | EMD–CNN–TGNN vs. ARIMA | 79.79 | 91.59 | 71.00 | 53.97 |
C2 | EMD–CNN–TGNN vs. SVR | 79.25 | 92.19 | 72.05 | 61.67 |
C3 | EMD–CNN–TGNN vs. EMD–CNN–LSTM | 8.26 | 15.17 | 7.90 | 2.25 |
C3 | EMD–CNN–TGNN vs. EMD–CNN–GRU | 10.68 | 18.83 | 9.90 | 2.25 |
C3 | EMD–CNN–TGNN vs. ARIMA | 59.66 | 73.12 | 48.15 | 37.88 |
C3 | EMD–CNN–TGNN vs. SVR | 61.41 | 75.42 | 50.42 | 44.44 |
Model | MAE (kw) | MSE (kw) | RMSE (kw) | |
---|---|---|---|---|
EMD–CNN–TGNN | 98.54 | 34,174.33 | 182.95 | 0.93 |
EMD–CNN–LSTM | 134.28 | 45,521.78 | 211.65 | 0.91 |
EMD–CNN–GRU | 132.61 | 67,208.28 | 259.11 | 0.87 |
ARIMA | 339.45 | 173,620.88 | 415.49 | 0.65 |
SVR | 365.94 | 193,205.82 | 438.91 | 0.62 |
Model | MAE (kw) | MSE (kw) | RMSE (kw) | |
---|---|---|---|---|
EMD–CNN–TGNN | 80.80 | 16,756.76 | 116.71 | 0.96 |
EMD–CNN–LSTM | 135.75 | 30,190.36 | 161.95 | 0.93 |
EMD–CNN–GRU | 130.00 | 42,401.49 | 196.13 | 0.89 |
ARIMA | 270.33 | 144,513.07 | 363.60 | 0.63 |
SVR | 319.13 | 143,387.66 | 362.67 | 0.64 |
Model | MAE (kw) | MSE (kw) | RMSE (kw) | |
---|---|---|---|---|
EMD–CNN–TGNN | 54.44 | 8982.16 | 89.43 | 0.92 |
EMD–CNN–LSTM | 57.46 | 11,548.18 | 100.39 | 0.90 |
EMD–CNN–GRU | 67.65 | 20,124.73 | 124.92 | 0.87 |
ARIMA | 199.39 | 77,858.37 | 234.00 | 0.63 |
SVR | 196.69 | 83,264.56 | 239.07 | 0.64 |
Comparison | MAE (%) | MSE (%) | RMSE (%) | (%) |
---|---|---|---|---|
EMD–CNN–TGNN vs. EMD–CNN–LSTM | 23.95 | 25.60 | 13.76 | 2.56 |
EMD–CNN–TGNN vs. EMD–CNN–GRU | 24.01 | 49.85 | 29.65 | 7.70 |
EMD–CNN–TGNN vs. ARIMA | 70.94 | 79.75 | 55.63 | 44.42 |
EMD–CNN–TGNN vs. SVR | 72.94 | 82.40 | 58.38 | 51.45 |
Comparison | MAE (%) | MSE (%) | RMSE (%) | (%) |
---|---|---|---|---|
EMD–CNN–TGNN vs. EMD–CNN–LSTM | 41.23 | 49.53 | 29.86 | 3.59 |
EMD–CNN–TGNN vs. EMD–CNN–GRU | 39.32 | 66.45 | 43.07 | 8.29 |
EMD–CNN–TGNN vs. ARIMA | 71.10 | 89.51 | 68.66 | 52.24 |
EMD–CNN–TGNN vs. SVR | 74.64 | 88.89 | 68.06 | 51.42 |
Comparison | MAE (%) | MSE (%) | RMSE (%) | (%) |
---|---|---|---|---|
EMD–CNN–TGNN vs. EMD–CNN–LSTM | 5.91 | 18.71 | 9.89 | 2.29 |
EMD–CNN–TGNN vs. EMD–CNN–GRU | 17.56 | 36.28 | 21.38 | 4.98 |
EMD–CNN–TGNN vs. ARIMA | 67.66 | 76.91 | 53.62 | 45.62 |
EMD–CNN–TGNN vs. SVR | 67.07 | 75.11 | 52.48 | 44.68 |
Model | MAE (kw) | MSE (kw) | RMSE (kw) | |
---|---|---|---|---|
EMD–CNN–TGNN | 77.93 | 19,971.08 | 129.70 | 0.94 |
EMD–CNN–LSTM | 109.16 | 29,086.77 | 157.99 | 0.91 |
EMD–CNN–GRU | 110.09 | 43,244.83 | 193.39 | 0.88 |
ARIMA | 269.72 | 131,997.44 | 337.70 | 0.64 |
SVR | 293.92 | 139,952.68 | 346.88 | 0.63 |
Comparison (Model 1 vs. Model) | MAE (%) | MSE (%) | RMSE (%) | (%) |
---|---|---|---|---|
EMD–CNN–TGNN vs. EMD–CNN–LSTM | 23.70 | 31.28 | 17.84 | 2.82 |
EMD–CNN–TGNN vs. EMD–CNN–GRU | 26.96 | 50.86 | 31.36 | 6.99 |
EMD–CNN–TGNN vs. ARIMA | 69.90 | 82.06 | 59.30 | 47.43 |
EMD–CNN–TGNN vs. SVR | 71.55 | 82.13 | 59.64 | 49.18 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Guo, Z.; Wei, F.; Qi, W.; Han, Q.; Liu, H.; Feng, X.; Zhang, M. A Time Series Prediction Model for Wind Power Based on the Empirical Mode Decomposition–Convolutional Neural Network–Three-Dimensional Gated Neural Network. Sustainability 2024, 16, 3474. https://doi.org/10.3390/su16083474
Guo Z, Wei F, Qi W, Han Q, Liu H, Feng X, Zhang M. A Time Series Prediction Model for Wind Power Based on the Empirical Mode Decomposition–Convolutional Neural Network–Three-Dimensional Gated Neural Network. Sustainability. 2024; 16(8):3474. https://doi.org/10.3390/su16083474
Chicago/Turabian StyleGuo, Zhiyong, Fangzheng Wei, Wenkai Qi, Qiaoli Han, Huiyuan Liu, Xiaomei Feng, and Minghui Zhang. 2024. "A Time Series Prediction Model for Wind Power Based on the Empirical Mode Decomposition–Convolutional Neural Network–Three-Dimensional Gated Neural Network" Sustainability 16, no. 8: 3474. https://doi.org/10.3390/su16083474
APA StyleGuo, Z., Wei, F., Qi, W., Han, Q., Liu, H., Feng, X., & Zhang, M. (2024). A Time Series Prediction Model for Wind Power Based on the Empirical Mode Decomposition–Convolutional Neural Network–Three-Dimensional Gated Neural Network. Sustainability, 16(8), 3474. https://doi.org/10.3390/su16083474