Dynamic Gust Detection and Conditional Sequence Modeling for Ultra-Short-Term Wind Speed Prediction
Abstract
:1. Introduction
- (1)
- The Conditional LSTM framework incorporates gust detection, adapting predictions to gust and non-gust conditions, which significantly improves forecasting accuracy during gust events.
- (2)
- A novel approach to detecting gusts based on dynamic window adjustment is proposed, capturing rapid fluctuations in wind speed more accurately by adjusting the window size based on wind speed change rate.
- (3)
- Systematic experiments and comparisons were conducted to perform an in-depth analysis of the performance of different models under gust and non-gust conditions, providing a comprehensive evaluation of various modeling approaches.
2. Methodology
2.1. Task Definition
2.2. Gust Detection
- Sudden: A gust is a sharp increase in wind speed over a short period of time that quickly reaches a peak.
- Localized Extremes: Gusts exhibit localized extremes of wind speed, rather than minima, meaning that wind speeds peak at some point.
- High-frequency Fluctuations: gusts are characterized by high frequencies compared to the overall change in wind speed.
2.3. Conditional Sequence Modeling
- Gust Embedding Layer: This layer embeds the gust labels (binary: gust or non-gust) into a continuous vector space. The gust state, represented as 0 (non-gust) or 1 (gust), is embedded into a 4-dimensional vector.
- LSTM Layer: The input wind speed sequence, combined with the gust embeddings, is passed into the LSTM network. The LSTM processes the sequence of wind speed data while considering the gust state at each time step.
- Attention Mechanism: After the LSTM layer, a multi-head attention mechanism is applied. This allows the model to assign different importance to each time step, especially focusing on periods when gusts are detected.
- Conditional Gating Mechanism: The gust embedding is used in a gating mechanism to control the flow of information between the attention output and the final prediction. This helps dynamically adjust the model’s sensitivity based on whether a gust is detected.
2.3.1. Gust Embeddings
2.3.2. Long Short-Term Memory
2.3.3. Attention Mechanism
2.3.4. Conditional Gating Mechanism
2.4. Loss Function Design
Algorithm 1 Training the Conditional LSTM Model |
Require: Training dataset D, gust state labels G Ensure: Optimal model parameters for LSTM, embedding, attention, and gating 1. θLSTM, θgust_embed, θattention, θgating ← initialize network parameters 2. repeat 3. x ← random mini-batch from D // Wind speed time series 4. gust_labels ← random mini-batch from G // Corresponding gust labels 5. gust_embed ← gust_embedding(gust_labels) // Get gust state embeddings 6. gust_embed_expanded ← expand(gust_embed) // Expand embeddings over time steps 7. combined_input ← concatenate(x, gust_embed_expanded) // Concatenate input and gust embeddings 8. lstm_output, (hn, cn) ← LSTM(combined_input) // Pass through LSTM 9. attn_output, _ ← MultiheadAttention(lstm_output, lstm_output, lstm_output) // Apply attention mechanism 10. attn_output_avg ← mean(attn_output, dim = 1) // Average over time dimension 11. gating_weights ← Sigmoid(gating_layer(gust_embed)) // Apply gating mechanism 12. gated_output ← attn_output_avg * gating_weights // Apply gating to attention output 13. wind_output ← FullyConnected(gated_output) // Predict wind speed 14. gust_output ← Sigmoid(FullyConnected(gated_output)) // Predict gust event (binary classification) 15. ℓwind ← ||x-wind_output||^2 // Compute wind speed loss (MSE) 16. ℓgust ← binary_cross_entropy(gust_output, gust_labels) // Compute gust event loss (Binary Cross-Entropy) 17. // Total loss for optimization 18. ℓtotal ← ℓwind + λ × ℓgust // Combine losses, where λ is a hyperparameter balancing losse 19. // Update parameters using gradient descent 20. θLSTM ← θLSTM-∇θLSTM(ℓtotal) 21. θgust_embed ← θgust_embed-∇θgust_embed(ℓtotal) 22. θattention ← θattention-∇θattention(ℓtotal) 23. θgating ← θgating-∇θgating(ℓtotal) 24. until convergence or max iterations |
3. Experiment Setup
3.1. Dataset Description
3.2. Experiment Settings
3.2.1. Dataset Split
3.2.2. Hyperparameter Settings
3.2.3. Training Details
3.2.4. Window Size Settings
3.3. Baseline Models
- AutoRegressive (AR): This statistical model predicts future values based on past values, capturing the linear relationships in the wind speed data.
- AutoRegressive Moving Average (ARMA): Combining AR and moving average components, ARMA models the time series by considering both past values and past errors, allowing for more nuanced predictions.
- Backpropagation Neural Network (BPNN): A feedforward neural network that uses backpropagation for training. This model is capable of capturing complex nonlinear relationships in the data.
- Long Short-Term Memory (LSTM): A recurrent neural network designed to capture long-range dependencies in time series data, LSTM is well-suited for sequential prediction tasks such as wind speed forecasting.
- Support Vector Machine (SVM): A robust regression technique that identifies the optimal hyperplane for prediction, SVM is effective in handling high-dimensional data and nonlinear relationships.
- Empirical Mode Decomposition-Support Vector Machine (EMD-SVM): This model first decomposes the wind speed signal into intrinsic modes using EMD, followed by SVM regression to predict wind speed, allowing for a detailed analysis of the underlying dynamics.
- Extreme Learning Machine (ELM): A single-hidden layer feedforward neural network that offers rapid training and good generalization performance, ELM is included for its efficiency in handling large datasets.
3.4. Evaluation Metrics
- Mean Squared Error (MSE):
- 2.
- Root Mean Squared Error (RMSE):
- 3.
- Mean Absolute Error (MAE):
- 4.
- Squared Error-Relevance Area (SERA):
- 5.
- Residual Analysis:
- 6.
- Statistical Significance Test:
4. Results and Discussion
4.1. Overall Performance Comparison
4.2. Performance Comparison During Gust Events
4.3. Reasons for Performance Variations
- CLSTM’s architecture, utilizing conditional embeddings and long short-term memory (LSTM) units, allows it to adaptively respond to both general wind patterns and rapid gust fluctuations, maintaining high accuracy across scenarios.
- In non-gust scenarios, models such as BPNN and LSTM, while demonstrating overall good performance, may still be prone to overfitting to the training data. This overfitting can hinder their ability to generalize effectively, particularly when faced with gust conditions. Their limitations in capturing the rapid changes inherent in gusts can result in poorer performance in high-variability situations.
- The SVM model performs reasonably well in both non-gust and gust conditions, but its mean residuals indicate a tendency to slightly underpredict wind speeds during gust events. The standard deviation for SVM is also notable, reflecting its stability in predictions. However, while it demonstrates good generalization, it may lack the responsiveness to extreme fluctuations compared to more specialized models such as CLSTM.
- EMD-SVM’s method may inherently emphasize sudden changes, which could explain its relatively better performance during gusts despite its overall shortcomings. This model might rely on capturing local patterns effectively, making it responsive in specific scenarios even when its broader applicability is limited.
- The ELM model shows a non-gust mean residual close to zero, indicating that its predictions are generally lower than the actual values. However, its high standard deviation suggests instability in its predictions, reflecting difficulties in adapting to the nuanced changes in wind speed during stable conditions.
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Abdoos, A.A. A new intelligent method based on combination of VMD and ELM for short term wind power forecasting. Neurocomputing 2016, 203, 111–120. [Google Scholar] [CrossRef]
- Allen, D.J.; Tomlin, A.S.; Bale, C.S.E.; Skea, A.; Vosper, S.; Gallani, M.L. A boundary layer scaling technique for estimating near-surface wind energy using numerical weather prediction and wind map data. Appl. Energy 2017, 208, 1246–1257. [Google Scholar] [CrossRef]
- Anvari, M.; Lohmann, G.; Wächter, M.; Milan, P.; Lorenz, E.; Heinemann, D.; Tabar, M.R.R.; Peinke, J. Short term fluctuations of wind and solar power systems. New J. Phys. 2016, 18, 063027. [Google Scholar] [CrossRef]
- Viet, D.T.; Phuong, V.V.; Duong, M.Q.; Tran, Q.T. Models for Short-Term Wind Power Forecasting Based on Improved Artificial Neural Network Using Particle Swarm Optimization and Genetic Algorithms. Energies 2020, 13, 2873. [Google Scholar] [CrossRef]
- Shi, J.; Qu, X.; Zeng, S. Short-Term Wind Power Generation Forecasting: Direct Versus Indirect Arima-Based Approaches. Int. J. Green Energy 2011, 8, 100–112. [Google Scholar] [CrossRef]
- Cadenas, E.; Rivera, W. Wind speed forecasting in three different regions of Mexico, using a hybrid ARIMA–ANN model. Renew. Energy 2010, 35, 2732–2738. [Google Scholar] [CrossRef]
- Qu, X.; Kang, X.; Zhang, C.; Jiang, S.; Ma, X. Short-term prediction of wind power based on deep long short-term memory. In Proceedings of the 2016 IEEE PES Asia-Pacific Power and Energy Engineering Conference (APPEEC), Xi’an, China, 25–28 October 2016; IEEE: New York, NY, USA, 2016; pp. 1148–1152. [Google Scholar]
- Han, L.; Jing, H.; Zhang, R.; Gao, Z. Wind power forecast based on improved Long Short Term Memory network. Energy 2019, 189, 116300. [Google Scholar] [CrossRef]
- Harbola, S.; Coors, V. One dimensional convolutional neural network architectures for wind prediction. Energy Convers. Manag. 2019, 195, 70–75. [Google Scholar] [CrossRef]
- Trebing, K.; Mehrkanoon, S. Wind speed prediction using multidimensional convolutional neural networks. In Proceedings of the 2020 IEEE Symposium Series on Computational Intelligence (SSCI), Canberra, Australia, 1–4 December 2020; IEEE: New York, NY, USA, 2020; pp. 713–720. [Google Scholar]
- Yildiz, C.; Acikgoz, H.; Korkmaz, D.; Budak, U. An improved residual-based convolutional neural network for very short-term wind power forecasting. Energy Convers. Manag. 2021, 228, 113731. [Google Scholar] [CrossRef]
- Brasseur, O. Development and Application of a Physical Approach to Estimating Wind Gusts. Mon. Wea. Rev. 2001, 129, 5–25. [Google Scholar] [CrossRef]
- Ágústsson, H.; Ólafsson, H. Forecasting wind gusts in complex terrain. Meteorol. Atmos. Phys. 2009, 103, 173–185. [Google Scholar] [CrossRef]
- Fawbush, E.J.; Miller, R.C. A Basis for Forecasting Peak Wind Gusts in Non-Frontal Thunderstorms. Bull. Am. Meteorol. Soc. 1954, 35, 14–19. [Google Scholar] [CrossRef]
- Sheridan, P. Current gust forecasting techniques, developments and challenges. Adv. Sci. Res. 2018, 15, 159–172. [Google Scholar] [CrossRef]
- Schulz, B.; Lerch, S. Machine Learning Methods for Postprocessing Ensemble Forecasts of Wind Gusts: A Systematic Comparison. Mon. Weather Rev. 2022, 150, 235–257. [Google Scholar] [CrossRef]
- Thorarinsdottir, T.L.; Johnson, M.S. Probabilistic Wind Gust Forecasting Using Nonhomogeneous Gaussian Regression. Mon. Weather Rev. 2012, 140, 889–897. [Google Scholar] [CrossRef]
- Zhou, K.; Cherukuru, N.; Sun, X.; Calhoun, R. Wind Gust Detection and Impact Prediction for Wind Turbines. Remote Sens. 2018, 10, 514. [Google Scholar] [CrossRef]
- Alencar, D.B.; Affonso, C.M.; Oliveira, R.C.; Jose Filho, C.R. Hybrid approach combining SARIMA and neural networks for multi-step ahead wind speed forecasting in Brazil. IEEE Access 2018, 6, 55986–55994. [Google Scholar] [CrossRef]
- Zaman, U.; Teimourzadeh, H.; Sangani, E.H.; Liang, X.; Chung, C.Y. Wind speed forecasting using ARMA and neural network models. In Proceedings of the 2021 IEEE Electrical Power and Energy Conference (EPEC), Virtually, 22–24 October and 30–31 October 2021; IEEE: New York, NY, USA, 2021; pp. 243–248. [Google Scholar]
- Dong, H.; Zhu, J.; Li, S.; Chen, Z.; Wu, W.; Li, X. Dynamic Load Forecasting with Adversarial Domain Adaptation Based LSTM Neural Networks. In Proceedings of the 2023 IEEE/IAS Industrial and Commercial Power System Asia (I&CPS Asia), Chongqing, China, 7–9 July 2023; IEEE: New York, NY, USA, 2023; pp. 1130–1135. [Google Scholar]
- Dong, H.; Zhu, J.; Li, S.; Luo, T.; Li, H.; Huang, Y. Ultra-Short-Term Load Forecasting Based on Convolutional-LSTM Hybrid Networks. In Proceedings of the 2022 IEEE 31st International Symposium on Industrial Electronics (ISIE), Anchorage, AK, USA, 1–3 June 2022; IEEE: New York, NY, USA, 2022; pp. 142–148. [Google Scholar]
- Zhang, L.; Zhu, J.; Zhang, D.; Liu, Y. An incremental photovoltaic power prediction method considering concept drift and privacy protection. Appl. Energy 2023, 351, 121919. [Google Scholar] [CrossRef]
- Wang, H.; Zhang, Y.-M.; Mao, J.-X.; Wan, H.-P. A probabilistic approach for short-term prediction of wind gust speed using ensemble learning. J. Wind Eng. Ind. Aerodyn. 2020, 202, 104198. [Google Scholar] [CrossRef]
- Zhu, J.; Zhang, L.; Zhang, D.; Chen, Y. Probabilistic Wind Power Prediction Using Incremental Bayesian Stochastic Configuration Network Under Concept Drift Environment. IEEE Trans. Ind. Appl. 2024, 1–11. [Google Scholar] [CrossRef]
- Guo, Z.; Zhao, W.; Lu, H.; Wang, J. Multi-step forecasting for wind speed using a modified EMD-based artificial neural network model. Renew. Energy 2012, 37, 241–249. [Google Scholar] [CrossRef]
- Jiang, Z.; Che, J.; Wang, L. Ultra-short-term wind speed forecasting based on EMD-VAR model and spatial correlation. Energy Convers. Manag. 2021, 250, 114919. [Google Scholar] [CrossRef]
- Wang, J.; Zhang, W.; Li, Y.; Wang, J.; Dang, Z. Forecasting wind speed using empirical mode decomposition and Elman neural network. Appl. Soft Comput. 2014, 23, 452–459. [Google Scholar] [CrossRef]
- Zhang, C.; Wei, H.; Zhao, J.; Liu, T.; Zhu, T.; Zhang, K. Short-term wind speed forecasting using empirical mode decomposition and feature selection. Renew. Energy 2016, 96, 727–737. [Google Scholar] [CrossRef]
- Ren, Y.; Suganthan, P.N.; Srikanth, N. A novel empirical mode decomposition with support vector regression for wind speed forecasting. IEEE Trans. Neural Netw. Learn. Syst. 2014, 27, 1793–1798. [Google Scholar] [CrossRef]
- An, X.; Jiang, D.; Zhao, M.; Liu, C. Short-term prediction of wind power using EMD and chaotic theory. Commun. Nonlinear Sci. Numer. Simul. 2012, 17, 1036–1042. [Google Scholar] [CrossRef]
- Saroha, S.; Aggarwal, S.K. Wind power forecasting using wavelet transforms and neural networks with tapped delay. CSEE J. Power Energy Syst. 2018, 4, 197–209. [Google Scholar] [CrossRef]
- Singh, S.N.; Mohapatra, A. Repeated wavelet transform based ARIMA model for very short-term wind speed forecasting. Renew. Energy 2019, 136, 758–768. [Google Scholar]
- Li, L.-L.; Chang, Y.-B.; Tseng, M.-L.; Liu, J.-Q.; Lim, M.K. Wind power prediction using a novel model on wavelet decomposition-support vector machines-improved atomic search algorithm. J. Clean. Prod. 2020, 270, 121817. [Google Scholar] [CrossRef]
- Han, L.; Zhang, R.; Wang, X.; Bao, A.; Jing, H. Multi-step wind power forecast based on VMD-LSTM. IET Renew. Power Gener. 2019, 13, 1690–1700. [Google Scholar] [CrossRef]
- Sun, Z.; Zhao, S.; Zhang, J. Short-term wind power forecasting on multiple scales using VMD decomposition, K-means clustering and LSTM principal computing. IEEE Access 2019, 7, 166917–166929. [Google Scholar] [CrossRef]
- Sun, Z.; Zhao, M. Short-term wind power forecasting based on VMD decomposition, ConvLSTM networks and error analysis. IEEE Access 2020, 8, 134422–134434. [Google Scholar] [CrossRef]
- Zhang, Y.; Pan, G.; Chen, B.; Han, J.; Zhao, Y.; Zhang, C. Short-term wind speed prediction model based on GA-ANN improved by VMD. Renew. Energy 2020, 156, 1373–1388. [Google Scholar] [CrossRef]
- Chen, Y.; Dong, Z.; Wang, Y.; Su, J.; Han, Z.; Zhou, D.; Zhang, K.; Zhao, Y.; Bao, Y. Short-term wind speed predicting framework based on EEMD-GA-LSTM method under large scaled wind history. Energy Convers. Manag. 2021, 227, 113559. [Google Scholar] [CrossRef]
- He, Y.; Wang, Y. Short-term wind power prediction based on EEMD–LASSO–QRNN model. Appl. Soft Comput. 2021, 105, 107288. [Google Scholar] [CrossRef]
- Hu, J.; Wang, J.; Zeng, G. A hybrid forecasting approach applied to wind speed time series. Renew. Energy 2013, 60, 185–194. [Google Scholar] [CrossRef]
- Wang, S.; Zhang, N.; Wu, L.; Wang, Y. Wind speed forecasting based on the hybrid ensemble empirical mode decomposition and GA-BP neural network method. Renew. Energy 2016, 94, 629–636. [Google Scholar] [CrossRef]
- Hanifi, S.; Zare-Behtash, H.; Cammarano, A.; Lotfian, S. Offshore wind power forecasting based on WPD and optimised deep learning methods. Renew. Energy 2023, 218, 119241. [Google Scholar] [CrossRef]
- Kumar, B.; Yadav, N.; Sunil. A novel hybrid algorithm based on empirical fourier decomposition and deep learning for wind speed forecasting. Energy Convers. Manag. 2024, 300, 117891. [Google Scholar] [CrossRef]
- Chen, X.; Wang, Y.; Zhang, H.; Wang, J. A novel hybrid forecasting model with feature selection and deep learning for wind speed research. J. Forecast. 2024, 43, 1682–1705. [Google Scholar] [CrossRef]
- Barjasteh, A.; Ghafouri, S.H.; Hashemi, M. A hybrid model based on discrete wavelet transform (DWT) and bidirectional recurrent neural networks for wind speed prediction. Eng. Appl. Artif. Intell. 2024, 127, 107340. [Google Scholar] [CrossRef]
- Xiao, Y.; Wu, S.; He, C.; Hu, Y.; Yi, M. An effective hybrid wind power forecasting model based on “decomposition-reconstruction-ensemble” strategy and wind resource matching. Sustain. Energy Grids Netw. 2024, 38, 101293. [Google Scholar] [CrossRef]
- Sun, Y.; Zhang, S. A multiscale hybrid wind power prediction model based on least squares support vector regression–regularized extreme learning machine–multi-head attention–bidirectional gated recurrent unit and data decomposition. Energies 2024, 17, 2923. [Google Scholar] [CrossRef]
- Parri, S.; Teeparthi, K. SVMD-TF-QS: An efficient and novel hybrid methodology for the wind speed prediction. Expert Syst. Appl. 2024, 249, 123516. [Google Scholar] [CrossRef]
- Sarangi, S.; Dash, P.K.; Bisoi, R. Probabilistic prediction of wind speed using an integrated deep belief network optimized by a hybrid multi-objective particle swarm algorithm. Eng. Appl. Artif. Intell. 2023, 126, 107034. [Google Scholar] [CrossRef]
- Mirza, M.; Osindero, S. Conditional Generative Adversarial Nets. arXiv 2014, arXiv:1411.1784. [Google Scholar]
- Wen, T.-H.; Gasic, M.; Mrksic, N.; Su, P.-H.; Vandyke, D.; Young, S. Semantically Conditioned LSTM-based Natural Language Generation for Spoken Dialogue Systems. arXiv 2015, arXiv:1508.01745. [Google Scholar]
- Ribeiro, R.P.; Moniz, N. Imbalanced regression and extreme value prediction. Mach Learn 2020, 109, 1803–1835. [Google Scholar] [CrossRef]
Models | Structure (e.g., Layers/Nodes) | Parameters | Hyper-Parameters |
---|---|---|---|
AR | Lag order: p | p: 5 | - |
ARMA | AR order: p, MA order: q | p: 5, q: 1 | - |
BPNN | Input: n, Hidden: x, Output: y | Activation: ReLU | Learning Rate: 0.01, Batch Size: 16 |
LSTM | Layers: 3, Units: 100 | - | Learning Rate: 0.003, Batch Size: 32 |
SVM | Kernel: rbf | C: 13.88, Gamma: 0.02 | - |
EMD-SVM | EMD + Kernel: rbf | C: 13.88, Gamma: 0.02 | - |
ELM | Input: n, Hidden: x, Output: y | Activation: Sigmoid | Hidden Dim: 123 |
CLSTM (Ours) | LSTM Layers: 1, Units: 64, Gust Embedding: 16 | - | Learning Rate: 0.001, Batch Size: 32 |
Models | MSE | RMSE | MAE | SERA | p-Value |
---|---|---|---|---|---|
AR | 14.667 | 3.8298 | 3.2496 | 10.1960 | 0.0 |
ARMA | 14.9524 | 3.8668 | 3.2636 | 10.4607 | 0.0 |
BPNN | 0.6469 | 0.8043 | 0.6245 | 0.8219 | 0.403 |
LSTM | 0.5394 | 0.7344 | 0.5686 | 0.9320 | 0.0109 |
SVM | 0.5627 | 0.7501 | 0.582 | 0.9343 | 0.0702 |
EMD-SVM | 1.0838 | 1.041 | 0.8809 | 0.8602 | 3.41 × 10−123 |
ELM | 0.6815 | 0.8255 | 0.6502 | 1.8556 | 1.22 × 10−51 |
CLSTM (Ours) | 0.4893 | 0.6995 | 0.542 | 0.3613 | base |
Wind Speed Condition | Models | MSE | MAE | p-Value |
---|---|---|---|---|
Very Low (0–3 m/s) | AR | 41.392288 | 6.420835 | 2.26 × 10−36 |
ARMA | 43.489571 | 6.578476 | 2.26 × 10−36 | |
BPNN | 1.679833 | 1.18879 | 5.34 × 10−2 | |
LSTM | 1.674148 | 1.192171 | 4.33 × 10−3 | |
SVM | 1.879408 | 1.282047 | 6.04 × 10−3 | |
EMD-SVM | 1.26687 | 1.00098 | 3.38 × 10−8 | |
ELM | 1.638098 | 1.179153 | 5.09 × 10−1 | |
CLSTM (Ours) | 1.728294 | 1.213017 | base | |
Low (3–6 m/s) | AR | 21.136846 | 4.524484 | 1.62 × 10−248 |
ARMA | 21.453719 | 4.544553 | 1.62 × 10−248 | |
BPNN | 0.421482 | 0.50075 | 1.88 × 10−1 | |
LSTM | 0.381913 | 0.478559 | 3.72 × 10−56 | |
SVM | 0.405006 | 0.495406 | 5.75 × 10−99 | |
EMD-SVM | 1.087931 | 0.911125 | 1.41 × 10−180 | |
ELM | 0.476769 | 0.549381 | 1.15 × 10−139 | |
CLSTM (Ours) | 0.37987 | 0.480563 | base | |
Moderate (6–9 m/s) | AR | 1.976805 | 1.145495 | 6.60 × 10−96 |
ARMA | 1.753842 | 1.061309 | 2.99 × 10−85 | |
BPNN | 0.491729 | 0.545699 | 1.15 × 10−42 | |
LSTM | 0.524757 | 0.561436 | 4.62 × 10−1 | |
SVM | 0.53245 | 0.565359 | 4.91 × 10−4 | |
EMD-SVM | 1.203638 | 0.894072 | 2.05 × 10−75 | |
ELM | 1.214961 | 0.891385 | 6.97 × 10−48 | |
CLSTM (Ours) | 0.486924 | 0.541944 | base | |
High (9–12 m/s) | AR | 1.23193 | 0.923843 | 3.59 × 10−55 |
ARMA | 1.443489 | 1.031548 | 6.58 × 10−15 | |
BPNN | 0.515516 | 0.563487 | 4.44 × 10−1 | |
LSTM | 0.533526 | 0.579675 | 4.33 × 10−4 | |
SVM | 0.523497 | 0.572579 | 3.71 × 10−47 | |
EMD-SVM | 0.73916 | 0.679174 | 1.60 × 10−16 | |
ELM | 0.846833 | 0.71868 | 3.59 × 10−55 | |
CLSTM (Ours) | 0.509791 | 0.562863 | base |
Models | Non-Gust Mean Residual | Non-Gust Standard Deviation | Gust Mean Residual | Gust Standard Deviation |
---|---|---|---|---|
AR | −3.0137 | 2.391 | −1.3771 | 1.9225 |
ARMA | −2.9926 | 2.4759 | −1.338 | 2.0113 |
BPNN | −0.2035 | 0.7036 | 0.9696 | 0.409 |
LSTM | −0.155 | 0.7066 | 1.0853 | 0.4035 |
SVM | −0.1303 | 0.7312 | 1.0702 | 0.4141 |
EMD-SVM | −0.8405 | 0.6197 | −0.4469 | 0.656 |
ELM | 0.0605 | 0.8815 | 1.5718 | 0.6782 |
CLSTM (Ours) | −0.1784 | 0.7053 | −0.2361 | 0.3709 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhou, L.; Zhang, D.; Zhang, L.; Zhu, J. Dynamic Gust Detection and Conditional Sequence Modeling for Ultra-Short-Term Wind Speed Prediction. Electronics 2024, 13, 4513. https://doi.org/10.3390/electronics13224513
Zhou L, Zhang D, Zhang L, Zhu J. Dynamic Gust Detection and Conditional Sequence Modeling for Ultra-Short-Term Wind Speed Prediction. Electronics. 2024; 13(22):4513. https://doi.org/10.3390/electronics13224513
Chicago/Turabian StyleZhou, Liwan, Di Zhang, Le Zhang, and Jizhong Zhu. 2024. "Dynamic Gust Detection and Conditional Sequence Modeling for Ultra-Short-Term Wind Speed Prediction" Electronics 13, no. 22: 4513. https://doi.org/10.3390/electronics13224513
APA StyleZhou, L., Zhang, D., Zhang, L., & Zhu, J. (2024). Dynamic Gust Detection and Conditional Sequence Modeling for Ultra-Short-Term Wind Speed Prediction. Electronics, 13(22), 4513. https://doi.org/10.3390/electronics13224513