Analysis of a Predictive Mathematical Model of Weather Changes Based on Neural Networks
Abstract
:1. Introduction
- The study of weather phenomena at the current location using physical laws and the Weather Research & Forecasting Model available at https://www.mmm.ucar.edu/models/wrf (accessed on 1 December 2023).
- Investigation of weather conditions by means of mathematical transformations of data received from the probes.
- The use of radar and satellite observations to obtain meteorological information. Radars and satellites can provide data on wind speed, temperature, atmospheric pressure, humidity, and other parameters using a variety of sensors and instruments.
- Application of meteorological balloons and aerological probes that are released into the atmosphere and equipped with meteorological instruments to measure vertical profiles of temperature, humidity, pressure, and wind speed. The obtained data help in weather forecasting and analysing weather conditions.
- The use of a network of meteorological stations located at various locations. These stations have meteorological instruments that continuously monitor and record data on weather conditions, such as temperature, humidity, pressure, and precipitation. The information from these stations helps in forecasting and analysing the weather at the location.
- Utilization of computer weather prediction models that are used to analyse and forecast meteorological data. These models take into account the physical laws of the atmosphere, data from meteorological observations, and other parameters to create forecasts of future weather conditions.
- Usage of remote sensing, such as LIDAR (laser sensing), and infrared sensing to measure atmospheric parameters and obtain weather data. These technologies are based on the analysis of reflected or emitted radiation and can measure temperature, humidity, cloud cover, and other weather characteristics at the desired location.
2. Analysing the Application of Artificial Neural Networks
2.1. Stages of Neural Network Construction
- -
- Defining goals and objectives;
- -
- Data collection;
- -
- Data preprocessing;
- -
- Selection of the neural network architecture;
- -
- Determination of hyperparameters;
- -
- Training the network;
- -
- Model evaluation and tuning;
- -
- Model development and optimisation.
2.2. Improving the Prediction Accuracy of Neural Networks
- -
- Increasing the training sample size: A large amount of diverse data can help a neural network learn a wider range of patterns and improve generalisability. This may involve collecting additional data, augmenting existing data, or using generative modelling techniques to create new samples [29,30,31].
- -
- Improving data quality: Data quality is also important. Data preprocessing should be done to eliminate noise, outliers, missing values, and other problems. In addition, methods can be applied to eliminate class imbalance or sampling imbalance problems.
- -
- Fitting the network architecture: Choosing the right neural network architecture plays an important role in achieving high prediction accuracy. By using techniques such as hyperparameter search or optimisation by automatic machine learning, different network configurations can be explored, including the number of layers, the number of neurons in each layer, and activation functions.
- -
- Model regularising: Regularisation techniques can help to control overfitting and generalise the model better to new data. This includes methods such as L1 or L2 regularisation, neuron or layer dropout, and data dimensionality reduction techniques, such as principal component analysis (PCA) and stochastic neighbour embedding with t-distribution (t-SNE).
- -
- Using pretraining: Pretraining can be useful, especially when there are not enough labelled data available. One can use pretrained models using large unlabelled datasets (e.g., autoencoders or generative models) and then pretrain them based on smaller labelled datasets for a specific task.
- -
- Combining models: Sometimes, combining multiple models or using ensembles of models can significantly improve the forecast accuracy. This can be done by combining forecasts from several models or by using stacking or bagging techniques.
- -
- Hyperparameter tuning: The hyperparameters of a model also affect its prediction accuracy. Optimisation techniques, such as grid search or random search, can be used to find optimal values for the hyperparameters.
- -
- Improving the learning process: Applying various optimisation techniques, such as stochastic gradient descent with momentum (SGD with momentum), Adam’s method, and early stopping, can help to train the neural network more efficiently and stably.
- -
- A certain value is given to the input.
- -
- It is known in advance what value should be at the output.
- -
- In case the output value is different from the desired value, the network is adjusted until the difference is minimised [37].
3. Mathematical Models of the Used Neural Networks
4. Description of the Mathematical Model of Neural Network Grouping
- -
- Bagging: This method involves training a set of independent neural networks on subsets of training data obtained by selecting bootstrap examples. The predictions from each neural network are then combined, for example, by majority voting or averaging.
- -
- Boosting: Unlike bagging, boosting builds a sequence of neural networks, each of which learns to correct the errors of previous networks. At each iteration, the weights of the samples are adjusted to focus on the examples where the previous networks made a mistake. Hence, subsequent neural networks focus on complex and poorly classified examples [48].
- -
- Stacking: This method is used when the predictions of several neural networks become inputs to another model (a meta-model) that produces the final prediction. In this way, the meta-model is trained to combine and utilise the predictions of different neural networks to produce the best overall result.
- -
- Parameter values are random numbers from a given range;
- -
- Values of synaptic weights of the new neuron are determined by splitting one of the old neurons.
- The bias of function FI (x) referring to multiple classification systems is exactly the same as the bias of function F(x) referring to a single neural network.
- The difference in function FI (x) is smaller than that in function F(x).
5. Review of the Results of the Considered Mathematical Models
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Baldauf, M.; Seifert, A.; Forstner, J.; Majewski, D.; Raschendorfer, M.; Reinhardt, T. Operational convective-scale numerical weather prediction with the COSMO model: Description and sensitivities. Mon. Wea. Rev. 2011, 139, 3887–3905. [Google Scholar] [CrossRef]
- Brisson, E.; Blahak, U.; Lucas-Picher, P.; Purr, C.; Ahrens, B. Contrasting lightning projection using the lightning potential index adapted in a convection-permitting regional climate model. Clim. Dyn. 2021, 57, 2037–2051. [Google Scholar] [CrossRef]
- Voitovich, E.V.; Kononenko, R.V.; Konyukhov, V.Y.; Tynchenko, V.; Kukartsev, V.A.; Tynchenko, Y.A. Designing the Optimal Configuration of a Small Power System for Autonomous Power Supply of Weather Station Equipment. Energies 2023, 16, 5046. [Google Scholar] [CrossRef]
- Kleyko, D.; Rosato, A.; Frady, E.P.; Panella, M.; Sommer, F.T. Perceptron Theory Can Predict the Accuracy of Neural Networks. IEEE Trans. Neural Netw. Learn. Syst. 2023, 1–15. [Google Scholar] [CrossRef] [PubMed]
- Stephan, K.; Schraff, C. Improvements of the operational latent heat nudging scheme used in COSMO-DE at DWD. COSMO Newsl. 2008, 9, 7–11. [Google Scholar]
- Steppeler, J.; Doms, G.; Schaettler, U.; Bitzer, H.-W.; Gassmann, A.; Damrath, U.; Gregoric, G. Meso-gamma scale forecasts using the nonhydrostatic model LM. Meteorol. Atmos. Phys. 2003, 82, 75–96. [Google Scholar] [CrossRef]
- Armenta, M.; Jodoin, P.-M. The Representation Theory of Neural Networks. Mathematics 2021, 9, 3216. [Google Scholar] [CrossRef]
- Bengio, Y.; Goodfellow, I.; Courville, A. Deep Learning; MIT Press: Cambridge, MA, USA, 2015. [Google Scholar]
- Meng, L.; Zhang, J. IsoNN: Isomorphic Neural Network for Graph Representation Learning and Classification. arXiv 2019, arXiv:1907.09495. [Google Scholar] [CrossRef]
- Sorokova, S.N.; Efremenkov, E.A.; Qi, M. Mathematical Modeling the Performance of an Electric Vehicle Considering Various Driving Cycles. Mathematics 2023, 11, 2586. [Google Scholar] [CrossRef]
- Dozat, T. Incorporating Nesterov momentum into Adam. In Proceedings of the ICLR Workshop, San Juan, Puerto Rico, 2–4 May 2016; Volume 1, pp. 2013–2016. [Google Scholar]
- Xie, S.; Kirillov, A.; Girshick, R.; He, K. Exploring Randomly Wired Neural Networks for Image Recognition. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Seoul, Republic of Korea, 27 October–2 November 2019. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Delving Deep into Rectifiers: Surpassing Human-level Performance on ImageNet Classification. arXiv 2015, arXiv:1502.01852. Available online: https://arxiv.org/pdf/1502.01852.pdf (accessed on 1 December 2023).
- Kukartsev, V.V.; Martyushev, N.V.; Kondratiev, V.V.; Klyuev, R.V.; Karlina, A.I. Improvement of Hybrid Electrode Material Synthesis for Energy Accumulators Based on Carbon Nanotubes and Porous Structures. Micromachines 2023, 14, 1288. [Google Scholar] [CrossRef]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- Kalnay, E. Atmospheric Modelling, Data Assimilation and Predictability; Cambridge University Press: Cambridge, UK, 2003. [Google Scholar]
- Kazakova, E.; Rozinkina, I.; Chumakov, M. Verification of results of the working technology SNOWE for snow water equivalent and snow density fields determination as initial data for COSMO model. COSMO Newsl. 2016, 16, 25–36. [Google Scholar]
- Chernykh, N.; Mikhalev, A.; Dmitriev, V.; Tynchenko, V.; Shutkina, E. Comparative Analysis of Existing Measures to Reduce Road Accidents in Western Europe. In Proceedings of the 2023 22nd International Symposium INFOTEH-JAHORINA, INFOTEH, Sarajevo, Bosnia and Herzegovina, 15–17 March 2023. [Google Scholar] [CrossRef]
- Krasnopolsky, V.M.; Lin, Y. A neural network nonlinear multimodel ensemble to improve precipitation forecasts over continental US. Adv. Meteorol. 2012, 2012, 649450. [Google Scholar] [CrossRef]
- Marzban, C. A neural network for post-processing model output: ARPS. Mon. Wea. Rev. 2003, 131, 1103–1111. [Google Scholar] [CrossRef]
- Warner, T.T. Numerical Weather and Climate Prediction; Cambridge University Press: Cambridge, UK, 2010. [Google Scholar]
- Ye, C.; Zhao, C.; Yang, Y.; Fermuller, C.; Aloimonos, Y. LightNet: A Versatile, Standalone Matlab-based Environment for Deep Learning. arXiv 2016. Available online: https://arxiv.org/pdf/1605.02766.pdf (accessed on 1 December 2023).
- Zurada, J.M. Introduction to Artificial Neural Systems; PWS: New York, NY, USA, 1992. [Google Scholar]
- Goyal, M.; Goyal, R.; Lall, B. Learning Activation Functions: A New Paradigm of Understanding Neural Networks. arXiv 2019, arXiv:1906.09529. [Google Scholar]
- Burgers, G.; Van Leeuwen, P.J.; Evensen, G. Analysis scheme in the ensemble Kalman filter. Mon. Weather. Rev. 1998, 126, 1719–1724. [Google Scholar] [CrossRef]
- Dey, R.; Chakraborty, S. Convex-hull & DBSCAN clustering to predict future weather. In Proceedings of the 2015 International Conference and Workshop on Computing and Communication (IEMCON), Vancouver, BC, Canada, 15–17 October 2015; pp. 1–8. [Google Scholar] [CrossRef]
- Saima, H.; Jaafar, J.; Belhaouari, S.; Jillani, T.A. Intelligent methods for weather forecasting: A review. In Proceedings of the 2011 National Postgraduate Conference, Perak, Malaysia, 19–20 September 2011; pp. 1–6. [Google Scholar] [CrossRef]
- Salman, A.G.; Kanigoro, B.; Heryadi, Y. Weather forecasting using deep learning techniques. In Proceedings of the International Conference on Advanced Computer Science and Information Systems (ICACSIS), Depok, Indonesia, 10–11 October 2015; pp. 281–285. [Google Scholar] [CrossRef]
- Singh, N.; Chaturvedi, S.; Akhter, S. Weather forecasting using machine learning algorithm. In Proceedings of the 2019 International Conference on Signal Processing and Communication (ICSC), Noida, India, 7–9 March 2019; pp. 171–174. [Google Scholar] [CrossRef]
- Sela, J.G. The Implementation of the Sigma Pressure Hybrid Coordinate into the GFS. Office Note (National Centers for Environmental Prediction (U.S.)). 2009; p. 461. Available online: https://repository.library.noaa.gov/view/noaa/11401 (accessed on 1 December 2023).
- Weather Research & Forecasting Model (WRF) Mesoscale & Microscale Meteorology Laboratory. NCAR. Available online: https://www.mmm.ucar.edu/models/wrf (accessed on 1 December 2023).
- Yonekura, K.; Hattori, H.; Suzuki, T. Short-term local weather forecast using dense weather station by deep neural network. In Proceedings of the IEEE International Conference on Big Data (Big Data), Seattle, WA, USA, 10–13 December 2018; pp. 1683–1690. [Google Scholar] [CrossRef]
- Bauer, P.; Thorpe, A.; Brunet, G. The quiet revolution of numerical weather prediction. Nature 2015, 525, 47–55. [Google Scholar] [CrossRef]
- Buschow, S.; Friederichs, P. Local dimension and recurrent circulation patterns in long-term climate simulations. arXiv 2018, 1803, 11255. [Google Scholar] [CrossRef]
- C3S: ERA5: Fifth Generation of ECMWF Atmospheric Reanalyses of the Global Climate, Copernicus Climate Change Service Climate Data Store (CDS). 2017. Available online: https://cds.climate.copernicus.eu/cdsapp#!/home (accessed on 7 June 2019).
- Compo, G.P.; Whitaker, J.S.; Sardeshmukh, P.D.; Matsui, N.; Allan, R.J.; Yin, X.; Gleason, B.E.; Vose, R.S.; Rutledge, G.; Bessemoulin, P.; et al. The twentieth century reanalysis project. Q. J. R. Meteor. Soc. 2011, 137, 1–28. [Google Scholar] [CrossRef]
- Coors, B.; Paul Condurache, A.; Geiger, A. Spherenet: Learning spherical representations for detection and classification in omnidirectional images. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 518–533. [Google Scholar]
- Dee, D.P.; Uppala, S.M.; Simmons, A.; Berrisford, P.; Poli, P.; Kobayashi, S.; Andrae, U.; Balmaseda, M.; Balsamo, G.; Bauer, P.; et al. The ERA-Interim reanalysis: Configuration and performance of the data assimilation system. Q. J. R. J. R. Meteor. Soc. 2011, 137, 553–597. [Google Scholar] [CrossRef]
- Dueben, P.D.; Bauer, P. Challenges and design choices for global weather and climate models based on machine learning. Geosci. Model Dev. 2018, 11, 3999–4009. [Google Scholar] [CrossRef]
- Faranda, D.; Messori, G.; Yiou, P. Dynamical proxies of North Atlantic predictability and extremes. Sci. Rep. 2017, 7, 41278. [Google Scholar] [CrossRef] [PubMed]
- Faranda, D.; Messori, G.; Vannitsem, S. Attractor dimension of time-averaged climate observables: Insights from a low-order ocean-atmosphere model. Tellus A 2019, 71, 1–11. [Google Scholar] [CrossRef]
- Fraedrich, K.; Jansen, H.; Kirk, E.; Luksch, U.; Lunkeit, F. The Planet Simulator: Towards a user-friendly model. Meteorol. Z. 2005, 14, 299–304. [Google Scholar] [CrossRef]
- Freitas, A.C.M.; Freitas, J.M.; Todd, M. Hitting time statistics and extreme value theory. Probab. Theory Rel. 2010, 147, 675–710. [Google Scholar] [CrossRef]
- Tynchenko, V.; Kurashkin, S.; Murygin, A.; Bocharov, A.; Seregin, Y. Software for optimization of beam output during electron beam welding of thin-walled structures. Procedia Comput. Sci. 2022, 200, 843–851. [Google Scholar] [CrossRef]
- Kukartsev, V.V.; Tynchenko, V.S.; Bukhtoyarov, V.V.; Wu, X.; Tyncheko, Y.A.; Kukartsev, V.A. Overview of Methods for Enhanced Oil Recovery from Conventional and Unconventional Reservoirs. Energies 2023, 16, 4907. [Google Scholar] [CrossRef]
- Krasnopolsky, V.M.; Fox-Rabinovitz, M.S. Complex hybrid models combining deterministic and machine learning components for numerical climate modelling and weather prediction. Neural Netw. 2006, 19, 122–134. [Google Scholar] [CrossRef]
- Filina, O.A.; Sorokova, S.N.; Efremenkov, E.A.; Valuev, D.V.; Qi, M. Stochastic Models and Processing Probabilistic Data for Solving the Problem of Improving the Electric Freight Transport Reliability. Mathematics 2023, 11, 4836. [Google Scholar] [CrossRef]
- Krasnopolsky, V.M.; Fox-Rabinovitz, M.S.; Belochitski, A.A. Using ensemble of neural networks to learn stochastic convection parameterisations for climate and numerical weather prediction models from data simulated by a cloud resolving model. Adv. Artif. Neural Syst. 2013, 2013, 485913. [Google Scholar] [CrossRef]
- Lorenz, E.N. Deterministic nonperiodic flow. J. Atmo. Sci. 1963, 20, 130–141. [Google Scholar] [CrossRef]
- Das, P.; Manikandan, S.; Raghavendra, N. Holomorphic aspects of moduli of representations of quivers. Indian J. Pure Appl. Math. 2019, 50, 549–595. [Google Scholar] [CrossRef]
- Konyukhov, V.Y.; Oparina, T.A.; Zagorodnii, N.A.; Efremenkov, E.A.; Qi, M. Mathematical Analysis of the Reliability of Modern Trolleybuses and Electric Buses. Mathematics 2023, 11, 3260. [Google Scholar] [CrossRef]
- Filina, O.A.; Tynchenko, V.S.; Kukartsev, V.A.; Bashmur, K.A.; Pavlov, P.P.; Panfilova, T.A. Increasing the Efficiency of Diagnostics in the Brush-Commutator Assembly of a Direct Current Electric Motor. Energies 2024, 17, 17. [Google Scholar] [CrossRef]
- Frankle, J.; Carbin, M. The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks. In Proceedings of the International Conference on Learning Representations, New Orleans, LA, USA, 6–9 May 2019. [Google Scholar]
- Nooteboom, P.D.; Feng, Q.Y.; López, C.; Hernández-García, E.; Dijkstra, H.A. Using network theory and machine learning to predict El Nino. Earth Syst. Dynam. 2018, 9, 969–983. [Google Scholar] [CrossRef]
- O’Gorman, P.A.; Dwyer, J.G. Using Machine Learning to Parameterize Moist Convection: Potential for Modeling of Climate, Climate Change, and Extreme Events. J. Adv. Model. Earth Syst. 2018, 10, 2548–2563. [Google Scholar] [CrossRef]
- Golik, V.I.; Brigida, V.; Kukartsev, V.V.; Tynchenko, Y.A.; Boyko, A.A.; Tynchenko, S.V. Substantiation of Drilling Parameters for Undermined Drainage Boreholes for Increasing Methane Production from Unconventional Coal-Gas Collectors. Energies 2023, 16, 4276. [Google Scholar] [CrossRef]
- Volneikina, E.; Kukartseva, O.; Menshenin, A.; Tynchenko, V.; Degtyareva, K. Simulation-Dynamic Modeling of Supply Chains Based on Big Data. In Proceedings of the 2023 22nd International Symposium INFOTEH-JAHORINA, INFOTEH 2023, Sarajevo, Bosnia and Herzegovina, 15–17 March 2023. [Google Scholar] [CrossRef]
- Poli, P.; Hersbach, H.; Dee, D.P.; Berrisford, P.; Simmons, A.J.; Vitart, F.; Laloyaux, P.; Tan, D.G.H.; Peubey, C.; Thépaut, J.-N.; et al. ERA-20C: An Atmospheric Reanalysis of the Twentieth Century. J. Clim. 2016, 29, 4083–4097. [Google Scholar] [CrossRef]
- Semenova, E.; Tynchenko, V.; Chashchina, S.; Suetin, V.; Stashkevich, A. Using UML to Describe the Development of Software Products Using an Object Approach. In Proceedings of the 2022 IEEE International IOT, Electronics and Mechatronics Conference (IEMTRONICS), Toronto, ON, Canada, 1–4 June 2022. [Google Scholar] [CrossRef]
- Scher, S. Toward Data-Driven Weather and Climate Forecasting: Approximating a Simple General Circulation Model with Deep Learning. Geophys. Res. Lett. 2018, 45, 12616–12622. [Google Scholar] [CrossRef]
- Martyushev, N.V.; Malozyomov, B.V.; Sorokova, S.N.; Efremenkov, E.A.; Valuev, D.V.; Qi, M. Review Models and Methods for Determining and Predicting the Reliability of Technical Systems and Transport. Mathematics 2023, 11, 3317. [Google Scholar] [CrossRef]
- Scher, S. Videos for Weather and climate forecasting with neural networks: Using GCMs with different complexity as studyground. Zenodo 2019. [Google Scholar] [CrossRef]
- Scher, S. Code and data for Weather and climate forecasting with neural networks: Using GCMs with different complexity as study-ground. Zenodo 2019. [Google Scholar] [CrossRef]
- Martyushev, N.V.; Malozyomov, B.V.; Kukartsev, V.V.; Gozbenko, V.E.; Konyukhov, V.Y.; Mikhalev, A.S.; Kukartsev, V.A.; Tynchenko, Y.A. Determination of the Reliability of Urban Electric Transport Running Autonomously through Diagnostic Parameters. World Electr. Veh. J. 2023, 14, 334. [Google Scholar] [CrossRef]
- Scher, S.; Messori, G. Predicting weather forecast uncertainty with machine learning. Q. J. R. Meteor. Soc. 2018, 144, 2830–2841. [Google Scholar] [CrossRef]
- Schneider, T.; Lan, S.; Stuart, A.; Teixeira, J. Earth System Modeling 2.0: A Blueprint for Models That Learn from Observations and Targeted High-Resolution Simulations. Geophys. Res. Lett. 2017, 44, 12396–12417. [Google Scholar] [CrossRef]
Principle of Model Construction | Mathematical Description | Input Information | Output Information |
---|---|---|---|
D-PNN | |||
Approximation of functions described by differential equations that describe relationships between input parameters of the system | x(1), x(2), x(3), x(4) … x(n) | x(n + 1) | |
RBFN | |||
Approximation of the unknown solution by means of functions of a special kind, whose arguments are distant | x(1), x(2), x(3), x(4) … x(n) | x(n + 1) | |
MLPN | |||
Approximation of the unknown solution using nonlinear functions | x(1), x(2), x(3), x(4) … x(n) | x(n + 1) |
Temp | City | Daytime | Day_of_Year | Year | Humidity% | Wind Speed kph | Pressure in mBar | Weather Conditions |
---|---|---|---|---|---|---|---|---|
13.2 | 1 | 3 | 1 | 2022 | 18 | 6 | 1300 | 3 |
14.6 | 1 | 4 | 1 | 2022 | 19 | 7 | 1150 | 4 |
… | … | … | … | … | … | … | … | … |
Time of Year | Assemble | D-PNN | RBFN | MLPN |
---|---|---|---|---|
Winter | 12% | 18% | 15% | 16% |
Spring | 10% | 16% | 14% | 13% |
Summer | 8% | 14% | 16% | 11% |
Autumn | 9% | 13% | 16% | 12% |
№ | Country | Accuracy Years 2021/2022, % | |||
---|---|---|---|---|---|
Assemble | D-PNN | RBFN | MLPN | ||
1 | USA | 89.1/91.4 | 75.65/77.3 | 79.21/80.99 | 77.4/79.2 |
2 | Canada | 87.3/90.1 | 73.9/76.5 | 77.43/80.1 | 75.6/78.3 |
3 | Europe | 88.7/91.6 | 74.8/77.35 | 78.32/80.99 | 76.5/79.2 |
4 | UK | 89.4/90.0 | 75.65/76.5 | 79.21/80.1 | 77.4/78.3 |
5 | Australia | 87.2/90.3 | 73.95/76.5 | 77.43/80.1 | 75.6/78.3 |
6 | Germany | 88.8/90.7 | 74.8/76.5 | 78.32/80.1 | 76.5/78.3 |
7 | Japan | 86.2/88.0 | 73.1/74.8 | 76.54/78.4 | 74.8/76.6 |
8 | China | 87.9/88.2 | 73.95/74.8 | 77.43/78.3 | 75.6/76.6 |
9 | Korea | 88.3/90.8 | 74.8/76.5 | 78.32/80.1 | 76.5/78.3 |
10 | Russia | 86.5/90.2 | 73.1/76.5 | 76.54/80.1 | 74.8/78.3 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Malozyomov, B.V.; Martyushev, N.V.; Sorokova, S.N.; Efremenkov, E.A.; Valuev, D.V.; Qi, M. Analysis of a Predictive Mathematical Model of Weather Changes Based on Neural Networks. Mathematics 2024, 12, 480. https://doi.org/10.3390/math12030480
Malozyomov BV, Martyushev NV, Sorokova SN, Efremenkov EA, Valuev DV, Qi M. Analysis of a Predictive Mathematical Model of Weather Changes Based on Neural Networks. Mathematics. 2024; 12(3):480. https://doi.org/10.3390/math12030480
Chicago/Turabian StyleMalozyomov, Boris V., Nikita V. Martyushev, Svetlana N. Sorokova, Egor A. Efremenkov, Denis V. Valuev, and Mengxu Qi. 2024. "Analysis of a Predictive Mathematical Model of Weather Changes Based on Neural Networks" Mathematics 12, no. 3: 480. https://doi.org/10.3390/math12030480
APA StyleMalozyomov, B. V., Martyushev, N. V., Sorokova, S. N., Efremenkov, E. A., Valuev, D. V., & Qi, M. (2024). Analysis of a Predictive Mathematical Model of Weather Changes Based on Neural Networks. Mathematics, 12(3), 480. https://doi.org/10.3390/math12030480