Next Article in Journal
Modeling and Simulation of Photovoltaic Modules Using Bio-Inspired Algorithms
Next Article in Special Issue
The Effects of Anodization Conditions on TiO2 Nanotubes Features Obtained Using Aqueous Electrolytes with Xanthan Gum
Previous Article in Journal
Analysis of the Efficiency of Various Receipting Multiple Access Methods with Acknowledgement in IoT Networks
Previous Article in Special Issue
Study on the Liquid Cooling Method of Longitudinal Flow through Cell Gaps Applied to Cylindrical Close-Packed Battery
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Short-Term Solar Insolation Forecasting in Isolated Hybrid Power Systems Using Neural Networks

1
Ural Power Engineering Institute, Ural Federal University, 19 Mira Str., 620002 Yekaterinburg, Russia
2
Department of Power Supply Systems, Novosibirsk State Technical University, 20 K. Marx Ave., 630073 Novosibirsk, Russia
3
Faculty of Electrical and Environmental Engineering, Riga Technical University, 12/1 Azenes Str., 1048 Riga, Latvia
4
College of Engineering and Technology, American University of the Middle East, Egaila 54200, Kuwait
*
Author to whom correspondence should be addressed.
Inventions 2023, 8(5), 106; https://doi.org/10.3390/inventions8050106
Submission received: 11 July 2023 / Revised: 19 August 2023 / Accepted: 22 August 2023 / Published: 23 August 2023

Abstract

:
Solar energy is an unlimited and sustainable energy source that holds great importance during the global shift towards environmentally friendly energy production. However, integrating solar power into electrical grids is challenging due to significant fluctuations in its generation. This research aims to develop a model for predicting solar radiation levels using a hybrid power system in the Gorno-Badakhshan Autonomous Oblast of Tajikistan. This study determined the optimal hyperparameters of a multilayer perceptron neural network to enhance the accuracy of solar radiation forecasting. These hyperparameters included the number of neurons, learning algorithm, learning rate, and activation functions. Since there are numerous combinations of hyperparameters, the neural network training process needed to be repeated multiple times. Therefore, a control algorithm of the learning process was proposed to identify stagnation or the emergence of erroneous correlations during model training. The results reveal that different seasons require different hyperparameter values, emphasizing the need for the meticulous tuning of machine learning models and the creation of multiple models for varying conditions. The absolute percentage error of the achieved mean for one-hour-ahead forecasting ranges from 0.6% to 1.7%, indicating a high accuracy compared to the current state-of-the-art practices in this field. The error for one-day-ahead forecasting is between 2.6% and 7.2%.

1. Introduction

Nowadays, the practice of forecasting is increasingly prevalent across various sectors, including the national economy, business domains, and applied research [1]. For example, it is employed in predicting electricity consumption, financial market prices, technical device failures, weather conditions, and many other factors [2]. The concept of forecasting is derived from the Greek word “πρόγνωση”, meaning “foresight” or “prediction,”, which refers to a scientifically informed assessment of the future state of a studied system considering the present moment and the combined impact of internal and external factors. The statistical data and historical models are examined using software tools and data analysis systems to generate a forecast capable of handling extensive datasets gathered over an extended period. The primary objective is to mitigate risks associated with financial, marketing, and operational decision making by anticipating future demand and trends [3,4]. The accuracy (or reliability) of a forecast serves as a measure of the analysis’s quality. It shows the degree to which the predicted outcomes align with actual values; thus, the accuracy is directly influenced by the forecasting system and the methods employed [5]. Enhancing reliability, reducing risks, and minimizing potential losses are achieved by employing a variety of resources in the forecasting process. For instance, methods such as scenarios, constructing a “goal tree,” and morphological analysis have gained widespread usage when predicting outcomes in fundamental research, system analysis, and synthesis. Conversely, statistical approaches are frequently constrained due to the absence or scarcity of initial data and, also, the challenges in determining the nature of the predicted phenomenon [6].
The task of forecasting electricity consumption patterns by managing load profiles in the electrical grid and effectively overseeing modern energy power systems (EPS) holds tremendous importance from an energy perspective [7,8]. The planning and management of energy power systems have gained particular significance in light of the increasing integration of renewable energy sources (RES) and their inherent stochastic nature of energy generation. Accurately predicting electricity generation from wind and solar power plants (providing estimates of expected electricity output) has become crucial for grid operators. This is especially essential for maintaining a balance between energy supply and demand, which requires facilitating the successful integration of distributed renewable energy sources (DRES), optimizing the component sizes in autonomous hybrid systems, and enhancing the reliability, efficiency, and safety of isolated systems [9]. Furthermore, the global trend of expanding network capacity through the integration of hybrid distributed systems that utilize renewable energy is emerging as an option for sustainable energy supply for the foreseeable future [10].
To address the aforementioned challenges, there is a need for high-precision forecasting systems for renewable energy generation, enabling the effective planning of electricity production for the upcoming day [11]. Typically, solar or wind power facilities cannot guarantee a precise output of electrical power at a given time, making the forecast of generated energy and its supply to the grid highly relevant [12,13].
Solar energy represents a dynamic sector of the energy industry characterized by significant regional disparities in its development [14]. The task of forecasting solar power output presents greater intricacies and complexities compared to other energy sources due to the influence of meteorological factors on the performance of photovoltaic arrays. These factors exhibit stochastic behavior, making it challenging to predict accurately the changes in power generation. Consequently, the absorbed solar radiation can be modeled as a correlation between the maximum power output of the photovoltaic modules and solar radiation levels. The intensity of solar radiation directly depends on geographical location, time, and the orientation of the solar panel in relation to both the sun and the sky [15]. In such a case, quite complex mathematical models are used, for instance, deep neural networks [16] and recurrent neural networks [17].
The primary objective of this study is to develop a model for predicting solar insolation by employing adaptive neural network techniques. This model is designed to forecast hourly solar radiation patterns. The hidden-layer neurons utilize the rectified linear unit (ReLU) and sigmoid activation functions in the proposed model. A comparative analysis is conducted between two learning methods: the conventional stochastic gradient descent (SGD) and the adapted lead scenting algorithm (Adam). To evaluate the performance of this model, testing is conducted for an isolated hybrid power system located in the Gorno-Badakhshan Autonomous Oblast (GBAO) considering data for a winter month. In this region, the unpredictability of the energy generation from renewable and alternative sources (including small mountain river-based hydropower plants (HPPs), solar power plants (SPPs), and wind power plants (WPPs) combined with energy storage devices) poses a significant challenge to maintaining a stable electric energy balance. Therefore, the problem addressed in this study holds substantial relevance [18,19]. The problem of the control of the autonomous power system of GBAO is described in [18]. The problem of the short-term forecasting of wind speed, which is necessary for forecasting generation at WPPs, is considered in [19].
Recently, the global solar energy industry has begun to predict the output of renewable energy in the short-term horizon, which ranges from an hour to a week. It is important to predict the solar insolation for different horizons since this allows better planning and management of renewable energy systems. For example, if the forecast predicts low insolation for the next few hours, energy storage systems can be charged in advance to ensure a continuous power supply. Similarly, if high insolation is predicted, excess energy can be sold back to the grid or stored for later use. Additionally, accurate short-term forecasting can support grid operators to balance the supply and demand of electricity in real time, leading to a more reliable and efficient power system. However, the currently used forecasting method lacks free and useful software frameworks, which could be used by small energy companies. Therefore, they are forced to build their own predictive models.
Over the past few years, numerous methods have been proposed to forecast solar radiation values. All existing methods for predicting the generation of electric energy by solar power plants (SPPs) can be categorized into three main groups [20]:
Physical models: These models rely on the relationship between weather patterns and solar radiation. They utilize numerical weather forecasting and electricity generation data from stations. Various factors are taken into account, for instance, location, historical orientation data, meteorological variables, characteristics of the photovoltaic installation, and predicted weather variables, like global horizontal illumination. However, the accuracy of forecasts based on physical models largely depends on changes in meteorological elements [21]. An example of a popular physical model is the numerical weather-predicting (NWP) model, which uses a set of mathematical equations to describe the physical state and motion of the atmosphere. Studies like [22] propose methodologies to assess the influence of clouds, ambient temperature, and other factors on the hourly output power of photovoltaic systems. Additionally, research in [21] proposes an approach to forecasting solar and photovoltaic energy through the post-processing of a global ecological multiscale numerical weather-forecasting model. The descriptions of various methods for numerical weather prediction, including methods for using satellite-based techniques, are reviewed in [23].
Statistical models: These models describe the relationship between solar radiation flux density, obtained using numerical weather forecasting, and the generation of electrical energy at a SPP through the statistical analysis of time series. They can utilize the following techniques: autoregressive (AR), autoregressive moving average (ARMA), autoregressive integrated moving average (ARIMA), autoregressive integrated moving average with exogenous inputs (ARIMMAX), or other forecast models based on artificial intelligence (AI) [24]. The accuracy of these forecasts depends on the quantity and quality of historical input data [25]. For instance, the study [26] presents a two-stage method that involves the statistical normalization of solar energy using a clear-sky model in the first stage before forecasting normalized solar energy using adaptive linear time-series models. The authors apply the ARMA model for the short-term forecasting of solar potential [27]. Seasonal ARIMA (SARIMA) models are proposed in this study [28], based on the insolation data from NASA’s POWER (Prediction of Worldwide Energy Resources) data archive.
Machine learning models: These models employ artificial intelligence systems to establish the relationship between forecasted weather conditions and the output power of a power plant. Artificial neural networks (ANNs) are particularly powerful tools for solving problems that require difficult-to-obtain knowledge and they have been extensively used [29,30]. ANNs have been proposed and successfully applied for solar radiation forecasting. They provide accurate simulation results by leveraging the self-learning and adaptive capabilities of neural architectures, thus reducing the need for human intervention [31]. For example, ref. [32] proposes models using radial basis functions (RBF) and multilayer perceptron for estimating solar radiation based on the preliminary estimation of the clarity index; meanwhile, ref. [33] investigates the use of backpropagation artificial neural networks (BAN) for estimating global solar radiation.
Deep neural network models have been introduced to model and predict solar radiation data using meteorological and geographical parameters without requiring knowledge of the data generation process [34]. Developing a reliable model for estimating or forecasting solar radiation data typically requires a significant amount of long-term data [35]. In addition to neural networks, other machine learning methods are applied when considering issues in solar insolation forecasting, among which, ensemble methods show high accuracy, for instance, XGBoost [36], gradient boosting, and random forest [37].
When using adaptive models for forecasting, it is crucial to consider their properties and operational principles. One significant distinction of adaptive models is their ability to exhibit the qualities of a series and continuously account for the evolving characteristics of the studied processes. The primary objective of adaptive methods is to construct self-correcting models that can capture the changing conditions over time, consider the informational significance of all potential elements in the time series, and provide reliable estimates for the future elements of the given series. Consequently, these models directly serve the purpose of forecasting [38].
One approach in adaptive methods is the piecewise linear approximation method, which entails a reduction in the model’s “memory”. This leads to the “forgetting” of past data and the construction of a regression line based on a limited amount of information. An important advantage of this method is its rapid adaptation when the parameters of the process are changed. However, a drawback of the adaptive method is its susceptibility to interference, small fluctuations, or deviations that can distort the results.
In many instances, hybrid models are formed by combining physical and statistical models. These models aimed to reduce computational complexity and the time required for online applications in forecasting [39]. For example, authors in [40] propose a hybrid technique that merges stochastic learning, earth image processing using remote sensing, and ground telemetry. They claim that such hybrid models offer significant benefits in terms of precision and reliability, leading to highly accurate predictions of solar activity.
This study presents the findings of an investigation of the impact of hyperparameters in feed-forward neural networks on the accuracy of the short-term and medium-term forecasting of solar insolation. The contribution can be formulated in two points. Firstly, it is shown that for both different seasons of the year and forecasting horizons, even for compact models, careful tuning of the hyperparameters is required. Secondly, the possible accuracy of operation and the short-term solar insolation forecasting being conducted for the first time for the Gorno-Badakhshan Autonomous Oblast of the Republic of Tajikistan highlights the importance of the sustainable development of the energy sector in this region and improving the quality of people’s lives.
This paper is structured as follows: Section 2 provides a description of this study’s focus, which is the Gorno-Badakhshan Autonomous Oblast; Section 3 details the development of the model employed for forecasting hourly solar radiation, along with the obtained results; finally, a summary of the outcomes is provided.

2. Methodology

2.1. Power System under Study

The GBAO has an isolated hybrid power system that consists of HPPs and one SPP operating independently from the main electrical power system of the country. Currently, Tajikistan does not utilize solar energy for industrial purposes, despite the favorable climatic conditions in this region [41].
Solar energy in Tajikistan could fulfill a quarter of the country’s electricity consumption if extensively harnessed. The annual solar energy potential is estimated to be 25 billion kWh; but, a significant portion of this potential remains untapped. Nevertheless, Tajikistan is exploring options to utilize some of its solar resources for water heating. The territory’s climatic conditions are conducive to solar energy utilization, with total solar insolation ranging from 800 to 900 W/m2 or from 8500 to 9000 MJ/m2 and direct solar radiation intensity varying from 1.30 to 1.7 cal/cm2 min. These values are even higher in mountainous areas, particularly in the Eastern Pamirs, where hydroresources are limited [42]. Mountainous areas experience fewer hours of sunshine due to prevalent cloudy weather throughout the year and the presence of rugged terrain (e.g., Dekhauz—2097 h, Fedchenko glacier—2116 h). The longest duration of sunshine (over 3000 h per year) is observed in the southern part of the country (Pyanj—3029 h) and in the Eastern Pamirs (Lake Karakul—3166 h).
The current situation reveals a significant electricity shortage in the GBAO’s electrical power system during winter due to reduced electricity production from the HPPs caused by a significant drop in river water levels [18]. Consequently, using solar energy can serve as an alternative solution to address this problem as solar insolation forecasts can help mitigate the challenges associated with the intermittent nature of solar energy [43].
The GBAO’s climatic conditions are exceptionally favorable for the utilization of solar energy (Table 1). For instance, with clear skies, the total solar radiation ranges from 700 to 800 W/m2 (or from 7500 to 8000 MJ/m2) and direct solar radiation intensity ranges from 1.30 to 1.7 cal/cm2/min [42]. The solar energy potential indicators for the GBAO are depicted in Figure 1.
The proximity of energy consumers allows for the installation of solar panels without the need for extensive power line infrastructure, presenting a significant advantage for solar energy in the GBAO [19].

2.2. Forecasting Model and Method

Complex models, such as deep recurrent networks or networks with convolutional layers, are unnecessary in the proposed model since it solely relies on the historical data of solar insolation. Instead, this study selected a multilayer perceptron with a single hidden layer after conducting preliminary experiments. The simplicity of this model safeguards against overfitting and enables a vast number of computational experiments to be conducted within a limited timeframe to investigate hyperparameters.
There are numerous methods for training neural networks; but, backpropagation is the most widely utilized approach. In this scenario, the error backpropagation technique can be implemented using various algorithms [44,45], such as:
  • Stochastic gradient descent;
  • Momentum;
  • RMSProp;
  • Adagrad;
  • AdaMax;
  • Adam;
  • Etc.
In this solution, a neural network utilizes a multilayer perceptron with one hidden layer as its fundamental architecture [19]. The two different model variants are considered:
The first model exclusively takes input data from previous hours that align with the forecast hour within the same month. This allows for predicting solar insolation values one hour and one day ahead. For instance, to predict the solar insolation value at 3:00 p.m. on 31 January, the model utilizes solar insolation values from 3:00 p.m. on 1 January, 2 January, and so on, until 30 January.
The second model incorporates input data from all previous hours during the week, excluding nighttime hours as they are not representative. This model is capable of predicting solar insolation values one hour ahead only. As an example, to predict the solar insolation value at 3:00 p.m. on 31 January, the model considers solar insolation values from 3:00 p.m. on 24 January, 4:00 p.m. on 24 January, 5:00 p.m. on 24 January, and continues until 2:00 p.m. on 31 January.
The forecasting pipeline is depicted in Figure 2.
The learning process was conducted until the occurrence of overfitting. Each learning stage consisted of 200 epochs (mini-batches of 32 samples are used) and, after each stage, a determination was made regarding the continuation of the learning process. If a stagnation in the decrease of the objective function was observed over the last 200 epochs, it was concluded that the learning process should be terminated. Given the extensive exploration of numerous options during this study, it was decided that it was best to minimize the training time. The ability to make a decision to stop the learning process significantly reduced the overall computational workload.
Table 2 contains the model’s hyperparameters.
The control algorithm of a learning process (Algorithm 1) can be formalized as follows:
Algorithm 1. Learning process
Input: n, f, b, A (values of hyperparameters), training data set.
  • Begin
  • The initialization of a neural network model with n neurons of the hidden layer and activation function f
  • Execute the first stage of learning
  • If MAPE1 < 100 then End
  • Execute the second stage of learning
  • If MAPE1 − MAPE2 < 5 then End
  • Execute the third stage of learning
  • If MAPE3 − MAPE1 < 15 then End
  • i = 4
  • While MAPEi < MAPEi1 + 0.1
    • Execute the ith stage of learning
    • i = i + 1
  • End
Output: weights of the neural network.

3. Results and Discussion

The metric of accuracy employed in this study is the mean absolute percentage error (MAPE). The dataset was divided into an 80% training set and a 20% validation/test set. Figure 3, Figure 4, Figure 5 and Figure 6 illustrate examples of the learning process. It can be seen that at the learning rate of 10−4, the process is too smooth and slow.
The outcomes of the computational experiments for the first variant (employing previous hours that align with the forecast hour within the same month) are presented in Table 3, Table 4, Table 5, Table 6, Table 7, Table 8, Table 9 and Table 10. A dashed line in Table 3, Table 4, Table 5, Table 6, Table 7, Table 8, Table 9 and Table 10 shows that the learning process was terminated before the specified number of learning stages was reached. Table 4, Table 6, Table 8 and Table 10 illustrate the process of model training for the best option, selected from the experiments for choosing the best hyperparameters given in Table 3, Table 5, Table 7 and Table 9. These experiments were conducted for all seasons of the year.
Table 11, Table 12, Table 13 and Table 14 showcase the best results achieved for the combinations of Z + S, Z + A, R + S, and R + A across all seasons when utilizing the first variant. The results demonstrate that the neural network’s hyperparameters greatly influence one another. Certain factors can be determined beforehand, such as the lower learning rate required for SGD compared to the Adam (10−4 vs. 10−3, respectively). However, the optimal number of neurons needs to be adjusted accordingly. Furthermore, the ideal model configuration and forecast accuracy exhibit strong dependence on the season. The utilization of the sigmoidal activation function in conjunction with the Adam resulted in the longest process duration for three out of the four seasons while also producing the best results. This highlights the importance of conducting experimental analysis when selecting the activation function following many popular software machine learning frameworks, such as Keras and PyTorch, using the ReLU function by default. Moreover, the obtained results indicate that meteorological conditions (seasonal) exert a significant influence on the performance of even a simple neural network. For each combination of learning methods and hidden-layer activation functions, different hyperparameter values (such as the number of neurons and the learning rate) yield the best results in different seasons.
Considering the deviation between the test and validation sets (a measure of overfitting), it can be concluded that a compact network with no more than 15 neurons should be chosen for this problem.
To provide a clearer understanding of how the hyperparameter values presented in Table 11, Table 12, Table 13 and Table 14 were determined, the experimental results in Table 15 and Table 16 are provided below. These experiments were conducted across all activation functions, training methods, numbers of neurons, and seasons. The total number of tested parameter combinations for each season amounted to 84, calculated as 2 (Adam, SGD) × 2 (ReLU, Sigmoid) × 7 (3, 6, … 21 neurons) × 3 (10−2, 10−3, 10−4 learning rate values).
For the second variant (employing all previous hours during the week, excluding nighttime hours), the best option was determined to be Z + A with a learning rate of a4. Table 8 illustrates the influence of the count of hidden-layer neurons. The forecasting results, both for one hour ahead (Table 17) and one day ahead (Table 18), are highly dependent on the season and necessitate the selection of the optimal number of neurons. Figure 7 and Figure 8 depict the learning process.
One intriguing finding suggests that the accuracy of one-hour-ahead forecasts during the summer period is significantly higher compared to the other seasons. However, when predicting weather conditions 24 h in advance, the accuracy diminishes. This suggests that the summer weather in the specific region is relatively stable in the immediate future but becomes less predictable in the short term.
Figure 9, Figure 10, Figure 11 and Figure 12 depict the alignment of the forecast with the actual solar insolation patterns. The curve of solar radiation closely resembles the desired shape for the majority of the days throughout the year. This can be attributed to two key factors. Firstly, the solar radiation data were collected from a highly reliable and precise weather station, ensuring the accuracy of the measurements. Secondly, mountainous areas of the Pamirs region (the high-altitude area) experience minimal cloud cover and a scarcity of cloudy days. This notion is substantiated by statistical records indicating that the region enjoys over three-hundred sunny days annually.
Research findings indicate that the accuracy of short-term forecasting is significantly influenced by the determination of the optimal combination of these factors. Despite its compact nature, the Adam training method has demonstrated a considerable improvement over the traditional SGD in terms of performance. While the ReLU activation function is commonly employed as the default, it has exhibited lower effectiveness compared to the sigmoid function when paired with the Adam method; but, it has shown greater effectiveness when used in conjunction with SGD. In general, for the problem of predicting solar insolation for the examined region, the best results were obtained using the ReLU + Adam combination.
The decline in accuracy during winter can be attributed to a higher frequency of cloudy days (the proportions of sunny days: winter—49%, spring—66%, summer—97%, autumn—80%) and reduced solar insolation levels (the average value of solar insolation in winter is 1.85 times lower than in the rest of the year).

4. Conclusions

To enhance the integration of solar power plants into the conventional energy grid, it is crucial to address vulnerabilities that arise in the network due to the unpredictable nature of these resources. Fluctuations and sudden spikes in power output pose significant challenges for system operators, leading to an impact on system balancing, reserve management, planning, and the availability of generating plants. Consequently, researchers have been focused on developing advanced forecasting methods to predict solar radiation across various timeframes and locations.
Presently, the research in this field primarily revolves around complex neural network models. However, even the simplest models can achieve significantly improved accuracy through appropriate adjustments. Conducted experiments for tuning the hyperparameters of neural networks have underscored the importance of careful selection, even for compact models with minimal layers. Additionally, this study confirms the necessity of developing distinct models for different seasons and weather conditions. The research focuses on shallow neural network models that consist of only one hidden layer to demonstrate the impact of hyperparameters and operational condition choices, even for the most basic machine learning models. Notably, even these simple models exhibit relatively low error rates after tuning.
For the short-term forecasting of solar insolation, the mean absolute percentage error for predicting one hour ahead ranges from 0.6% to 1.7% based on the validation set. This indicates a high level of forecast accuracy, considering the collective research experience on this issue. For one-day-ahead forecasts, the error ranges from 2.6% to 7.2%. The simplicity of these models makes them highly accessible for implementation, including the microcontrollers of individual solar panels, enabling generation prediction based on insolation.
The obtained results of this study have the potential to significantly enhance the efficiency of the planning and operational management of solar power plants in the Republic of Tajikistan due to more precise generation forecasting.

Author Contributions

All authors have made valuable contributions to this paper. Conceptualization, P.M., V.M., M.N., I.Z., S.B., M.S. and S.K.; methodology, P.M., V.M., M.N., I.Z., S.B. and M.S.; software, P.M., M.N., S.K. and M.S.; validation, V.M., P.M., M.N., S.B. and I.Z.; formal analysis, P.M., V.M., M.N., I.Z., S.B., M.S. and S.K.; investigation, V.M., P.M., M.N. and M.S.; writing—original draft preparation, V.M., P.M., M.N., S.B., M.S., I.Z. and S.K.; writing—review and editing, V.M., S.B. and I.Z.; supervision, V.M. All authors have read and agreed to the published version of the manuscript.

Funding

The research funding from the Ministry of Science and Higher Education of the Russian Federation (Ural Federal University Program of Development within the Priority-2030 Program) is gratefully acknowledged.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No supporting data information.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Shah, I.; Iftikhar, H.; Ali, S. Modeling and Forecasting Electricity Demand and Prices: A Comparison of Alternative Approaches. J. Math. 2022, 2022, 3581037. [Google Scholar] [CrossRef]
  2. Sulandari, W.; Suhartono, S.; Rodrigues, P.C. Exponential smoothing on modeling and forecasting multiple seasonal time series: An overview. Fluct. Noise Lett. 2021, 20, 2130003. [Google Scholar] [CrossRef]
  3. Leite Coelho da Silva, F.; da Costa, K.; Canas Rodrigues, P.; Salas, R.; López-Gonzales, J.L. Statistical and Artificial Neural Networks Models for Electricity Consumption Forecasting in the Brazilian Industrial Sector. Energies 2022, 15, 588. [Google Scholar] [CrossRef]
  4. Lisi, F.; Shah, I. Forecasting next-day electricity demand and prices based on functional models. Energy Syst. 2020, 11, 947–979. [Google Scholar] [CrossRef]
  5. Diebold, F.; Mariano, R. Comparing predictive accuracy. J. Bus. Econ. Stat. 1995, 13, 253–263. [Google Scholar]
  6. Sulandari, W.; Subanar, S.; Lee, M.H.; Rodrigues, P.C. Time series forecasting using singular spectrum analysis, fuzzy systems and neural networks. MethodsX 2020, 7, 101015. [Google Scholar] [CrossRef]
  7. Sulandari, W.; Lee, M.H.; Rodrigues, P.C. Indonesian electricity load forecasting using singular spectrum analysis, fuzzy systems and neural networks. Energy 2020, 190, 116408. [Google Scholar] [CrossRef]
  8. Alamaniotis, M.; Bargiotas, D.; Tsoukalas, L.H. Towards smart energy systems: Application of kernel machine regression for medium term electricity load forecasting. SpringerPlus 2016, 5, 58. [Google Scholar] [CrossRef]
  9. Dolara, A.; Leva, S.; Manzolini, G. Comparison of different physical models for PV power output prediction. Sol. Energy 2015, 119, 83–89. [Google Scholar] [CrossRef]
  10. Iweh, C.D.; Gyamfi, S.; Tanyi, E.; Effah-Donyina, E. Distributed Generation and Renewable Energy Integration into the Grid: Prerequisites, Push Factors, Practical Options, Issues and Merits. Energies 2021, 14, 5375. [Google Scholar] [CrossRef]
  11. Zafar, R.; Vu, B.H.; Husein, M.; Chung, I.-Y. Day-Ahead Solar Irradiance Forecasting Using Hybrid Recurrent Neural Network with Weather Classification for Power System Scheduling. Appl. Sci. 2021, 11, 6738. [Google Scholar] [CrossRef]
  12. Khasanzoda, N.; Zicmane, I.; Beryozkina, S.; Safaraliev, M.; Sultonov, S.; Kirgizov, A. Regression model for predicting the speed of wind flows for energy needs based on fuzzy logic. Renew. Energy 2022, 191, 723–731. [Google Scholar] [CrossRef]
  13. Behera, M.K.; Majumder, I.; Nayak, N. Solar photovoltaic power forecasting using optimized modified extreme learning machine technique. Eng. Sci. Technol. Int. J. 2018, 21, 428–438. [Google Scholar] [CrossRef]
  14. Wang, K.; Qi, X.; Liu, H. A comparison of day-ahead photovoltaic power forecasting models based on deep learning neural network. Appl. Energy 2019, 251, 113315. [Google Scholar] [CrossRef]
  15. Sun, M.; Feng, C.; Zhang, J. Probabilistic solar power forecasting based on weather scenario generation. Appl. Energy 2020, 266, 114823. [Google Scholar] [CrossRef]
  16. Li, P.; Zhou, K.; Lu, X.; Yang, S. A hybrid deep learning model for short-term PV power forecasting. Appl. Energy 2020, 259, 114216. [Google Scholar] [CrossRef]
  17. Qing, X.; Niu, Y. Hourly day-ahead solar irradiance prediction using weather forecasts by LSTM. Energy 2018, 148, 461–468. [Google Scholar] [CrossRef]
  18. Manusov, V.; Beryozkina, S.; Nazarov, M.; Safaraliev, M.; Zicmane, I.; Matrenin, P.; Ghulomzoda, A. Optimal Management of Energy Consumption in an Autonomous Power System Considering Alternative Energy Sources. Mathematics 2022, 10, 525. [Google Scholar] [CrossRef]
  19. Manusov, V.; Matrenin, P.; Nazarov, M.; Beryozkina, S.; Safaraliev, M.; Zicmane, I.; Ghulomzoda, A. Short-Term Prediction of the Wind Speed Based on a Learning Process Control Algorithm in Isolated Power Systems. Sustainability 2023, 15, 1730. [Google Scholar] [CrossRef]
  20. Tyunkov, D.; Sapilova, A.; Gritsay, A.; Alekseenko, D.; Khamitov, R. Short-Term Forecast Methods of Electricity Generation by Solar Power Plants and their Classification. Elektrotekhnicheskie Sist. Kompleks. 2020, 3, 4. [Google Scholar] [CrossRef]
  21. Raza, M.; Nadarajah, M.; Ekanayake, C. On recent advances in PV output power forecast. Sol. Energy 2016, 136, 125–144. [Google Scholar] [CrossRef]
  22. Gandoman, F.H.; Abdel, A.; Shady, H.E.; Omar, N.; Ahmadi, A.; Alenezi, F.Q. Short-term solar power forecasting considering cloud coverage and ambient temperature variation effects. Renew. Energy 2018, 123, 793–805. [Google Scholar] [CrossRef]
  23. Miller, S.D.; Rogers, M.A.; Haynes, J.M.; Sengupta, M.; Heidinger, A.K. Short-term solar irradiance forecasting via satellite/model coupling. Sol. Energy 2018, 168, 102–117. [Google Scholar] [CrossRef]
  24. Kamalov, F.; Gurrib, I.; Thabtah, F. Autoregressive and neural network models: A comparative study with linearly lagged series. In Proceedings of the 2021 International Conference on Innovation and Intelligence for Informatics, Computing, and Technologies (3ICT), Zallaq, Bahrain, 29–30 September 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 175–180. [Google Scholar]
  25. Hoyos-Gómez, L.S.; Ruiz-Muñoz, J.F.; Ruiz-Mendoza, B.J. Short-term forecasting of global solar irradiance in tropical environments with incomplete data. Appl. Energy 2022, 307, 118192. [Google Scholar] [CrossRef]
  26. Bacher, P.; Madsen, H.; Nielsen, H.A. Online short-term solar power forecasting. Sol. Energy 2009, 83, 1772–1783. [Google Scholar] [CrossRef]
  27. Mbaye, A.; Ndiaye, M.; Ndione, D.M.; Diaw, M.; Traoré, V.; Ndiaye, A.; Diaw, V.; Traoré, A.; Ndiaye, P. ARMA model for short-term forecasting of solar potential ARMA model for short-term forecasting of solar potential: Application to a horizontal surface on Dakar site. Mater. Devices Collab. Acad. Int. Press 2019, 4, 1103. [Google Scholar]
  28. Shadab, A.; Ahmad, S.; Said, S. Spatial forecasting of solar radiation using ARIMA model. Remote Sens. Appl. Soc. Environ. 2020, 20, 100427. [Google Scholar] [CrossRef]
  29. Kamalov, F.; Rajab, K.; Cherukuri, A.; Elnagar, A.; Safaraliev, M. Deep Learning for Covid-19 Forecasting: State-of-the-art review. Neurocomputing 2022, 511, 142–154. [Google Scholar] [CrossRef]
  30. Voyant, C.; Notton, G.; Kalogirou, S.; Nivet, M.-L.; Paoli, C.; Motte, F.; Fouilloy, A. Machine learning methods for solar radiation forecasting: A review. Renew. Energy 2017, 105, 569–582. [Google Scholar] [CrossRef]
  31. Zou, L.; Wang, L.; Xia, L.; Lin, A.; Hu, B.; Zhu, H. Prediction and comparison of solar radiation using improved empirical models and Adaptive Neuro-Fuzzy Inference Systems. Renew. Energy 2017, 106, 343–353. [Google Scholar] [CrossRef]
  32. Dorvlo, A.S.S.; Jervase, J.A.; Al-Lawati, A. Solar radiation estimation using artificial neural networks. Appl. Energy 2002, 71, 307–319. [Google Scholar] [CrossRef]
  33. Naser, N.; Abdelbari, A. Estimation of Global Solar Radiation using Back Propagation Neural Network: A case study Tripoli, Libya. In Proceedings of the 2020 International Conference on Electrical, Communication, and Computer Engineering (ICECCE), Istanbul, Turkey, 12–13 June 2020; pp. 1–5. [Google Scholar]
  34. Kumari, P.; Toshniwal, D. Deep learning models for solar irradiance forecasting: A comprehensive review. J. Clean. Prod. 2021, 318, 128566. [Google Scholar] [CrossRef]
  35. Kumari, P.; Toshniwal, D. Long short term memory–convolutional neural network based deep hybrid approach for solar irradiance forecasting. Appl. Energy 2021, 295, 117061. [Google Scholar] [CrossRef]
  36. Solano, E.S.; Dehghanian, P.; Affonso, C.M. Solar Radiation Forecasting Using Machine Learning and Ensemble Feature Selection. Energies 2022, 15, 7049. [Google Scholar] [CrossRef]
  37. Alam, M.S.; Al-Ismail, F.S.; Hossain, M.S.; Rahman, S.M. Ensemble Machine-Learning Models for Accurate Prediction of Solar Irradiation in Bangladesh. Processes 2023, 11, 908. [Google Scholar] [CrossRef]
  38. Safaraliev, M.; Kiryanova, N.; Matrenin, P.; Dmitriev, S.; Kokin, S.; Kamalov, F. Medium-term forecasting of power generation by hydropower plants in isolated power systems under climate change. Energy Rep. 2022, 8, 765–774. [Google Scholar] [CrossRef]
  39. Monjoly, S.; André, M.; Calif, R.; Soubdhan, T. Hourly forecasting of global solar radiation based on multiscale decomposition methods: A hybrid approach. Energy 2017, 119, 288–298. [Google Scholar] [CrossRef]
  40. Marquez, R.; Pedro, H.T.C.; Coimbra, C.F.M. Hybrid solar forecasting method uses satellite imaging and ground telemetry as inputs to ANNs. Sol. Energy 2013, 92, 176–188. [Google Scholar] [CrossRef]
  41. Ghulomzoda, A.; Safaraliev, M.; Matrenin, P.; Beryozkina, S.; Zicmane, I.; Gubin, P.; Gulyamov, K.; Saidov, N. A Novel Approach of Synchronization of Microgrid with a Power System of Limited Capacity. Sustainability 2021, 13, 13975. [Google Scholar] [CrossRef]
  42. Safaraliev, M.K.; Odinaev, I.N.; Ahyoev, J.S.; Rasulzoda, K.N.; Otashbekov, R.A. Energy Potential Estimation of the Region’s Solar Radiation Using a Solar Tracker. Appl. Sol. Energy 2020, 56, 270–275. [Google Scholar] [CrossRef]
  43. Asanov, M.S.; Safaraliev, M.K.; Zhabudaev, T.Z.; Asanova, S.M.; Kokin, S.E.; Dmitriev, S.A.; Obozov, A.J.; Ghulomzoda, A.H. Algorithm for calculation and selection of micro hydropower plant taking into account hydrological parameters of small watercourses mountain rivers of Central Asia. Int. J. Hydrogen Energy 2021, 46, 37109–37119. [Google Scholar] [CrossRef]
  44. Dogo, E.M.; Afolabi, O.J.; Nwulu, N.I.; Twala, B.; Aigbavboa, C.O. A Comparative Analysis of Gradient Descent-Based Optimization Algorithms on Convolutional Neural Networks. In Proceedings of the 2018 International Conference on Computational Techniques, Electronics and Mechanical Systems (CTEMS), Belgaum, India, 21–22 December 2018; pp. 92–99. [Google Scholar]
  45. Kamalov, F.; Nazir, A.; Safaraliev, M.; Cherukuri, A.K.; Zgheib, R. Comparative analysis of activation functions in neural networks. In Proceedings of the 2021 28th IEEE International Conference on Electronics, Circuits, and Systems (ICECS), Dubai, United Arab Emirates, 28 November–1 December 2021; IEEE: Piscataway, NJ, USA; pp. 1–6. [Google Scholar]
Figure 1. The solar energy potential of the GBAO.
Figure 1. The solar energy potential of the GBAO.
Inventions 08 00106 g001
Figure 2. The forecasting pipeline.
Figure 2. The forecasting pipeline.
Inventions 08 00106 g002
Figure 3. The decrease of forecasting error, train set, winter, Z + A, and learning rate 10−3, with three, six, and nine neurons of the hidden layer.
Figure 3. The decrease of forecasting error, train set, winter, Z + A, and learning rate 10−3, with three, six, and nine neurons of the hidden layer.
Inventions 08 00106 g003
Figure 4. The decrease of forecasting error, validation set, winter, Z + A, and learning rate 10−3, with three, six, and nine neurons of the hidden layer.
Figure 4. The decrease of forecasting error, validation set, winter, Z + A, and learning rate 10−3, with three, six, and nine neurons of the hidden layer.
Inventions 08 00106 g004
Figure 5. The decrease of forecasting error, winter, train set, Z + S, and learning rate 10−4, with three, six, and nine neurons of the hidden layer.
Figure 5. The decrease of forecasting error, winter, train set, Z + S, and learning rate 10−4, with three, six, and nine neurons of the hidden layer.
Inventions 08 00106 g005
Figure 6. The decrease of forecasting error, winter, test set, Z + S, and learning rate 10−4, with three, six, and nine neurons of the hidden layer.
Figure 6. The decrease of forecasting error, winter, test set, Z + S, and learning rate 10−4, with three, six, and nine neurons of the hidden layer.
Inventions 08 00106 g006
Figure 7. The decrease of forecasting error, winter, training set, R + A, and learning rate 10−3, with nine, twelve, and fifteen neurons of the hidden layer.
Figure 7. The decrease of forecasting error, winter, training set, R + A, and learning rate 10−3, with nine, twelve, and fifteen neurons of the hidden layer.
Inventions 08 00106 g007
Figure 8. The decrease of forecasting error, winter, validation set, R + A, and learning rate 10−3, with nine, twelve, and fifteen neurons of the hidden layer.
Figure 8. The decrease of forecasting error, winter, validation set, R + A, and learning rate 10−3, with nine, twelve, and fifteen neurons of the hidden layer.
Inventions 08 00106 g008
Figure 9. Forecast of solar insolation: an example of a winter day based on the second variant.
Figure 9. Forecast of solar insolation: an example of a winter day based on the second variant.
Inventions 08 00106 g009
Figure 10. Forecast of solar insolation: an example of a spring day based on the second variant.
Figure 10. Forecast of solar insolation: an example of a spring day based on the second variant.
Inventions 08 00106 g010
Figure 11. Forecast of solar insolation: an example of a summer day based on the second variant.
Figure 11. Forecast of solar insolation: an example of a summer day based on the second variant.
Inventions 08 00106 g011
Figure 12. Forecast of solar insolation: an example of an autumn day based on the second variant.
Figure 12. Forecast of solar insolation: an example of an autumn day based on the second variant.
Inventions 08 00106 g012
Table 1. Number of days without sun in the GBAO.
Table 1. Number of days without sun in the GBAO.
Station NameMonthsPer Year
123456789101112
Lednik Fedchenko2220000000129
Karakul1212116311005101273
Haburabad65532000016836
Murgab43210000012316
Khorog74310100012625
Table 2. Hyperparameters.
Table 2. Hyperparameters.
HyperparameterDestinationRange/OptionsShor Notation
Hidden-layer neurons countn3–21-
Activation functionfSigmoidalZ
ReLUR
Backpropagation learning algorithmbSGDS
AdamA
Learning rateA0.00110−4
0.0110−3
0.110−2
Table 3. Forecasting accuracy in winter (R + A), learning rate 10−4.
Table 3. Forecasting accuracy in winter (R + A), learning rate 10−4.
Learning StageMAPEn
36912151821
1training, %12.211.211.311.011.56.97.1
validation, %11.211.311.411.811.55.65.8
2training, %11.311.111.211.611.16.56.6
validation, %11.211.111.411.311.05.55.6
Table 4. Forecasting accuracy in winter (R + A), n = 18.
Table 4. Forecasting accuracy in winter (R + A), n = 18.
Learning RateMAPELearning Stage
123456
10−2training, %11.311.9----
validation, %11.410.7----
10−3training, %6.96.5----
validation, %5.55.6----
10−4training, %52.016.611.711.011.1-
validation, %54.018.012.111.111.3-
Table 5. Forecasting accuracy in winter (R + S), learning rate 10−4.
Table 5. Forecasting accuracy in winter (R + S), learning rate 10−4.
Learning StageMAPEn
36912151821
1training, %15.616.215.714.113.811.815.9
validation, %17.816.612.011.015.513.616.4
2training, %13.113.810.710.511.310.115.3
validation, %14.415.910.611.010.88.410.7
3training, %11.511.7-10.6---
validation, %10.813.1-10.4---
Table 6. Forecasting accuracy in winter (R + S), n = 18.
Table 6. Forecasting accuracy in winter (R + S), n = 18.
Learning RateMAPELearning Stage
123456
5 × 10−4training, %119.3-----
validation, %108.1-----
10−4training, %11.810.1----
validation, %13.68.4----
Table 7. Forecasting accuracy in winter (Z + A), learning rate 10−2.
Table 7. Forecasting accuracy in winter (Z + A), learning rate 10−2.
Learning StageMAPEn
36912151821
1training, %17.014.113.212.612.412.011.9
validation, %17.213.613.112.512.112.111.8
2training, %10.911.611.510.810.812.011.4
validation, %10.311.110.910.710.711.911.3
3training, %5.46.47.47.48.28.38.5
validation, %5.16.06.26.87.57.68.4
4training, %4.85.76.56.87.47.47.9
validation, %4.94.95.66.26.76.77.0
5training, %4.55.06.06.67.07.37.3
validation, %4.24.55.65.76.46.56.9
6training, %4.34.85.76.36.66.87.2
validation, %3.94.64.95.96.26.56.5
Table 8. Forecasting accuracy in winter (Z + A), n = 3.
Table 8. Forecasting accuracy in winter (Z + A), n = 3.
Learning RateMAPELearning Stage
123456
10−2training, %5.54.5----
validation, %4.84.2----
10−3training, %17.010.95.44.84.54.3
validation, %17.22610.3285.1744.9234.2193.998
10−4training, %70.852.235.022.717.614.9
validation, %72.051.532.620.316.814.7
Table 9. Forecasting accuracy in winter (Z + S), learning rate 10−4.
Table 9. Forecasting accuracy in winter (Z + S), learning rate 10−4.
Learning RateMAPEn
36912151821
1training, %13.913.412.912.212.612.313.2
validation, %14.313.912.812.612.812.614.0
2training, %12.612.112.011.711.611.611.4
validation, %12.211.511.711.911.511.811.9
3training, %11.411.011.611.711.411.711.3
validation, %11.110.810.910.911.511.812.1
Table 10. Forecasting accuracy in winter (Z + S), n = 6.
Table 10. Forecasting accuracy in winter (Z + S), n = 6.
Learning RateMAPELearning Stage
123456
5 × 10−4training, %20.720.0----
validation, %21.023.4----
10−4training, %23.412.111.0---
validation, %23.911.510.8---
Table 11. The winter one-day-ahead forecasting results.
Table 11. The winter one-day-ahead forecasting results.
b + fnaEpochsMAPE, TrainingMAPE, Validation
R + A1810−34006.485.55
R + S1810−440010.138.45
Z + A310−312004.314.00
Z +S610−460011.0710.81
Table 12. The spring one-day-ahead forecasting results.
Table 12. The spring one-day-ahead forecasting results.
b + fnaEpochsMAPE, TrainingMAPE, Validation
R + A1210−36007.587.82
R + S1510−42007.978.33
Z + A1510−38007.177.23
Z + S610−44007.447.52
Table 13. The summer one-day-ahead forecasting results.
Table 13. The summer one-day-ahead forecasting results.
B + fnaEpochsMAPE, TrainingMAPE, Validation
R + A1510−34008.278.16
R + S1510−44007.026.78
Z + A1210−36008.798.73
Z + S1510−44008.998.62
Table 14. The autumn one-day-ahead forecasting results.
Table 14. The autumn one-day-ahead forecasting results.
b + fnaEpochsMAPE, TrainingMAPE, Validation
R + A610−34003.353.52
R + S610−44009.619.59
Z + A610−36002.362.64
Z + S1210−44007.217.01
Table 15. Results of Z + A, three neurons, the second variant.
Table 15. Results of Z + A, three neurons, the second variant.
aEpochsMAPE, Train SetMAPE, Val. Set
10−22005.574.88
10−24004.594.21
10−320017.0317.23
10−340010.9410.33
10−310004.524.22
10−312004.224.00
10−420070.8272.02
10−440052.2651.57
10−4100017.67616.835
10−4120014.96114.797
Table 16. Results of Z + A, a = 10−3, 1200 epochs.
Table 16. Results of Z + A, a = 10−3, 1200 epochs.
NeuronsMAPE, Train SetMAPE, Val. Set
34.314.00
64.854.68
95.735.00
126.325.98
156.666.20
186.866.58
217.226.52
Table 17. The best-obtained results of one-hour-ahead forecasting (Z + A, a = 10−3).
Table 17. The best-obtained results of one-hour-ahead forecasting (Z + A, a = 10−3).
SeasonNeuronsMAPE, Train SetMAPE, Val. Set
Winter121.741.67
Spring151.060.95
Summer150.610.56
Autumn180.890.84
Table 18. The best-obtained results of one-day-ahead forecasting.
Table 18. The best-obtained results of one-day-ahead forecasting.
SeasonNeuronsMAPE, Train SetMAPE, Val. Set
Winter34.314.00
Spring157.177.23
Summer157.026.78
Autumn62.362.64
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Matrenin, P.; Manusov, V.; Nazarov, M.; Safaraliev, M.; Kokin, S.; Zicmane, I.; Beryozkina, S. Short-Term Solar Insolation Forecasting in Isolated Hybrid Power Systems Using Neural Networks. Inventions 2023, 8, 106. https://doi.org/10.3390/inventions8050106

AMA Style

Matrenin P, Manusov V, Nazarov M, Safaraliev M, Kokin S, Zicmane I, Beryozkina S. Short-Term Solar Insolation Forecasting in Isolated Hybrid Power Systems Using Neural Networks. Inventions. 2023; 8(5):106. https://doi.org/10.3390/inventions8050106

Chicago/Turabian Style

Matrenin, Pavel, Vadim Manusov, Muso Nazarov, Murodbek Safaraliev, Sergey Kokin, Inga Zicmane, and Svetlana Beryozkina. 2023. "Short-Term Solar Insolation Forecasting in Isolated Hybrid Power Systems Using Neural Networks" Inventions 8, no. 5: 106. https://doi.org/10.3390/inventions8050106

APA Style

Matrenin, P., Manusov, V., Nazarov, M., Safaraliev, M., Kokin, S., Zicmane, I., & Beryozkina, S. (2023). Short-Term Solar Insolation Forecasting in Isolated Hybrid Power Systems Using Neural Networks. Inventions, 8(5), 106. https://doi.org/10.3390/inventions8050106

Article Metrics

Back to TopTop