1. Introduction
A pumped storage power station is an important energy storage technology. It can effectively alleviate the impact of intermittent fluctuation energy, such as scenery on the power system, and improve the absorption capacity of clean energy. To solve the security and stability of clean energy entering the power grid on a large scale [
1]. Currently, the pumped storage power stations built or being built in China are developing in an intelligent direction. The complex water diversion system of a pumped storage power station brings great challenges to unit frequency governing and power grid security and stability. Therefore, it is necessary to carry out parameter identification of the regulation system of pumped storage units to ensure the safe and stable operation of units.
In modeling studies of governing systems, due to the diversity of unit and governor characteristics and their operating conditions, it is often difficult to derive accurate model parameters directly from their basic operating principles [
2,
3]; it is difficult to establish a complete system simulation model that can be used practically in power system simulation software or in the performance evaluation of governing systems. In order to solve this problem, the system identification method was widely used [
4].
System identification is the determination of a mathematical model that describes the behavior of a system based on its input and output time functions. It consists of two basic parts: structure identification and parameter identification. The governing system is a complex dynamic system; the mathematical model structure can be determined by applying the mechanism analysis method, so the parameters of the model are determined by the parameter identification method [
5]. In recent decades, the traditional methods for parameter identification were the least squares method [
6], input response method [
7], and maximum likelihood estimation method [
8], but these methods have many limitations, for example, the least squares method requires sufficient system inputs, and the maximum likelihood estimation method can easily fall into local optimality, etc. In recent years, identification methods based on meta-heuristic algorithms have been developed that treat the parameter identification problem as an optimization problem. Since metaheuristic algorithms are a global optimization method, they can establish an objective function for parameter identification by optimizing the output error between the original system and the identification system, comparing with traditional identification methods, metaheuristic algorithms are more suitable for the parameter identification of complex systems, and the identification performances of these algorithms depend on optimization capability.
Many metaheuristic algorithms have been presented and successfully applied to different areas: particle swarm optimization algorithm ((PSO) simulates the foraging behavior of birds in a group [
9,
10]; the chimpanzee optimization algorithm (ChOA) simulates the cooperative hunting behaviors of attack, drive, intercept, and chase chimpanzees [
11]; the artificial rabbit optimization (ARO) algorithm is inspired by the survival strategies of rabbits found in nature, including detour foraging and random hiding [
12]; the genetic algorithm (GA) simulates biological evolutionary mechanisms in natural environments [
13]; the artificial ecosystem optimization algorithm (AEO) simulates the energy flow process in the earth ecosystem [
14]; the gravitational search algorithm (GSA) involves the optimization of populations based on the law of gravity and Newton’s second law [
15]; the ant colony optimization (ACO) algorithm is a simulation of the way ants find paths in nature [
16]; the Black Widow algorithm (BWO) simulates its entire life cycle [
17]; the Tom search algorithm (ASO) simulates the displacement of atoms in a molecular system composed of atoms due to their mutual force and system constraint [
18]; the gray wolf optimization (GWA) algorithm simulates gray wolf prey predation activities [
19]; the artificial fish swarming algorithm (AFSA) simulates the foraging, clustering, and tail-chasing behaviors of fish [
20]. The algorithm for the ant lion optimization algorithm (ALO) simulates the hunting mechanism of an ant lion hunting ants [
21]; the whale optimization algorithm (WOA) is based on encircling prey, bubble netting prey, and searching for whale prey [
22]; the manta ray foraging optimization algorithm (MRFO) simulates the foraging process of manta rays in the ocean [
23]. Strong robustness, adaptability, and randomness are characteristics of these optimization algorithms. Although these metaheuristics outperform conventional numerical approaches in handling challenging engineering problems, they also have some drawbacks and nevertheless hold enormous promise for performance in optimization.
Some authors have made improvements to metaheuristic algorithms, for example, Zhongqiang Wu et al. [
24] proposed an improved ant lion optimization algorithm to identify the parameters of the solar cell model by adding chaotic sequences. M Ali et al. proposed an algorithm to identify the parameters of a polymer electrolyte membrane fuel cell model using the gray wolf optimization algorithm [
25]. Xiao Zhang et al. applied an elite backward learning particle swarm algorithm for the identification of PV cell parameters [
26]. The metaheuristic algorithms have some limitations due to local optimum and premature phenomena, despite the fact that these algorithms have been successfully used in a variety of fields for parameter identification problems.
In 2022, Weiguo Zhao et al. proposed the artificial hummingbird algorithm (AHA) [
27], which was inspired by simulating hummingbirds, special flight abilities, and their intelligent foraging strategies. The method’s advantages—it has few parameters, a fast speed, and performs well in solving optimization problems. This paper proposes an improved metaheuristic algorithm, named the improved artificial hummingbird algorithm (IAHA). In order to make the initialization more uniform and rich, IAHA added the Chebyshev chaotic map to initialize the artificial hummingbird and the Levy flight to improve the search efficiency when guiding foraging.
3. Performance Testing and Analysis
In order to verify the effectiveness of the IAHA algorithm. We compare the IAHA method with four well-established optimizers, including the artificial hummingbird algorithm (AHA), particle swarm optimization algorithm (PSO), ant lion optimization algorithm (ALO), and gravitational search algorithm (GSA). In addition, the number of iterations of all test functions in the five algorithms was 500, and the population size was 30. The rest of the parameters are shown in
Table 1.
The metaheuristic algorithm is a kind of random search algorithm. For the same optimization problem, the results are usually not identical if the same algorithm is used many times. To avoid excessive errors arising from randomness in a single result, the algorithm was repeated 20 times and the statistical results for each algorithm are shown in
Table 2,
Table 3 and
Table 4. The tables include the mean (mean), standard deviation (Std), best (best), and worst (worst) values of the 23 test functions (see
Table A1 in “
Appendix A”), in which the bold are the best values of the mean, standard deviation, best, and worst values of the different test functions for the five algorithms.
3.1. Unimodal Test Functions
F1–F7 are unimodal functions, which mainly test the capability of the development stage. Moreover, these types of functions do not have local extrema, but only global optimal values, which are easy to optimize. Based on this, the convergence speed may be more important than the global optimal value of the algorithm. The statistical results of the IAHA algorithm proposed compared with the other four algorithms are shown in
Table 2 and
Figure 2. Among the five intelligent optimization algorithms, the IAHA algorithm has a significant advantage in terms of mean and standard deviation, except for F6, where both the IAHA algorithm and the AHA algorithm achieve optimal values. As a result, the IAHA algorithm is superior in development ability, convergence, and stability.
3.2. High-Dimensional Multi-Peak Test Functions
F8–F23 are multi-peak functions, which mainly demonstrate the ability of the exploration phase. Among these, F8–F13 are high-dimensional multi-peak functions and F14–F23 are low-dimensional multi-peak functions. The multi-peak test functions have multiple local extremes and global optima, which are more difficult to optimize compared to unimodal functions. As a result, the multi-peak function algorithm’s capability of global optimization is given additional focus. The statistical results of the high-dimensional multi-peak function test for each algorithm are shown in
Table 3 and
Figure 3. In F8, the mean value and standard deviation of IAHA rank second, second only to AHA. In F9–F11, the mean and standard deviation of IAHA and AHA are the same. The mean value of IAHA (in F12 and F13) is the best. Although the standard deviation of F13 is not the smallest, its best value and the worst value are optimal compared with other algorithms. Thus, it can be seen that IAHA has a better advantage in the global search.
3.3. Low-Dimensional Multi-Peak Test Functions
F14–F23 are low-dimensional multi-peaked functions, which have fewer local extremes than higher-dimensional multi-peaked functions and are relatively easier to optimize.
Table 4 and
Figure 4 show the statistical results of the low-dimensional multi-peak test functions of each algorithm. The mean values of the functions in F16–F19 tested by the IAHA algorithm and other algorithms are the same and reach optimal values. In addition, the standard deviation of the IAHA algorithm in F16 and F19 is second only to the GSA algorithm. In F20, the result of the IAHA algorithm is slightly better than that of other algorithms. The IAHA algorithm has a minimum standard deviation in F15, F18, F19, F21–F23. However, regarding the mean values of the AHA algorithm and IAHA algorithm, eight functions are the same and do not show better advantages. Therefore, because of the particularity of the function, the results obtained by the proposed IAHA algorithm and other algorithms are basically close to the theoretical global optimum.
5. Conclusions
In this paper, an improved AHA (IAHA) is proposed. Compared with AHA, there are two improved strategies. First, the population initialization of the Chebyshev chaotic map was used to expand the search range and improve the accuracy. Second, Levy flight was added to guide foraging, which gives the algorithm better convergence and stability. In order to verify the optimization performance of the IAHA algorithm, the 23 standard functions, which include unimodal and multimodal functions, were evaluated. The calculated values in the statistical analyses were the mean value, standard deviation, optimal value, and the worst value. IAHA performs best in most functions. The practical application involves identifying the parameters of the governing system of pumped storage units and calculating the average value through 20 independent operation algorithms. The results show that the errors of the IAHA are only 2.04%, 1.82%, and 1.28% under 5%, 10%, and 15% frequency disturbance conditions. Compared with PSO, GSA, ALO, and AHA, the identification accuracy of the governing system is improved.