Next Article in Journal
Three-Saddle-Foci Chaotic Behavior of a Modified Jerk Circuit with Chua’s Diode
Next Article in Special Issue
Fuzzy Interpolation with Extensional Fuzzy Numbers
Previous Article in Journal
Towards A Global Cosmic Ray Sensor Network: CREDO Detector as the First Open-Source Mobile Application Enabling Detection of Penetrating Radiation
Previous Article in Special Issue
Segmentation of Lung Nodules Using Improved 3D-UNet Neural Network
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Chaotic Hybrid Butterfly Optimization Algorithm with Particle Swarm Optimization for High-Dimensional Optimization Problems

1
Electrical Engineering College, Guizhou University, Guiyang 550025, China
2
Power China Guizhou Engineering CO. LTD, Guiyang 550001, China
3
Guizhou Provincial Key Laboratory of Internet + Intelligent Manufacturing, Guiyang 550025, China
*
Author to whom correspondence should be addressed.
Symmetry 2020, 12(11), 1800; https://doi.org/10.3390/sym12111800
Submission received: 23 September 2020 / Revised: 22 October 2020 / Accepted: 27 October 2020 / Published: 30 October 2020
(This article belongs to the Special Issue Computational Intelligence and Soft Computing: Recent Applications)

Abstract

:
In order to solve the problem that the butterfly optimization algorithm (BOA) is prone to low accuracy and slow convergence, the trend of study is to hybridize two or more algorithms to obtain a superior solution in the field of optimization problems. A novel hybrid algorithm is proposed, namely HPSOBOA, and three methods are introduced to improve the basic BOA. Therefore, the initialization of BOA using a cubic one-dimensional map is introduced, and a nonlinear parameter control strategy is also performed. In addition, the particle swarm optimization (PSO) algorithm is hybridized with BOA in order to improve the basic BOA for global optimization. There are two experiments (including 26 well-known benchmark functions) that were conducted to verify the effectiveness of the proposed algorithm. The comparison results of experiments show that the hybrid HPSOBOA converges quickly and has better stability in numerical optimization problems with a high dimension compared with the PSO, BOA, and other kinds of well-known swarm optimization algorithms.

1. Introduction

The butterfly optimization algorithm (BOA) was proposed by Arora and Singh in 2018 [1]. The method and concept of this algorithm was proposed [2] firstly at the 2015 International Conference on Signal Processing, Computing and Control (2015 ISPCC). After the algorithm was proposed, the authors have performed many studies on BOA. Arora and Singh [3] proposed an improved butterfly optimization algorithm with ten chaotic maps for solving three engineering optimization problems. Arora and Singh [4] proposed a new hybrid optimization algorithm which combines the standard BOA with Artificial Bee Colony (ABC) algorithm. Arora and Singh [5] used the BOA to solve the node localization in wireless sensor networks and compared the results with the particle swarm optimization (PSO) algorithm and firefly algorithm (FA). Arora et al. [6] proposed a modified butterfly optimization algorithm for solving the mechanical design optimization problems. Singh and Anand [7] proposed a novel adaptive butterfly optimization algorithm, which a novel phenomenon of changing the sensory modality of the basic BOA. Sharma and Saha [8] proposed a novel hybrid algorithm (m-MBOA) to enhance the exploitation ability of BOA with the help of the mutualism phase of symbiosis organisms search (SOS). Yuan et al. [9] proposed an improved butterfly optimization algorithm, which is employed for optimizing the system performance that is analyzed based on annual cost, exergy and energy efficiencies, and pollutant emission reduction. Li et al. [10] proposed an improved BOA for engineering design problems using the cross-entropy method. A hybrid intelligent predicting model was proposed for exploring household CO2 emission mitigation strategies derived from BOA [11]. Tan et al. [12] proposed an improved BOA to solve the wavelet neural networks problem based on solutions for elliptic partial differential equations. Malisetti and Pamula [13] proposed a novel BOA based on quasi opposition for the problem of cluster head selection in wireless sensor network (WSNs). Sharma et al. [14] proposed a bidirectional butterfly optimization algorithm for solving the engineering optimization problems. Above the studies of BOA, which are improvement research or applied research, there is only one paper for a hybrid algorithm with ABC and BOA.
In addition, concerning the optimization algorithms that were proposed, these are mainly divided into three categories according to their principles, and the famous meta-heuristic algorithm mainly including evolutionary algorithms: Genetic Algorithm (GA) [15,16], Differential Evolution (DE) [17]; swarm intelligence algorithms: Particle Swarm Optimization (PSO) [18], Ant Colony Optimization (ACO) [19], and Artificial Bee Colony (ABC) algorithm [20]; physics-based algorithms: Gravitational Search Algorithm (GSA) [21], Sine Cosine Algorithm (SCA) [22], and Henry Gas Solubility Optimization (HGSO) algorithm [23]. In the past ten years, scholars have proposed many new swarm intelligence optimization algorithms, which are based on the behavior of animals in nature and also named nature-inspired heuristic algorithms, such as Bat-Inspired Algorithm (BA) [24], Krill herd (KH) [25], Fruit Fly Optimization Algorithm (FOA) [26], Grey Wolf Optimizer (GWO) [27], Moth-flame optimization (MFO) algorithm [28], Whale Optimization Algorithm (WOA) [29], Salp Swarm Algorithm (SSA) [30], Grasshopper Optimization Algorithm (GOA) [31], and Marine Predators Algorithm (MPA) [32]. For more details, the reader can refer to the papers [33,34,35], where the recent and popular algorithms are well reviewed.
The research status of the hybrid algorithm of different intelligent optimization algorithms and PSO algorithm are introduced. Zhen et al. [36] proposed a new memetic algorithm called shuffled particle swarm optimization (SPSO), which combines the PSO with the shuffled frog leaping algorithm (SFLA). Niu and Li [37] proposed a new hybrid global optimization algorithm PSODE combining PSO with DE. Lai and Zhang [38] proposed a novel hybrid algorithm, which combines PSO and GA, and the experiment for 23 benchmark problems was also presented. Mirjalili and Hashim [39] proposed a new hybrid PSOGSA algorithm for function optimization. Wang et al. [40] proposed a hybrid algorithm based on krill herd and quantum-behaved particle swarm optimization (QPSO) for benchmark and engineering optimization. Trivedi et al. [41] proposed a novel hybrid PSO-DA algorithm, which combined the PSO algorithm with the dragonfly algorithm (DA) for global numerical optimization. Trivedi et al. [42] proposed a novel PSOWOA for the global numerical optimization problems. Laskar et al. [43] proposed a new hybrid HWPSO algorithm for electronic design optimization problems according to the studies of hybrid algorithms with PSO and other meta-heuristic algorithms. In addition, the structure of PSO algorithm and BOA has certain similarities, and it is meaningful for a novel hybrid algorithm of PSO with BOA to be studied.
For the research of chaotic theory and chaotic attractors of nonlinear control systems, a general polynomial function was derived for Hopf controlling bifurcations using nonlinear state feedback by Xu and Chen [44]. Xu et al. [45] analyzed the n-scroll chaotic attractors of modified Chua’s circuit and proved the chaos of the Chua system. Yu and Lü [46] studied three-dimensional chaotic systems for Hopf controlling bifurcations in detail. In addition, Yu et al. [47] used the inverse trigonometric function, tan−1(x), to obtain one-, two-, and three-directional multiscroll integer and fractional order chaotic attractors, and they analyzed stabilization of the chaotic system with the application of chaos theory in the improvement of swarm intelligent optimization algorithms [48,49], and it has been recognized by scholars in the field.
In order to improve the ability of the algorithm for high-dimensional optimization problems that we proposed, the method for hybrid the meta-heuristic algorithms, which combines the basic PSO and BOA, and the chaotic theory, is also used in the improved method. In addition, the control parameter of the power exponent a in BOA is also analyzed in detail, and a nonlinear control strategy is proposed for adjusting the ability of the global search and local search capabilities of the improved algorithm.
The rest of this paper is organized as follows: Section 2 presents the basic BOA model. The basic PSO model is presented in Section 3. In Section 4, a novel HPSOBOA algorithm is proposed, and three improved strategies are also introduced in detail. Section 5 illustrates the experimental results on 26 high-dimensional optimization problems and the comparison results of two experiments are also introduced in detail. Finally, conclusion and future studies are summarized in Section 6.

2. The Basic Butterfly Optimization Algorithm (BOA)

The nature-inspired meta-heuristic algorithm is proposed, named BOA [1,2], which simulates the foraging and mating behavior of the butterfly. One of the main characteristics of BOA different from other meta-heuristics is that each butterfly has its own unique scent. The fragrance can be formulated as follows:
f i = c I a
where f i is the perceived magnitude of fragrance, c represents the sensory modality, and I is the stimulus intensity, and a represents the power exponent based on the degree of fragrance absorption.
Theoretically any value of the sensory morphology coefficient c in the range [0,∞] can be taken. However, its value is determined by the particularity of the optimization problem in the iterative process of the BOA. The sensory modality c in the optimal search phase of the algorithm can be formulated as follows:
c t + 1 = c t + [ 0.025 / ( c t · T m a x ) ]
where T m a x is the maximum number of iterations of the algorithm, and the initial value of parameter c is set to 0.01.
In addition, there are two key steps in the algorithm, they are, respectively, global search phase and local search phase. The mathematical model of the butterflies’ global search movements can be formulated as follows:
x i t + 1 = x i t + ( r 2 × g b e s t x i t ) × f i
where x i t denotes the solution vector x i of the ith butterfly in t iteration and r means a random number in [0,1]. Here, g b e s t is the current best solution found among all the solutions in the current stage. Particularly, f i represents the fragrance of the ith butterfly. The local search phase can be formulated as follows:
x i t + 1 = x i t + ( r 2 × x i k x j t ) × f i
where x j t and x i k are jth and kth butterflies chosen randomly from the solution space. If x j t and x i k belong to the same iteration, it means that the butterfly becomes a local random walk. If not, this kind of random movement will diversify the solution.
Both global and local searches for food and a mating partner by the butterfly in nature can occur. Therefore, a switch probability p is set to convert the normal global search and the intensive local search. In each iteration, the BOA randomly generates a number in [0,1], which is compared with switch probability p to decide whether to conduct a global search or local search.

3. The Basic Particle Swarm Optimization (PSO) Model

PSO algorithm [18] is based on the swarm of birds moving for searching food in a multidimensional search space. The position and velocity are the important characteristics of PSO, which are used to find the optimal value.
Each individual is called a particle, and each particle is first initialized with random position and velocity within the search space. The position of the best global particle in the optimal solution is as follows:
v i t + 1 = w · v i t + c 1 · r a n d 1 × ( p b e s t x i t ) + c 2 · r a n d 2 × ( g b e s t x i t )
x i t + 1 = x i t + v t + 1
where v i t + 1 and v t + 1 represent the velocity of ith particle at iteration number (t) and (t + 1). Usually, c1 = c2 = 2, r a n d 1 , and r a n d 2 are the random numbers in (0, 1). The w can be calculated as:
w ( t ) = w m a x ( w m a x w m i n ) · T i T m a x
where w m a x = 0.9 , and w m i n = 0.2 , and T m a x represents the maximum number of iterations.

4. The Proposed Algorithm

In this section, a novel hybrid algorithm is proposed, and the initialization of BOA by a cubic one-dimensional map is introduced, and a nonlinear parameter control strategy is also performed. In addition, the PSO algorithm is hybridized with BOA in order to improve the basic BOA for global optimization.

4.1. Cubic Map

Chaos is a relatively common phenomenon in nonlinear systems. The basic cubic map [50] can be calculated as follows:
z n + 1 = α z n 3 β z n
where α and β represent the chaos factors, and when β in (2.3, 3), the cubic map is chaotic. When α = 1 , the cubic map is in the interval (−2, 2), and the sequence in (−1, 1) with α = 4 . The cubic map can also be:
z n + 1 = ρ z n ( 1 z n 2 )
where the ρ is control parameter. In Equation (8), the sequence of the cubic map is in (0, 1), and when ρ = 2.595 , the chaotic variable z n generated at this time has better ergodicity. A graphical presentation of the cubic map for 1000 iterations is in Figure 1.
In Figure 1, it can be seen that the chaotic map can distribute the population of butterflies to the random value in the interval (0, 1) during the search phase.
We propose the cubic map to initialize the position of the algorithm, and in order to ensure that the initialized interval is in (0, 1), the z (0) of cubic map is set to 0.315 in the proposed algorithm.

4.2. Nonlinear Parameter Control Strategy

From Equations (1), (3), and (4), we can see that the power exponent a plays an important role in BOA’s ability to find the best optimization. When a = 1, it means that no scent is absorbed—that is, the scent emitted by a specific butterfly is perceived by other butterflies—which means that the search range will be narrowed and the local exploration ability of the algorithm will be improved. When a = 0, it means that the fragrance emitted by any butterfly cannot be perceived by other butterflies, so the group will expand the search range—that is, improve the global exploration ability of the algorithm. However, a = 0.1 in basic BOA, and taking a as a fixed value cannot effectively balance the global and local search capabilities. Therefore, we propose a nonlinear parameter control strategy as:
a ( t ) = a f i r s t ( a f i r s t a f i n a l ) · sin ( π μ ( t T m a x ) 2 )
where a f i r s t and a f i n a l represent the initial value and final value of parameter a, μ is tuning parameter, and T m a x represents the maximum number of iterations. In this paper, μ = 2 , T m a x = 500 , a f i r s t = 0.1 , and a f i n a l = 0.3 .
It can be seen from Figure 2a that for the intensity indicator coefficient a, the nonlinear control strategy based on the sine function proposed in this paper has a larger slope in the early stage, which can speed up the algorithm’s global search ability. The mid-term slope is reduced, which is convenient for entering a local search. The later slope is gentle to allow the algorithm to search for the optimal solution. Therefore, it can effectively balance the global search and local search capabilities of the algorithm.
From Figure 2, It can be seen from (b) that the convergence curve of improved BOA with the nonlinear parameter control strategy is better than the basic BOA in the optimal test of Schwefel 1.2 function. The curve has many turning points, indicating that the improved algorithm has the ability to jump out of the global optimum from Figure 2b.
The results of the main controlling parameter μ values of parameter a are shown in Figure 2c. As the value of parameter μ increases, the effect of the improvement strategy gradually worsens. It can be seen from (c) that the convergence curve of improved BOA with μ = 2 is best in the seven convergence curves. When μ 4 , the convergence curve is worse than the original BOA.

4.3. Hybrid BOA with PSO

In this section, a novel hybrid PSOBOA is proposed, which is a combination of separate PSO and BOA. The major difference between PSO and BOA is how new individuals are generated. The drawback of the PSO algorithm is the limitation to cover a small space for solving high-dimensional optimization problems.
In order to combine the advantages of the two algorithms, we combine the functionality of both algorithms and do not use both algorithm one after another. In other words, it is heterogeneous because of the method involved to produce the final results of the two algorithms. The hybrid is proposed as follow:
V i t + 1 = w · V i t + C 1 · r 1 × ( p b e s t X i t ) + C 2 · r 2 × ( g b e s t X i t )
where C 1 = C 2 = 0.5 , and w can be also calculated by Equation (7), r 1 and r 2 are the random number in (0, 1).
X i t + 1 = X i t + V t + 1
In addition, the mathematical model of the global search phase and local search phase in the basic BOA, which can be calculated by Equations (3) and (4). However, the global search phase of the hybrid PSOBOA can be formulated as follows:
X i t + 1 = w · X i t + ( r 2 × g b e s t w · X i t ) × f i
The local search phase of the hybrid PSOBOA can be formulated as follows:
X i t + 1 = w · X i t + ( r 2 × X i k w · X j t ) × f i
where X i k and X j t are jth and kth butterflies chosen randomly from the solution space, respectively.
The pseudo-code of hybrid PSOBOA is shown in Algorithm 1.
Algorithm 1. Pseudo-code of hybrid PSO with BOA (PSOBOA)
1. Generate the initialize population of the butterflies Xi (i = 1, 2, …, n) randomly
2. Initialize the parameter r1, r2, C1 and C2
3. Define senser modality c, power exponent a and switch probability p
4. Calculate the fitness value of each butterflies
5. While t = 1: the max iterations
6.   For each search agent
7.      Update the fragrance of current search agent by Equation (1)
8.    End for
9.    Find the best f
10.   For each search agent
11.      Set a random number r in [0,1]
12.      If r < p then
13.       Move towards best position by Equation (13)
14.      Else
15.      Move randomly by Equation (14)
16.      End if
17.   End for
18.   Update the velocity using Equation (11)
19.    Calculate the new fitness value of each butterflies
20.     If f n e w < best f
21.       Update the position of best f using Equation (12)
22.     End if
23.   Update the value of power exponent a
24. t = t + 1
25. End while
26. Return the best solution and its fitness value

4.4. The Proposed HPSOBOA

In order to combine the advantages of the three improvement strategies proposed in this paper, a novel hybrid HPSOBOA is proposed in this section, which is a combination of the cubic map for the initial population, nonlinear parameter control strategy of power exponent a, PSO algorithm, and BOA.
The pseudo-code of novel HPSOBOA is shown in Algorithm 2.
Algorithm 2. Pseudo-code of novel HPSOBOA
1. Generate the initialize population of the butterflies Xi (i = 1, 2, …, n) using cubic map
2. Initialize the parameter r1, r2, C1 and C2 and switch probability p
3. Define senser modality c and the initial value of power exponent a
4. Calculate the fitness value of each butterflies
5. While t = 1: the max iterations
6.   For each search agent
7.     Update the fragrance of current search agent by Equation (1)
8.   End for
9.   Find the best f
10.   For each search agent
11.     Set a random number r in [0,1]
12.     If r < p then
13.       Move towards best position by Equation (13)
14.     Else
15.       Move randomly by Equation (14)
16.     End if
17.   End for
18.   Update the velocity using Equation (11)
19.    Calculate the new fitness value of each butterflies
20.     If f n e w < best f
21.       Update the position of best f using Equation (12)
22.     End if
23.   Update the value of power exponent a using Equation (10)
24. t = t + 1
25. End while
26. Output the best solution

5. Experiments and Comparison Results

In this section, we choose the 26 high-dimensional test functions from CEC benchmark functions, and the name, range, type, and theoretical optimal value of the test functions are shown in Table 1. Then, two experiments are performed with ten algorithms, including improved BOA, novel BOAs in this paper, and other swarm algorithms or natural science-based algorithms. The performance of experiment 1 was compared through experimental data, which were compared with six algorithms by six benchmark functions in dimensions 100 and 300, respectively. Then, the performance of experiment 2 was, respectively, compared with the ten algorithms by 26 high-dimensional test functions in Dim = 30. Finally, the statistical methods were conducted, and the boxplots for the 30 times fitness of 26 test functions were also compared.

5.1. Numerical Optimization Funtions and Experiments

The experiments were carried out on the same experimental platform. The results of all the algorithms were compared using MATLAB 2018a installed over Windows 10 (64 bit), Intel (R) Core (TM) i5-10210U, and @2.11G with 16.0GB of RAM.

5.1.1. The 26 Test Functions

The properties of unimodal and multimodal benchmark functions for numerical optimization, which are also high-dimensional test functions, are listed in Table 1, where Dim indicates the dimension of the function, and Range is the boundary of the function’s search space. These functions are used to test the performance of the algorithms.

5.1.2. Experiment 1: Comparison with BOA, CBOA, PSOBOA, HPSOBOA, LBOA, and IBOA

In order to analyze the effectiveness of the improvement strategies proposed in this paper, the comparison experiment for BOA [1], CBOA, PSOBOA, HPSOBOA, LBOA [5], and IBOA [9] was designed for six high-dimensional functions from Table 1 with Dim = 100 and Dim = 100 as experiment 1. Additionally, there are three unimodal problems and three multimodal problems. The CBOA combines the basic BOA with the cubic map and nonlinear control strategy of the power exponent a. The hybrid PSOBOA just combines the basic BOA with PSO algorithm, the novel HPSOBOA is a combination of three improvement strategies in Section 4. In addition, two improved BOAs are also compared in this experiment; LBOA [5] was proposed by Arora and Singh, which was used in the improved algorithm to solve the node localization in wireless sensor networks in 2017. The IBOA [9] was proposed by Yuan et al., which was employed for optimizing the system performance that was analyzed based on annual cost, exergy and energy efficiencies, and pollutant emission reduction in 2019.

5.1.3. Experiment 2: Comparison with Other Swarm Algorithms

In order to prove the novel hybrid algorithm superior to other swarm algorithms, the experiment 2 was designed for 26 benchmark functions with Dim = 30. There are ten algorithms in this experiment, and we chose four swarm intelligence optimization algorithms besides the six algorithms in the experimental one. The four swarm algorithms including PSO [18], GWO [27], SCA [22], and MAP [32] were proposed in different years, and their principles are also different. The PSO and GWO algorithms simulate the behavior of animals in nature. The SCA is a physics-based algorithm, which moves towards the best solution using a mathematical model based on sine and cosine functions. The MPA is based on the widespread foraging strategy, namely Lévy and Brownian movements in ocean predators along with optimal encounter rate policy in the bio-logical interaction between predator and prey.

5.1.4. Performance Measures

In order to analyze the performances of the algorithms, three criteria of different swarm algorithms are considered, including the Mean ( A v g ), the Standard deviation ( S t d ), and the Success Rate ( S R ). Here, we will use the Mean which is defined as:
A v g = 1 m i = 1 m F i
where m is the number of optimization test runs, and F i is the best fitness value.
The Standard deviation ( S t d ) is defined as follows:
S t d = 1 m i = 1 m ( F i A v g ) 2
The Success Rate ( S R ) is defined as follows:
S R = m s u m a l l × 100 %
where m a l l is the total number of optimization test runs, and m s u is the times of the algorithm successfully reached to the specified value that ε < 10 15 is called the specified value.

5.2. Comparison of the Parameter Settings of Ten Algorithms

In the experiments, ten comparison algorithms were selected, namely, BOA [1], CBOA, PSOBOA, HPSOBOA, LBOA [5], IBOA [9], PSO [18], GWO [27], SCA [22], and MPA [32]. The parameter settings of the ten algorithms are shown in Table 2. In addition, the population number of each algorithm is set to 30, and the max iteration is set to 500. Each algorithm is run for 30 times, and the Mean (Avg), Standard deviation (Std), Success Rate (SR), and Friedman rank [51] of the results are all taken in the two experiments.

5.3. Results of Experiment 1

For the results of experiment 1, in order to analyze the robustness of the hybrid algorithm by three improved control strategies with other swarm intelligence algorithms, the convergence curves for six benchmark functions (Dim = 100) plots are shown in Figure 3.
It can be verified from the convergence curve that the proposed HPSOBOA converges faster than the other algorithms from Figure 3. The results show that the improved algorithm based on the three improvement strategies in this paper can effectively improve the convergence trend of the basic BOA when Dim = 100. From Figure 3 and Figure 4a–f, it can be seen that the proposed HPSOBOA for those functions has a better convergence than the original BOA except the Schwefel 1.2 function when Dim = 300.
In order to analyze the robustness of the hybrid HPSOBOA by three improved control strategies with other five algorithms, the dimension of the six optimization problems is set to 300, and the convergence curves for six benchmark functions plots are shown in Figure 4.
Figure 5 shows the box plots of optimization results of six high-dimensional problems by the six algorithms. The optimization result of the hybrid HPSOBOA is better than other algorithms from Figure 3, Figure 4 and Figure 5.
In addition, statistical tests are essential to check significant improvements by novel algorithms over others, which were proposed. The Friedman rank test [51] was applied on the mean solutions, we used this method to compare the improved algorithms by different control strategies. The Avg-rank and overall rank are shown in Table 3. From the Friedman rank, the HPSOBOA outperforms all the comparison algorithms on six numerical optimization problems (Schwefel 1.2, Sumsquare, Zakharov, Rastrigin, Ackley, and Alpine), and the order of six algorithms with Dim = 100 is HPSOBOA > PSOBOA > IBOA > LBOA > CABOA > BOA. However, when the Dim = 300, the order of six algorithms is that HPSOBOA > PSOBOA > IBOA > CABOA > LBOA > BOA.
From the results of the analysis, we can see that although the order of HPSOBOA is better than others, the IBOA with chaotic theory for improving the control parameters also performed well. Thus, different one-dimensional chaotic maps can also have a good performance for improving the basic BOA.

5.4. Results of Experiment 2

In experiment 2, the performance of the proposed algorithm was compared with the other optimization algorithms using the 26 test functions with Dim = 30. The statistical results include the Mean ( A v g ), the Standard deviation ( S t d ), the Success Rate ( S R ), Friedman rank test [51], and Wilcoxon rank-sum test [52] because the statistical test is a significance method to analyze the improved algorithm, and these comparison results are presented in Table 4, Table 5, Table 6, Table 7 and Table 8.
The alpha is set to 0.05 in the Wilcoxon rank-sum (WRS) test and Friedman rank test, and there are two hypotheses called the null and alternative. The null hypothesis is a significant difference from the proposed algorithm and the others. According to the statistical value, the null is accepted if this statistical value is greater than the value of alpha; otherwise, the alternative is accepted. The p-value and the Friedman rank depicted that this supremacy is statistically significant. Note, the last row in Table 4, Table 5, Table 6 represents the rank of each algorithm with the number of the best solutions. The p-value and the Friedman rank depicted that this supremacy is statistically significant.
From the comparison results of Table 4, it is proved that the HPSOBOA yields the best results on the 26 test functions with Dim = 30 except F6, F7, F10, F12, F13, F14, F17, F20, F21, F23, and F25. For functions F6, F7, F10, F12, and F23, the hybrid HPSOBOA can obtain the optimal fitness value, which is close to other algorithms but slightly worse. However, for F13, F14, F17, F20, F21, and F25, the best solutions of these functions are searched by the other algorithms, such as GWO, PSO, MPA, and IBOA, and MPA obtains the best solution twice. Additionally, the IBOA also obtains the best solution twice, which is improved by the logistic map for the control parameters. Combining the comparison results in Table 5 and Table 6, we can see that the IBOA is better than others in the SR rank, which is set to ε < 10 15 , and is called the specified value, and the order of ten algorithms is IBOA > HPSOBOA > MPA > GWO > CABOA = LBOA > PSOBOA > PSO = SCA > BOA. The order of HPSOBOA and IBOA is only different once on the function F17, and the SR of HPSOBOA is 93.33%, but the SR of IBOA is 100% for searching the global optimization value, which is set to ε < 10 15 , and is accepted in this paper. Therefore, the performance of the proposed algorithm needs to be improved in future work.
In addition, the comparison results of the Friedman rank test are shown in Table 6; from the Avg-rank, we can obtain that the final order of the rank means of the Friedman rank test—the ten swarm algorithms—is HPSOBOA > MPA > IBOA > PSOBOA > CABOA > GWO > LBOA > PSO > BOA > SCA. The WRS test values are given in Table 7 and Table 8 for the 26 high-dimensional test functions of HPSOBOA vs. the others, respectively, where N/A means not applicable in Table 7. It can be seen from these tables that there is a significance different between the proposed hybrid HPSOBOA and the other algorithms for the 26 test functions with Dim = 30. In Table 8, if H=1, this indicates rejection of the null hypothesis at the 5% significance level. If H=0, this indicates a failure to reject the null hypothesis at the 5% significance level. In addition, Table 9 shows the comparison results of t-test for 26 benchmark functions of the proposed HPSOBOA with the other algorithms.
Figure 6 shows the box plots of the optimization results of 26 high-dimensional problems by the ten algorithms. It is clear from Figure 6 that the outcomes of the average of the fitness function are not normally distributed, in which each algorithm is run for 30 times for the 26 test functions. The values of SCA are relatively poor in the ten algorithms.

6. Conclusions and Future Work

In this paper, we proposed three improvement strategies, and they are as follows: (1) the initialization of BOA by cubic map; (2) a nonlinear parameter control strategy for the power exponent a; (3) hybrid PSO algorithm with BOA. These strategies all aim to improve the ability for global optimization of the basic BOA.
In order to analyze the effectiveness of the improvement strategies, a novel hybrid algorithm was compared with other swarm algorithms, and two experiments were designed. To deal with 26 high-dimensional optimization problems, a cubic map was employed for the initial population of HPSOBOA, and the experimental results show that the initial fitness value is superior to the BOA and other algorithms. In addition, the experimental results show that the one-dimensional chaotic maps may also have a good performance for improving the basic BOA. The MPA proposed in 2020 will be applied in more fields.
In future work, the performance of the proposed algorithm needs to be improved, and the improved BOA includes adjusting its control parameters to optimize algorithm performance. The two-dimensional and three-dimensional chaotic systems can also improve the BOA or other swarm intelligence algorithms in theory. The improved algorithm can also solve real-world problems, such as engineering problems, wireless sensor network (WSNs) deployment problems, proportional-integral-derivative (PID) control problems, and analysis of regional economic activity [53].

Author Contributions

M.Z.: Conceptualization, Methodology, Software, Writing, and Language modification; J.Y.: Data curation, Writing—original draft preparation, and Programming calculation; D.L.: Data curation; T.Q.: Example analysis. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported by the National Natural Science Foundation of China (under Grant No. 61640014 and 61861007), Industrial Project of Guizhou province (Grant No. Qiankehe Zhicheng (2019) 2152), Science and Technology Fund of Guizhou Province under Grant Qiankehe (No. (2020)1Y266), Science and Technology Plan of Guizhou province Qiankehepingtai (No. (2017) 5788), Guizhou Graduate Innovation Fund (No. YJSCXJH (2019) 005), the platform of IOT personnel form Guiyang hi tech Development Zone under Grant 2015, postgraduate case library (under Grant No. KCALK201708), and Innovation group 2020 of Guizhou Provincial Education Department.

Acknowledgments

The authors are grateful for the support provided by the Guizhou Provincial Key Laboratory of Internet + Intelligent Manufacturing, Guiyang 550025, China.

Conflicts of Interest

The authors declare no conflict of interest of this paper.

References

  1. Arora, S.; Singh, S. Butterfly optimization algorithm: A novel approach for global optimization. Soft Comput. 2019, 23, 715–734. [Google Scholar] [CrossRef]
  2. Arora, S.; Singh, S. Butterfly Algorithm with Lévy Flights for Global Optimization. In Proceedings of the 2015 International Conference on Signal Processing, Computing and Control (2015 ISPCC), Waknaghat, India, 24–26 September 2015; pp. 220–224. [Google Scholar]
  3. Arora, S.; Singh, S. An improved butterfly optimization algorithm with chaos. J. Intell. Fuzzy Syst. 2017, 32, 1079–1088. [Google Scholar] [CrossRef]
  4. Arora, S.; Singh, S. An Effective Hybrid Butterfly Optimization Algorithm with Artificial Bee Colony for Numerical Optimization. Int. J. Interact. Multimed. Artif. Intell. 2017, 4, 14–21. [Google Scholar] [CrossRef] [Green Version]
  5. Arora, S.; Singh, S. Node Localization in Wireless Sensor Networks Using Butterfly Optimization Algorithm. Arab. J. Sci. Eng. 2017, 42, 3325–3335. [Google Scholar] [CrossRef]
  6. Arora, S.; Singh, S.; Yetilmezsoy, K. A modified butterfly optimization algorithm for mechanical design optimization problems. J. Braz. Soc. Mech. Sci. 2018, 40, 1–17. [Google Scholar] [CrossRef]
  7. Singh, B.; Anand, P. A novel adaptive butterfly optimization algorithm. Int. J. Comput. Mater. Sci. Eng. 2019, 7, 1850026. [Google Scholar] [CrossRef]
  8. Sharma, S.; Saha, A.K. m-MBOA: A novel butterfly optimization algorithm enhanced with mutualism scheme. Soft Comput. 2019, 24, 4809–4827. [Google Scholar] [CrossRef]
  9. Yuan, Z.; Wang, W.; Wang, H.; Khodaei, H. Improved Butterfly Optimization Algorithm for CCHP Driven by PEMFC. Appl. Therm. Eng. 2019, 173, 114766. [Google Scholar]
  10. Li, G.; Shuang, F.; Zhao, P.; Le, C. An Improved Butterfly Optimization Algorithm for Engineering Design Problems Using the Cross-Entropy Method. Symmetry 2019, 11, 1049. [Google Scholar] [CrossRef] [Green Version]
  11. Wen, L.; Cao, Y. A hybrid intelligent predicting model for exploring household CO2 emissions mitigation strategies derived from butterfly optimization algorithm. Sci. Total Environ. 2020, 727, 138572. [Google Scholar] [CrossRef]
  12. Tan, L.S.; Zainuddin, Z.; Ong, P. Wavelet neural networks based solutions for elliptic partial differential equations with improved butterfly optimization algorithm training. Appl. Soft Comput. 2020, 95, 106518. [Google Scholar] [CrossRef]
  13. Malisetti, N.R.; Pamula, V.K. Performance of Quasi Oppositional Butterfly Optimization Algorithm for Cluster Head Selection in WSNs. Procedia Comput. Sci. 2020, 171, 1953–1960. [Google Scholar] [CrossRef]
  14. Sharma, T.K.; Kumar Sahoo, A.; Goyal, P. Bidirectional butterfly optimization algorithm and engineering applications. Mater. Today Proc. 2020, in press. [Google Scholar] [CrossRef]
  15. Holland, J.H. Genetic Algorithms. Sci. Am. 1992, 267, 66–72. [Google Scholar] [CrossRef]
  16. McCall, J. Genetic algorithms for modelling and optimisation. J. Comput. Appl. Math. 2005, 184, 205–222. [Google Scholar] [CrossRef]
  17. Storn, R.; Price, K. Differential Evolution—A Simple and Efficient Heuristic for global Optimization over Continuous Spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
  18. Kennedy, J.; Eberhart, R. Particle Swarm Optimization. In Proceedings of the IEEE International Conference on Neural Networks, Perth, Australia, 27 November–1 December 1995. [Google Scholar]
  19. Dorigo, M.; Di Car, G. Ant Colony Optimization: A New Meta-Heuristic. In Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406), Washington, DC, USA, 6–9 July 1999; pp. 1470–1477. [Google Scholar]
  20. Karaboga, D.; Basturk, B. A powerful and efficient algorithm for numerical function optimization: Artificial bee colony (ABC) algorithm. J. Glob. Optim. 2007, 39, 459–471. [Google Scholar] [CrossRef]
  21. Rashedi, E.; Nezamabadi-pour, H.; Saryazdi, S. GSA: A Gravitational Search Algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  22. Mirjalili, S. SCA: A Sine Cosine Algorithm for solving optimization problems. Knowl. Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  23. Hashim, F.A.; Houssein, E.H.; Mabrouk, M.S.; Al-Atabany, W.; Mirjalili, S. Henry gas solubility optimization: A novel physics-based algorithm. Future Gener. Comput. Syst. 2019, 101, 646–667. [Google Scholar] [CrossRef]
  24. Yang, X. A New Metaheuristic Bat-Inspired Algorithm. In Nature Inspired Cooperative Strategies for Optimization; Springer: Berlin/Heidelberg, Germany, 2010; pp. 65–74. [Google Scholar] [CrossRef] [Green Version]
  25. Gandomi, A.H.; Alavi, A.H. Krill herd: A new bio-inspired optimization algorithm. Commun. Nonlinear Sci. 2012, 17, 4831–4845. [Google Scholar] [CrossRef]
  26. Pan, W. A new Fruit Fly Optimization Algorithm: Taking the financial distress model as an example. Knowl. Based Syst. 2012, 26, 69–74. [Google Scholar] [CrossRef]
  27. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
  28. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl. Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  29. Mirjalili, S.; Lewis, A. The Whale Optimization Algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  30. Mirjalili, S.; Gandomi, A.H.; Mirjalili, S.Z.; Saremi, S.; Faris, H.; Mirjalili, S.M. Salp Swarm Algorithm: A bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 2017, 114, 163–191. [Google Scholar] [CrossRef]
  31. Saremi, S.; Mirjalili, S.; Lewis, A. Grasshopper Optimisation Algorithm: Theory and application. Adv. Eng. Softw. 2017, 105, 30–47. [Google Scholar] [CrossRef]
  32. Faramarzi, A.; Heidarinejad, M.; Mirjalili, S.; Gandomi, A.H. Marine Predators Algorithm: A nature-inspired metaheuristic. Expert Syst. Appl. 2020, 152, 113377. [Google Scholar] [CrossRef]
  33. Li, M.D.; Zhao, H.; Weng, X.W.; Han, T. A novel nature-inspired algorithm for optimization: Virus colony search. Adv. Eng. Softw. 2016, 92, 65–88. [Google Scholar] [CrossRef]
  34. Mirjalili, S.; Gandomi, A.H. Chaotic gravitational constants for the gravitational search algorithm. Appl. Soft Comput. 2017, 53, 407–419. [Google Scholar] [CrossRef]
  35. Faramarzi, A.; Heidarinejad, M.; Stephens, B.; Mirjalili, S. Equilibrium optimizer: A novel optimization algorithm. Knowl. Based Syst. 2020, 191, 105190. [Google Scholar] [CrossRef]
  36. Zhen, Z.; Wang, Z.; Gu, Z.; Liu, Y. A Novel Memetic Algorithm for Global Optimization Based on PSO and SFLA. Lecture Notes Comput. Sci. 2007, 4683, 127–136. [Google Scholar]
  37. Niu, B.; Li, L. A Novel PSO-DE-Based Hybrid Algorithm for Global Optimization. Lecture Notes Comput. Sci. 2008, 5227, 156–163. [Google Scholar]
  38. Lai, X.; Zhang, M. An efficient ensemble of GA and PSO for real function optimization. In Proceedings of the 2009 2nd IEEE International Conference on Computer Science and Information Technology, Beijing, China, 8–11 August 2009; pp. 651–655. [Google Scholar]
  39. Mirjalili, S.; Hashim, S.Z.M. A new hybrid PSOGSA algorithm for function optimization. In Proceedings of the International Conference on Computer and Information Application (ICCIA 2010), Tianjin, China, 3–5 December 2010; pp. 374–377. [Google Scholar]
  40. Wang, G.; Gandomi, A.H.; Alavi, A.H.; Deb, S. A hybrid method based on krill herd and quantum-behaved particle swarm optimization. Neural Comput. Appl. 2016, 27, 989–1006. [Google Scholar] [CrossRef]
  41. Trivedi, I.N.; Jangir, P.; Kumar, A.; Jangir, N.; Totlani, R.H.; Totlani, R. A Novel Hybrid PSO-DA Algorithm for Global Numerical Optimization. In Networking Communication and Data Knowledge Engineering; Springer: Singapore, 2017; pp. 287–298. [Google Scholar] [CrossRef]
  42. Trivedi, I.N.; Jangir, P.; Kumar, A.; Jangir, N.; Totlani, A.R. A Novel Hybrid PSO–WOA Algorithm for Global Numerical Functions Optimization. In Advances in Computer and Computational Sciences; Springer: Singapore, 2017; pp. 53–60. [Google Scholar] [CrossRef]
  43. Laskar, N.M.; Guha, K.; Chatterjee, I.; Chanda, S.; Baishnab, K.L.; Paul, P.K. HWPSO: A new hybrid whale-particle swarm optimization algorithm and its application in electronic design optimization problems. Appl. Intell. 2019, 49, 265–291. [Google Scholar] [CrossRef]
  44. Yu, P.; Chen, G. Hopf bifurcation control using nonlinear feedback with polynomial functions. Int. J. Bifurcat. Chaos 2004, 14, 1683–1704. [Google Scholar] [CrossRef]
  45. Xu, F.; Yu, P.; Liao, X. Global analysis on n-scroll chaotic attractors of modified Chua’s circuit. Int. J. Bifurcat. Chaos 2009, 19, 135–157. [Google Scholar] [CrossRef]
  46. Yu, P.; Lü, J. Bifurcation control for a class of Lorenz-like systems. Int. J. Bifurcat. Chaos 2011, 21, 2647–2664. [Google Scholar] [CrossRef]
  47. Xu, F.; Yu, P.; Liao, X. Synchronization and stabilization of multi-scroll integer and fractional order chaotic attractors generated using trigonometric functions. Int. J. Bifurcat. Chaos 2013, 23, 1350145. [Google Scholar] [CrossRef]
  48. Wang, G.; Guo, L.; Gandomi, A.H.; Hao, G.; Wang, H. Chaotic Krill Herd algorithm. Inf. Sci. 2014, 274, 17–34. [Google Scholar] [CrossRef]
  49. Yousri, D.; AbdelAty, A.M.; Said, L.A.; Elwakil, A.S.; Maundy, B.; Radwan, A.G. Chaotic Flower Pollination and Grey Wolf Algorithms for parameter extraction of bio-impedance models. Appl. Soft Comput. 2019, 75, 750–774. [Google Scholar] [CrossRef]
  50. Palacios, A. Cycling chaos in one-dimensional coupled iterated maps. Int. J. Bifurcat. Chaos 2002, 12, 1859–1868. [Google Scholar] [CrossRef] [Green Version]
  51. Meddis, R. Unified analysis of variance by ranks. Br. J. Math. Stat. Psychol. 1980, 33, 84–98. [Google Scholar] [CrossRef]
  52. Wilcoxon, F. Individual Comparisons by Ranking Methods. Biometr. Bull. 1945, 1, 80–83. [Google Scholar] [CrossRef]
  53. Caruso, G.; Di Battista, T.; Gattone, S.A. A Micro-level Analysis of Regional Economic Activity through a PCA Approach. Adv. Intell. Syst. Comput. 2020, 1009, 227–234. [Google Scholar]
Figure 1. Visualization of implemented cubic map with ρ in (1.5, 3) and ρ = 2.595 , respectively.
Figure 1. Visualization of implemented cubic map with ρ in (1.5, 3) and ρ = 2.595 , respectively.
Symmetry 12 01800 g001
Figure 2. Variation curve of different intensity coefficients and convergence curve of test function. (a) Two control parameter strategies, (b) Convergence curve of Schwefel 1.2, (c) Convergence curve of Schwefel 1.2 with Dim = 100 for different parameter values setting.
Figure 2. Variation curve of different intensity coefficients and convergence curve of test function. (a) Two control parameter strategies, (b) Convergence curve of Schwefel 1.2, (c) Convergence curve of Schwefel 1.2 with Dim = 100 for different parameter values setting.
Symmetry 12 01800 g002
Figure 3. Convergence curve for six algorithms with Dim = 100; the six test functions’ names are Schwefel 1.2, Sumsquare, Zakharov, Rastrigin, Ackley, and Alpine, respectively.
Figure 3. Convergence curve for six algorithms with Dim = 100; the six test functions’ names are Schwefel 1.2, Sumsquare, Zakharov, Rastrigin, Ackley, and Alpine, respectively.
Symmetry 12 01800 g003
Figure 4. Convergence curve for six algorithms with Dim = 300; the six test functions’ names are Schwefel 1.2, Sumsquare, Zakharov, Rastrigin, Ackley, and Alpine, respectively. (a) Schwefel 1.2, (b) Sumsquare, (c) Zakharov, (d) Rastrigin, (e) Ackley, (f) Alpine.
Figure 4. Convergence curve for six algorithms with Dim = 300; the six test functions’ names are Schwefel 1.2, Sumsquare, Zakharov, Rastrigin, Ackley, and Alpine, respectively. (a) Schwefel 1.2, (b) Sumsquare, (c) Zakharov, (d) Rastrigin, (e) Ackley, (f) Alpine.
Symmetry 12 01800 g004
Figure 5. Boxplot for the 30 times fitness of six test functions with Dim = 100 and Dim = 300. (a) the three functions’ names are Schwefel 1.2, Sumsquare and Zakharov with Dim = 100; (b) the three functions’ names are Rastrigin, Ackley, and Alpine Dim = 100; (c) the three functions’ names are Schwefel 1.2, Sumsquare, and Zakharov with Dim = 300; (d) the three functions’ names are Rastrigin, Ackley, and Alpine Dim = 300.
Figure 5. Boxplot for the 30 times fitness of six test functions with Dim = 100 and Dim = 300. (a) the three functions’ names are Schwefel 1.2, Sumsquare and Zakharov with Dim = 100; (b) the three functions’ names are Rastrigin, Ackley, and Alpine Dim = 100; (c) the three functions’ names are Schwefel 1.2, Sumsquare, and Zakharov with Dim = 300; (d) the three functions’ names are Rastrigin, Ackley, and Alpine Dim = 300.
Symmetry 12 01800 g005
Figure 6. Boxplot for the algorithms run 30 times for the fitness of 26 test functions with Dim = 30.
Figure 6. Boxplot for the algorithms run 30 times for the fitness of 26 test functions with Dim = 30.
Symmetry 12 01800 g006aSymmetry 12 01800 g006b
Table 1. High-dimensional test functions.
Table 1. High-dimensional test functions.
NameFormula of FunctionsDimRangeTypefmin
Sphere F 1 ( x ) = i = 1 D i m x i 2 30[−100,100]U0
Schwefel 2.22 F 2 ( x ) = i = 1 D i m | x i | + i = 1 D i m | x i | 30[−10,10]U0
Schwefel 1.2 F 3 ( x ) = i = 1 D i m ( j = i i x j ) 2 30[−100,100]U0
Schwefel 2.21 F 4 = max { | x i | , 1 i D i m } 30[−10,10]U0
Step F 5 = i = 1 D i m ( x i + 0.5 ) 2 30[−10,10]U0
Quartic F 6 ( x ) = i = 1 D i m D i m · x i 2 + r a n d ( 0 , 1 ) 30[−1.28,1.28]U0
Exponential F 7 = e x p ( 0.5 i = 1 D i m x i ) 30[−10,10]U0
Sum power F 8 = i = 1 D i m | x i | ( i + 1 ) 30[−1,1]U0
Sum square F 9 ( x ) = i = 1 D i m ( D i m · x i 2 ) 30[−10,10]U0
Rosenbrock F 10 ( x ) = i = 1 D i m ( 100 ( x i + 1 x i 2 ) + ( x i 1 ) 2 ) 30[−5,10]U0
Zakharov F 11 ( x ) = i = 1 D i m x i 2 + ( i = 1 D i m 0.5 i x i ) 2 + ( i = 1 D i m 0.5 i x i ) 4 30[−5,10]U0
Trid F 12 ( x ) = ( x i 1 ) 2 + i = 1 D i m i · ( 2 x i 2 x i 1 ) 2 30[−10,10]U0
Elliptic F 13 ( x ) = i = 1 D i m ( 10 6 ) ( i 1 ) / ( D i m 1 ) · x i 2 30[−100,100]U0
Cigar F 14 ( x ) = x 1 2 + 10 6 i = 2 D i m x i 2 30[−100,100]U0
Rastrigin F 15 ( x ) = i = 1 D i m [ x i 2 10 cos ( 2 π x i ) + 10 ] 30[−5.12,5.12]M0
NCRastrigin F 16 ( x ) = i = 1 D i m [ y i 2 10 cos ( 2 π y i ) + 10 ] ,
y i = { x i , , | x i | < 0.5 r o u n d ( 2 x i ) / 2 , | x i | > 0.5
30[−5.12,5.12]M0
Ackley F 17 ( x ) = 20 exp ( 0.2 1 D i m i = 1 D i m x i 2 ) + e x p ( 1 D i m i = 1 D i m cos ( 2 π x i ) ) + 20 + e x p ( 1 ) 30[−50,50]M0
Griewank F 18 ( x ) = 1 4000 i = 1 D i m x i 2 i = 1 D i m cos ( x i i ) + 1 30[−600,600]M0
Alpine F 19 ( x ) = i = 1 D i m | x i · sin ( x i ) + 0.1 x i | 30[−10,10]M0
Penalized 1 F 20 ( x ) = π D i m { i = 1 D i m 1 ( y i 1 ) 2 [ 1 + 10 s i n 2 ( π y i + 1 ) ] + ( y D i m 1 ) 2 + 10 s i n 2 ( π y 1 ) } + i = 1 D i m u ( x i , 10 , 100 , 4 )
y i = 1 + ( x i + 1 ) / 4 , u y i , a , k , m = { k ( x i a ) m , x i > a 0 , a x i a k ( x i a ) m , x i < a
30[−100,100]M0
Penalized 2 F 21 ( x ) = 1 10 { s i n 2 ( π x 1 ) + i = 1 D i m 1 ( x i 1 ) 2 [ 1 + s i n 2 ( 3 π x i + 1 ) ] + ( x D i m 1 ) 2 ( 1 + s i n 2 ( 2 π x i + 1 ) ) } + i = 1 D i m u ( x i , 5 , 100 , 4 ) 30[−100,100]M0
Schwefel F 22 ( x ) = i = 1 D i m | x i · sin ( | x i | ) | 30[−100,100]M0
Levy F 23 ( x ) = s i n 2 ( 3 π x i ) + i = 1 D i m 1 ( x i 1 ) 2 [ 1 + s i n 2 ( 3 π x i + 1 ) ] + | x D i m 1 | · [ 1 + s i n 2 ( 2 π x D i m ) ] 30[−10,10]M0
Weierstrass F 24 ( x ) = i = 1 D i m ( k = 0 k m a x [ a k cos ( 2 π b k ( x i + 0.5 ) ) ] ) D i m · k = 0 k m a x [ a k cos ( 2 π b k · 0.5 ) ] , a = 0.5 , b = 3 , k m a x = 20 30[−1,1]M0
Solomon F 25 ( x ) = 1 c o s ( 2 π i = 1 D i m x i 2 ) + 0.1 i = 1 D i m x i 2 30[−100,100]M0
Bohachevsky F 26 ( x ) = i = 1 D i m [ x i 2 + 2 x i + 1 2 0.3 · c o s ( 3 π x i ) ] 30[−10,10]M0
where U represents unimodal, and M represents multimodal.
Table 2. Parameter settings for algorithms.
Table 2. Parameter settings for algorithms.
NO.AlgorithmsPopulation SizeParameter Settings
1Butterfly Optimization Algorithm (BOA)30a = 0.1, c(0) = 0.01, p = 0.6
2Butterfly Optimization Algorithm with Cubic map (CBOA)30afirst = 0.1, afinal = 0.3, c(0) = 0.01, p = 0.6, x(0) = 0.315, ρ = 0.295
3PSOBOA30a = 0.1, c(0) = 0.01, p = 0.6, c1 = c2 = 0.5
4Hybrid PSO with BOA and Cubic map (HPSOBOA)30afirst = 0.1, afinal = 0.3, c(0) = 0.01, p = 0.6, x(0) = 0.315, ρ = 0.295, c1 = c2 = 0.5
5Butterfly Optimization Algorithm with Lévy flights (LBOA) 30a = 0.1, c(0) = 0.01, p = 0.6, λ = 1.5
6Improved Butterfly Optimization Algorithm (IBOA)30a(0) = 0.1, c(0) = 0.01, p = 0.6, r(0) = 0.33, μ = 4
7Particle Swarm Optimization (PSO) 30c1 = c2 = 2, Vmax = 1, Vmin = −1, ωmax = 0.9, ωmin = 0.2
8Grey Wolf Optimizer (GWO) 30afirst = 2, afinal = 0
9Sine Cosine Algorithm (SCA)30a = 2, r1(0) = 2
10Marine Predators Algorithm (MPA)30a = 0.1, c(0) = 0.01, p = 0.6
Table 3. Optimization comparison results for 100-dimensional and 300-dimensional functions.
Table 3. Optimization comparison results for 100-dimensional and 300-dimensional functions.
FunctionsBOACABOAPSOBOAHBOAPSOLBOAIBOABOACABOAPSOBOAHBOAPSOLBOAIBOA
Dim = 100Dim = 300
Schwefel 1.2Worst8.23 × 10−113.16 × 10−186.40 × 10−92.89 × 10−2071.67 × 10−119.22 × 10−299.21 × 10−113.95 × 10−274.06 × 10−92.65 × 10−762.56 × 10−114.53 × 10−29
Best5.68 × 10−111.48 × 10−304.03 × 10−2877.12 × 10−2186.56 × 10−149.39 × 10−346.14 × 10−116.44 × 10−414.93 × 10−2854.57 × 10−2743.27 × 10−132.82 × 10−32
Avg6.95 × 10−111.12 × 10−192.13 × 10−102.32 × 10−2074.43 × 10−124.34 × 10−307.49 × 10−111.32 × 10−281.35 × 10−108.85 × 10−783.46 × 10−122.73 × 10−30
Std6.15 × 10−125.76 × 10−191.17 × 10−90.00 × 1004.29 × 10−121.68 × 10−297.44 × 10−127.20 × 10−287.41 × 10−104.85 × 10−774.86 × 10−128.16 × 10−30
rank5.973.971.971.174.972.975.972.801.871.634.973.77
SR/%0.00100.0096.67100.000.00100.000.00100.0096.67100.000.00100.00
SumsquareWorst1.07 × 10−102.33 × 10−122.98 × 10−95.82 × 10−201.16 × 10−111.35 × 10−291.06 × 10−102.50 × 10−125.21 × 10−98.92 × 10−241.18 × 10−114.98 × 10−30
Best6.71 × 10−114.45 × 10−193.42 × 10−2943.47 × 10−2941.34 × 10−149.55 × 10−347.33 × 10−111.00 × 10−169.50 × 10−2721.35 × 10−2922.87 × 10−155.72 × 10−33
Avg8.63 × 10−112.14 × 10−131.01 × 10−101.94 × 10−213.20 × 10−121.35 × 10−308.95 × 10−111.93 × 10−131.98 × 10−102.97 × 10−253.03 × 10−121.01 × 10−30
Std8.78 × 10−124.62 × 10−135.43 × 10−101.06 × 10−202.80 × 10−122.71 × 10−308.84 × 10−124.82 × 10−139.56 × 10−101.63 × 10−243.64 × 10−121.20 × 10−30
rank5.973.931.901.434.932.835.933.901.701.774.902.80
SR/%0.0043.3393.33100.000.00100.000.0046.6790.00100.003.33100.00
ZakharovWorst1.11 × 10−105.43 × 10−122.38 × 10−52.28 × 10−711.95 × 10−112.64 × 10−291.03 × 10−106.45 × 10−132.45 × 10−72.74 × 10−722.04 × 10−112.84 × 10−29
Best5.70 × 10−113.06 × 10−172.14 × 10−2944.25 × 10−2897.01 × 10−154.38 × 10−336.88 × 10−114.94 × 10−167.09 × 10−2934.43 × 10−2874.80 × 10−148.50 × 10−33
Avg8.18 × 10−115.01 × 10−137.95 × 10−77.59 × 10−734.42 × 10−121.57 × 10−308.41 × 10−119.96 × 10−141.60 × 10−89.12 × 10−744.75 × 10−123.43 × 10−30
Std1.13 × 10−111.21 × 10−124.35 × 10−64.16 × 10−724.89 × 10−124.85 × 10−308.57 × 10−121.61 × 10−136.11 × 10−85.00 × 10−735.45 × 10−126.09 × 10−30
rank5.973.932.301.074.932.805.933.932.071.474.932.67
SR/%0.0043.3386.67100.003.33100.000.0036.6786.67100.000.00100.00
RastriginWorst4.44 × 10−70.00 × 1001.39 × 10−90.00 × 1000.00 × 1000.00 × 1002.36 × 10−70.00 × 1003.65 × 10−90.00 × 1000.00 × 1000.00 × 100
Best0.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 100
Avg1.48 × 10−80.00 × 1004.65 × 10−110.00 × 1000.00 × 1000.00 × 1007.88 × 10−90.00 × 1001.22 × 10−100.00 × 1000.00 × 1000.00 × 100
Std8.11 × 10−80.00 × 1002.55 × 10−100.00 × 1000.00 × 1000.00 × 1004.32 × 10−80.00 × 1006.67 × 10−100.00 × 1000.00 × 1000.00 × 100
rank3.923.403.483.403.403.403.583.473.553.473.473.47
SR/%83.33100.0096.67100.00100.00100.0096.67100.0096.67100.00100.00100.00
AckleyWorst3.23 × 10−81.62 × 10−81.58 × 10−61.85 × 10−85.46 × 10−108.88 × 10−164.86 × 10−86.44 × 10−92.67 × 10−92.51 × 10−127.15 × 10−98.88 × 10−16
Best1.59 × 10−93.02 × 10−108.88 × 10−168.88 × 10−164.44 × 10−158.88 × 10−162.30 × 10−81.80 × 10−108.88 × 10−168.88 × 10−164.49 × 10−138.88 × 10−16
Avg1.34 × 10−83.17 × 10−95.26 × 10−86.38 × 10−104.02 × 10−118.88 × 10−163.38 × 10−82.09 × 10−91.75 × 10−101.33 × 10−135.19 × 10−108.88 × 10−16
Std8.14 × 10−93.96 × 10−92.88 × 10−73.38 × 10−91.20 × 10−100.00 × 1005.63 × 10−91.97 × 10−96.66 × 10−105.21 × 10−131.33 × 10−90.00 × 100
rank5.974.932.102.203.901.906.004.972.082.054.001.90
SR/%0.000.0096.6786.6713.33100.000.000.0090.0093.330.00100.00
AlpineWorst1.30 × 10−106.56 × 10−71.80 × 10−129.13 × 10−419.60 × 10−124.41 × 10−197.31 × 10−101.80 × 10−61.04 × 10−118.54 × 10−236.14 × 10−114.97 × 10−19
Best2.31 × 10−113.36 × 10−113.11 × 10−1469.85 × 10−1362.17 × 10−173.81 × 10−215.21 × 10−115.68 × 10−128.46 × 10−1473.37 × 10−1319.80 × 10−184.96 × 10−21
Avg6.94 × 10−112.57 × 10−86.01 × 10−143.18 × 10−429.82 × 10−137.90 × 10−202.67 × 10−106.34 × 10−83.46 × 10−133.00 × 10−247.53 × 10−121.33 × 10−19
Std3.01 × 10−111.20 × 10−73.29 × 10−131.67 × 10−412.21 × 10−129.09 × 10−201.76 × 10−103.28 × 10−71.89 × 10−121.56 × 10−231.22 × 10−111.48 × 10−19
rank5.006.001.731.374.002.905.775.231.431.734.002.83
SR/%0.000.0096.67100.0043.33100.000.000.0096.67100.0010.00100.00
Avg.rank5.4644.3612.2471.7724.3562.8005.5314.0502.1172.0194.3782.906
Final rank652143642153
Table 4. Comparison results for 26 test functions with Dim = 30 for ten algorithms.
Table 4. Comparison results for 26 test functions with Dim = 30 for ten algorithms.
FunctionsBOACABOAPSOBOAHBOAPSOLBOAIBOAPSOGWOSCAMPA
F1Avg7.78 × 10−111.01 × 10−131.68 × 10−103.74 × 10−1043.92 × 10−121.61 × 10−301.11 × 10−56.20 × 10−281.39 × 1014.93 × 10−23
Std7.67 × 10−122.11 × 10−139.17 × 10−102.05 × 10−1034.46 × 10−123.90 × 10−302.12 × 10−57.68 × 10−282.88 × 1017.29 × 10−23
F2Avg2.23 × 10−81.25 × 10−144.14 × 10−102.63 × 10−221.38 × 10−95.11 × 10−193.35 × 10−31.04 × 10−161.87 × 10−22.99 × 10−13
Std7.12 × 10−92.15 × 10−142.27 × 10−91.44 × 10−212.08 × 10−91.73 × 10−182.18 × 10−38.66 × 10−173.66 × 10−22.56 × 10−13
F3Avg6.34 × 10−116.30 × 10−138.05 × 10−173.04 × 10−712.74 × 10−126.15 × 10−311.23 × 1027.24 × 10−68.03 × 1031.52 × 10−4
Std5.70 × 10−121.37 × 10−124.41 × 10−161.67 × 10−702.44 × 10−121.16 × 10−305.98 × 1021.51 × 10−56.30 × 1033.15 × 10−4
F4Avg2.59 × 10−82.77 × 10−109.39 × 10−83.61 × 10−462.30 × 10−91.36 × 10−191.85 × 10−18.57 × 10−83.77 × 1003.29 × 10−10
Std2.58 × 10−92.96 × 10−105.14 × 10−71.97 × 10−452.36 × 10−91.97 × 10−194.62 × 10−28.56 × 10−81.30 × 1002.23 × 10−10
F5Avg5.17 × 1008.50 × 10−66.47 × 1004.17 × 10−23.52 × 1004.44 × 1003.69 × 10−66.84 × 10−14.85 × 1001.25 × 10−7
Std6.09 × 10−11.06 × 10−53.90 × 10−16.41 × 10−28.50 × 10−18.70 × 10−14.74 × 10−64.38 × 10−17.32 × 10−14.78 × 10−7
F6Avg2.03 × 10−32.00 × 10−32.53 × 10−42.55 × 10−42.10 × 10−31.22 × 10−47.98 × 10−21.69 × 10−31.19 × 10−11.31 × 10−3
Std8.70 × 10−47.89 × 10−43.21 × 10−44.00 × 10−49.63 × 10−48.06 × 10−53.14 × 10−28.21 × 10−41.04 × 10−15.47 × 10−4
F7Avg1.05 × 10−111.48 × 10−628.41 × 10−111.48 × 10−625.23 × 10−201.19 × 10−190.00 × 1005.10 × 10−581.38 × 10−407.18 × 10−66
Std4.21 × 10−116.67 × 10−632.94 × 10−106.70 × 10−631.41 × 10−195.94 × 10−190.00 × 1001.71 × 10−577.24 × 10−407.74 × 10−70
F8Avg6.33 × 10−146.58 × 10−151.42 × 10−173.19 × 10−1187.51 × 10−161.32 × 10−361.37 × 10−142.21 × 10−957.27 × 10−51.41 × 10−60
Std3.60 × 10−141.19 × 10−147.78 × 10−171.68 × 10−1179.49 × 10−164.59 × 10−364.69 × 10−141.20 × 10−942.25 × 10−45.28 × 10−60
F9Avg7.01 × 10−112.91 × 10−131.87 × 10−162.72 × 10−992.36 × 10−125.60 × 10−311.67 × 10−41.50 × 10−287.67 × 10−11.07 × 10−23
Std7.91 × 10−127.08 × 10−131.02 × 10−151.31 × 10−982.76 × 10−121.87 × 10−303.97 × 10−41.96 × 10−281.13 × 1001.36 × 10−23
F10Avg2.89 × 1012.87 × 1012.90 × 1012.89 × 1012.88 × 1012.89 × 1012.67 × 1012.68 × 1014.19 × 1012.53 × 101
Std2.54 × 10−21.39 × 10−52.16 × 10−28.18 × 10−23.18 × 10−23.40 × 10−21.34 × 1007.02 × 10−14.32 × 1013.86 × 10−1
F11Avg6.72 × 10−112.37 × 10−141.32 × 10−83.64 × 10−782.78 × 10−121.10 × 10−309.02 × 10−52.24 × 10−288.79 × 1001.09 × 10−23
Std6.90 × 10−124.24 × 10−146.84 × 10−81.99 × 10−772.63 × 10−122.90 × 10−301.05 × 10−43.10 × 10−281.74 × 1012.38 × 10−23
F12Avg9.72 × 10−14.77 × 10−19.91 × 10−19.75 × 10−19.34 × 10−19.71 × 10−17.66 × 10−16.67 × 10−15.86 × 1026.67 × 10−1
Std1.14 × 10−23.45 × 10−15.24 × 10−38.45 × 10−22.11 × 10−27.36 × 10−33.39 × 10−12.62 × 10−62.29 × 1035.38 × 10−8
F13Avg1.16 × 10−204.17 × 10−306.44 × 10−245.73 × 10−921.20 × 10−243.03 × 10−357.70 × 10−770.00 × 1002.43 × 10−961.97 × 10−162
Std6.14 × 10−202.18 × 10−293.00 × 10−233.14 × 10−915.29 × 10−246.42 × 10−353.27 × 10−760.00 × 1001.31 × 10−951.09 × 10−161
F14Avg6.51 × 10−172.24 × 10−237.15 × 10−151.28 × 10−634.71 × 10−188.45 × 10−317.38 × 10−618.59 × 10−2016.47 × 10−671.07 × 10−63
Std1.39 × 10−167.51 × 10−233.92 × 10−147.04 × 10−637.50 × 10−182.52 × 10−303.81 × 10−600.00 × 1003.37 × 10−665.84 × 10−63
F15Avg2.51 × 1010.00 × 1001.10 × 1010.00 × 1000.00 × 1000.00 × 1004.56 × 1013.36 × 1004.42 × 1010.00 × 100
Std6.52 × 1010.00 × 1004.22 × 1010.00 × 1000.00 × 1000.00 × 1001.11 × 1014.42 × 1003.71 × 1010.00 × 100
F16Avg9.36 × 1010.00 × 1002.00 × 1010.00 × 1000.00 × 1000.00 × 1004.48 × 1018.20 × 1007.08 × 1011.01 × 10−8
Std8.04 × 1010.00 × 1005.23 × 1010.00 × 1000.00 × 1000.00 × 1009.04 × 1005.18 × 1004.45 × 1014.72 × 10−8
F17Avg1.09 × 10−91.84 × 10−95.63 × 10−88.96 × 10−112.34 × 10−128.88 × 10−161.69 × 10−32.79 × 1002.03 × 1011.06 × 10−3
Std8.16 × 10−101.76 × 10−93.06 × 10−74.73 × 10−107.87 × 10−120.00 × 1001.32 × 10−37.22 × 1005.27 × 10−25.83 × 10−3
F18Avg7.64 × 10−121.70 × 10−142.61 × 10−80.00 × 1003.48 × 10−130.00 × 1005.33 × 10−31.31 × 10−32.17 × 10−10.00 × 100
Std6.94 × 10−121.82 × 10−141.35 × 10−70.00 × 1008.78 × 10−130.00 × 1007.48 × 10−34.99 × 10−32.13 × 10−10.00 × 100
F19Avg1.90 × 10−106.76 × 10−64.77 × 10−72.54 × 10−456.32 × 10−148.93 × 10−201.15 × 10−35.15 × 10−43.02 × 10−12.12 × 10−14
Std1.00 × 10−103.13 × 10−51.78 × 10−61.39 × 10−441.73 × 10−131.19 × 10−199.36 × 10−47.29 × 10−45.38 × 10−11.52 × 10−14
F20Avg5.56 × 10−11.90 × 10−48.75 × 10−12.84 × 10−33.06 × 10−14.97 × 10−14.44 × 1004.74 × 10−21.17 × 1065.79 × 10−5
Std1.40 × 10−14.90 × 10−42.11 × 10−13.79 × 10−39.96 × 10−21.37 × 10−12.62 × 1002.27 × 10−22.83 × 1063.17 × 10−4
F21Avg3.52 × 1001.36 × 10−24.42 × 1003.93 × 10−22.41 × 1003.15 × 1001.89 × 10−69.27 × 10−13.48 × 1061.38 × 10−2
Std5.92 × 10−14.57 × 10−27.32 × 10−14.57 × 10−25.15 × 10−14.40 × 10−13.64 × 10−62.80 × 10−16.37 × 1063.69 × 10−2
F22Avg9.76 × 1005.52 × 10−163.42 × 10−78.38 × 10−771.56 × 10−33.56 × 10−266.71 × 10−45.34 × 10−11.42 × 1011.11 × 10−1
Std7.84 × 1004.56 × 10−161.88 × 10−64.30 × 10−768.54 × 10−37.84 × 10−262.74 × 10−34.25 × 10−11.87 × 1001.72 × 10−1
F23Avg1.17 × 1014.35 × 10−41.75 × 1017.28 × 10−28.42 × 1009.83 × 1004.77 × 10−21.42 × 1001.74 × 1011.35 × 10−1
Std2.66 × 1004.66 × 10−43.79 × 1001.87 × 10−12.56 × 1002.47 × 1006.58 × 10−21.13 × 1003.58 × 1001.13 × 10−1
F24Avg6.21 × 10−10.00 × 1003.36 × 10−110.00 × 1000.00 × 1000.00 × 1009.15 × 10−15.05 × 1009.91 × 1000.00 × 100
Std1.95 × 1000.00 × 1001.84 × 10−100.00 × 1000.00 × 1000.00 × 1001.40 × 1002.35 × 1002.00 × 1000.00 × 100
F25Avg7.65 × 10−17.30 × 10−21.41 × 10−12.53 × 10−83.65 × 10−22.25 × 10−321.15 × 1003.48 × 10−11.66 × 1009.95 × 10−2
Std2.21 × 10−14.47 × 10−21.21 × 10−11.38 × 10−74.88 × 10−25.88 × 10−323.41 × 10−11.13 × 10−12.15 × 1008.20 × 10−17
F26Avg7.96 × 10−116.54 × 10−156.50 × 10−120.00 × 1003.22 × 10−120.00 × 1001.00 × 10−10.00 × 1007.60 × 10−10.00 × 100
Std8.77 × 10−121.52 × 10−143.54 × 10−110.00 × 1002.68 × 10−120.00 × 1003.29 × 10−10.00 × 1001.27 × 1000.00 × 100
Table 5. The Success Rate for 26 benchmark functions.
Table 5. The Success Rate for 26 benchmark functions.
FunctionsBOACABOAPSOBOAHBOAPSOLBOAIBOAPSOGWOSCAMPA
F10.0043.3393.33100.000.00100.000.00100.000.00100.00
F20.0076.6786.67100.000.00100.000.00100.000.000.00
F30.0030.00100.00100.003.33100.000.000.000.000.00
F40.000.0090.00100.000.00100.000.000.000.000.00
F50.000.000.000.000.000.000.000.000.000.00
F60.000.000.000.000.000.000.000.000.000.00
F70.00100.0056.67100.00100.00100.00100.00100.00100.00100.00
F80.0080.00100.00100.00100.00100.0080.00100.000.00100.00
F90.0056.67100.00100.003.33100.000.00100.000.00100.00
F100.000.000.000.000.000.000.000.000.000.00
F110.0063.3390.00100.000.00100.000.00100.000.00100.00
F120.000.000.000.000.000.000.000.000.000.00
F13100.00100.00100.00100.00100.00100.00100.00100.00100.00100.00
F14100.00100.0096.67100.00100.00100.00100.00100.00100.00100.00
F1550.00100.0086.67100.00100.00100.000.000.000.00100.00
F163.33100.0090.00100.00100.00100.000.000.000.0050.00
F170.000.0083.3393.3330.00100.000.000.000.000.00
F180.0046.6793.33100.0023.33100.000.0093.330.00100.00
F190.000.0086.67100.0056.67100.000.0033.330.0023.33
F200.000.000.000.000.000.000.000.000.000.00
F210.000.000.000.000.000.000.000.000.000.00
F2216.67100.0090.00100.0096.67100.000.000.000.0013.33
F230.000.000.000.000.000.000.000.000.000.00
F2423.33100.0096.67100.00100.00100.000.000.000.00100.00
F250.000.0010.00100.0030.00100.000.000.000.000.00
F260.0086.6793.33100.003.33100.000.00100.000.00100.00
times2741871939311
SR rank8562517473
Table 6. The rank test for 26 benchmark functions.
Table 6. The rank test for 26 benchmark functions.
RankBOACABOAPSOBOAHBOAPSOLBOAIBOAPSOGWOSCAMPA
F17.97 5.93 2.10 1.53 6.97 2.83 9.00 3.83 10.00 4.83
F28.00 4.87 1.93 1.63 6.97 2.83 9.03 3.87 9.97 5.90
F36.00 4.00 1.70 1.33 5.00 2.97 9.00 7.03 10.00 7.97
F47.17 4.07 1.83 1.47 5.97 2.87 9.00 7.77 10.00 4.87
F58.83 2.93 10.00 4.07 6.00 7.13 2.07 4.93 8.03 1.00
F66.87 6.67 2.43 1.77 7.07 1.80 9.47 5.27 9.53 4.13
F79.53 3.68 9.43 3.65 8.00 7.03 1.00 4.53 6.00 2.13
F88.97 6.97 1.90 1.60 6.20 4.93 7.87 2.73 10.00 3.83
F98.00 6.00 1.90 1.30 7.00 2.93 9.00 3.93 10.00 4.93
F107.70 3.90 9.13 6.97 5.30 6.30 2.97 2.63 9.07 1.03
F117.90 5.90 2.67 1.03 6.90 2.87 9.00 3.87 10.00 4.87
F126.23 2.97 7.83 8.60 4.70 6.17 4.00 2.77 10.00 1.73
F1310.00 7.20 5.03 3.13 8.93 7.40 5.70 1.00 4.57 2.03
F149.97 7.57 3.83 2.07 8.97 7.30 5.67 1.50 4.33 3.80
F155.90 3.68 4.28 3.68 3.68 3.68 9.47 7.80 9.13 3.68
F168.90 3.18 4.02 3.18 3.18 3.18 8.53 7.13 9.07 4.62
F176.60 7.57 2.47 2.12 3.57 1.98 8.83 6.10 9.87 5.90
F187.87 5.82 3.30 2.97 6.75 2.97 9.00 3.37 10.00 2.97
F196.33 7.33 2.23 1.37 4.17 2.80 9.00 6.77 10.00 5.00
F207.03 2.00 8.03 3.00 5.03 6.03 8.87 4.00 10.00 1.00
F218.00 2.83 9.00 3.97 6.00 7.00 1.50 5.00 10.00 1.70
F228.40 4.93 2.30 1.03 4.03 2.87 6.53 8.33 9.73 6.83
F238.00 1.10 9.70 2.27 6.07 6.93 2.80 4.97 9.30 3.87
F246.43 3.58 3.68 3.58 3.58 3.58 7.97 9.00 10.00 3.58
F258.77 4.80 5.90 1.07 3.47 2.00 9.03 6.77 9.20 4.00
F267.97 4.60 3.53 3.23 6.97 3.23 9.00 3.23 10.00 3.23
Avg-rank7.82 4.77 4.62 2.75 5.79 4.29 7.05 4.93 9.15 3.83
Final rank9 5 4 1 7 3 8 6 10 2
Table 7. The p-value of Wilcoxon rank-sum (WRS) test for 26 benchmark functions.
Table 7. The p-value of Wilcoxon rank-sum (WRS) test for 26 benchmark functions.
RanksumBOACABOAPSOBOALBOAIBOAPSOGWOSCAMPA
F13.02 × 10−113.02 × 10−110.0351373.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
F23.02 × 10−113.02 × 10−110.3255273.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
F33.02 × 10−113.02 × 10−110.0015973.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
F43.02 × 10−113.02 × 10−110.0144123.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
F53.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−111.31 × 10−83.02 × 10−113.02 × 10−11
F64.20 × 10−102.15 × 10−100.2009491.33 × 10−100.5201453.02 × 10−117.38 × 10−103.02 × 10−112.44 × 10−9
F75.18 × 10−121.00 × 1005.18 × 10−125.18 × 10−125.16 × 10−121.19 × 10−130.0096895.18 × 10−129.85 × 10−11
F83.02 × 10−113.02 × 10−110.1223533.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
F93.02 × 10−113.02 × 10−110.0013023.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
F100.3402883.02 × 10−113.16 × 10−50.0036710.2009492.20 × 10−74.50 × 10−111.34 × 10−53.02 × 10−11
F110.3402883.02 × 10−113.16 × 10−50.0036710.2009492.20 × 10−74.50 × 10−111.34 × 10−53.02 × 10−11
F128.48 × 10−94.69 × 10−84.12 × 10−68.48 × 10−98.48 × 10−91.43 × 10−85.57 × 10−103.02 × 10−115.57 × 10−10
F133.02 × 10−113.02 × 10−110.005573.02 × 10−113.02 × 10−113.02 × 10−111.21 × 10−121.29 × 10−92.53 × 10−4
F143.02 × 10−113.02 × 10−110.0019533.02 × 10−113.02 × 10−114.62 × 10−100.090497.69 × 10−89.06 × 10−8
F151.27 × 10−5NaN0.041926NaNNaN1.21 × 10−121.19 × 10−121.21 × 10−12NaN
F161.21 × 10−12NaN0.041926NaNNaN1.21 × 10−121.21 × 10−121.21 × 10−121.27 × 10−5
F176.03 × 10−112.89 × 10−110.2486735.93 × 10−70.1608022.37 × 10−122.80 × 10−102.37 × 10−126.24 × 10−10
F181.21 × 10−124.57 × 10−120.1608024.57 × 10−12NaN1.21 × 10−120.1608021.21 × 10−12NaN
F193.02 × 10−113.02 × 10−110.0036713.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
F203.02 × 10−113.08 × 10−83.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.34 × 10−113.02 × 10−111.09 × 10−10
F213.02 × 10−119.51 × 10−63.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−114.12 × 10−6
F223.02 × 10−113.02 × 10−111.58 × 10−43.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
F233.02 × 10−111.39 × 10−63.02 × 10−113.02 × 10−113.02 × 10−111.11 × 10−42.37 × 10−103.02 × 10−111.86 × 10−6
F241.95 × 10−9NaN0.333711NaNNaN1.21 × 10−121.21 × 10−121.21 × 10−12NaN
F253.02 × 10−115.49 × 10−111.09 × 10−108.89 × 10−108.48 × 10−91.90 × 10−113.02 × 10−113.02 × 10−111.55 × 10−11
F261.21 × 10−122.93 × 10−50.1608021.21 × 10−12NaN1.21 × 10−12NaN1.21 × 10−12NaN
Table 8. The hypothesis (H) of WSR test for 26 benchmark functions.
Table 8. The hypothesis (H) of WSR test for 26 benchmark functions.
HBOACABOAPSOBOALBOAIBOAPSOGWOSCAMPA
F1111111111
F2110111111
F3111111111
F4111111111
F5111101111
F6100111111
F7111111111
F8110111111
F9111111111
F10011101111
F11011101111
F12111111111
F13111111111
F14111111011
F15101001110
F16101001111
F17110101111
F18110101010
F19111111111
F20111111111
F21111111111
F22111111111
F23111111111
F24100001110
F25111111111
F26110101010
Table 9. The t-test for 26 benchmark functions.
Table 9. The t-test for 26 benchmark functions.
t-tsetBOACABOAPSOBOALBOAIBOAPSOGWOSCAMPA
F16.64566.64562.10686.64566.64566.64566.64566.64566.6456
F26.64566.64560.98326.64566.64566.64566.64566.64566.6456
F36.64566.64563.15656.64566.64566.64566.64566.64566.6456
F46.64566.64562.44686.64566.64566.64566.64566.64566.6456
F56.6456−6.64566.64566.64566.6456−6.64565.68466.6456−6.6456
F66.24646.34991.27896.42380.64316.64566.15776.64565.9655
F76.90050.00006.90056.90056.9010−7.41802.58676.9005−6.4692
F86.64566.64561.54506.64566.64566.64566.64566.64566.6456
F96.64566.64563.21566.64566.64566.64566.64566.64566.6456
F100.9536−6.64564.1618−2.9051−1.2789−5.1819−6.58654.3540−6.6456
F116.64566.64563.71836.64566.64566.6456−6.58656.64566.6456
F12−5.7585−5.4628−4.6053−5.7585−5.7585−5.6698−6.20216.6456−6.2021
F136.64566.64562.77216.64566.64566.6456−7.10406.0690−3.6591
F146.64566.64563.09736.64566.64566.2316−1.69285.37415.3446
F154.3649NaN2.0343NaNNaN7.10407.10637.1040NaN
F167.1040NaN2.0343NaNNaN7.10407.10407.10404.3650
F176.54316.65231.15364.9936−1.40247.01106.30947.01106.1844
F187.10406.91831.40246.9182NaN7.10401.40247.1040NaN
F196.64566.64562.90516.64566.64566.64566.64566.64566.6456
F206.6456−5.53686.64566.64566.64566.64566.63086.6456−6.4534
F216.6456−4.42796.64566.64566.6456−6.64566.64566.6456−4.6053
F226.64566.64563.77746.64566.64566.64566.64566.64566.6456
F236.6456−4.82716.64566.64566.64563.86616.33516.64564.7680
F246.0023NaN0.9667NaNNaN7.10407.10407.1040NaN
F256.64566.55696.45346.12815.75856.71366.64566.64566.7434
F267.10404.17851.40247.1040NaN7.1040NaN7.1040NaN
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zhang, M.; Long, D.; Qin, T.; Yang, J. A Chaotic Hybrid Butterfly Optimization Algorithm with Particle Swarm Optimization for High-Dimensional Optimization Problems. Symmetry 2020, 12, 1800. https://doi.org/10.3390/sym12111800

AMA Style

Zhang M, Long D, Qin T, Yang J. A Chaotic Hybrid Butterfly Optimization Algorithm with Particle Swarm Optimization for High-Dimensional Optimization Problems. Symmetry. 2020; 12(11):1800. https://doi.org/10.3390/sym12111800

Chicago/Turabian Style

Zhang, Mengjian, Daoyin Long, Tao Qin, and Jing Yang. 2020. "A Chaotic Hybrid Butterfly Optimization Algorithm with Particle Swarm Optimization for High-Dimensional Optimization Problems" Symmetry 12, no. 11: 1800. https://doi.org/10.3390/sym12111800

APA Style

Zhang, M., Long, D., Qin, T., & Yang, J. (2020). A Chaotic Hybrid Butterfly Optimization Algorithm with Particle Swarm Optimization for High-Dimensional Optimization Problems. Symmetry, 12(11), 1800. https://doi.org/10.3390/sym12111800

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop