Next Article in Journal
A FEM-Green Approach for Magnetic Field Problems with Open Boundaries
Previous Article in Journal
SARS-CoV-2 Spread Forecast Dynamic Model Validation through Digital Twin Approach, Catalonia Case Study
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Modified Flower Pollination Algorithm for Global Optimization

1
Department of Computer Science, Faculty of Computers and Informatics, Zagazig University, Zagazig 44519, Egypt
2
Department of Statistics and Operations Research, College of Science, King Saud University, Riyadh 11451, Saudi Arabia
3
Department of Mathematics, Faculty of Science, Mansoura University, Mansoura 35516, Egypt
4
Department of Computational Mathematics, Science, and Engineering (CMSE), College of Engineering, Michigan State University, East Lansing, MI 48824, USA
*
Author to whom correspondence should be addressed.
Mathematics 2021, 9(14), 1661; https://doi.org/10.3390/math9141661
Submission received: 27 May 2021 / Revised: 28 June 2021 / Accepted: 30 June 2021 / Published: 15 July 2021

Abstract

:
In this paper, a modified flower pollination algorithm (MFPA) is proposed to improve the performance of the classical algorithm and to tackle the nonlinear equation systems widely used in engineering and science fields. In addition, the differential evolution (DE) is integrated with MFPA to strengthen its exploration operator in a new variant called HFPA. Those two algorithms were assessed using 23 well-known mathematical unimodal and multimodal test functions and 27 well-known nonlinear equation systems, and the obtained outcomes were extensively compared with those of eight well-known metaheuristic algorithms under various statistical analyses and the convergence curve. The experimental findings show that both MFPA and HFPA are competitive together and, compared to the others, they could be superior and competitive for most test cases.

1. Introduction

In recent decades, meta-heuristic optimization algorithms have widely interfered in several fields, tackling numerous optimization problems, especially engineering problems, due to fabulously avoiding stagnation in local optima, with high convergence speed in the right direction of the near-optimal solution [1]. The meta-heuristic algorithms have been classified into four categories according to the inspiration nature: evolutionary algorithms, physics-based algorithms, swarm-based algorithms, and human-based algorithms. The first category, called evolution-based algorithms, mimics biological evolution based on reproduction, mutation, recombination, and selection to produce new offspring stronger than their parents. The most population evolutionary algorithms which have been significantly applied for various optimization problems are genetic algorithms (GA) [2], evolution strategy (ES) [3], genetic programming (GP) [4], probability-based incremental learning (PBIL) [5], and biogeography-based optimizer (BBO) [6].
The physics-based algorithms have been simulating the laws of physics for proposing other algorithms with various behaviors, in the hope of coming true better outcomes; some of those algorithms are simulated annealing (SA) [7], Big-Bang Big-Crunch (BBBC) [8], Gravitational Search Algorithm (GSA) [9], Small-World Optimization Algorithm (SWOA) [10], Curved Space Optimization (CSO) [11], Galaxy-based Search Algorithm (GbSA) [12], Charged System Search (CSS) [13], Artificial Chemical Reaction Optimization Algorithm (ACROA) [14], Ray Optimization (RO) [15], Equilibrium Optimizer (EO) [16], Billiards-inspired optimization algorithm (BOA) [17], and Black Hole (BH) [18].
The social behavior-inspired algorithms, or swarm-based algorithms, as the third category, have been developed to model the social behaviors of birds and animals; those algorithms involve Particle Swarm Optimization (PSO) [19], Whale Optimization Algorithm (WOA) [1], Harris Hawks algorithm (HHA) [20], Marine Predators Algorithm (MPA) [21], Slime Mold Algorithm (SMA) [22], Ant Colony Optimization (ACO) [23], Grey Wolf Optimizer (GWO) [24], Cuckoo Search (CS) [25], Bat Algorithm (BA) [26], flower pollination algorithm (FPA) [27], and several others [28,29,30,31,32,33,34,35,36,37,38,39,40].
The last category, called human-based algorithms, has worked on emulating human behaviors, for proposing other algorithms with different methodology; including Teaching Learning Based Optimization(TLBO) [41], Harmony search (HS) [42], League Championship Algorithm (LCA) [43], Group Counseling Optimization (GCO) [44,45], Mine Blast Algorithm (MBA) [46], Seeker Optimization Algorithm (SOA) [47], Soccer League Competition (SLC) algorithm [48,49], Firework Algorithm [50], and many others [51].
The significant successes achieved by metaheuristic algorithms have given them the first rank as optimization models for tackling several optimization problems in a reasonable time [52]. One of the most popular optimization problems tackled by those optimization algorithms is nonlinear equation systems (NESs).
Nonlinear equation systems (NESs) have significantly arisen in engineering and science fields and solving those systems has recently attracted the attention of several researchers for finding effective optimization methods [53,54,55]. The optimization methods proposed for tackling the NESs have been divided into two categories: metaheuristic and classical. The metaheuristic techniques won significant interest over the classical ones due to averting being stuck in local minima, accelerating the convergence speed, and independence of the initial guess, in addition to fulfilling better outcomes in a reasonable time as discussed before. Several papers apply the metaheuristic algorithms: human-, evolution-, swarm-, and physics-based, for tackling the NESs, as discussed in the next section. The mathematical model of the NESs are described as:
S ( x ) = { f 1 ( x 1 ,     x 2 ,   ,   x D ) = 0 f 2 ( x 1 ,     x 2 ,   ,   x D ) = 0 f 3 ( x 1 ,     x 2 ,   ,   x D ) = 0 . . . f n ( x 1 ,     x 2 ,   ,   x D ) = 0
where D refers to the number of dimensions, n determines the number of equations, and x involves the solution to the NESs. As shown in Equation (1), the NESs consists of more than one objective function and, hence, the metaheuristic algorithms designed to deal with the problem with a single objective are not able to solve them. Therefore, the NESs will be converted into a single objective to be solvable by the metaheuristic algorithms using the following formula:
  f ( x ) = i = 1 n f i 2 ( x )
The flower optimization algorithm (FPA) proposed for tackling the global optimization based on mimicking the pollination process of flowers have had an effective performance for tackling several optimization problems [27,56,57,58,59,60,61], but unfortunately, its performance still substantially suffers from stagnation in local minima because of the inability to explore several regions within the search space during the optimization process, in addition to having low convergence speed, which makes the classical FPA consume several iterations for searching better solutions within unpromising regions. Broadly speaking, the classical FPA was evaluated using 10 mathematical test functions under a population size of 25 and maximum iteration reaching 10,840; this is considered a significant rate to be consumed for coming to the desired outcomes. Furthermore, the authors in [56] hybridized the classical FPA with the clonal selection algorithm (CSA) to solve 23 global test functions. This work was based on improving the local search of the classical FPA to avoid being stuck in local minima, and to reach better outcomes. In our opinion, that hybridization between two algorithms, to mix the advantages of each one to overcome their owning disadvantages, is considered a good alternative, but that might be sometimes ineffective because of difficulty in finding two or more algorithms completing each other to reach better outcomes. As an easier alternative, we see that the structure of the classical algorithms needs to be redesigned differently to create various updating schemes, working on exploring several regions within the search space for reaching better outcomes without wasting the iterations useless.
Therefore, a new modified FPA (MFPA) is proposed in this paper to overcome all those aforementioned problems, by building an effective mathematical model based on effectively hybridizing various updating schemes to enable this modified algorithm of adapting itself during looking for the solutions to the optimization problem. This modified algorithm could overcome the standard one for all 23 well-known unimodal and multimodal test functions and 27 NESs. However, unfortunately, it still suffers from a defect in its exploration operator, which prevents it from reaching better outcomes for some test functions compared to the competing algorithms.
Therefore, a well-established evolutionary algorithm known as differential evolution (DE) has been successfully applied to tackling several optimization problems, either continuous or discrete ones, and enjoyed with various updating schemes. One of those schemes, having a high ability for the exploration, is “DE/rand/1”, which explores the regions around one selected randomly from the population [62]. There are several DE variants based on various updating schemes or hybridization between DE, and some effective techniques have been extensively applied to tackle global optimization [63,64,65,66,67,68,69,70,71,72,73]. Since our proposed, MFPA suffers from a problem in the exploration operator, and the DE’s updating scheme “DE/rand/1”, have more direction to the exploration operator than the exploitation. DE is integrated with MFPA to propose a new variant called HFPA, balancing between the exploration and exploitation for preserving the population diversity to avoid being stuck into the local minima, and moving accurately toward the best-so-far solution to reduce the time-consuming fitness function evaluations. After validation and comparison on 23 well-known unimodal and multimodal test functions, HFPA has superior outcomes compared to the rival algorithms, with success rates reaching 78% and 70% in best and average cases, also outperforming MFPA, which has percentages of 61% and 52% as the second-best one.
Finally, our proposed algorithms: HFPA and MFPA have further investigated 27 NESs, and compared with some recent (and well-established) optimization algorithms, which show that those proposed algorithms are better with outperformance rate, reaching 100% and 81% in the best case, and 67% and 37% in the average case. It is concluded—based on the experimental findings—that HFPA is better than all competing algorithms and MFPA for both global optimization and NESs; thus, it is a strong alternative to tackle those two types of optimization problems. Briefly, this paper presents the following contributions:
Proposes a modified variant of the classical FPA, namely MFPA, with various updating schemes to tackle both global optimization and NESs.
Improves the exploration operator of MFPA using the DE with the “DE/rand/1” scheme to propose a new hybrid variant, called HFPA, with strong attributes.
The experimental findings show that HFPA has superior performance for tackling global optimization and NESs compared to eight rival algorithms and MFPA.
The structure of this paper is depicted in Figure 1: Section 2 describes works done previously on tackling the NESs. Section 3 describes the standard algorithms: FPA and DE. Section 4 shows our proposed algorithms, explaining them clearly and effectively. Section 5 exposes various experiments and presents some discussions. Finally, Section 6 presents our conclusions and future work.

2. Literature Review: NESs

As aforementioned—NESs are a hot research area that have attracted the attention of researchers (in terms of proposing an optimization model that could optimally solve them in a reasonable time). Therefore, researchers have significantly moved towards metaheuristic algorithms as a strong alternative to the classical ones to tackle NESs. Some of the metaheuristic algorithms proposed for tackling the NESs are reviewed next.
Ramadas, G.C. and E.M.d.G [74] have employed some variants of the harmony search algorithm to tackle NESs. Furthermore, the social emotion optimization algorithm (SEOA) was recently published with some improvements to develop a new variant, namely HSEOA, avoiding being stuck in local optima in order to reach better outcomes [75]. In addition, another method based on the multi-crossover real-coded genetic algorithm was proposed to tackle NESs, and was compared to some evolutionary algorithms to show its superiority [76]. Furthermore, Grosan, C., et al. [77] dealt with this problem as a multi-objective one, where each function represented an objective and tried to find the non-dominated solution, which minimizes all of the test functions together.
In the same context, a new efficient variant of the genetic algorithm (GA) improved, using the symmetric and harmonious individuals and elitism way to improve the population diversity and the convergence speed, respectively [78]. The particle swarm optimization improved using a conjugate direction (CD) method, and was developed to propose a new variant, namely CDPSO, overcoming the optimization problems with high dimensions [79]. Moreover, in [80], the PSO was used as a technique for tackling NESs as suggested to overcome the disadvantages of the classical methods, e.g., Newton’s method. A new NES technique based on the modified firefly algorithm was employed to deal with the problems with multiple roots [81]. In [82], another NES approach, named parallel elite-subspace evolutionary algorithm (PESEA), was proposed to tackle NESs in a reasonable time.
The grasshopper optimization algorithm (GOA) and genetic algorithm (GA) were effectively integrated to produce a hybrid variant, namely hybrid-GOA-GA, which could efficiently tackle the NESs [83]. This hybrid variant was validated using eight benchmark problems with different applications and its outcomes were compared with some of the state-of-the-art outcomes, in terms of computational costs, final results, and convergence speed. The experimental outcomes show its effectiveness for all of these terms. Furthermore, differential evolution (DE) was improved using two methods: a new mutation operation strategy and a restart technique to preserve the population diversity and avoid being stuck in local minima, which had to, by the standard DE, suggest a new variant, namely DE-R, to accurately solve the NESs. DE-R was validated using different real-world problems and compared with some recently proposed methods to show its superiority. In terms of the convergence speed and accuracy, DE-R was better.
A new hybrid algorithm was recently employed for solving NESs [84]. This hybrid algorithm, called DEMBO, was based on integrating the differential evolution (DE) algorithm into the monarch butterfly optimization (MBO) to overcome its defects confined to time-consuming fitness functions and falling in local minima. This algorithm was evaluated using nine unconstrained optimization problems and eight NESs, and compared to some state-of-the-art algorithms. The experimental findings demonstrate its superiority over the competing ones. In [85], a framework based on both grey wolf optimizer and multi-objective particle swarm optimization was proposed to tackle the NESs. This framework could be more effective compared to some of the classical and metaheuristic techniques. The differential evolution unified with a method, known as the Powell conjugate direction method, to avert stagnation in local minima was suggested, to propose a system called DE-Powell for tackling the NESs [86]. DE-Powell could be more effective compared to several existing algorithms when solving nine NESs.
The cuckoo search algorithm and the niche strategy have been combined to propose a strong variant called the niche cuckoo search algorithm (NCSA) for solving NESs [87]. NCSA has been benchmarked using 20 well-known mathematical test functions and some NESs, and compared to three well-established metaheuristic algorithms, such as chaos gray-coded genetic algorithm, classical genetic algorithm, and standard cuckoo search algorithm, showing that this algorithm is more adaptable compared to the other for solving the NESs. A hybrid algorithm, based on incorporating the cuckoo search (CS) with the particle swarm optimization (PSO), to overcome the huge function evaluations required by CS and local minima as the defect of the PSO, has been proposed in a variant named CSPSO to tackle the NESs [88]. CSPSO was benchmarked by some NESs and 28 CEC2013 benchmark functions to show its efficiency, as well as compared with some existing algorithms to measure its efficiency.
The bat algorithm, improved by a differential operator and Levy flight strategy, to accelerate the convergence speed and avoid local minima, respectively, were proposed; this improved variant was named DLBA [89]. Fourteen typical test functions and an NES have been employed for benchmarking the efficiency of the proposed algorithm compared to some other optimization algorithms. The experimental findings show the effectiveness of DLBA for finding better solutions than all competing ones.
In [90], a comparative study among the various variant of the genetic algorithms, in addition to the classical methods, was performed to see which one is better for solving the system of equations. The experimental results of this study showed that a modified GA variant was the best. The grey wolf optimizer was efficiently combined with the DE to produce a new variant called GWO-DE, with strong characteristics, such as avoiding getting stuck in local minima and accelerating the convergence speed, for solving the NESs [91]. The experimental findings, as mentioned by the authors, proved the efficacy of GWO-DE for tackling most of the NESs, compared to the existing optimization techniques. There are several other approaches proposed for tackling the NESs [92,93,94,95].

3. Overview of Used Metaheuristic Techniques

3.1. Flower Pollination Algorithm (FPA)

Yang, X.-S. [27] proposed a nature-inspired metaheuristic optimization algorithm called the flower pollination algorithm (FPA), based on mimicking the pollination process of flowers. There are two kinds of pollination: self-pollination and cross-pollination. In self-pollination, the fertilization process is performed between the flowers of the same types, where the pollen from one flower goes to fertilize another similar one. Cross-pollination is related to transferring the pollen for long distances between different plants, by insects, such as birds, bees, and bats. It is worth mentioning that some insects tend to visit some flowers without the others, in a phenomenon called flower constancy. Generally, the flower pollination process could be described in the following rules:
  • Biotic and cross-pollination can be defined as global pollination used to explore the regions of the search space for finding the most promising regions. This stage is based on the levy distribution.
  • The abiotic self-pollination describes the local pollination utilized to exploit the regions around the current solution for accelerating the convergence speed.
  • The flower constancy property can be regarded as a reproduction ratio that is proportional to the degree of similarity between two flowers.
  • Local pollination has a slight advantage in comparison to global pollination due to the physical proximity and wind. In specific, the local and global pollinations are controlled by a control variable P having a value between 0 and 1.
The mathematical model of global pollination and flower constancy is based on involving the fittest insect through the ones that travel for long distances, which is described as follows:
  x i t + 1 = x i t + γ l ( x i t x * )
where t indicates the current iteration, x i t is the current position of the ith solution, x * is the best-so-far solution, l is a step generated based on the levy distribution, γ is the step size scaling factor, and x i t + 1 express the next position. While the mathematical model of local pollination is described as follows:
x i t + 1 = x i t + ϵ ( x k t x j t )
where ϵ is a variable involving a random value generated at the interval of 0 and 1 based on the uniform distribution. x k t and x j t are two solutions selected randomly from the current population.

3.2. Differential Evolution

Storn, R.J.T.r. [96] proposed a population-based optimization algorithm named differential evolution (DE), similar to genetic algorithms, in terms of the mutation, crossover, and selection operators. The differential evolution before starting the optimization process initializes a number of individuals with D dimensions for each one x i ,   j t |   i = 1 , 2 , 3 , ,   N P ; j = 1 , 2 , 3 , ,   D , where NP is the individuals number and called also as population size and D is the dimension size, within the search space of an optimization problem. Afterwards, the mutation and crossover operators have been applied to explore the search space for finding better solutions as described below.

3.2.1. Mutation Operator

This operator has been employed by DE to generate a mutant vector, namely v i t , for each individual x i t , called as target vector, in the population. The mutant vector is generated using the mutation strategy described below:
v i t = x a t + F . ( x k t x j t )
where x a t is a random solution selected randomly from the population at generation t. F is a positive scaling factor.

3.2.2. Crossover Operator

After generating the mutant vector v i t , the crossover operator has been employed to generate a trial vector u i t based on the current position of the i th individual and its corresponding mutant one, according to a crossover probability (CR). This crossover operation is described as follows:
u i ,   j t = { v i ,   j t     i f   ( r 1 C R   ) | |   ( j = j r )   x i ,   j                 o t h e r w i s e   t
j r is a random integer generated between 1 and D, j indicates the current dimension, and C R is a constant value predefined between 0 and 1 to determine the percentage of the dimensions copied to the trial vector from the mutant one.

3.2.3. Selection Operator

Finally, the selection operator is used to evaluate the trial vector u i t and the current one x i t and the fittest one is used at the next generation. In general, the selection process for a minimization problem is expressed using the mathematical formulation as such:
x i   t = { u i t         i f   ( f ( u i t ) < f ( x i t ) )   x i                                     t o t h e r w i s e
where f(.) indicates the objective function or often known as the fitness function.

4. Proposed Algorithm: Hybrid Modified FPA (HMFPA)

The steps used to build the proposed algorithm, developed for solving the global optimization and NESs, are described in this section, and involve initialization, evaluation, modification, and comprehensive algorithm.

4.1. Initialization

Before beginning the optimization process, NP solutions will be distributed within the lower bound and upper bound vectors of the optimization problem using the following formula:
i N ,   x i = L + r ( U L )
where U ,   and   L are the upper and lower bound vectors, r is a vector consisting of D cells having values generated randomly between 0 and 1. Afterward, those initialized solutions will be evaluated using Equation (2) to find the best-so-far solution used at the next generation for updating the current population in the hope of exploring a better one.

4.2. Modified Flower Pollination Algorithm (MFPA)

4.2.1. Global Pollination

The classical FPA has designed a mathematical model for the global pollination, which is based on transferring the pollens among the plants by insects, based on updating the current position in a reverse direction to the best-so-far solution, x * , to take the pollens for a long distance. However, this involves some defects, mentioned next, which might affect the performance of the FPA. Since the main goal of this stage takes the pollen a long distance to fertilize other plants, it is not essential to always move the current position in the reverse direction into the best-so-far, because updating using various schemes, which might be combined in an effective manner to take the pollen to several regions, involving various plants within the search space, might significantly affect the optimization process. Therefore, three various updating schemes swapped effectively to take the pollen to several regions within the optimization process are mathematically described as follows. The first updating scheme is based on relating the current position to each search agent with the current iteration to help the algorithm gradually explore various regions around the current solution within the search space, even reaching the end of the iteration. In this case, the optimization process will focus on a local search around this current solution in the hope of finding a better solution. Generally, this updating scheme is modeled as follows:
S i t + 1 = L ( x i t x * )
L = γ . l . a . ( t m a x t t m a x )
x i t + 1 = t t m a x x i t + S i t + 1
where t m a x indicates the maximum iteration, a is a distance control factor to determine the distance around the current position to be explored. The second updating scheme is searching around the best-so-far solution based on two-step sizes: the first one will take the algorithm in a reverse direction to the best-so-far solution, while the other works on improving this direction to be close to the best-far solution, to promote the exploitation operator, or further, to strengthen the exploration operator. The mathematical model of this scheme is described as follows:
x i t + 1 = x * + S i t + 1 + L . ( 2. r . x r 1 t x r 2 t )
where x r 1 t and x r 2 t are two solutions randomly selected from the population at iteration t, while r is a numerical value generated between 0 and 1 under the uniform distribution. Finally, the third updating scheme is based on exploring the regions between the current best-so-far position and its negative one, based on the uniform distribution, to avoid being stuck in local minima, as modeled mathematically below:
x i t + 1 = x * . v
v = U ( r 1 , r 1 )
U indicates a uniform distribution method that takes the lower endpoints 1 r 1 and upper endpoint r 1 as inputs and return a vector involves random values generated in-between; where r 1 is a value created randomly between 0 and 1. The swapping between those three updating scheme is achieved as described by the following equation to balance between the implementation of the following updating scheme and the other two, as an attempt to balance between the exploration and exploitation capability:
x i t + 1 { t t m a x x i t + S i t + 1 r < 0.5 x * + S i t + 1 + L . ( 2. r . x r 1 t x r 2 t ) r 0.5   and   r 1 < r 2 x * . v r 0.5   and   r 1 r 2  
where r, r1, and r2 are numerical values generated randomly between 0 and 1.

4.2.2. Local Pollination

Regarding modification to the mathematical model at this stage—our idea was based on designing one using two various schemes that are exchanged using a probability of 0.5 to involve balance between them. The first one searches around the current position scaled according to the current iteration, to promote the searchability of the algorithm within the search space, to avoid being stuck in local minima. The second searches around the best-so-far solution, and is also scaled according to the current iteration to improve the exploitation operator, to accelerate the convergence speed in the right direction of the near-optimal solution.
x i t + 1 = t t m a x x i t + ϵ . ( x k t x j t )
x i t + 1 = t t m a x x * + ϵ . ( x k t x j t ) + ϵ 1 . ( x m t x n t )
x i t + 1 = { A p p l y i n g   E q u a t i o n   ( 16 )                                             r < 0.5   A p p l y i n g   E q u a t i o n   ( 17 )                                       o t h e r w i s e  
where x m t and x n t are two solutions selected randomly from the population, and r is a random number generated between 0 and 1. Finally, Algorithm 1 shows the steps of modified FPA (MFPA) and the same steps depicted in Figure 2.
Algorithm 1 The Steps of MFPA
1. Initialization step.
2. Evaluation.
3. while (t < t max )
4.   For (I = 1: NP)
5.    r : create a random number between 0 and 1.
6.   if (r > p)
7.     Update x i using Equation (15)
8.   Else
9.     Update x i using Equation (18)
10.   End if
11.   End for
12.   Evaluation step.
13.   t = t + 1;
14. end while

4.3. Hybridization of MFPA with DE(HFPA)

Unfortunately, MFPA still suffers from a lack of population diversity; this will pull the algorithm into the local minima and, hence, it cannot get to the near-optimal solution. Therefore, the DE has been effectively integrated into MFPA with a probability p 1 , picked experimentally, as shown in the experiments section later, even taking the algorithm into other regions, preserving the population diversity for achieving better outcomes. Finally, the steps of integrating MFPA with DE are listed in Algorithm 2, and its framework is described in Figure 3.
Algorithm 2 The Steps of HFPA
1. Initialization step.
2.Evaluation.
3.while (t < t max )
4.   For (i = 1: NP)
5.    r : create a random number between 0 and 1.
6.   if (r > p)
7.     Update x i using Equation (15)
8.   Else
9.     Update x i using Equation (18)
10.   End if
11.   End for
12.   Evaluation step.
13.   t = t + 1;
14. /// Applying differential evolution
15.  if r<   p 1
16.   For (i=1: NP)
17.     Creating a mutant vector v i t for x i using Equation (5)
18.     Applying crossover operator.
19.     Applying selection operator
20.    End for
21.    t = t + 1;
22.  End if
23. end while

5. Outcomes and Discussion

This section assesses the performance of the proposed algorithm using two independent experiments: the first one is based on checking its performance to search for the near-optimal solution for 23 well-known mathematical test functions, and the second will employ this, proposed for estimating the roots of 27 common NESs. Specifically, this section is organized as follows:
Section 5.1 shows the parameter settings and benchmark test functions.
Section 5.2 presents validation and comparison under 23 global optimization problems.
Section 5.3 presents validation and comparison under 27 NESs.

5.1. Parameter Settings

The proposed algorithm has been compared with eight well-known, recently-published metaheuristic algorithms, such as equilibrium optimizer (EO, 2020) [16], marine predators algorithm (MPA, 2020) [21], particle swarm optimization (PSO) [19], differential evolution (DE) [96], horse herd optimization algorithm (HOA, 2020) [97], slime mold algorithm (SMA, 2020) [22], and Runge Kutta based optimizer (RUN, 2021) [98]. Those algorithms have been implemented in the MATLAB platform under the same parameter values found in the cited papers, which are the original study for those algorithms.
Regarding the parameters of each compared algorithm, they were assigned at the implementation, as cited in the published papers. However, the proposed algorithms: MFPA and HFPA have three effective parameters, which need to be optimally picked to maximize their performances, those parameters are p, a , and p 1 . After executing different experiments with different values for each parameter on different test functions, it is obvious that the best value for p is 0.4, as shown in Figure 4a,b. The best for the parameters a and p 1 are of 0.8 and 0.5, as shown in Figure 4c,d. Regarding the parameter γ for the proposed, it is set to 0.5 to increase the step size for increasing the exploration operator, while the parameters CR and F are set to 0.9 and 0.5, respectively, as described in [99]. All algorithms were executed 30 independent times with a population size of 30 and a maximum iteration of 500 under the same machine to ensure a fair comparison.
The algorithms are here compared based on estimating the optimal value for two benchmarks: the first one consists of eight well-known unimodal mathematical test functions and 15 multimodal ones, as described in Table 1, which consists of four columns: the first one called “Name” mentions the function name, the second labeled “Formula: shows the mathematical equation of each function, the third labeled “D” carries the number of dimensions, and the last labeled “R” involves the search area of each function. The second benchmark involves 28 widely used NESs, defined in Table 2. The landscape of the unimodal and multimodal functions are depicted in Figure 5 to display the difference between the two.

5.2. Comparison of the Global Optimization

This section is presented to compare the performance of the standard FPA, MFPA, and HFPA together, and with seven well-known swarm and evolutionary algorithms to see how far our modification to the standard FPA could positively affect its performance for solving 23 well-known unimodal and multimodal functions. Due to the stochastic nature of these algorithms, they are executed 30 independent times, and the best, Avg, worst, and standard deviation (SD) of the fitness values obtained by each one were calculated and exposed in Table 3 and Table 4. Inspecting Table 3 and Table 4 shows that both MPFA and HFPA could outperform the classical FPA for all used test functions and this affirms that our modification could aid the standard in reaching other regions not reachable by this classical one. Not only could HFPA outperform the standard one, but it could also be superior and competitive with the rival algorithms and MFPA for 16 out of 23 test function, with a percentage up to 70%, as shown in Figure 6 in all independent runs. Moreover, for the other seven test functions, it could reach less value in the best case for two with a total percentage of 78%, as depicted in Figure 6, and its performance was significantly converged for the other ones. This superiority achieved by HFPA is due to preserving the population diversity among the individuals along the optimization process and this helps it avoid being stuck in local minima, which prevents it from reaching better outcomes. Based on that, HFPA is a strong alternative metaheuristic algorithm to the existing ones for tackling optimization problems.
Furthermore, the convergence curve obtained by each algorithm in log scale is presented in Figure 7, on nine test functions randomly selected to show if any of them need fewer iterations to reach the optimal solution. This figure shows that MFPA could be superior for F2, F3, F6, F12, and F22, while both MFPA and HFPA were competitive with each other and superior to the other for the remainder, except F7, which has better convergence by RUN. This figure, which elaborates the superiority of the MFPA for most test functions, affirms that MFPA has a better exploitation operator than the HFPA, and this might not be effective for some optimization problems, which need higher exploration capabilities to cover the search space as possible for reaching the best solution.
Finally, to see if the speedup of our proposed algorithms is better or not, Figure 8 shows the average of the computational cost consumed by each algorithm on all test functions within the independent runs, which affirms that both HFPA and MFPA are almost competitive with PSO, and superior to the others, except for EO and FPA, which need less computational costs, but have worse performance in comparison to the proposed algorithms.

5.3. Comparison of the NESs

As a case study, the NESs are herein solved by the proposed algorithms: MFPA and HFPA, and their outcomes are compared with eight well-established optimization algorithms to see their superiority for tackling these equations. The proposed algorithms: MFPA and HFPA, in addition to the others, are executed 30 independent times and the analyzed outcomes are exhibited in Table 5 and Table 6, which show HFPA could have superior and equal performance for 18 out of 27 test functions with a percentage of 67%, as found in Figure 9, better than MFPA, which could be superior and competitive for only 10 NESs, with a success proportion of 37% as the second-best algorithm in all independent runs. For the other test functions, in the best case, HFPA could be competitive and superior to the competing algorithms for all employed NESs with a proportion of 100%, as found in Figure 9. This confirms the efficiency of integrating the DE with the MFPA, which gives this hybrid variant a higher influence for the exploration over the exploitation to preserve the population diversity, to prevent being stuck in local minima and, hence, reach better outcomes along the optimization process, as long as the population diversity is preserved.
Furthermore, the convergence curves obtained by each algorithm in log scale are presented in Figure 10 to show if any of them needs fewer iterations to reach the optimal solution. This figure shows that MFPA could be superior for f1, f4, f13, f15, and f17; and HFPA for f2, f3, f7, f10, f11, and f12, while both are competitive with the others for the remaining functions depicted in Figure 10. Moreover, from Figure 10, it is noted that MFPA has better exploitation capability because it could reach the optimal solution using significantly smaller iterations than those needed by the other competing algorithms. MFPA for f3 has a worse convergence speed than most of the others because of weakening its exploration operator, which helps it keep the population diversity to explore several regions within the whole optimization process, in the hope of establishing the region, which has near-optimal solution. On the contrary, the HFPA on f3 could be superior to all the others in terms of the convergence speed, which notifies that HFPA balances between the exploration and exploitation capability to avoid being stuck in local minima, and accelerate the convergence speed to the near-optimal solution.
In addition, Figure 11 affirms that the average of the computational cost consumed by the proposed algorithm is superior to HOA, DE, MPA, RUN, and SMA; and competitive to PSO; however, unfortunately, they almost consume twice the time of both EO and FPA as our main future challenge.

5.4. Comparison between FPA Variants on NESs

In this section, the proposed algorithms: HFPA and MFPA, in addition to the classical FPA, are compared with each other based on drawing the boxplot of the outcomes obtained by each various test function, and exposing the outcomes in Figure 12. From this figure, it is obvious that both MFPA and HFPA could be better than the classical FPA for all test functions, and both HFPA and MFPA have competitive performance.

6. Conclusions and Future Work

In this paper, the classical FPA is modified to improve its global pollination for exploring more regions in the search space to avoid being stuck in local minima. In addition, the local pollination is also modified to enhance the exploitation capability for searching extensively around the best-so-far solution to accelerate the convergence speed in the right direction of the near-optimal solution This modified variant is abbreviated MFPA. Furthermore, the differential evolution algorithm was integrated with this modified variant, effectively as an attempt to develop a new one, namely HFPA, having a high exploration operator. The proposed algorithms: MFPA and HFPA, and the classical FPA, in addition to seven well-known metaheuristic algorithms, were extensively assessed using 23 unimodal and multimodal mathematical test functions, and 27 widely used nonlinear equation systems. Their outcomes were statistically analyzed and compared with each other. The experimental findings affirm that both MFPA and HFPA are significantly competitive with each other and dramatically superior to the standard FPA. Moreover, these findings show that MFPA and HFPA are superior and competitive to the well-known compared metaheuristic algorithms in terms of final accuracy, computational cost, and convergence speed. Our future work involves proposing a binary variant of those two proposed algorithms for tackling the 0–1 knapsack, feature selection, and cryptanalysis of cipher-text, in addition to proposing another combinatorial algorithm to tackle the DNA fragment assembly problem. Moreover, in the future, we will search for a new strategy to balance between the exploration and exploitation operators of this modified variant to fulfill better outcomes.

Author Contributions

Conceptualization, M.A.-B., R.M., S.S.A. and M.A.; methodology, M.A.-B., R.M., S.S., M.A.; software, M.A-B., R.M., M.A.; validation, M.A.-B., S.S. and S.S.A.; formal analysis, M.A.-B., R.M. and M.A.; investigation, S.S.A., S.S. and M.A.; resources, M.A.-B., M.A. and R.M.; data curation, M.A.-B., S.S.A., R.M. and M.A.; writing—original draft preparation, M.A.-B., R.M. and M.A.; writing—review and editing, S.S.A., S.S. and M.A.; visualization, M.A.-B., M.A. and R.M.; supervision, M.A.-B., M.A. and S.S.A.; project administration, M.A.-B., S.S., R.M. and M.A.; funding acquisition, S.S.A. All authors have read and agreed to the published version of the manuscript.

Funding

This project is funded by King Saud University, Riyadh, Saudi Arabia.

Data Availability Statement

Not Applicable.

Acknowledgments

Research Supporting Project number (RSP-2021/167), King Saud University, Riyadh, Saudi Arabia.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Mirjalili, S.; Lewis, A. The Whale Optimization Algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  2. Holland, J.H. Genetic Algorithms. Sci. Am. 1992, 267, 66–73. [Google Scholar] [CrossRef]
  3. Rechenberg, I. Evolutionsstrategien, in Simulationsmethoden in der Medizin und Biologie; Springer: Berlin/Heidelberg, Germany, 1978; pp. 83–114. [Google Scholar]
  4. Banzhaf, W.; Nordin, P.; Keller, R.E.; Francone, F.D. Genetic Programming: An Introduction; Morgan Kaufmann Publishers: San Francisco, CA, USA, 1998; Volume 1. [Google Scholar]
  5. Dasgupta, D.; Michalewicz, Z. Evolutionary Algorithms in Engineering Applications; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
  6. Simon, D. Biogeography-Based Optimization. IEEE Trans. Evol. Comput. 2008, 12, 702–713. [Google Scholar] [CrossRef] [Green Version]
  7. Van Laarhoven, P.J.M.; Aarts, E.H.L. Simulated Annealing: Theory and Applications; Springer Science and Business Media LLC: Berlin/Heidelberg, Germany, 1987; pp. 7–15. [Google Scholar]
  8. Erol, O.K.; Eksin, I. A New Optimization Method: Big Bang–Big Crunch. Adv. Eng. Softw. 2006, 37, 106–111. [Google Scholar] [CrossRef]
  9. Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A Gravitational Search Algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  10. Du, H.; Wu, X.; Zhuang, J. Small-World Optimization Algorithm for Function Optimization. In Proceedings of the Transactions on Petri Nets and Other Models of Concurrency XV; Springer Science and Business Media LLC: Berlin/Heidelberg, Germany, 2006; pp. 264–273. [Google Scholar]
  11. Moghaddam, F.F.; Moghaddam, R.F.; Cheriet, M. Curved Space Optimization: A Random Search Based on General Relativity Theory. arXiv 2012, arXiv:1208.2214. [Google Scholar]
  12. Hosseini, H.S. Principal Components Analysis by the Galaxy-Based Search Algorithm: A Novel Metaheuristic for Continuous Optimisation. Int. J. Comput. Sci. Eng. 2011, 6, 132. [Google Scholar] [CrossRef]
  13. Kaveh, A.; Talatahari, S. A Novel Heuristic Optimization Method: Charged System Search. Acta Mech. 2010, 213, 267–289. [Google Scholar] [CrossRef]
  14. Van Tran, T.; Wang, Y. Informatics, Artificial Chemical Reaction Optimization Algorithm and Neural Network Based Adaptive Control for Robot Manipulator. J. Control. Eng. Appl. Inform. 2017, 19, 61–70. [Google Scholar]
  15. Kaveh, A.; Khayatazad, M. A New Meta-Heuristic Method: Ray Optimization. Comput. Struct. 2012, 112-113, 283–294. [Google Scholar] [CrossRef]
  16. Faramarzi, A.; Heidarinejad, M.; Stephens, B.; Mirjalili, S. Equilibrium Optimizer: A Novel Optimization Algorithm. Knowl. Based Syst. 2020, 191, 105190. [Google Scholar] [CrossRef]
  17. Kaveh, A.; Khanzadi, M.; Moghaddam, M.R. Billiards-Inspired Optimization Algorithm; A New Meta-Heuristic Method. Structures 2020, 27, 1722–1739. [Google Scholar] [CrossRef]
  18. Hatamlou, A. Black Hole: A New Heuristic Optimization Approach for Data Clustering. Inf. Sci. 2013, 222, 175–184. [Google Scholar] [CrossRef]
  19. Kennedy, J.; Eberhart, R. Particle Swarm Optimization. In Proceedings of the ICNN’95 International Conference on Neural Networks, Perth, Australia, 27 November–1 December 1995; Volume 4, pp. 1942–1948. [Google Scholar]
  20. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris Hawks Optimization: Algorithm and Applications. Futur. Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  21. Faramarzi, A.; Heidarinejad, M.; Mirjalili, S.; Gandomi, A.H. Marine Predators Algorithm: A Nature-Inspired Metaheuristic. Expert Syst. Appl. 2020, 152, 113377. [Google Scholar] [CrossRef]
  22. Li, S.; Chen, H.; Wang, M.; Heidari, A.A.; Mirjalili, S. Slime Mould Algorithm: A New Method for Stochastic Optimization. Future Gener. Comput. Syst. 2020, 111, 300–323. [Google Scholar] [CrossRef]
  23. Dorigo, M.; Birattari, M.; Stutzle, T. Ant Colony Optimization. IEEE Comput. Intell. Mag. 2006, 1, 28–39. [Google Scholar] [CrossRef] [Green Version]
  24. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
  25. Gandomi, A.H.; Yang, X.-S.; Alavi, A.H. Erratum to: Cuckoo Search Algorithm: A Metaheuristic Approach to Solve Structural Optimization Problems. Eng. Comput. 2013, 29, 245. [Google Scholar] [CrossRef] [Green Version]
  26. Yang, X.-S.; He, X. Bat Algorithm: Literature Review and Applications. Int. J. Bio-Inspired Comput. 2013, 5, 141–149. [Google Scholar] [CrossRef] [Green Version]
  27. Yang, X.-S. Flower Pollination Algorithm for Global Optimization. In Proceedings of the Transactions on Petri Nets and Other Models of Concurrency XV; Springer Science and Business Media LLC: Berlin/Heidelberg, Germany, 2012; pp. 240–249. [Google Scholar]
  28. Khishe, M.; Mosavi, M. Chimp Optimization Algorithm. Expert Syst. Appl. 2020, 149, 113338. [Google Scholar] [CrossRef]
  29. Chu, S.-C.; Tsai, P.-W.; Pan, J.-S. Cat Swarm Optimization. In Proceedings of the 9th Pacific Rim International Conference on Artificial Intelligence, Guilin, China, 7–11 August 2006. [Google Scholar]
  30. Meng, X.-B.; Liu, Y.; Gao, X.; Zhang, H. A New Bio-inspired Algorithm: Chicken Swarm Optimization. In Proceedings of the 5th International Conference (ICSI 2014), Hefei, China, 17–20 October 2014; pp. 86–94. [Google Scholar]
  31. Bansal, J.C.; Sharma, H.; Jadon, S.S.; Clerc, M. Spider Monkey Optimization Algorithm for Numerical Optimization. Memetic Comput. 2014, 6, 31–47. [Google Scholar] [CrossRef]
  32. Gandomi, A.H.; Alavi, A.H. Krill Herd: A New Bio-Inspired Optimization Algorithm. Commun. Nonlinear Sci. Numer. Simul. 2012, 17, 4831–4845. [Google Scholar] [CrossRef]
  33. Xing, B.; Gao, W.-J. Innovative Computational Intelligence: A Rough Guide to 134 Clever Algorithms; Springer Science and Business Media LLC: Berlin/Heidelberg, Germany, 2014; pp. 167–170. [Google Scholar]
  34. Kaveh, A.; Farhoudi, N. A New Optimization Method: Dolphin Echolocation. Adv. Eng. Softw. 2013, 59, 53–70. [Google Scholar] [CrossRef]
  35. Oftadeh, R.; Mahjoob, M.; Shariatpanahi, M. A Novel Meta-Heuristic Optimization Algorithm Inspired by Group Hunting of Animals: Hunting Search. Comput. Math. Appl. 2010, 60, 2087–2098. [Google Scholar] [CrossRef] [Green Version]
  36. Yang, X.-S. Firefly Algorithms for Multimodal Optimization. In International Symposium on Stochastic Algorithms; Springer: Berlin/Heidelberg, Germany, 2009; pp. 169–178. [Google Scholar]
  37. Shiqin, Y.; Jianjun, J.; Guangxing, Y. A Dolphin Partner Optimization. In Proceedings of the 2009 WRI Global Congress on Intelligent Systems, Xiamen, China, 19–21 May 2009; Volume 1, pp. 124–128. [Google Scholar] [CrossRef]
  38. Lu, X.; Zhou, Y. A Novel Global Convergence Algorithm: Bee Collecting Pollen Algorithm. In Proceedings of the 4th International Conference on Intelligent Computing, ICIC 2008, Shanghai, China, 15–18 September 2008; pp. 518–525. [Google Scholar]
  39. Wu, H.; Zhang, F.-M. Wolf Pack Algorithm for Unconstrained Global Optimization. Math. Probl. Eng. 2014, 2014, 465082. [Google Scholar] [CrossRef] [Green Version]
  40. Pilat, M.L. Wasp-Inspired Construction Algorithms; University of Calgary: Calgary, AB, Canda, 2006. [Google Scholar]
  41. Rao, R.V. Teaching-Learning-Based Optimization Algorithm. In Teaching Learning Based Optimization Algorithm; Springer: Berlin/Heidelberg, Germany, 2016; pp. 9–39. [Google Scholar]
  42. Geem, Z.W.; Kim, J.H.; Loganathan, G. A New Heuristic Optimization Algorithm: Harmony Search. Simulation 2001, 76, 60–68. [Google Scholar] [CrossRef]
  43. Kashan, A.H. League Championship Algorithm: A New Algorithm for Numerical Function Optimization. In Proceedings of the 2009 International Conference of Soft Computing and Pattern Recognition, Malacca, Malaysia, 4–7 December 2009; pp. 43–48. [Google Scholar]
  44. Eita, M.; Fahmy, M. Group Counseling Optimization. Appl. Soft Comput. 2014, 22, 585–604. [Google Scholar] [CrossRef]
  45. Eita, M.A.; Fahmy, M.M. Group Counseling Optimization: A Novel Approach. In Research and Development in Intelligent Systems XXVI; Springer Science and Business Media LLC: Berlin/Heidelberg, Germany, 2009; pp. 195–208. [Google Scholar]
  46. Sadollah, A.; Bahreininejad, A.; Eskandar, H.; Hamdi, M. Mine Blast Algorithm: A New Population Based Algorithm for Solving Constrained Engineering Optimization Problems. Appl. Soft Comput. 2013, 13, 2592–2612. [Google Scholar] [CrossRef]
  47. Dai, C.; Zhu, Y.; Chen, W. Seeker Optimization Algorithm. In International Conference on Computational and Information Science; Springer: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
  48. Moosavian, N.; Roodsari, B.K.J. Soccer League Competition Algorithm, A New Method for Solving Systems of Nonlinear Equations. Int. J. Intell. Sci. 2013, 4, 7. [Google Scholar] [CrossRef]
  49. Moosavian, N.; Roodsari, B.K. Soccer League Competition Algorithm: A Novel Meta-Heuristic Algorithm for Optimal Design of Water Distribution Networks. Swarm Evol. Comput. 2014, 17, 14–24. [Google Scholar] [CrossRef]
  50. Tan, Y.; Zhu, Y. Fireworks Algorithm for Optimization. In Proceedings of the Transactions on Petri Nets and Other Models of Concurrency XV; Springer Science and Business Media LLC: Berlin/Heidelberg, Germany, 2010; pp. 355–364. [Google Scholar]
  51. Moosavi, S.H.S.; Bardsiri, V.K. Poor and Rich Optimization Algorithm: A New Human-Based and Multi Populations Algorithm. Eng. Appl. Artif. Intell. 2019, 86, 165–181. [Google Scholar] [CrossRef]
  52. Liao, Z.; Gong, W.; Wang, L. Memetic Niching-Based Evolutionary Algorithms for Solving Nonlinear Equation System. Expert Syst. Appl. 2020, 149, 113261. [Google Scholar] [CrossRef]
  53. Wetweerapong, J.; Puphasuk, P. An Improved Differential Evolution Algorithm with a Restart Technique to Solve Systems of Nonlinear Equations. Int. J. Optim. Control. Theor. Appl. 2020, 10, 118–136. [Google Scholar] [CrossRef] [Green Version]
  54. Boussaïd, I.; Lepagnot, J.; Siarry, P. A Survey on Optimization Metaheuristics. Inf. Sci. 2013, 237, 82–117. [Google Scholar] [CrossRef]
  55. Siddique, N.H.; Adeli, H. Nature Inspired Computing: An Overview and Some Future Directions. Cogn. Comput. 2015, 7, 706–714. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  56. Nabil, E. A Modified Flower Pollination Algorithm for Global Optimization. Expert Syst. Appl. 2016, 57, 192–203. [Google Scholar] [CrossRef]
  57. Yang, X.-S.; Karamanoglu, M.; He, X. Flower Pollination Algorithm: A Novel Approach for Multiobjective Optimization. Eng. Optim. 2014, 46, 1222–1237. [Google Scholar] [CrossRef] [Green Version]
  58. Nigdeli, S.M.; Bekdaş, G.; Yang, X.-S. Application of the Flower Pollination Algorithm in Structural Engineering. In Internet of Things (IoT) in 5G Mobile Technologies; Springer Science and Business Media LLC: Berlin/Heidelberg, Germany, 2016; pp. 25–42. [Google Scholar]
  59. Rodrigues, D.; Yang, X.-S.; de Souza, A.N.; Papa, J.P. Binary Flower Pollination Algorithm and Its Application to Feature Selection. In Econometrics for Financial Applications; Springer Science and Business Media LLC: Berlin/Heidelberg, Germany, 2015; pp. 85–100. [Google Scholar]
  60. Salgotra, R.; Singh, U. Application of Mutation Operators to Flower Pollination Algorithm. Expert Syst. Appl. 2017, 79, 112–129. [Google Scholar] [CrossRef]
  61. Wang, R.; Zhou, Y.; Qiao, S.; Huang, K. Flower Pollination Algorithm with Bee Pollinator for Cluster Analysis. Inf. Process. Lett. 2016, 116, 1–14. [Google Scholar] [CrossRef] [Green Version]
  62. Storn, R. On the Usage of Differential Evolution for Function Optimization. In Proceedings of the North American Fuzzy Information Processing, Berkeley, CA, USA, 19–22 June 1996; pp. 519–523. [Google Scholar] [CrossRef]
  63. Mohamed, A.W.; Sabry, H.Z.; Khorshid, M. An Alternative Differential Evolution Algorithm for Global Optimization. J. Adv. Res. 2012, 3, 149–165. [Google Scholar] [CrossRef] [Green Version]
  64. Karaboğa, D.; Ökdem, S.J.; Sciences, C. A Simple and Global Optimization Algorithm for Engineering Problems: Differential Evolution Algorithm. Turk. J. Electr. Eng. Comput. Sci. 2004, 12, 53–60. [Google Scholar]
  65. Das, S.; Mandal, A.; Mukherjee, R. An Adaptive Differential Evolution Algorithm for Global Optimization in Dynamic Environments. IEEE Trans. Cybern. 2013, 44, 966–978. [Google Scholar] [CrossRef]
  66. Huang, Z.; Wang, C.-E.; Ma, M. A Robust Archived Differential Evolution Algorithm for Global Optimization Problems. J. Comput. 2009, 4, 160–167. [Google Scholar] [CrossRef]
  67. Pant, M.; Ali, M.; Singh, V.P. Parent-Centric Differential Evolution Algorithm for Global Optimization Problems. Opsearch 2009, 46, 153–168. [Google Scholar] [CrossRef]
  68. Choi, T.J.; Ahn, C.W.; An, J. An Adaptive Cauchy Differential Evolution Algorithm for Global Numerical Optimization. Sci. World J. 2013, 2013, 969734. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  69. Kumar, P.; Pant, M. A Self Adaptive Differential Evolution Algorithm for Global Optimization. In Proceedings of the Computer Vision; Springer Science and Business Media LLC: Berlin/Heidelberg, Germany, 2010; Volume 6466, pp. 103–110. [Google Scholar]
  70. Sun, J.; Zhang, Q.; Tsang, E.P.K. DE/EDA: A New Evolutionary Algorithm for Global Optimization. Inf. Sci. 2005, 169, 249–262. [Google Scholar] [CrossRef]
  71. Brest, J.; Zamuda, A.; Fister, I.; Maucec, M.S. Large Scale Global Optimization Using Self-Adaptive Differential Evolution Algorithm. In Proceedings of the IEEE Congress on Evolutionary Computation, Barcelona, Spain, 18–23 July 2010; pp. 1–8. [Google Scholar]
  72. Yi, W.; Gao, L.; Li, X.; Zhou, Y. A New Differential Evolution Algorithm with a Hybrid Mutation Operator and Self-Adapting Control Parameters for Global Optimization Problems. Appl. Intell. 2014, 42, 642–660. [Google Scholar] [CrossRef]
  73. Brest, J.; Zumer, V.; Maucec, M. Self-Adaptive Differential Evolution Algorithm in Constrained Real-Parameter Optimization. In Proceedings of the 2006 IEEE International Conference on Evolutionary Computation, Vancouver, BC, Canada, 16–21 July 2006. [Google Scholar]
  74. Ramadas, G.C.; Fernandes, E.M.d.G. Solving Systems of Nonlinear Equations by Harmony Search. In Proceedings of the 13th International Conference on Mathematical Methods in Science and Engineering, Almeria, Spain, 24–27 June 2013. [Google Scholar]
  75. Wu, J.; Cui, Z.; Liu, J. Using Hybrid Social Emotional Optimization Algorithm with Metropolis Rule to Solve Nonlinear Equations. In Proceedings of the IEEE 10th International Conference on Cognitive Informatics and Cognitive Computing (ICCI-CC’11), Banff, AB, Canada, 18–20 August 2011; pp. 405–411. [Google Scholar]
  76. Chang, W.-D. An Improved Real-Coded Genetic Algorithm for Parameters Estimation of Nonlinear Systems. Mech. Syst. Signal. Process. 2006, 20, 236–246. [Google Scholar] [CrossRef]
  77. Grosan, C.; Abraham, A. A New Approach for Solving Nonlinear Equations Systems. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 2008, 38, 698–714. [Google Scholar] [CrossRef]
  78. Ren, H.; Wu, L.; Bi, W.; Argyros, I.K. Solving Nonlinear Equations System via an Efficient Genetic Algorithm with Symmetric and Harmonious Individuals. Appl. Math. Comput. 2013, 219, 10967–10973. [Google Scholar] [CrossRef]
  79. Mo, Y.; Liu, H.; Wang, Q. Conjugate Direction Particle Swarm Optimization Solving Systems of Nonlinear Equations. Comput. Math. Appl. 2009, 57, 1877–1882. [Google Scholar] [CrossRef] [Green Version]
  80. Jaberipour, M.; Khorram, E.; Karimi, B. Particle Swarm Algorithm for Solving Systems of Nonlinear Equations. Comput. Math. Appl. 2011, 62, 566–576. [Google Scholar] [CrossRef] [Green Version]
  81. Ariyaratne, M.; Fernando, T.; Weerakoon, S. Solving Systems of Nonlinear Equations Using a Modified Firefly Algorithm (MODFA). Swarm Evol. Comput. 2019, 48, 72–92. [Google Scholar] [CrossRef]
  82. Wu, Z.; Kang, L. A Fast and Elitist Parallel Evolutionary Algorithm for Solving Systems of Non-Linear Equations. In Proceedings of the 2003 Congress on Evolutionary Computation, CEC ’03, Canberra, Australia, 8–12 December 2003. [Google Scholar]
  83. El-Shorbagy, M.A.; El-Refaey, A.M. Hybridization of Grasshopper Optimization Algorithm With Genetic Algorithm for Solving System of Non-Linear Equations. IEEE Access 2020, 8, 220944–220961. [Google Scholar] [CrossRef]
  84. Ibrahim, A.M.; Tawhid, M.A. A Hybridization of Differential Evolution and Monarch Butterfly Optimization for Solving Systems of Nonlinear Equations. J. Comput. Des. Eng. 2018, 6, 354–367. [Google Scholar] [CrossRef]
  85. Pant, S.; Kumar, A.; Ram, M. Solution of Nonlinear Systems of Equations via Metaheuristics. Int. J. Math. Eng. Manag. Sci. 2019, 4, 1108–1126. [Google Scholar] [CrossRef]
  86. Ibrahim, A.M.; Tawhid, M.A. Conjugate Direction DE Algorithm for Solving Systems of Nonlinear Equations. Appl. Math. Inf. Sci. 2017, 11, 339–352. [Google Scholar] [CrossRef]
  87. Zhang, X.; Wan, Q.; Fan, Y. Applying Modified Cuckoo Search Algorithm for Solving Systems of Nonlinear Equations. Neural Comput. Appl. 2017, 31, 553–576. [Google Scholar] [CrossRef]
  88. Ibrahim, A.M.; Tawhid, M.A. A Hybridization of Cuckoo Search and Particle Swarm Optimization for Solving Nonlinear Systems. Evol. Intell. 2019, 12, 541–561. [Google Scholar] [CrossRef]
  89. Xie, J.; Zhou, Y.; Chen, H. A Novel Bat Algorithm Based on Differential Operator and Lévy Flights Trajectory. Comput. Intell. Neurosci. 2013, 2013, 1–13. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  90. Hassan, O.F.; Jamal, A.; Abdel-Khalek, S. Genetic Algorithm and Numerical Methods for Solving Linear and Nonlinear System of Equations: A Comparative Study. J. Intell. Fuzzy Syst. 2020, 38, 2867–2872. [Google Scholar] [CrossRef]
  91. Tawhid, M.A.; Ibrahim, A.M. A Hybridization of Grey Wolf Optimizer and Differential Evolution for Solving Nonlinear Systems. Evol. Syst. 2019, 11, 65–87. [Google Scholar] [CrossRef]
  92. Luo, Y.-Z.; Tang, G.-J.; Zhou, L.-N. Hybrid Approach for Solving Systems of Nonlinear Equations Using Chaos Optimization and Quasi-Newton Method. Appl. Soft Comput. 2008, 8, 1068–1073. [Google Scholar] [CrossRef]
  93. Wu, J.; Gong, W.; Wang, L. A Clustering-Based Differential Evolution with Different Crowding Factors for Nonlinear Equations system. Appl. Soft Comput. 2021, 98, 106733. [Google Scholar] [CrossRef]
  94. Mangla, C.; Ahmad, M.; Uddin, M. Optimization of Complex Nonlinear Systems Using Genetic Algorithm. Int. J. Inf. Technol. 2020. [Google Scholar] [CrossRef]
  95. Pourjafari, E.; Mojallali, H. Solving Nonlinear Equations Systems with a New Approach Based on Invasive Weed Optimization Algorithm and Clustering. Swarm Evol. Comput. 2012, 4, 33–43. [Google Scholar] [CrossRef]
  96. Storn, R. Differrential Evolution-A Simple and Efficient Adaptive Scheme for Global Optimization over Continuous Spaces; Technical report; International Computer Science Institute: Berkeley, CA, USA, 1995; p. 11. [Google Scholar]
  97. MiarNaeimi, F.; Azizyan, G.; Rashki, M. Horse Herd Optimization Algorithm: A Nature-Inspired Algorithm for High-Dimensional Optimization Problems. Knowl. Based Syst. 2021, 213, 106711. [Google Scholar] [CrossRef]
  98. Ahmadianfar, I.; Heidari, A.A.; Gandomi, A.H.; Chu, X.; Chen, H. RUN Beyond the Metaphor: An Efficient Optimization Algorithm Based on Runge Kutta Method. Expert Syst. Appl. 2021, 181, 115079. [Google Scholar] [CrossRef]
  99. Brest, J.; Maučec, M.S. Population Size Reduction for the Differential Evolution Algorithm. Appl. Intell. 2007, 29, 228–247. [Google Scholar] [CrossRef]
  100. Song, W.; Wang, Y.; Li, H.-X.; Cai, Z. Locating Multiple Optimal Solutions of Nonlinear Equation Systems Based on Multiobjective Optimization. IEEE Trans. Evol. Comput. 2015, 19, 414–431. [Google Scholar] [CrossRef]
  101. Sacco, W.; Henderson, N. Finding All Solutions of Nonlinear Systems Using a Hybrid Metaheuristic with Fuzzy Clustering Means. Appl. Soft Comput. 2011, 11, 5424–5432. [Google Scholar] [CrossRef]
  102. Hirsch, M.J.; Pardalos, P.M.; Resende, M.G. Solving Systems of Nonlinear Equations with Continuous GRASP. Nonlinear Anal. Real World Appl. 2009, 10, 2000–2006. [Google Scholar] [CrossRef]
  103. Sharma, J.R.; Arora, H. On Efficient Weighted-Newton Methods for Solving Systems of Nonlinear Equations. Appl. Math. Comput. 2013, 222, 497–506. [Google Scholar] [CrossRef]
  104. Junior, H.A.E.O.; Ingber, L.; Petraglia, A.; Petraglia, M.R.; Machado, M.A.S. Stochastic Global Optimization and Its Applications with Fuzzy Adaptive Simulated Annealing; Springer Science and Business Media LLC: Berlin/Heidelberg, Germany, 2012; pp. 33–62. [Google Scholar]
  105. Morgan, A.; Shapiro, V. Box-Bisection for Solving Second-Degree Systems and the Problem of Clustering. ACM Trans. Math. Softw. 1987, 13, 152–167. [Google Scholar] [CrossRef]
  106. Grau-Sánchez, M.; Grau, À.; Noguera, M. Frozen Divided Difference Scheme for Solving Systems of Nonlinear Equations. J. Comput. Appl. Math. 2011, 235, 1739–1743. [Google Scholar] [CrossRef]
  107. Waziri, M.; Leong, W.J.; Hassan, M.A.; Monsiet, M. An Efficient Solver for Systems of Nonlinear Equations with Singular Jacobian via Diagonal Updating. Appl. Math. Sci. 2010, 4, 3403–3412. [Google Scholar]
Figure 1. Flowchart of the paper organization.
Figure 1. Flowchart of the paper organization.
Mathematics 09 01661 g001
Figure 2. Flowchart of MFPA for tackling the NESs.
Figure 2. Flowchart of MFPA for tackling the NESs.
Mathematics 09 01661 g002
Figure 3. Framework of the proposed algorithm: HFPA.
Figure 3. Framework of the proposed algorithm: HFPA.
Mathematics 09 01661 g003
Figure 4. Sensitivity analysis for the parameters of the proposed algorithms: (a) Depiction of outcomes under various values for parameter p on F1; (b) Depiction of outcomes under various values for parameter p on F5; (c) Depiction of outcomes under various values for parameters a on F14; (d) Depiction of outcomes under various values for parameters p₁ on F14.
Figure 4. Sensitivity analysis for the parameters of the proposed algorithms: (a) Depiction of outcomes under various values for parameter p on F1; (b) Depiction of outcomes under various values for parameter p on F5; (c) Depiction of outcomes under various values for parameters a on F14; (d) Depiction of outcomes under various values for parameters p₁ on F14.
Mathematics 09 01661 g004
Figure 5. The landscape of the mathematical test functions: (a) The landscape of F1; (b) The landscape of F2; (c) The landscape of F3; (d) The landscape of F4; (e) The landscape of F5; (f) The landscape of F6; (g) The landscape of F7; (h) The landscape of F8; (i) The landscape of F9; (j) The landscape of F10; (k) The landscape of F11; (l) The landscape of F12; (m) The landscape of F13; (n) The landscape of F14; (o) The landscape of F15; (p) The landscape of F16; (q) The landscape of F17; (r) The landscape of F18; (s) The landscape of F19; (t) The landscape of F20; (u) The landscape of F21; (v) The landscape of F22; (w) The landscape of F23.
Figure 5. The landscape of the mathematical test functions: (a) The landscape of F1; (b) The landscape of F2; (c) The landscape of F3; (d) The landscape of F4; (e) The landscape of F5; (f) The landscape of F6; (g) The landscape of F7; (h) The landscape of F8; (i) The landscape of F9; (j) The landscape of F10; (k) The landscape of F11; (l) The landscape of F12; (m) The landscape of F13; (n) The landscape of F14; (o) The landscape of F15; (p) The landscape of F16; (q) The landscape of F17; (r) The landscape of F18; (s) The landscape of F19; (t) The landscape of F20; (u) The landscape of F21; (v) The landscape of F22; (w) The landscape of F23.
Mathematics 09 01661 g005aMathematics 09 01661 g005b
Figure 6. Outperformance and competitiveness percentage of each algorithm in terms of best and average values on global test functions.
Figure 6. Outperformance and competitiveness percentage of each algorithm in terms of best and average values on global test functions.
Mathematics 09 01661 g006
Figure 7. Convergence curve for unimodal and multimodal benchmark functions: (a) Convergence curve for F1; (b) Convergence curve for F2; (c) Convergence curve for F3; (d) Convergence curve for F6; (e) Convergence curve for F7; (f) Convergence curve for F12; (g) Convergence curve for F19; (h) Convergence curve for F20; (i) Convergence curve for F22.
Figure 7. Convergence curve for unimodal and multimodal benchmark functions: (a) Convergence curve for F1; (b) Convergence curve for F2; (c) Convergence curve for F3; (d) Convergence curve for F6; (e) Convergence curve for F7; (f) Convergence curve for F12; (g) Convergence curve for F19; (h) Convergence curve for F20; (i) Convergence curve for F22.
Mathematics 09 01661 g007aMathematics 09 01661 g007b
Figure 8. Comparison in term of CPU time on global optimization problems.
Figure 8. Comparison in term of CPU time on global optimization problems.
Mathematics 09 01661 g008
Figure 9. Outperformance and competitiveness percentage of each algorithm in terms of best and average values on NESs.
Figure 9. Outperformance and competitiveness percentage of each algorithm in terms of best and average values on NESs.
Mathematics 09 01661 g009
Figure 10. Convergence curve for the NESs: (a) Convergence curve for f1; (b) Convergence curve for f2; (c) Convergence curve for f3; (d) Convergence curve for f4; (e) Convergence curve for f5; (f) Convergence curve for f7; (g) Convergence curve for f8; (h) Convergence curve for f9; (i) Convergence curve for f10; (j) Convergence curve for f11; (k) Convergence curve for f12; (l) Convergence curve for f13; (m) Convergence curve for f14; (b) Convergence curve for f1; (n) Convergence curve for f15; (o) Convergence curve for f17.
Figure 10. Convergence curve for the NESs: (a) Convergence curve for f1; (b) Convergence curve for f2; (c) Convergence curve for f3; (d) Convergence curve for f4; (e) Convergence curve for f5; (f) Convergence curve for f7; (g) Convergence curve for f8; (h) Convergence curve for f9; (i) Convergence curve for f10; (j) Convergence curve for f11; (k) Convergence curve for f12; (l) Convergence curve for f13; (m) Convergence curve for f14; (b) Convergence curve for f1; (n) Convergence curve for f15; (o) Convergence curve for f17.
Mathematics 09 01661 g010aMathematics 09 01661 g010bMathematics 09 01661 g010cMathematics 09 01661 g010d
Figure 11. Comparison in terms of CPU time on NESs.
Figure 11. Comparison in terms of CPU time on NESs.
Mathematics 09 01661 g011
Figure 12. Comparison among FPA variants under Boxplot: (a) Boxplot for f1; (b) Boxplot for f2; (c) Boxplot for f3; (d) Boxplot for f5; (e) Boxplot for f6; (f) Boxplot for f7; (g) Boxplot for f8; (h) Boxplot for f9; (i) Boxplot for f10; (j) Boxplot for f11; (k) Boxplot for f12; (l) Boxplot for f13.
Figure 12. Comparison among FPA variants under Boxplot: (a) Boxplot for f1; (b) Boxplot for f2; (c) Boxplot for f3; (d) Boxplot for f5; (e) Boxplot for f6; (f) Boxplot for f7; (g) Boxplot for f8; (h) Boxplot for f9; (i) Boxplot for f10; (j) Boxplot for f11; (k) Boxplot for f12; (l) Boxplot for f13.
Mathematics 09 01661 g012aMathematics 09 01661 g012b
Table 1. Descriptions of benchmark test functions.
Table 1. Descriptions of benchmark test functions.
NameFormulaD R
Unimodal Test Functions
Beale F 1 ( x ) = ( 1.5 x 1 x 1 x 2 2 ) 2 + ( 2.25 x 1 x 1 x 2 2 ) 2 + ( 2.625 x 1 x 1 x 2 3 ) 2 2 x i = [ 4.5 , 4.5 ] i = 1 ,   2
Matyas F 2 ( x ) = 0.26 ( x 1 2 + x 2 2 ) 2 0.48 x 1 x 2 2 x i = [ 10 , 10 ] i = 1 ,   2
Three-hump camel F 3 ( x ) = 2 x 1 2 1.05 x 1 4 + x 1 6 +   x 1 x 2 +   x 2 2 2 x i = [ 5 , 5 ] i = 1 ,   2
Exponential F 4 ( x ) = e ( 0.5   i = 1 D x i 2   ) 30 x i = [ 1 ,   1 ] i = 1 ,   .. , D
Ridge F 5 ( x ) = x 1 + 2 ( i = 1 D x i 2 ) 0.1 30 x i = [ 5 , 5 ] i = 1 ,   , D
Sphere F 6 ( x ) = i = 1 D x i 2 30 x i = [ 100 , 100 ] i = 1 ,   .. , D
Step F 7 ( x ) = i = 1 D ( x i + 0.5 ) 2 30 x i = [ 5.12 , 5.12 ] i = 1 ,   , D
Multimodal Test Functions
Drop wave F 8 ( x ) = 1 + cos ( 12 x 1 2 + x 2 2 ) 0.5 ( x 1 2 + x 2 2 ) + 2 2 x i = [ 5.2 , 5.2 ] i = 1 , ,   D
Egg holder F 9 ( x ) = ( x 2 + 47 ) sin ( | x 2 + x 1 2 + 47 | ) x 1 sin ( | x 1 x 2 47 | ) 2 x i = [ 5.2 ,   5.2 ] i = 1 , ,   D
Himmelblau F 10 ( x ) = ( x 1 2 + x 2 11 ) 2 + ( x 2 2 + x 1 7 ) 2 2 x 1 = [ 30 ,   30 ]
x 2 = [ 30 ,   30 ]
Levi 13 F 11 ( x ) = sin 2 ( 3 π x 1   ) + ( x 1 1 ) 2   ( 1 + sin 2 ( 3 π x 2   )   ) + ( x 2 1 ) 2   ( 1 + sin 2 ( 2 π x 2   )   ) 2 x i = [ 10 , 10 ] i = 1 ,   .. , D
Ackley 1 f 12 ( x ) = 20 e ( 0.2   1 D i = 1 D x i 2   ) e ( 0.2   1 D i = 1 D cos ( 2 π x i 2 )   ) + 20 + e 20 x i = [ 1 , 1 ] i = 1 , ,   D
Griewank F 13 ( x ) = ( x 2 D ) 2 8 + 1 D ( 1 2 x 2 + i = 1 D x i ) + 1 2 5 x i = [ 2 , 2 ] i = 1 , ,   D
Happy cat F 14 ( x ) = 1 + i = 1 D x i 2 4000 i = 1 D cos   ( x i i   )   30 x i = [ 2 , 2 ] i = 1 ,   , D
Michalewicz F 15 ( x ) = i = 1 D sin ( x i ) ( sin ( ix i π ) ) 2 10 x i = [ 0 , π ] i = 1 ,   , D
Penalized 1 F 16 ( x ) = π D [ sin 2 ( π y 1 ) + i = 1 D 1 ( ( y i 1 ) 2 ( 1 + 10 sin 2 ( π y i + 1 ) ) ) + ( y D 1 ) 2 ]          + i = 1 D u ( x i ,   10 ,   100 ,   4 )
y i = 1 + 1 4 ( x i + 1 )
u ( x i , a , k , m ) = { k ( x i a ) m                       x i > a 0                           a x i a k ( x i a ) m                 x i < a
30 x i = [ 50 , 50 ] i = 1 ,   , D
Penalized 2 F 16 ( x ) = 0.1 [ sin 2 ( 3 π y 1 )          + i = 1 D 1 ( ( x i 1 ) 2 ( 1 + sin 2 ( 3 π y i + 1 ) ) ) + ( y D 1 ) 2 ( 1 + sin 2 ( 2 π x D ) ) ]          + i = 1 D u ( x i ,   10 ,   100 ,   4 )
30 x i = [ 50 , 50 ] i = 1 , .. ,   D
Periodic F 18 ( x ) = 1 + i = 1 D sin 2 ( x i ) 0.1 e ( i = 1 D x i 2 ) 30 x i = [ 50 , 50 ] i = 1 , .. ,   D
Qing F 19 ( x ) = i = 1 D ( x i 2 i ) 2 30 x i = [ 500 , 500 ] i = 1 , .. ,   D
Rastrigin F 20 ( x ) = 10 D + i = 1 D ( x i 2 10 cos ( 2 π x i ) ) 30 x i = [ 5.12 , 5.12 ] i = 1 , .. ,   D
Rosenbrock F 21 ( x ) = i = 1 D ( 100 ( X i + 1 x i 2 ) 2 + ( 1 x i ) 2 ) 30 x i = [ 5 , 10 ] i = 1 , ,   D
Salomon F 22 ( x ) = 1 cos ( 2 π i = 1 D x i 2 ) + 0.1 i = 1 D x i 2 30 x i = [ 100 , 100 ] i = 1 , ,   D
Yang 4 F 23 ( x ) = ( i = 1 D sin 2 ( x i ) ) e ( i = 1 D sin 2 ( | x i | ) ) 30 x i = [ 10 , 10 ] i = 1 , ,   D
Table 2. The descriptions of used NESs.
Table 2. The descriptions of used NESs.
FunctionFormulasD References
f1 x 1 sin ( 5 π x 2 ) = 0
x 1 x 2 = 0
2 x i = [ 1 , 1 ] i = 1 ,   2 [100]
f2 x 1 cos ( 4 π x 2 ) = 0
x 1 2 + x 2 2 1 = 0
2 x i = [ 10 , 10 ] i = 1 ,   2 [100]
f3 x 1 0.25428722 0.18324757 x 4 x 3 x 9 = 0
x 2 0.37842197 0.16275449 x 1 x 10 x 6 = 0
x 3 0.27162577 0.16955071 x 1 x 2 x 10 = 0
x 4 0.19807914 0.15585316 x 7 x 1 x 6 = 0
x 5 0.44166728 0.19950920 x 7 x 6 x 3 = 0
x 6 0.14654113 0.18922793 x 8 x 5 x 10 = 0
x 7 0.42937161 0.21180486 x 2 x 5 x 8 = 0
x 8 0.07056438 0.17081208 x 1 x 7 x 6 = 0
x 9 0.34504906 0.19612740 x 10 x 6 x 8 = 0
x 10 0.42651102 0.21466544 x 4 x 8 x 1 = 0
10 x i = [ 10 , 10 ] i = 1 ,   . , 10 [77]
f4 3.0 x 1 x 3 2 = 0
x 3 sin ( π x 2 ) x 3 x 4 = 0  
x 2 x 3 exp ( 1.0 x 1 x 3 ) + 0.2707 = 0  
2 x 1 2 x 3 x 2 4 x 3 x 2 = 0
4 x i = [ 0 ,   5 ] i = 1 ,   4 [95]
f5 4 x 1 3 + 4 x 1 x 2 + 2 x 2 2 42 x 1 14 = 0
4 x 2 3 + 2 x 1 2 + 4 x 1 x 2 16 x 2 22 = 0
2 x i = [ 20 , 20 ] i = 1 ,   2 [101]
f6 sin ( x 1 ) cos ( x 2 ) 2 cos ( x 1 ) sin ( x 2 ) = 0
cos ( x 1 ) sin ( x 2 ) 2 sin ( x 1 ) cos ( x 2 ) = 0
2 x i = [ 0 , π ] i = 1 ,   2 [102]
f7 x 1 2 + x 2 2 1.0 = 0
x 3 2 + x 4 2 1.0 = 0
x 5 2 + x 6 2 1.0 = 0
x 7 2 + x 8 2 1.0 = 0
4.731   ·   10 3   x 1 x 3 0.3578 x 2 x 3 0.1238 x 1 +   x 7 1.637   ·   10 3 x 2          0.9338 x 4 0.3571 = 0
0.2238 x 1 x 3 + 0.7623 x 2 x 3 + 0.2638 x 1   x 7 0.07745 x 2 0.6734 x 4          0.6022 = 0
x 6 x 8 + 0.3578 x 1 + 4.731   ·   10 3 x 2 = 0
0.7623 x 1 + 0.2238 x 2 + 0.3461 = 0
8 x i = [ 1 , 1 ] i = 1 ,   , 8 [52]
f8 x i cos ( 2 x i j = 1 D x j ) = 0 3 x i = [ 20 , 20 ] i = 1 , ,   D [103]
f9 x 1 2 x 2 2 = 0
x 1 + sin ( π 2 x 2 ) = 0
2 x 1 = [ 0 ,   1 ]
x 2 = [ 10 ,   0 ]
[95]
f10 x 1 2 +   x 2 2 +   x 1 +   x 2 8 = 0
x 1 | x 2 | +   x 1 + | x 2 | 5 = 0
2 x 1 = [ 30 ,   30 ]
x 2 = [ 30 ,   30 ]
[104]
f11 x 1 2 | x 2 | + 1 + 1 9   | x 1 1 | = 0
x 2 2 + 5 x 1 2 7 + 1 9   | x 2 | = 0
2 x 1 = [ 1 ,   1 ]
x 2 = [ 10 ,   10 ]
[104]
f12 i = 1 D x i 2 1 = 0
| x 1 x 2 | + i = 3 D x 1 2 = 0
20 x i = [ 1 , 1 ] i = 1 , ,   D [100]
f13 2 x 1 +   x 2 +   x 3 +   x 4 +   x 5 6.0 = 0
x 1 + 2 x 2 +   x 3 +   x 4 +   x 5 6.0 = 0
x 1 +   x 2 + 2 x 3 +   x 4 +   x 5 6.0 = 0
x 1 +   x 2 +   x 3 + 2 x 4 +   x 5 6.0 = 0
x 1   x 2 x 3   x 4 x 5 1.0 = 0
5 x i = [ 2 , 2 ] i = 1 , ,   D [105]
f14 x 1 2   x 1   x 2 2   x 2 +   x 3 2 = 0
sin   ( x 2   exp   ( x 1 ) ) = 0
x 3   log   | x 2 | = 0
5 x 1 = [ 0 ,   2 ]
x 2 = [ 10 ,   10 ]
x 3 = [ 1 ,   1 ]
[106]
f15 i = 1 D 1 ( x i + ( i = 1 D 1 ( x i ) ) ( D + 1 ) ) = 0
i = 0 D x i 1 = 0
20 x i = [ 2 , 2 ] i = 1 , ,   D [106]
f16 x 1   x 2 2 + 3 log ( x 1 ) = 0
1 5 x 1 + 2 x 2 2 x 1 x 2 = 0
2 x 1 = [ 0 ,   4 ]
x 2 = [ 3 ,   4 ]
[106]
f17 cos ( x 2 )   sin ( x 1 ) = 0
x 3 x 1 1 x 2 = 0
exp   ( x 1 )   x 3 2 = 0
3 x 1 = [ 0 ,   5 ]
x 2 = [ 0 ,   5 ]
x 3 = [ 0 ,   5 ]
[106]
f18 x 1 3   x 1 x 2 x 3 = 0
x 2 2   x 1 x 3 = 0
10 x 1 x 2 x 3 x 1 0.1 = 0
3 x 1 = [ 5 ,   5 ]
x 2 = [ 5 ,   5 ]
x 3 = [ 5 ,   5 ]
[107]
f19 sin ( x 1 3 ) 3 x 1 x 2 2 1 = 0
cos ( 3 x 1 2 x 2 ) | x 2 3 | + 1 = 0
2 x 1 = [ 2 ,   2 ]
x 2 = [ 2 ,   2 ]
[52]
f20 4 x 1 3 3 x 1   cos ( x 2 ) = 0
sin ( x 1 2 ) | x 2 | = 0
2 x 1 = [ 2 ,   2 ]
x 2 = [ 2 ,   2 ]
[52]
f21 exp ( x 1 2 +   x 2 2 ) 3 = 0
| x 2 | +   x 1 +   x 2 2   sin ( 3 | x 2 | +   x 1 ) = 0
2 x 1 = [ 2 ,   2 ]
x 2 = [ 2 ,   2 ]
[52]
f22 3.84 x 1 2 + 3.84 x 1   x 2 = 0
3.84 x 2 2 + 3.84 x 2   x 3 = 0
3.84 x 3 2 + 3.84 x 3   x 1 = 0
3 x 1 = [ 0 ,   10 ]
x 2 = [ 0 ,   10 ]
x 3 = [ 0 ,   1 ]
[52]
f23 x 1 4 +   x 2 4   x 1 x 2 3 6 = 0
| 1   x 1 2 x 2 2 | 0.6787 = 0
2 x 1 = [ 20 ,   20 ]
x 2 = [ 20 ,   20 ]
[52]
f24 0.5 x 1 2 + 0.5 x 2 2 +   x 1 +   x 2 8 = 0
| x 1 | x 2 +   x 1 + | x 2 | x 1 = 5 = 0
2 x 1 = [ 5 ,   5 ]
x 2 = [ 5 ,   5 ]
[52]
f25 4   sin ( 4 x 1 )   x 2 = 0
x 1 2 +   x 2 2 15 = 0
2 x 1 = [ 20 ,   20 ]
x 2 = [ 20 ,   20 ]
[52]
f26 cos ( 2 x 1 )   cos ( 2 x 2 ) 0.4 = 0
2 ( x 2   x 1 ) +   sin ( 2 x 2 )   sin ( 2 x 1 ) 1.2 = 0
2 x 1 = [ 15 ,   15 ]
x 2 = [ 15 ,   15 ]
[52]
f27 x 1 + 0.5 x 2 2 5 = 0
x 1 + 5   sin ( π x 2 2 ) = 0
2 x 1 = [ 5 ,   5 ]
x 2 = [ 5 ,   5 ]
[52]
Table 3. Comparison under the unimodal mathematical test functions.
Table 3. Comparison under the unimodal mathematical test functions.
F EOMPARUNSMADEPSOHOAFPAMFPAHFPA
F1Best06.34 × 10−89.23 × 10−193.07 × 10−1101.26 × 10−295.64 × 10−52.23 × 10−300
Avg9.24 × 10−342.37 × 10−52.36 × 10−131.28 × 10−801.02 × 10−12.33 × 10−21.47 × 10−100
Worst2.77 × 10−321.94 × 10−42.19 × 10−121.17 × 10−707.62 × 10−11.02 × 10−11.0100
SD5.06 × 10−333.91 × 10−54.56 × 10−132.70 × 10−802.63 × 10−12.71 × 10−22.90 × 10−100
F2Best4.14 × 10−1768. × 10−12007.09 × 10−851.01 × 10−312.79 × 10−1371.54 × 10−400
Avg4.60 × 10−1301.78 × 10−801.1 × 10−3181.06 × 10−802.01 × 10−275.71 × 10−58.98 × 10−300
Worst1.38 × 10−1281.11 × 10−703.5 × 10−3172.12 × 10−792.40 × 10−261.37 × 10−33.70 × 10−200
SD2.52 × 10−1292.37 × 10−8003.90 × 10−805.21 × 10−272.50 × 10−48.90 × 10−300
F3Best4.12 × 10−2471.56 × 10−26004.61 × 10−1201.09 × 10−343.10 × 10−2561.26 × 10−400
Avg1.61 × 10−1982.66 × 10−9006.20 × 10−1129.95 × 10−35.04 × 10−773.53 × 10−200
Worst4.83 × 10−1971.79 × 10−8007.57 × 10−1112.99 × 10−11.51 × 10−752.36 × 10−100
SD04.48 × 10−9001.97 × 10−1115.45 × 10−22.76 × 10−765.18 × 10−200
F4Best−1.0000−1.0000−1.0000−1.0000−1.0000−1.0000−1.0000−6.58 × 10−1−1.0000−1.0000
Avg−1.0000−9.90 × 10−1−1.0000−1.0000−1.0000−1.0000−1.0000−2.92 × 10−1−1.0000−1.0000
Worst−1.0000−9.68 × 10−1−1.0000−1.0000−1.0000−1.0000−1.0000−6.46 × 10−2−1.0000−1.0000
SD5.45 × 10−179.64 × 10−3006.94 × 10−91.21 × 10−86.84 × 10−171.50 × 10−100
F5Best−5.0000−3.42−5.0000−4.98−4.57−4.47−3.14−1.85−4.51−4.75
Avg−5.0000−2.86−5.0000−4.96−4.54−4.22−2.85−1.67−4.22−4.67
Worst−5.0000−2.40−5.0000−4.93−4.48−2.85−2.70−1.54−3.86−4.61
SD5.79 × 10−42.63 × 10−15.73 × 10−71.61 × 10−22.34 × 10−23.39 × 10−11.12 × 10−17.98 × 10−21.55 × 10−13.60 × 10−2
F6Best1.97 × 10−432.83 × 10−23.33 × 10−19001.05 × 10−49.15 × 10−53.03 × 10−2381.39 × 10400
Avg1.94 × 10−403.52 × 1021.56 × 10−1631.3 × 10−3193.50 × 10−46.96 × 10−41.11 × 10−1252.67 × 10400
Worst1.65 × 10−391.39 × 1034.67 × 10−1624.0 × 10−3188.85 × 10−42.72 × 10−33.34 × 10−1246.19 × 10400
SD3.57 × 10−403.95 × 102001.86 × 10−46.35 × 10−46.10 × 10−1251.00 × 10400
F7Best1.24 × 10−64.61 × 10−21.62 × 10−71.65 × 10−53.24 × 10−76.89 × 10−74.332.89 × 102.98 × 10−27.44 × 10−8
Avg5.33 × 10−66.91 × 10−13.28 × 10−79.95 × 10−48.90 × 10−77.39 × 10−66.048.50 × 105.76 × 10−12.83 × 10−6
Worst1.39 × 10−53.575.43 × 10−72.91 × 10−32.24 × 10−63.72 × 10−57.021.51 × 1021.232.15 × 10−5
SD3.11 × 10−68.92 × 10−18.52 × 10−88.20 × 10−44.67 × 10−79.04 × 10−67.18 × 10−13.08 × 102.97 × 10−14.36 × 10−6
F8Best−1.0000−1.0000−1.0000−1.0000−1.0000−1.0000−1.0000−9.92 × 10−1−1.0000−1.0000
Avg−1.0000−9.98 × 10−1−1.0000−1.0000−1.0000−9.88 × 10−1−9.99 × 10−1−9.11 × 10−1−1.0000−1.0000
Worst−1.0000−9.36 × 10−1−1.0000−1.0000−1.0000−9.36 × 10−1−9.77 × 10−1−7.82 × 10−1−1.0000−1.0000
SD01.16 × 10−20002.18 × 10−24.39 × 10−34.71 × 10−200
Bold values indicate the best outcomes.
Table 4. Comparison under the multimodal mathematical test functions.
Table 4. Comparison under the multimodal mathematical test functions.
F EOMPARUNSMADEPSOHOAFPAMFPAHFPA
F9Best−9.60 × 102−9.60 × 102−9.60 × 102−9.60 × 102−9.60 × 102−9.60 × 102−9.60 × 102−9.60 × 102−9.60 × 102−9.60 × 102
Avg−9.52 × 102−9.60 × 102−9.60 × 102−9.60 × 102−9.53 × 102−7.60 × 102−8.98 × 102−9.18 × 102−9.10 × 102−9.28 × 102
Worst−7.87 × 102−9.60 × 102−9.60 × 102−9.60 × 102−8.21 × 102−5.25 × 102−7.18 × 102−7.79 × 102−7.17 × 102−7.18 × 102
SD3.34 × 105.78 × 10−131.47 × 10−91.47 × 10−82.71 × 101.04 × 1026.77 × 105.60 × 107.41 × 106.25 × 10
F10Best02.88 × 10−91.17 × 10−181.24 × 10−90003.76 × 10−300
Avg1.15 × 10−318.24 × 10−62.90 × 10−111.82 × 10−71.05 × 10−316.27 × 10−271.57 × 10−32.68 × 10−13.16 × 10−311.05 × 10−31
Worst7.89 × 10−318.05 × 10−51.94 × 10−106.35 × 10−77.89 × 10−311.16 × 10−251.05 × 10−21.877.89 × 10−317.89 × 10−31
SD2.77 × 10−311.74 × 10−54.73 × 10−112.00 × 10−72.73 × 10−312.17 × 10−262.45 × 10−34.04 × 10−13.93 × 10−312.73 × 10−31
F11Best1.35 × 10−311.97 × 10−101.61 × 10−175.03 × 10−131.35 × 10−312.23 × 10−302.72 × 10−33.75 × 10−31.35 × 10−311.35 × 10−31
Avg1.35 × 10−314.65 × 10−61.26 × 10−113.68 × 10−91.35 × 10−312.30 × 10−251.74 × 10−13.15 × 10−11.35 × 10−311.35 × 10−31
Worst1.35 × 10−316.73 × 10−58.72 × 10−112.31 × 10−81.35 × 10−316.62 × 10−243.63 × 10−17.91 × 10−11.35 × 10−311.35 × 10−31
SD6.68 × 10−471.26 × 10−52.13 × 10−115.21 × 10−96.68 × 10−471.21 × 10−249.32 × 10−22.28 × 10−16.68 × 10−476.68 × 10−47
F12Best4.44 × 10−153.658.88 × 10−168.88 × 10−162.64 × 10−32.124.44 × 10−151.59 × 108.88 × 10−168.88 × 10−16
Avg9.06 × 10−156.608.88 × 10−168.88 × 10−165.25 × 10−36.286.57 × 10−151.89 × 108.88 × 10−168.88 × 10−16
Worst1.51 × 10−141.15 × 108.88 × 10−168.88 × 10−169.38 × 10−38.911.51 × 10−142.06 × 108.88 × 10−168.88 × 10−16
SD2.97 × 10−152.22001.74 × 10−31.612.57 × 10−151.4600
F13Best4.38 × 10−31.249.44 × 10−53.45 × 10−23.54 × 10−43.71 × 10−31.06 × 101.41 × 1021.111.87 × 10−4
Avg2.61 × 10−27.751.25 × 10−25.54 × 10−11.57 × 10−26.23 × 10−23.69 × 102.69 × 1025.172.89 × 10−2
Worst8.88 × 10−22.54 × 104.67 × 10−29.88 × 10−11.18 × 10−14.49 × 10−16.61 × 104.64 × 1021.32 × 101.18 × 10−1
SD2.44 × 10−25.841.32 × 10−23.57 × 10−12.85 × 10−29.00 × 10−21.88 × 107.68 × 102.692.60 × 10−2
F14Best2.04 × 10−13.88 × 10−11.26 × 10−11.59 × 10−13.54 × 10−15.05 × 10−11.058.42 × 10−14.52 × 10−12.67 × 10−1
Avg3.42 × 10−17.10 × 10−12.48 × 10−14.12 × 10−15.38 × 10−17.09 × 10−11.451.376.93 × 10−15.19 × 10−1
Worst5.61 × 10−19.08 × 10−13.85 × 10−17.12 × 10−16.73 × 10−19.82 × 10−11.991.761.017.29 × 10−1
SD8.06 × 10−21.08 × 10−16.57 × 10−21.54 × 10−17.85 × 10−21.20 × 10−12.33 × 10−12.23 × 10−11.51 × 10−11.01 × 10−1
F15Best−9.58−7.86−9.36−9.36−9.60−9.14−6.00−5.47−8.95−9.66
Avg−8.50−5.99−8.06−7.73−9.20−7.53−5.17−3.72−7.30−9.39
Worst−7.07−4.61−6.74−6.32−8.15−5.05−4.46−2.97−5.53−8.71
SD7.23 × 10−18.40 × 10−17.53 × 10−19.51 × 10−12.30 × 10−11.144.37 × 10−15.62 × 10−19.19 × 10−12.00 × 10−1
F16Best3.77 × 10−83.27 × 10−16.52 × 10−92.00 × 10−66.79 × 10−52.62 × 10−58.64 × 10−11.12 × 1069.66 × 10−36.40 × 10−9
Avg3.46 × 10−33.25 × 1032.06 × 10−75.17 × 10−35.44 × 10−41.171.301.08 × 1083.40 × 10−26.97 × 10−8
Worst1.04 × 10−19.37 × 1044.16 × 10−62.50 × 10−24.66 × 10−33.423.224.46 × 1085.97 × 10−22.61 × 10−7
SD1.89 × 10−21.71 × 1048.19 × 10−76.63 × 10−38.57 × 10−49.25 × 10−14.17 × 10−11.20 × 1081.41 × 10−28.19 × 10−8
F17Best2.69 × 10−64.17 × 10−11.31 × 10−81.84 × 10−41.90 × 10−45.41 × 10−12.861.24 × 1072.581.18
Avg4.26 × 10−24.73 × 1036.13 × 10−34.25 × 10−31.71 × 10−31.26 × 103.041.69 × 1082.802.28
Worst1.96 × 10−15.57 × 1042.10 × 10−21.62 × 10−28.23 × 10−32.87 × 103.468.03 × 1082.972.97
SD5.47 × 10−21.37 × 1047.25 × 10−33.44 × 10−31.75 × 10−36.131.28 × 10−11.67 × 1081.22 × 10−14.94 × 10−1
F18Best2.742.079.00 × 10−19.00 × 10−11.715.732.953.029.00 × 10−19.00 × 10−1
Avg2.862.471.109.11 × 10−12.027.403.054.649.00 × 10−19.00 × 10−1
Worst3.002.742.911.002.349.373.077.119.00 × 10−19.00 × 10−1
SD5.08 × 10−21.57 × 10−15.27 × 10−13.05 × 10−21.40 × 10−18.65 × 10−12.87 × 10−21.364.52 × 10−164.52 × 10−16
F19Best5.38 × 10−33.05 × 1033.52 × 10−32.418.69 × 1023.27 × 10−26.23 × 1031.59 × 1091.22 × 1021.79 × 10−4
Avg5.44 × 10−14.50 × 1073.37 × 10−15.821.38 × 1035.20 × 10−17.46 × 1034.63 × 10105.97 × 1021.05 × 10−1
Worst8.694.76 × 1083.901.71 × 101.81 × 1038.148.54 × 1031.87 × 10111.49 × 1032.78
SD1.709.89 × 1079.68 × 10−13.592.45 × 1021.475.83 × 1023.87 × 10103.96 × 1025.06 × 10−1
F20Best04.05 × 10001.22 × 1021.82 × 1002.97 × 10200
Avg09.56 × 10001.54 × 1029.48 × 102.28 × 103.69 × 10200
Worst01.63 × 102001.74 × 1022.29 × 1022.44 × 1024.26 × 10200
SD03.11 × 10001.16 × 107.06 × 107.00 × 103.73 × 1000
F21Best2.48 × 102.53 × 1022.41 × 106.95 × 10−32.55 × 106.15 × 10−22.87 × 101.35 × 1052.58 × 102.23 × 10
Avg2.54 × 102.12 × 1032.56 × 103.88 × 10−13.03 × 108.14 × 102.89 × 104.40 × 1052.68 × 102.36 × 10
Worst2.60 × 105.16 × 1032.87 × 101.359.32 × 103.17 × 1022.90 × 101.08 × 1062.84 × 102.50 × 10
SD3.19 × 10−11.53 × 1031.133.44 × 10−11.30 × 107.43 × 106.73 × 10−22.36 × 1056.16 × 10−17.38 × 10−1
F22Best9.99 × 10−21.301.14 × 10−8406.21 × 10−13.102.00 × 10−18.8900
Avg1.03 × 10−14.288.72 × 10−646.30 × 10−1457.94 × 10−14.438.42 × 10−11.74 × 1000
Worst2.00 × 10−19.002.61 × 10−621.89 × 10−1431.015.702.472.50 × 1000
SD1.83 × 10−21.934.76 × 10−633.45 × 10−1449.74 × 10−27.60 × 10−14.76 × 10−13.4200
F23Best1.78 × 10−173.70 × 10−15−1.0000−1.00002.50 × 10−125.99 × 10−141.36 × 10−123.34 × 10−10−1.0000−1.0000
Avg2.08 × 10−166.68 × 10−13−1.0000−1.00005.60 × 10−127.81 × 10−129.00 × 10−125.45 × 10−9−1.0000−1.0000
Worst5.79 × 10−164.66 × 10−12−1.0000−1.00001.21 × 10−115.97 × 10−112.45 × 10−112.36 × 10−8−1.0000−1.0000
SD1.59 × 10−161.16 × 10−12002.04 × 10−121.43 × 10−114.89 × 10−126.06 × 10−900
Bold values indicate the best outcomes.
Table 5. Comparison on test cases f1–f13.
Table 5. Comparison on test cases f1–f13.
F EOMPARUNSMADEPSOHOAFPAMFPAHFPA
f1Best8.0 × 10−1833.39 × 10−14009.75 × 10−865.55 × 10−321.1 × 10−1284.31 × 10−500
Avg1.36 × 10−102.15 × 10−61.21 × 10−479.3 × 10−3121.73 × 10−323.78 × 10−153.71 × 10−51.42 × 10−300
Worst4.09 × 10−92.93 × 10−53.63 × 10−462.7 × 10−3102.47 × 10−321.06 × 10−131.72 × 10−41.05 × 10−200
SD7.46 × 10−105.70 × 10−66.63 × 10−4701.09 × 10−321.93 × 10−144.96 × 10−52.08 × 10−300
f2Best02.93 × 10−901.25 × 10−17002.54 × 10−52.49 × 10−400
Avg4.63 × 10−254.09 × 10−64.78 × 10−123.16 × 10−95.81 × 10−324.45 × 10−254.11 × 10−37.00 × 10−25.87 × 10−323.52 × 10−32
Worst1.39 × 10−232.73 × 10−56.41 × 10−116.15 × 10−82.74 × 10−311.24 × 10−232.28 × 10−22.55 × 10−13.08 × 10−312.74 × 10−31
SD2.54 × 10−246.37 × 10−61.48 × 10−111.12 × 10−81.10 × 10−312.26 × 10−244.67 × 10−36.37 × 10−21.06 × 10−318.20 × 10−32
f3Best2.11 × 10−231.76 × 10−69.38 × 10−121.30 × 10−43.62 × 10−271.25 × 10−168.89 × 10−26.91 × 10−11.80 × 10−68.65 × 10−30
Avg1.44 × 10−195.60 × 10−57.91 × 10−102.53 × 10−31.12 × 10−251.50 × 10−141.83 × 10−11.684.31 × 10−51.96 × 10−26
Worst3.00 × 10−183.32 × 10−45.00 × 10−98.72 × 10−31.09 × 10−241.39 × 10−133.89 × 10−13.761.81 × 10−43.20 × 10−25
SD5.54 × 10−197.18 × 10−51.15 × 10−92.35 × 10−32.19 × 10−252.92 × 10−147.34 × 10−28.07 × 10−15.15 × 10−55.87 × 10−26
f4Best1.56 × 10−53.48 × 10−57.08 × 10−93.69 × 10−13.61 × 10−181.14 × 10−44.36 × 10−22.47 × 10−13.61 × 10−183.61 × 10−18
Avg9.20 × 10−35.92 × 10−21.78 × 10−34.062.34 × 10−35.20 × 10−22.102.981.54 × 10−16.46 × 10−2
Worst1.36 × 10−21.72 × 10−11.75 × 10−24.364.92 × 10−23.29 × 10−19.078.896.64 × 10−16.64 × 10−1
SD4.58 × 10−34.96 × 10−24.60 × 10−38.05 × 10−19.36 × 10−31.02 × 10−12.892.342.53 × 10−11.76 × 10−1
f5Best05.15 × 10−67.07 × 10−157.44 × 10−1102.02 × 10−282.78 × 10−15.50 × 10−100
Avg01.07 × 10−21.08 × 10−83.12 × 10−81.35 × 10−295.22 × 10−232.03 × 107.81 × 106.06 × 10−290
Worst01.88 × 10−13.21 × 10−74.29 × 10−72.02 × 10−281.40 × 10−212.32 × 1023.59 × 1028.08 × 10−280
SD03.49 × 10−25.87 × 10−87.83 × 10−85.12 × 10−292.55 × 10−224.22 × 108.36 × 101.60 × 10−280
f6Best000006.17 × 10−320000
Avg1.36 × 10−320001.00 × 10−321.14 × 10−2202.04 × 10−3100
Worst3.00 × 10−310003.00 × 10−313.43 × 10−2101.08 × 10−3000
SD5.76 × 10−320005.48 × 10−326.26 × 10−2203.69 × 10−3100
f7Best9.99 × 10−252.62 × 10−72.23 × 10−102.63 × 10−63.35 × 10−82.84 × 10−123.04 × 10−22.13 × 10−21.39 × 10−81.69 × 10−27
Avg1.87 × 10−156.47 × 10−33.17 × 10−74.14 × 10−32.04 × 10−42.85 × 10−22.48 × 10−11.96 × 10−12.16 × 10−58.08 × 10−20
Worst2.00 × 10−141.03 × 10−14.48 × 10−61.23 × 10−13.27 × 10−31.42 × 10−16.40 × 10−13.87 × 10−11.76 × 10−42.38 × 10−18
SD4.78 × 10−152.35 × 10−21.03 × 10−62.25 × 10−26.17 × 10−44.13 × 10−21.54 × 10−19.36 × 10−23.97 × 10−54.34 × 10−19
f8Best02.51 × 10−86.11 × 10−155.69 × 10−1004.28 × 10−261.94 × 10−36.35 × 10−200
Avg9.44 × 10−242.65 × 10−56.26 × 10−136.37 × 10−81.19 × 10−326.72 × 10−11.05 × 10−13.036.98 × 10−339.04 × 10−33
Worst2.83 × 10−222.07 × 10−43.20 × 10−123.35 × 10−71.23 × 10−324.033.87 × 10−11.56 × 102.47 × 10−321.23 × 10−32
SD5.17 × 10−234.80 × 10−57.56 × 10−131.01 × 10−72.25 × 10−331.538.55 × 10−24.097.01 × 10−335.54 × 10−33
f9Best09.03 × 10−272.21 × 10−291.27 × 10−160004.96 × 10−700
Avg4.50 × 10−338.78 × 10−85.63 × 10−114.34 × 10−121.50 × 10−331.94 × 10−2207.69 × 10−45.00 × 10−342.00 × 10−33
Worst1.50 × 10−322.63 × 10−61.06 × 10−99.75 × 10−111.50 × 10−322.80 × 10−2106.15 × 10−31.50 × 10−321.50 × 10−32
SD6.99 × 10−334.80 × 10−72.19 × 10−101.78 × 10−114.58 × 10−336.78 × 10−2201.31 × 10−32.74 × 10−335.19 × 10−33
f10Best01.88 × 10−71.82 × 10−164.56 × 10−11002.66 × 10−32.22 × 10−200
Avg1.31 × 10−302.01 × 10−48.36 × 10−124.02 × 10−808.71 × 10−253.83 × 10−14.461.31 × 10−311.05 × 10−31
Worst3.16 × 10−292.15 × 10−35.86 × 10−114.08 × 10−702.54 × 10−232.487.80 × 107.89 × 10−317.89 × 10−31
SD5.76 × 10−304.20 × 10−41.36 × 10−118.94 × 10−804.64 × 10−245.88 × 10−11.41 × 102.99 × 10−312.73 × 10−31
f11Best2.95 × 10−322.75 × 10−88.91 × 10−171.03 × 10−102.95 × 10−327.57 × 10−304.37 × 10−54.11 × 10−32.95 × 10−322.95 × 10−32
Avg1.09 × 10−315.68 × 10−52.35 × 10−123.74 × 10−81.08 × 10−319.96 × 10−184.16 × 10−28.53 × 10−21.20 × 10−311.07 × 10−31
Worst2.50 × 10−315.37 × 10−41.49 × 10−112.40 × 10−72.25 × 10−312.99 × 10−162.56 × 10−13.81 × 10−12.25 × 10−312.25 × 10−31
SD9.70 × 10−321.01 × 10−43.67 × 10−126.26 × 10−89.74 × 10−325.46 × 10−176.97 × 10−28.82 × 10−29.92 × 10−329.74 × 10−32
f12Best1.91 × 10−141.75 × 10−51.31 × 10−145.64 × 10−122.10 × 10−146.46 × 10−131.92 × 10−38.53 × 10−11.75 × 10−57.87 × 10−18
Avg8.62 × 10−112.35 × 10−32.75 × 10−85.56 × 10−99.79 × 10−63.36 × 10−24.66 × 10−23.891.20 × 10−35.48 × 10−14
Worst1.16 × 10−91.71 × 10−23.32 × 10−76.55 × 10−82.92 × 10−45.00 × 10−15.06 × 10−11.18 × 105.34 × 10−37.13 × 10−13
SD2.24 × 10−103.80 × 10−37.38 × 10−81.29 × 10−85.33 × 10−51.27 × 10−11.22 × 10−13.011.20 × 10−31.53 × 10−13
f13Best2.74 × 10−115.45 × 10−64.57 × 10−111.91 × 10−61.38 × 10−114.70 × 10−111.54 × 10−37.63 × 10−200
Avg2.03 × 10−59.56 × 10−49.15 × 10−95.60 × 10−52.05 × 10−81.12 × 10−11.43 × 10−24.91 × 10−100
Worst2.12 × 10−45.33 × 10−36.05 × 10−83.01 × 10−43.24 × 10−73.053.66 × 10−22.0100
SD5.02 × 10−51.45 × 10−31.47 × 10−88.92 × 10−56.21 × 10−85.58 × 10−18.44 × 10−34.22 × 10−100
Bold values indicate the best outcomes.
Table 6. Comparison on test cases f14–f27.
Table 6. Comparison on test cases f14–f27.
F EOMPARUNSMADEPSOHOAFPAMFPAHFPA
f14Best1.51 × 10−326.13 × 10−84.29 × 10−131.43 × 10−91.51 × 10−329.17 × 10−293.41 × 10−37.09 × 10−31.51 × 10−321.51 × 10−32
Avg3.12 × 10−324.39 × 10−52.15 × 10−69.31 × 10−72.66 × 10−327.48 × 10−174.11 × 10−21.16 × 10−12.62 × 10−323.50 × 10−32
Worst1.60 × 10−312.40 × 10−44.93 × 10−51.32 × 10−51.60 × 10−312.24 × 10−151.02 × 10−14.08 × 10−11.59 × 10−311.59 × 10−31
SD3.85 × 10−326.03 × 10−59.08 × 10−62.56 × 10−63.07 × 10−324.10 × 10−162.95 × 10−29.63 × 10−23.70 × 10−324.60 × 10−32
f15Best9.78 × 10−131.49 × 10−181.30 × 10−131.98 × 10−81.84 × 10−41.10 × 10−67.44 × 10−14.03 × 10−100
Avg2.21 × 10−51.06 × 10−52.47 × 10−43.47 × 10−54.35 × 10−22.65 × 1032.63 × 101.40 × 1021.93 × 10−322.27 × 10−14
Worst2.35 × 10−47.90 × 10−54.06 × 10−33.12 × 10−41.26 × 10−11.43 × 1045.29 × 1023.75 × 1031.97 × 10−315.14 × 10−12
SD5.27 × 10−52.04 × 10−58.37 × 10−46.06 × 10−53.57 × 10−24.22 × 1039.96 × 106.84 × 1024.18 × 10−329.31 × 10−13
f16Best6.16 × 10−326.79 × 10−82.10 × 10−195.10 × 10−26.16 × 10−323.46 × 10−303.67 × 10−51.59 × 10−46.16 × 10−326.16 × 10−32
Avg6.16 × 10−323.36 × 10−56.23 × 10−139.80 × 10−26.16 × 10−321.29 × 10−242.70 × 10−31.63 × 10−16.16 × 10−326.16 × 10−32
Worst6.16 × 10−321.90 × 10−43.59 × 10−122.33 × 10−16.16 × 10−321.59 × 10−238.80 × 10−32.126.16 × 10−326.16 × 10−32
SD04.36 × 10−59.88 × 10−132.90 × 10−203.12 × 10−242.69 × 10−33.96 × 10−100
f17Best3.79 × 10−121.57 × 10−75.75 × 10−153.94 × 10−72.84 × 10−279.15 × 10−211.76 × 10−61.92 × 10−400
Avg7.33 × 10−63.03 × 10−55.54 × 10−75.22 × 10−63.66 × 10−74.14 × 10−55.80 × 10−46.06 × 10−300
Worst1.09 × 10−42.32 × 10−49.99 × 10−71.22 × 10−49.99 × 10−71.22 × 10−47.91 × 10−33.73 × 10−200
SD2.75 × 10−55.73 × 10−54.94 × 10−72.22 × 10−54.65 × 10−75.03 × 10−51.53 × 10−38.08 × 10−300
f18Best4.93 × 10−325.20 × 10−114.52 × 10−181.22 × 10−104.93 × 10−321.23 × 10−312.84 × 10−57.32 × 10−64.93 × 10−324.93 × 10−32
Avg1.33 × 10−24.78 × 10−69.25 × 10−122.38 × 10−86.25 × 10−323.79 × 10−253.88 × 10−32.96 × 10−26.25 × 10−326.41 × 10−32
Worst9.94 × 10−23.73 × 10−51.08 × 10−101.82 × 10−71.23 × 10−315.77 × 10−243.04 × 10−21.24 × 10−11.23 × 10−311.23 × 10−31
SD3.44 × 10−28.19 × 10−62.09 × 10−114.28 × 10−82.73 × 10−321.31 × 10−246.29 × 10−33.62 × 10−22.48 × 10−322.79 × 10−32
f19Best01.13 × 10−101.88 × 10−181.13 × 10−12001.22 × 10−61.09 × 10−600
Avg2.05 × 10−344.21 × 10−71.43 × 10−141.80 × 10−1001.45 × 10−287.66 × 10−51.33 × 10−300
Worst3.08 × 10−332.82 × 10−61.55 × 10−131.03 × 10−901.52 × 10−272.49 × 10−47.62 × 10−300
SD7.82 × 10−347.62 × 10−73.41 × 10−142.31 × 10−1003.97 × 10−286.82 × 10−51.91 × 10−300
f20Best01.26 × 10−91.72 × 10−192.02 × 10−12002.09 × 10−51.37 × 10−300
Avg2.97 × 10−324.53 × 10−62.52 × 10−81.50 × 10−83.93 × 10−322.65 × 10−269.80 × 10−33.97 × 10−22.39 × 10−322.35 × 10−32
Worst1.97 × 10−316.41 × 10−53.83 × 10−72.14 × 10−72.47 × 10−317.89 × 10−251.17 × 10−11.43 × 10−11.97 × 10−311.97 × 10−31
SD5.73 × 10−321.28 × 10−58.06 × 10−84.07 × 10−87.37 × 10−321.44 × 10−252.13 × 10−23.67 × 10−24.77 × 10−324.78 × 10−32
f21Best000001.56 × 10−260000
Avg3.43 × 10−140003.29 × 10−243.83 × 10−1201.45 × 10−200
Worst1.03 × 10−120009.87 × 10−239.46 × 10−1101.12 × 10−100
SD1.88 × 10−130001.80 × 10−231.75 × 10−1102.81 × 10−200
f22Best05.73 × 10−221.60 × 10−182.74 × 10−1108.01 × 10−311.38 × 10−32.53 × 10−200
Avg3.71 × 10−181.59 × 10−44.63 × 10−112.44 × 10−81.65 × 10−313.61 × 10−232.85 × 10−22.312.29 × 10−311.48 × 10−31
Worst1.11 × 10−161.05 × 10−33.68 × 10−103.44 × 10−78.01 × 10−316.96 × 10−221.59 × 10−11.13 × 108.01 × 10−318.01 × 10−31
SD2.03 × 10−172.57 × 10−48.74 × 10−116.72 × 10−83.24 × 10−311.32 × 10−223.38 × 10−23.443.55 × 10−313.02 × 10−31
f23Best01.94 × 10−81.13 × 10−173.11 × 10−11005.67 × 10−43.13 × 10−400
Avg3.42 × 10−317.74 × 10−65.01 × 10−122.56 × 10−801.47 × 10−253.38 × 10−21.56 × 10−15.26 × 10−321.05 × 10−31
Worst3.16 × 10−306.94 × 10−53.11 × 10−113.14 × 10−703.69 × 10−245.07 × 10−19.85 × 10−17.89 × 10−313.16 × 10−30
SD9.65 × 10−311.45 × 10−58.26 × 10−125.91 × 10−806.75 × 10−259.18 × 10−22.25 × 10−12.00 × 10−315.76 × 10−31
f24Best01.02 × 10−213.36 × 10−181.69 × 10−11007.41 × 10−42.20 × 10−200
Avg1.05 × 10−27.27 × 10−32.45 × 10−27.01 × 10−33.51 × 10−31.05 × 10−21.59 × 10−11.692.45 × 10−21.40 × 10−2
Worst1.05 × 10−11.05 × 10−11.05 × 10−11.05 × 10−11.05 × 10−11.05 × 10−19.21 × 10−16.581.05 × 10−11.05 × 10−1
SD3.21 × 10−22.67 × 10−24.53 × 10−22.67 × 10−21.92 × 10−23.21 × 10−22.13 × 10−11.804.53 × 10−23.64 × 10−2
f25Best02.06 × 10−94.33 × 10−181.25 × 10−1101.11 × 10−314.69 × 10−47.49 × 10−300
Avg2.47 × 10−31.89 × 10−31.21 × 10−31.24 × 10−31.45 × 10−47.78 × 10−211.16 × 10−27.73 × 10−29.18 × 10−148.55 × 10−30
Worst7.27 × 10−37.27 × 10−37.27 × 10−37.27 × 10−34.36 × 10−32.27 × 10−193.97 × 10−23.13 × 10−13.22 × 10−131.28 × 10−29
SD3.46 × 10−33.07 × 10−32.75 × 10−32.75 × 10−37.97 × 10−44.14 × 10−201.06 × 10−27.78 × 10−21.22 × 10−143.24 × 10−30
f26Best07.06 × 10−101.11 × 10−181.12 × 10−1001.97 × 10−314.70 × 10−64.06 × 10−400
Avg7.85 × 10−26.12 × 10−63.17 × 10−127.16 × 10−86.57 × 10−322.76 × 10−251.67 × 10−21.33 × 10−12.04 × 10−315.89 × 10−32
Worst1.183.72 × 10−52.65 × 10−116.12 × 10−71.58 × 10−303.90 × 10−242.27 × 10−11.311.58 × 10−301.58 × 10−30
SD2.99 × 10−11.07 × 10−56.39 × 10−121.26 × 10−72.90 × 10−318.29 × 10−254.12 × 10−23.13 × 10−14.73 × 10−312.41 × 10−31
f27Best01.36 × 10−91.57 × 10−196.09 × 10−1209.12 × 10−315.52 × 10−31.01 × 10−300
Avg2.15 × 10−28.90 × 10−61.26 × 10−22.33 × 10−22.54 × 10−21.44 × 10−25.72 × 10−26.87 × 10−23.05 × 10−23.05 × 10−2
Worst5.39 × 10−29.55 × 10−55.39 × 10−25.39 × 10−25.39 × 10−25.39 × 10−21.26 × 10−11.91 × 10−15.39 × 10−25.39 × 10−2
SD2.68 × 10−21.95 × 10−52.32 × 10−22.72 × 10−22.71 × 10−22.42 × 10−22.48 × 10−24.38 × 10−22.72 × 10−22.72 × 10−2
Bold values indicate the best outcomes.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Abdel-Basset, M.; Mohamed, R.; Saber, S.; Askar, S.S.; Abouhawwash, M. Modified Flower Pollination Algorithm for Global Optimization. Mathematics 2021, 9, 1661. https://doi.org/10.3390/math9141661

AMA Style

Abdel-Basset M, Mohamed R, Saber S, Askar SS, Abouhawwash M. Modified Flower Pollination Algorithm for Global Optimization. Mathematics. 2021; 9(14):1661. https://doi.org/10.3390/math9141661

Chicago/Turabian Style

Abdel-Basset, Mohamed, Reda Mohamed, Safaa Saber, S. S. Askar, and Mohamed Abouhawwash. 2021. "Modified Flower Pollination Algorithm for Global Optimization" Mathematics 9, no. 14: 1661. https://doi.org/10.3390/math9141661

APA Style

Abdel-Basset, M., Mohamed, R., Saber, S., Askar, S. S., & Abouhawwash, M. (2021). Modified Flower Pollination Algorithm for Global Optimization. Mathematics, 9(14), 1661. https://doi.org/10.3390/math9141661

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop