Next Article in Journal
Antimonotonicity, Hysteresis and Coexisting Attractors in a Shinriki Circuit with a Physical Memristor as a Nonlinear Resistor
Next Article in Special Issue
A Novel Method for the Classification of Butterfly Species Using Pre-Trained CNN Models
Previous Article in Journal
Research on Braking Efficiency of Master-Slave Electro-Hydraulic Hybrid Electric Vehicle
Previous Article in Special Issue
Classification of Glaucoma Based on Elephant-Herding Optimization Algorithm and Deep Belief Network
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Recent Advances in Harris Hawks Optimization: A Comparative Study and Applications

1
Department of Computer and Information Science, Linköping University, 581 83 Linköping, Sweden
2
Faculty of Science, Fayoum University, Fayoum 2933110, Egypt
3
Faculty of Computer Sciences and Informatics, Amman Arab University, Amman 11953, Jordan
4
Sorbonne Center of Artificial Intelligence, Sorbonne University-Abu Dhabi, Abu Dhabi 38044, United Arab Emirates
5
Faculty of Engineering, Helwan University, Cairo 4034572, Egypt
6
Faculty of Science, Menoufia University, Menoufia 6131567, Egypt
7
Department of Computer Science, Faculty of Computers and Information, Kafr El-Sheikh University, Kafr El-Sheikh 6860404, Egypt
8
Computer Engineering Department, Computer and Information Systems College, Umm Al-Qura University, Makkah 21955, Saudi Arabia
9
Faculty of Engineering and Information Technology, University of Technology Sydney, Ultimo, NSW 2007, Australia
*
Authors to whom correspondence should be addressed.
Electronics 2022, 11(12), 1919; https://doi.org/10.3390/electronics11121919
Submission received: 3 May 2022 / Revised: 6 June 2022 / Accepted: 10 June 2022 / Published: 20 June 2022
(This article belongs to the Special Issue Big Data Analytics Using Artificial Intelligence)

Abstract

:
The Harris hawk optimizer is a recent population-based metaheuristics algorithm that simulates the hunting behavior of hawks. This swarm-based optimizer performs the optimization procedure using a novel way of exploration and exploitation and the multiphases of search. In this review research, we focused on the applications and developments of the recent well-established robust optimizer Harris hawk optimizer (HHO) as one of the most popular swarm-based techniques of 2020. Moreover, several experiments were carried out to prove the powerfulness and effectivness of HHO compared with nine other state-of-art algorithms using Congress on Evolutionary Computation (CEC2005) and CEC2017. The literature review paper includes deep insight about possible future directions and possible ideas worth investigations regarding the new variants of the HHO algorithm and its widespread applications.

1. Introduction

The optimization area has witnessed a wide range of applications in understanding the solutions of many new problems due to the fast progress of industrial technologies and artificial intelligence [1,2,3,4,5]. Such rapid progress requires the development of new technologies for tackling hard and challenging problems in a reasonable time. Many proposals and research attempts have been introduced by researchers with years of experience based on two classes of deterministic methods and stochastic-based methods [6,7,8,9]. In the first-class, there is a need for gradient info and details of the search space. The later class does not need such info and can handle black-box optimization without knowing mathematics details of the objective function, based on sensing and touching the surface of the problems. One of the popular classes of these stochastic optimizers is a swarm and evolutionary method [10,11,12,13]. Table 1 shows a list of metaheuristic algorithms belonging to this class and their algorithmic behavior.
These swarm-based and evolutionary methods seed a random set of solutions as initial guesses, and then they evolve and improve the initial solutions until convergence to a high-quality suboptimal or optimal solution [37,38,39,40,41]. The core operators of evolutionary techniques are crossovers and mutations to generate new gens from parent solutions based on some ideas inspired by the evolution of nature. The swarm-based solvers may also use such operations, but they organize initial swarms based on the interaction of two fundamental cores for scattering them and then intensifying them into specific high-potential areas [42,43,44]. Swarm-based approaches are developed and utilized in many areas of machine learning and artificial intelligence according to the cooperative intellect life of self-organized and reorganized coordination, e.g., artificial clusters of randomly generated agents. This way of problem solving is well-known among the swarm-intelligence community; due to such a delicate balance between the exploration and exploitation steps, it is a hard target to achieve [45,46,47].
There are many ways to classify these methods as well, but generally, some researchers, who benefit from a source of inspiration, classify these methods based on their source of inspiration [48,49,50]. More scientific classification has been conducted based on the algorithmic behavior of these methods by [51,52,53]. However, this field has some diseases that still have not healed. First, many population-based methods cannot show a high performance or novelty in their mathematical rationalities but have a new source of inspiration, and they are well-known as questioned metaphor-based methods. For instance, the well-known gray wolf optimizer (GWO) has a structural defect, and it shows uncertain performance for problems whose optimal solutions are not zero but near-zero points such as epsilon, as  discovered by [54,55]. Moreover, the core equations of many new optimizers can be constructed using the particle swarm optimizer (PSO) and differential evolution (DE) methods or other popular swarm-based methods [56]. This means the inspiration and metaphor-based language made it easy to develop unoriginal or “pseudonovel” solvers that show “pseudoefficient” efficacy. More and more, the wisdom of new researchers grows, they understand and reach the conclusion that “performance matters”, not the source of inspiration. The Harris hawk optimizer (HHO) was an attempt to reach not only better performance but also low-cost and efficient operators within a new stochastic optimizer.
Another way that exists in the literature is a method that distinguishes each method based on the source of inspiration. Based on this system of classification, metaheuristic algorithms (MAs) can be classified into five different categories: Evolutionary Algorithms, Swarm Intelligence, Physics-based algorithms, Human-based algorithms, and Sport-based algorithms [57]. The first category, Evolutionary Algorithms (EAs), refers to algorithms which are inspired by nature and simulate natural creatures’ behaviors such as mutations, crossover, selection, elitism, and reproduction. Examples of these algorithms are the Genetic Algorithm (GA) by [14], Genetic Programming (GP) by [58], Differential Evolution (DE) by [16], Evolutionary Programming (EP) by [59], Evolution Strategy (ES) by [60], Biogeography-Based Optimizer (BBO) by [24], and Backtracking Search Algorithm (BSA) by [61]. The second category is Swarm-Intelligence (SI)-based algorithms, which are inspired from the social behavior of swarms, birds, insects, fish, and animals. The  top three most popular examples of SI algorithms are Particle Swarm Optimization (PSO) by [19], Ant Colony Optimization (ACO) by [20], and Artificial Bee Colony (ABC) Algorithm by [62]. Some other SI-based algorithms that have their place in the literature regardless of their performance and originality include the Cuckoo Search Algorithm (CS) by [25], Firefly Algorithm (FA) by [63], COOT bird [64], Krill Herd (KH) by [31], Cat Swarm Optimization (CSO) by [65], Bat Algorithm (BA) by [66], Symbiotic Organisms Search (SOS) [67], Grey Wolf Optimizer (GWO) by [32], Moth–Flame Optimization (MFO) Algorithm checked by [68,69], Virus Colony Search (VCS) [70], Whale Optimization Algorithm (WOA)checked by [71,72], Grasshopper Optimization Algorithm (GOA) by [73], Salp Swarm Algorithm by [74,75], Crow Search Algorithm (CSA) reviewed by [76], Symbiotic Organisms Search (SOS) by [77], Reptile Search Algorithm (RSA) by [78], Butterfly Optimization Algorithm (BOA) by [79], Remora Optimization Algorithm (ROA) [80], Wild Horse Optimizer (WHO) [81], Seagull Optimization Algorithm (SOA) by [82], and Ant Lion Optimizer (ALO) reviewed by [83]. The third category is the Physics-based algorithms, which refers to the algorithms inspired from chemical rules or physical phenomena. Some examples of this class are Simulated Annealing (SA) by [84], Gravitational Search Algorithm (GSA) by [27],  Big-Bang Big-Crunch (BBBC) by [85], Lightning Search Algorithm (LSA) by [86], Electromagnetic Field Optimization (EFO) by [87], Thermal Exchange Optimization (TEO) by [88], Vortex Search Algorithm (VSA) by [89], Electrosearch Algorithm (ESA) by [90], Atom Search Optimization (ASO) by [91], Chemical Reaction Optimization by [92], and Henry Gas Solubility Optimization (HGSO) by [34]. The fourth class is Human-based algorithms (HA), which refers to the algorithms inspired from human behaviors such as Harmony Search (HS) by [21], Teaching–Learning-Based Algorithm (TLBO) by [29], the ideology algorithm by [93], human mental search (HMS) by [94], Brain storm optimization (BSO) by [95], Social Emotional Optimization Algorithm (ESOA) by [96], Socio-Evolution and Learning Optimization (SELO) by [97], and Human Group Formation (HGF) by [98]. The last category is Sport-based algorithms that simulate sport activities and games. Some interesting examples include League Championship Algorithm by [99], Golden Ball Algorithm by [100], World Cup Optimization by [101], Arithmetic Optimization Algorithm (AOA) by [18], and others [81,102,103].
The Harris hawks optimization is a recently developed algorithm that simulates Harris hawks’ special hunting behavior known as “seven kills”. The HHO algorithm [33] has some unique features compared with other popular swarm-based optimization methods. The first is that this optimizer utilizes a time-varying rule which evolves by more iterations of the method during exploration and exploitation. Such a way of shifting from exploration to exploitation propensities can make it subsequently flexible when the optimizer is in front of an undesirable difficulty in the feature space. Another advantage is that the algorithm has a progressive trend during the convergence process and when shifting from the first diversification/exploration phase to the intensification/exploitation core. The quality of results in HHO is relatively higher compared with other popular methods, and this feature also supported the widespread applications of this solver. Moreover, the exploitation phase of this method is compelling based on a greedy manner in picking the best solutions explored so far and ignoring low-quality solutions obtained until that iteration.  We review different applications of this optimizer in the next parts of this survey to see how its features make it a fitting choice for real-world cases.
HHO is similar to all metaheurstics algorithms. HHO has many benefits (advantages) and a smaller number of disadvantages. HHO advantages can be listed as follows:
  • Good convergence speed.
  • Powerful neighborhood search characteristic.
  • Good balance between exploration and exploitation.
  • Suitable for many kinds of problems.
  • Easy to implement.
  • Adaptability, scalability, flexibility, and robustness.
The disadvantages of HHO, as with all other algorithms, is that it may stick in local optima, and there is no theatrical converging study frame.
This paper is organized as follows. Section 2 presents the original procedure of the studied HHO. Section 3 presents the variants of the HHO. Section 4 presents the application of the HHO. Section 5 presents a brief discussion about the HHO with its advantages. Finally, the conclusion and future works are given in Section 6.

2. Review Methodology

The aims of this study are to present a comprehensive review of all HHO aspects and how the researchers and scholars are encouraged and motivated to use it in various disciplines. Since the suggestion of this algorithm, it has received huge attention from scholars all over the world. According to Google Scholar (https://scholar.google.com/scholar?cites=16912266037349375725&as_sdt=2005&sciodt=0,5&hl=en (accessed on 31 December 2021)), it has been cited more than “1231” times. Moreover, the original study is also selected as a hot paper and one of the most ranked studies in both Scopus (https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-85063421586&origin=recordpage (accessed on 31 December 2021)) and Web of Science (https://www.webofscience.com/wos/woscc/full-record/WOS:000469154500064 (accessed on 31 December 2021)). Firstly, a preliminary study was conducted, after that, we performed a search on studies that referred to the HHO original study to make a list of keywords we must use to perform the search. Secondly, skimming and scanning methods were used to select relevant studies. Thirdly, a checking and screening process was performed to extract data. Finally, we sorted the data and classified ideas (papers).

3. Harris Hawks Optimization (HHO)

The Harris hawks optimization is a recently introduced population-based optimization method suggested by Heidari et al. [33] in which the authors tried to simulate the Harris hawk’s cooperative behavior in nature. Some hawks try to surprise prey using some techniques (tracing and encircling–approaching and attacking). The HHO pseudocode can be founded in Algorithm 1. The authors simulated hawks’ behavior in 3 steps.
The logical diagram of the HHO can be given in Figure 1.
(i) The first step (exploration phase) can be formulated as follows:
X ( t + 1 ) = X r a n d ( t ) r 1 | X r a n d ( t ) 2 r 2 X ( t ) | q 0.5 X r a b b i t ( t ) X m ( t ) r 3 ( L B + r 4 ( U B L B ) ) i f   r 4 < 0.5
where X ( t )   and   X ( t + 1 ) refer to hawk location in the current iteration and next iteration, respectively, r 1 , r 2 , r 3 , r 4 , and q are stochastic numbers in the interval [ 0 , 1 ] , X r a b b i t and X r a n d ( t ) refer to rabbit location (best position) and randomly selected hawk location, respectively, and  X m ( t ) can be calculated from the following Equation (2).
X m ( t ) = 1 N i = 1 N X i ( t )
where X i ( t ) refers to each hawk position and N is the maximum iteration number.
(ii) Transition from exploration to exploitation
E = 2 E o ( 1 t T )
where E o and E are the initial energy and the escaping energy, respectively. If E o 1 , then exploration occurs. Otherwise, exploitation happens. This step can be identified in 4 scenarios:
Soft besiege: occurs if r 0.5 and | E | 0.5 , which can be obtained from the following equation:
X ( t + 1 ) = Δ X ( t ) E | J X r a b b i t ( t ) X ( t ) |
Δ X ( t ) = X r a b b i t X ( t )
Δ X ( t ) refers to the difference between location of current hawk and location of prey. J = 2 ( 1 r 5 ) where r 5 [ 0 , 1 ] .
Hard besiege occurs if r 0 and | E | < 0 . This scenario is formulated as:
X ( t + 1 ) = X r a b b i t E | Δ X ( t ) |
Advanced rapid dives while soft surround: if r < 0 and | E | 0 . The next hawk move is obtained by Equation (7).
Y = X r a b b i t ( t ) E | J X r a b b i t ( t ) X ( t ) |
Z = Y + S × L F ( D )
where S is a random vector and D refers to dimension. LF is levy flight, which can be obtained from the following equation:
L F ( x ) = u × σ | ν | 1 β , σ = ( Γ ( 1 + β ) × s i n ( π β 2 ) Γ ( 1 + β 2 × β × 2 β 1 2 ) ) 1 β
where β is fixed and equal to 1.5, u   and   ν refer to random numbers ( 0 , 1 ) . The final equation can be given as
X ( t + 1 ) = Y F ( Y ) < F ( X ( t ) ) Z F ( Z ) < F ( X ( t ) )
Advanced rapid dives while hard surround: occurs if r < 0.5 and | E | < 0.5 . This behavior can be shown as follows:
X ( t + 1 ) = Y F ( Y ) < F ( X ( t ) ) Z F ( Z ) < F ( X ( t ) )
where Y and Z are calculated from the following equations:
Y = X r a b b i t ( t ) E | J X r a b b i t X m |
Z = Y + S × L F ( D )
Algorithm 1 Harris Hawks Optimization Algorithm.
  • Initialize the parameters (Popsize (N), MaxIter (T), L B , U B , and  D i m
  • Initialize a population x i i = 1, 2, …, n.
  • while ( i t e r M a x I t e r ) do
  •     Compute the fitness function for each hawk x i
  •      X r a b b i t = the best search agent
  •     for each hawk ( x i ) do
  •         Update initial energy E o , Jump Strength J Equation (3)
  •         if ( | E | 1 ) then
  •            Update hawk position using Equation (1)
  •         end if
  •         if ( | E | < 1 ) then
  •            if ( r 0.5 and | E | 0.5 ) then
  •                Update hawk position by Equation (4)
  •            else if ( r 0.5 and | E | < 0.5 ) then
  •                Update hawk position by Equation (6)
  •            else if ( r < 0.5 and | E | 0.5 ) then
  •                Update hawk position by Equation (8)
  •            else if
  •                 then Update hawk using Equation (9)
  •            end if
  •         end if
  •     end for
  • end while
  • Return X r a b b i t .

4. Experimental Results

4.1. Parameter Setting

In all these experiments, we used the same parameter setting, which is given in Table 2, which gives the number of dimensions, number of individuals, and maximum number of iterations. These experiments were carried out using Matlab 2021b on Intel(R) Core(TM) i5-5200 machine with 6 GBs of RAM. The setting of compared algorithms in both CEC2005 and CEC2017 can be found in Table 3.

4.2. Experimental Results of CEC 2005

CEC 2005 is a classical benchmark mathematical functions that contains many mathematical function types (Unimodal, multimodal, hybrid, and composite). Here, we compared HHO with the Genetic Algorithm (GA) [104], Covariance Matrix adaptation Evolution Strategy (CMAES) [105], Linear population size reduction-Success-History Adaptation for Differential Evolution (L-SHADE) [106], Ensemble Sinusoidal incorporated with L-SHADE (LSHADE-EpSin) [107], Sine Cosine Algorithm (SCA) [108], Grasshopper Optimization Algorithm (GOA) [73], Whale Optimization Algorithm (WOA) [48], Thermal Exchange Optimization (TEO) [88], and Artificial Ecosystem Optimization (AEO) [109].
Table 4 shows the results of these algorithms. It is easy to notice that the HHO has a good performance compared with the others. Moreover, Figure 2 and Figure 3 show the convergence curves of these algorithms, whereas Figure 4 and Figure 5 show their boxplot. The Wilcoxon Rank Sum (WRS) test was carried out with a 5% percentage between the HHO and the other algorithms to prove its superiority. The results of the Wilcoxon test can be found in Table 5.

4.3. Experimental Results of CEC 2017

To prove the powerfulness and effectiveness of the HHO, we performed a comparison between HHO and the number of metaheurstic algorithms using CEC 2017. These algorithms are the Genetic Algorithm (GA) [104], Linear population size reduction-Success-History Adaptation for Differential Evolution (L-SHADE) [106], Ensemble Sinusoidal incorporated with L-SHADE (LSHADE-EpSin) [107], Sine Cosine Algorithm (SCA) [108], Grasshopper Optimization Algorithm (GOA) [73], Whale Optimization Algorithm (WOA) [48], Thermal Exchange Optimization (TEO) [88], Artificial Ecosystem Optimization (AEO) [109], and Henry Gas solubility optimization (HGSO) [34].
The results of such a comparison are shown in Table 6 in terms of average and standard deviation for fitness value. From this table, it is clear that the HHO has a good performance. Moreover, Figure 6, Figure 7 and Figure 8 show convergence curves of the HHO compared with the above-mentioned algorithms, where we conclude that the HHO has a good speed convergence. Moreover, Figure 9, Figure 10 and Figure 11 show the boxplot of the HHO compared with the other algorithms.

5. HHO Variants

5.1. Enhancement to HHO

In the literature, there exist many studies which enhanced the HHO using many mathematical operators. These studies can be summarized, as shown in Table 7.

5.1.1. Enhanced HHO

Li et al. [121] tried to speed the HHO convergence by introducing an enhanced the HHO version using two strategies: (1) enhancing the HHO exploration by using opposite-based learning and logarithmic spiral and (2) the Modify Rosenbrock Method (RM) to fuse HHO in order to improve convergence accuracy and enhance the HHO local search capability. The authors tested their algorithm, which is called (RLHHO), using 30 IEEE CEC 2014 and 23 traditional benchmark functions. They compared their results with eight standard metaheurstic algorithms and six advanced ones. In [122], Ariui et al. developed an enhanced version of the HHO by hybridizing the JOS operator with the HHO. JOS consists of two other operators: Dynamic Opposite (DO) and Selective Leading Opposition (SLO). The authors assumed that JOS increases the HHO exploration capabilities, whereas DO increases HHO exploration capabilities.

5.1.2. Binary HHO

Many binary variants of the HHO have been introduced. For example, Too et al. [110] introduced a binary version of the HHO called (BHHO) using two transfer functions: S-shaped and V-shaped. Furthermore, they proposed another version called Quadratic Binary HHO (QBHHO). They compared their versions with five different MAs, namely binary differential evolution, binary flower pollination algorithm, genetic algorithm, and binary salp swarm algorithm. The same work was conducted by Thaher et al. [123] in which the authors applied their algorithm to feature selection. Moreover, Thaher and Arman [124] developed another version of the HHO called enhanced binary HHO (EBHHO) by using the multiswarm technique, in which the population is divided into three groups. Each group has a leader, and the fittest leader is able to guide more agents. Furthermore, they used three classifiers: K-nearest neighbors (KNN), Linear Discriminant Analysis (LDA), and Decision Tree (DT). Likewise, in [125], Chellal and Benmessahed developed a binary version of the HHO in order to be able to have an accurate detection of a protein complex.
Dokeroglu et al. [126] developed a binary version of multiobjective HHO to be able to solve a classification problem. They introduced novel discrete besiege (exploitation) and perching (exploration) operators. They used four machine learning techniques: Support Vector Machines (SVM), Decision Trees, Logistic regression, and Extreme Learning Machines. They used a COVID-19 dataset. Moreover, Chantar et al. [127] developed a binary HHO version using the Time-Varying Scheme. The new version, which is called BHHO-TVS, is used to solve classification problems. They used 18 different datasets to prove the significant performance of BHHO-TVS.

5.1.3. Opposite HHO

Hans et al. [111] proposed a new version of the HHO called (OHHO), based on the opposition-based learning (OBL) strategy. They applied the OHHO in feature selection in breast cancer classification. Fan et al. [113] proposed a novel HHO version called NCOHHO, which improves the HHO by two mechanisms: neighborhood centroid and opposite-based learning. In NCOHHO, neighborhood centroid is considered a reference point in generating the opposite particle. Likewise, Gupta et al. [128] used four strategies on the HHO namely: proposing a new nonlinear parameter for the prey energy, greedy selection, different rapid dives, and OBL. Other studies that used optimization methods to solve the parameter extraction problems refer to [129,130,131].
Jiao et al. [112] introduced a novel HHO called EHHO which employed OBL and Orthogonal Learning (OL).
Likewise, another improved approach of HHO was made by Song et al. [114] called IHHO, in which two techniques were employed: (1) Quasi-Oppositional and (2) Chaos theory.
Amer et al. [132] developed another version of the HHO called Elite Learning HHO (ELHHO). They used elite opposition-based learning to improve exploration phase quality.

5.1.4. Modified HHO

Akdag et al. [133] developed a modified HHO version by using seven different random distribution functions in order to show how stochastic search affects HHO performance. Moreover, Zhang et al. [134] tried to balance the exploration and exploitation phases by focusing on prey escaping energy (E). They introduced six different strategies to update E.
Yousri et al. [135] proposed an improved version of the HHO by modifying exploration strategies by using levy flight instead of depending on prey position. Moreover, they updated the hawks’ position based on three randomly chosen hawks instead of the random update technique. Likewise, Zhao et al. [136] developed another modified version of the HHO by using a chaotic method called the Singer mechanism. They also used the levy flight mechanism in order to enhance HHO convergence.
Hussain et al. [115] developed a long-term HHO algorithm (LMHHO) which shares multiple promising areas of information. With such information, more exploration capability is given to the HHO. The authors used CEC 2017 and classical functions to validate their algorithm.
Liu et al. [137] proposed an improved HHO algorithm that combined the Nelder–Mead Simplex algorithm and the crisscrossed algorithm crossover technique which is called the horizontal and vertical crossover mechanism. The authors applied their algorithm, which is called CCNMHHO, in photovoltaic parameter estimation.
Rizk-Allah and Hassanien [138] developed a hybrid HHO algorithm which combined Nelder–Mead with the Harris algorithm. They validated their algorithm using six differential equations and four engineering differential equations.
Youssri et al. [139] developed a modified version of the HHO, called Fractional-Order modified HHO (FMHHO), which used the fractional calculus (FOC) memory concept. They tested the FMHHO using 23 benchmark functions in addition to the IEEE CEC 2017 ones. They applied it for proton exchange membrance for modeling of fuel cells. Moreover, Irfan et al. [140] proposed a modified HHO (MHHO) by using crowding distance and roulette wheel selection. They tested it using the IEEE 8 and 15 bus systems.
Ge et al. [141] suggested another improved HHO version using the predator–rabbit distributance method.
Singh et al. [142] used opposition-based learning (OBL) to enahnce the HHO. They applied the novel algorithm which is called OHHO to data clustering and tested it using 10 benchmark datasets.

5.1.5. Improved HHO

Kardani et al. [143] introduced another improved version of the HHO and extreme learning machine (ELM). Their novel algorithm, which is called ELM-IHHO, tried to overcome the limitations of the HHO by using a mutation mechanism. The authors applied ELM-IHHO in the prediction of light carbon permeability. They compared their results with ELM-based algorithms such as PSO, GA, and SMA. Moreoever, Guo et al. [144] used the random unscented sigma mutation strategy to improve the HHO. Their novel HHO version used quasi-reflection learning and quasi-opposite learning strategies in order to enhance generation diversity. They also implemented logarithmic nonlinear convergence factor to have a good balance between local and global searches.
Liu [145] developed an improved version of the HHO (IHHO). In the IHHO, a new search process is added to improve candidate solution quality. They applied it in the Job-Shop Scheduling problem. Duan and Liu [146] introduced the golden sine strategy to improve the HHO. The novel algorithm can enhance the diversity of population and improve its performance.
Hu et al. [147] developed another variant of the HHO called IHHO using two techniques: (1) adding velocity to the HHO from the Particle Swarm Optimization Algorithm and (2) using the crossover scheme from the artificial Tree algorithm [148]. They used 23 functions to test the IHHO and compared the results with 11 metaheuristics algorithms. They applied it in stock market prediction.
Moreover, Selim et al. [149] tried to enhance the HHO by returning hawks to rabbit position instead of returning them to the maximum and minimum variables’ limits. They also developed a multiobjective version. Moreover, a novel search mechanism was proposed by Sihwail et al. [150] to improve HHO performance by mutation neighborhood search and rollback techniques.
Another enhanced HHO version was proposed in [116], in which the authors tried to enhance the HHO by using three techniques: (1) Chaos theory, (2) Multipopulation topological structure, and (3) Differential evolution operators: mutation and crossover. The authors applied their algorithm, which is known as the CMDHHO, in image applications.
In order to have the right balance between exploitation and exploration search in the HHO, Song et al. [151] developed an enhanced HHO version called GCHHO, where two techniques were employed, namely the Gaussian mutation and Cuckoo Search dimension decision techniques. To test GCHHO, CEC 2017 functions were used in addition to three engineering problems. The authors compared GCHHO with classical HHO, WOA, MFO, BA, SCA, FA, and PSO.
Moreover, Yin et al. [152] tried to prevent the HHO from falling in the local optimum region by developing an improved version of the HHO called NOL-HHO, in which there are Nonlinear Control Parameter and Random Opposition-based Learning strategies.
Ridha et al. [153] developed a boosted HHO (BHHO) algorithm which employed the random exploration strategy from the Flower Pollination Algorithm (FPA) and the mutation strategy from differential evolution. Wei et al. [120] developed another improved HHO approach that uses Gaussian barebone (GB). They tested their algorithm, which is called GBHHO, using the CEC 2014 problems. Zhang et al. [154] used the adaptive cooperative foraging technique and dispersed foraging strategy to improve the HHO. Their algorithm, which known as ADHHO, was tested using a CEC 2014 benchmark function.
A vibrational HHO (VHHO) was proposed by Shao et al. [119] to prevent the HHO particles from converging around local optima by embedding SVM into HHO and using a frequent mutation. VHHO was compared with SCA, PSO, and classical HHO.

5.1.6. Chaotic HHO

Many chaotic HHO variants have been proposed. Menesy et al. [155] proposed a chaotic HHO algorithm (CHHO) using ten chaotic functions. They compared their algorithm with conventional HHO, GWO, CS-EO, and SSO. The authors claimed that the experimental results show the superiority of CHHO over other algorithms. Likewise, Chen et al. [156] developed a new version termed EHHO, in which the chaotic local search method is used in addition to OBL techniques. Statistical results show that the EHHO achieved better results than other competitors. The authors applied EHHO in identifying photovoltaic cells parameters. Moreover, Gao et al. [157] used tent map with HHO.
Dhawale et al. [158] developed an improved chaotic HHO (CHHO). They tested their algorithm using 23 benchmark functions and compared it with SSA, DE, PSO, GWO, MVO, MFO, SCA, CS, TSA, GA, MMA, ACO, and HS. They argued that the CHHO outperformed all mentioned algorithms.
Basha et al. [159] developed a variant chaotic HHO using quasi-reflection learning. They applied it in order to enhance CNN design for classifying different brain tumor grades using magnetic resonance imaging. The authors tested their model using 10 benchmark functions, and after that they used two datasets. Likewise, Hussien and Amin in [160] developed a novel HHO version based on the chaotic local search, opposition-based learning, and self adaption mechanism. They evaluated their model using IEEE CEC 2017 and applied the novel algorithm, which is called m-HHO, in feature selection.
Dehkordi et al. [161] introduced a nonlinear-based chaotic HHO algorithm. The new algorithm, which known as (NCHHO), is applied to solve the Internet of Vehicles (IoV) optimization problem.

5.1.7. Dynamic HHO with Mutation Mechanism

Jia et al. [118] developed a dynamic HHO algorithm with mutation strategy. Instead of decreasing the energy escaping parameter E from 2 to 0, they introduced a dynamic control parameter to be able to avoid local optimum solutions. The authors claimed that this dynamic control would prevent the solution from getting stacked in local optimum as E in the original cannot take values greater than 1 in the second half of iterations.

5.1.8. Other HHO Variants

An adaptive HHO technique has been proposed by Wunnava et al. [117], which is called AHHO. The authors used mutation strategy to force the escape energy in the interval [0, 2]. The AHHO was tested using 23 classical functions and 30 functions obtained from CEC 2014.
To strengthen HHO performance, two strategies from Cuckoo Search were introduced to the HHO by Song et al. [151]. They used the dimension decided strategy and Gaussian mutation. They tested their algorithm, GCHHO, using 30 functions from IEEE CEC 2017.
Jiao et al. [162] proposed a multistrategy search HHO using the Least Squares Support Vector Machine (LSSVM). They used the Gauss chaotic method as the initialization method. They also employed the neighborhood perturbation mechanism, variable spiral search strategy, and adaptive weight. They used 23 functions in addition to CEC 2017 suit test functions to prove the effectiveness of their algorithm.
Zhong and Li [163] developed a hybrid algorithm called the Comprehensive Learning Harris Hawks Equilibrium Optimizer (CLHHEO) algorithm. The authors used three operators: the equilibrium optimizer operator, comprehensive learning, and terminal replacement mechanism. They used EO operators to enhance the HHO exploration capacity. Comprehensive Learning was employed in the CLHHEO to share agent knowledge in order to enhance the convergence capacity. They usedthe terminal replacement technique to prevent the CLHHEO from falling in local stagnation. The authors compared the CLHHEO with CLPSO, PSO, GWO, BBBC, WOA, DA, SSA, AOA, SOA, and classical HHO using 15 benchmark functions and 10 constrained problems.
Abd Elaziz et al. [164] developed an algorithm called multilayer HHO based on the algorithm called multileader HHO, which is based on Differential Evolution (MLHHDE). They introduced a memory structure that makes hawks learn from the global best positions and the best historical ones. DE is employed in order to increase the exploration phase. They used CEC 2017 benchmark functions.
Bujok [165] developed an advanced HHO algorithm that archive each old solution. He compared his algorithm using 22 real-world problems from CEC 2011.
Al-Batar et al. [166] enhanced the exploration phase of the original HHO by introducing the survival-of-the-fittest evolutionary principle that helps smooth transitions from exploration to exploitation. The authors used three, strategies namely proportional, linear ranked-based, and tournament methods. They tested their algorithms using 23 mathematical functions and 3 constrained problems.
Qu et al. [167] tried to improve the HHO by employing Variable Neighborhood Learning (VNL) which is used to balance between exploration and exploitation. They also used F-score to narrow down selection range and used mutation to increase diversity.
Nandi and Kamboj [168] combined canis lupus and the Grey wolf optimizer to improve the HHO. The novel algorithm, which is called hHHO-GWO, was tested on CEC 2005, CEC-BC-2017, and 11 different engineering problems.
Gölcük and Ozsoydan [169] developed a hybrid algorithm which combines both the Teaching–Learning-based Optimization (TLBO) with the HHO. The new algorithm (ITLHHO) is design to give a proper balance between exploitation and exploration. The authors tested theirs using 33 benchmark function (CEC 2006 and CEC 2009) and 10 multidisciplinary engineering problems.
Yu et al. [170] proposed a modified HHO called compact HHO (cHHO) in which only one hawk is used to search for the optimal solution instead of many hawks.

5.2. Hybrid HHO

The HHO has been hybridized with many other algorithms as shown in Table 8. Wang et al. [171,172] presented a hybrid algorithm called the Improved Hybrid AOHHO (IHAOHHO) that combines the HHO and Aquila Optimizer [173]. They used the opposition-based learning strategy and Representative-based Hunting (RH) to improve both the exploitation and exploration phases. They tested their algorithm using 23 benchmark functions. They compared their results with classical HHO and AO and five other state-of-art algorithms.
In order to improve HHO performance, the authors in [174] used SSA as a local search in HHO. They argued that their algorithm, which is called MHHO, has better performance than the conventional HHO. Attiya et al. [175] tried to enhance HHO convergence and solutions by employing Simulated Annealing (SA) as a local search strategy. Their algorithm, which is called HHOSA, was compared with PSO, FA, SSA, MFO, and HHO. The authors applied HHOSA in a job scheduling problem.
HHO has been integrated with Differential Evolution by Fu et al. [176]. They developed an algorithm called Improved Differential evolution, and the HHO was termed as IHDEHHO. They applied it to forecast wind speed using kernel extreme learning and phase space reconstruction.
Abd Elaziz et al. [177] proposed a hybrid algorithm, which is called HHOSSA, that combined the HHO with SSA. In HHOSSA, the population is divided into two halves in which HHO is used to update the first half of solutions and SSA is used to update the other half. To evaluate their algorithm, they used thirty-six functions from IEEE CEC 2005.
Moreover, a hybrid HHO algorithm called CHHO-CS, which refers to chaotic HHO cuckoo search algorithm, has been developed by Houssein et al. [178]. The CHHO-CS is used to control the HHO’s main position vectors, and the chaotic map is employed for updating the parameters of control energy. Another hybrid algorithm is presented by Barshandeh et al. [179], called the hybrid multipopulation algorithm. Firstly, the population is divided into many subpopulations. Then, a hybrid algorithm between HHO and the Artificial Ecosystem Optimization (AEO) is used. Furthermore, the authors used chaos theory, levy flight, local search algorithm, and quasi-oppositional learning.
Xie et al. [180] tried to solve the HHO’s shortcomings by proposing a hybrid algorithm that combined the HHO and Henary Gas Solubility Optimization (HGSO) algorithm. The authors used two benchmark functions to verify their algorithm: CEC 2005 and CEC2017. The authors compared HHO-HGSO with Marine Predator Algorithm (MPA), WOA, LSA, HHO, WCA, and HGSO.
Likewise, in [181], Fu et al. integrated GWO in the HHO by using two layers for population activity. In the first layer (bottom layer), solutions are updated using mutation-based GWO. On the other hand, the solutions in the upper layer are updated by employing the HHO.
Qu et al. [182] introduced an improved HHO approach that used information exchange. In the novel algorithm, which is termed as IEHHO, hawks can share and realize sufficient information when exploring preys. They also used a dynamic disturbance term that depends on sine and cosine in order to enhance the escaping energy.
In [183], an intensified HHO is proposed by Kamboj et al., where a hybrid HHO is combined with SCA (hHHO-SCA). They verified their model using CEC 2017 and CEC 2018 and compared it with SCA, HHO, ALO, MFO, GWO, GSA, BA, CS, and GOA.
Moreover, an algorithm called hHHO-IGWO was proposed by Dhawale and Kamboj [184] which improved GWO with the HHO. Another hybrid algorithm between SCA and HHO was developed [185] in which a mutation strategy was used.
Suresh et al. [186] developed a hybrid chaotic multiverse HHO (CMVHHO). The authors used chaotic MV in the first generation and after that the HHO was used to update positions.
Table 8. Summary of the literature review on hybrid HHO algorithms.
Table 8. Summary of the literature review on hybrid HHO algorithms.
SN.Modification NameRef.AuthorsJournal/Conf.Year Remarks
1MHHO[174]Jouhari et al.Symmetry2020SSA has been used as a local search in HHO.
2HHOSA[175]Attiya et al.Computational Intelligence and Neuroscience2020SA is used as a local search strategy. The authors applied HHOSA in a job scheduling problem.
3IHDEHHO[176]Fu et al.Renewable Energy2020Improved Differential evolution version is hybridized with HHO.
4HHOSSA[177]Abd Elaziz et al.Applied Soft Computing2020Population is divided into 2 halves, the first half using HHO and other half using SSA.
5CHHO-CS[178]Houssein et al.Scientific Reports2020HHO is hybridized with cuckoo search and chaotic theory.
6HMPA[179]Barshandeh et al.Engineering with Computers2020Population is divided to many subpopulations. Then, HHO and AEO are used.
7HHO-HGSO[180]Xie et al.IEEE Access2020HHO is combined with HGSO algorithm.
8IEHHO[181]Fu et al.Energy Conversion and Management20202 layers for population activity were developed. In the first layer a mutation-based GWO was used, and in the second one HHO was used.
9hHHO-SCA[183]Kamboj et al.Applied Soft Computing2020HHO is combined with SCA (hHHO-SCA).
ElSayed and Elattar [187] developed a hybrid algorithm which combines HHO with Sequential Quadratic Programming (HHO-SQP) in order to obtain the optimal overcurrent relay coordination that relays incorporating distributed generation.
Kaveh et al. [188] developed a hybrid algorithm called Imperialist Competitive HHO (ICHHO) that used the Imperialist Competitive Algorithm (ICA) [189] to improve the HHO’s exploration performance. They tested their model using 23 mathematical functions and many common engineering problems.
Sihwail et al. [190] developed a hybrid algorithm called the Netwon-Harris Hawks Optimization (NHHO) in which Netwon’s technique second-order is used to correct digits in order to solve nonlinear equation systems.
Another hybrid algorithm combines Differential Evolution (DE) with HHO and Gaining-Sharing Knowledge algorithm (GSK) [191]. The new algorithm is abbreviated as DEGH and was developed using the “rand/1” DE operator and GSK two-phase strategy. They also used self-adaption crossover probability to strengthen the relationship between selection, mutation, and crossover. They used 32 benchmark functions and compared DEGH with 8 state-of-the-art algorithms.
Another hybrid version between DE and HHO has been proposed by Abualigah et al. [192], where DE is used to enhance the HHO exploitation experience. The novel hybrid algorithm, known as H-HHO, was used to obtain cluster optimal numbers to each dataset.
Azar et al. [193] developed a prediction model using the Least Square Support Vector Machine (LS-SVM) and Adaptive Neuro-Fuzzy Inference System (ANFIS). ANFIS was optimized by the HHO (ANFIS-HHO). The authors argued that the HHO increases the ANFIS prediction performance.
Firouzi et al. [194] developed a hybrid algorithm which combines both the HHO and Nelder–Mead Optimization. They used it to determine the depth and location of a microsystem crack. They compared their algorithm with classical HHO, GOA, WOA, and DA. Lie et al. [195] developed a hybrid algorithm that combines the Fireworks Algorithm (FWA) [98] with the HHO based on the mechanism of dynamic competition. The authors used CEC 2005 to verify the powerfulness of their algorithm, which is called (DCFW-HHO). They compared it with MPA, WOA, WCA, LSA, FWA, and HHO.
Li et al. [196] developed an enhanced hybrid algorithm called Elite Evolutionary Strategy HHO (EESHHO). They tested their algorithm using 29 functions (6 hybrid functions from CEC2017 + 23 basic functions).
Ahmed [197] developed a hybrid algorithm between the HHO and homotopy analytical method (HAM) to solve partial differential equations.
Yuan et al. [198] developed a hybrid HHO algorithm that combines Harris hawks with instinctive reaction Strategy (IRS). The novel algorithm (IRSHHO) is tested using five mathematical functions.
Setiawan et al. [199] developed a hybrid algorithm called (HHO-SVR) in which SVR is optimized using the HHO.

5.3. Multiobject HHO

Hossain et al. [200] introduced a multiobjective version of HHO based on the 2-hop routing mechanism. The novel algorithm is applied for the cognitive Radio-Vehicular Ad Hoc Network. Dabba et al. [201] introduced another multiobjective binary algorithm based on the HHO. The novel algorithm, MOBHHO, was applied on microarray data gene selection by using two fitness functions: SVM and KNN. Another binary multiobjective version was proposed in [126].
Jangir et al. [202] developed a nondominated sorting multiobjective HHO (NSHHO). They tested their algorithm using 46 multiobjective real-world problems.
Du et al. [203] proposed a multiobjective HHO called (MOHHO) in order to tune extreme learning machine (ELM) parameters and applied it in air pollution prediction and forecasting. The authors evaluated their model, MOHHO-ELM, using 12 air pollutant concentrations recorded from 3 cities. Moreover, Islam et al. [204] tried to solve multiobjective optimal power flow. Likewise, an improved multiobjective version of the HHO was introduced by Selim et al. [149], which is termed as MOIHHHO. The authors applied it to find the optimal DG size and location.
Fu and Lu [205] developed an improved version of the HHO called (HMOHHO), in which hybrid techniques are integrated with the HHO, including Latin hypercube sampling initialization, mutation technique, and a modified differential evolution operator. The authors used UF and ZDT to validate their algorithm.
Piri and Mohapatra [206] presented a multiobjective Quadratic Binary (MOQBHHO) approach using KNN. The authors used Crowding Distance (CD) to pick the best solution from nondominated ones. They compared MOQBHHO with MOBHHO-S, MOGA, MOALO, and NSGA-II.

6. HHO Applications

HHO has been successfully applied to many applications, as shown in Figure 12 and Table 9.

6.1. Power

6.1.1. Optimal Power Flow

Recently, electrical energy utilities are increasing rapidly with wide demand [215,216]. To face such a problem, interconnected networks have emerged with differential power systems [217].
Optimal Power Flow (OPF) can be considered as an optimization problem. Hussain et al. [115] applied their algorithm to solve the OPF problem. They claimed that their algorithm, which is called LMHHO, achieved better results than classical HHO. Another attempt to solve OPF was made by Islam et al. [218], in which the HHO was compared with ALO, WOA, SSA, MFO, and the Glow Warm algorithm. The same authors solved the same problem in [204] with consideration to environmental emission. Likewise, a modified HHO introduced in [133] has been applied to solve OPF.
Akdag et al. [133] developed a modified HHO version using seven random distribution functions, namely normal distribution, F distribution, Rayleigh distribution, chi-square distribution, exponential distribution, Student’s distribution, and lognormal distribution. The authors applied their algorithm to Optimal Power Flow (OPF) and applied it to the IEEE 30-bus test system.
Paital et al. [219] tuned interval fuzzy type-2 lead-lag using the HHO. The novel algorithm (Dual-IT2FL) was applied to find the enhancement of stability in unified power flow controllers (UPFC).
Shekarappa et al. [207] used a hybrid algorithm between HHO and PSO called (HHOPSO) to solvea reactive power planning problem. The HHOPSO was tested using IEEE 57-bus. Mohanty and Panda [220] adapted the HHO using the Sine and Cosine Algorithm. Their developed algorithm, which is called (ScaHHO), was utilized in order to tune an adaptive fuzzy proportional integrated (AFPID) for hybrid power system frequency controller.

6.1.2. Distributed Generation

Abdel Aleem et al. [221] tried to solve the optimal design of C-type resonance-free harmonic filter in a distributed system using the HHO. The authors claimed that the HHO has better results than the other compared algorithm.
Moreover, Diaaeldin et al. [222] used the HHO to obtain the optimal reconfiguration of the network in distributed systems. Abdelsalam et al. [223] presented a smart unit model for multisource operation and cost management based on the HHO algorithm. Mohandas and Devanathan [208] used crossover and mutation with the HHO. The novel algorithm, which is called (CMBHHO), is applied to configure the network by distribution generator (DG) size and optimal location. They compared it with GSA, ALO, LSA, HSA, GA, FWA, RGA, GA, TLBO, and HHO.
Mossa et al. [224] developed a model to estimate proton exchange membrance fuel cell (PEMFC) parameters based on the HHO.
Chakraborty et al. [225] tried to use the HHO to select the optimum capacity, site, and number of solar DG. They used two benchmark functions: IEEE 33 and IEEE 69 bus radial.

6.1.3. Photovoltaic Models

Due to continuous and huge increases in energy demand, solar photovoltaic (PV), which is based on solar cells systems, has gained huge momentum. The HHO has been successfully applied to solve PV problems. Liu et al. [137] used their algorithm, called CCNMHHO, in finding the optimal PV model parameters. The authors state that CCNMHHO has competitive results when compared with other well-known and state-of-the-art algorithms. Likewise, Jiao et al. [112] used their developed algorithm (EHHO) in finding PV parameters and for the construction of its model with high precision. Moreover, in [135], the authors used their novel approach to find the optimal array reconfiguration PV in alleviation the influence of partial shading. Likewise, Qias et al. [226] tried to extract the parameters of 3-diode PV (TDPV).
Chen et al. [156] tried to identify PV cell parameters by an enhanced HHO version termed EHHO. They compared their algorithm with CLPSO, IJAYA, and GOTLBO. Sahoo has conducted similar work, as has Sahoo and Panda [227], in which they control solar PV frequency.
Naeijian et al. [209] used their developed HHO version, which called Whippy HHO (WHHO), in estimating PV solar cell parameters.

6.1.4. Wind Applications

In [228], Fang et al. developed a multiobject mutation operator (HMOHHO) which is used to acquire kernel identification voltera parameters. They applied it to forecast wind speed.
Roy et al. [229] used the HHO to reduce the interconnected wind turbines. They compared it with PSO, FPA, GWO, MVO, WOA, MFO, and BOA.

6.1.5. Economic Load Dispatch Problem

The Economic Load Dispatch Problem can be considered as one of the most common and important problems in power systems [230,231].
Pham et al. [232] employed multirestart strategy and OBL to enhance the HHO. The novel algorithm is applied to solve ELD with nonsmooth cost functions. They argued that results are superior when compared with previous studies.

6.1.6. Unit Commitment Problem

Nandi and Kamboj [233] tried to hybridize the Sine Cosine Algorithm and Memetic Algorithm using the HHO. They applied it to solve a unit commitment problem with photovoltaic applications. SCA is used in power provision, whereas ELD is performed by HHO.

6.2. Computer Science

6.2.1. Artificial Neural Network

Sammen et al. [234] used the HHO in order to enhance Artificial Neural Network (ANN) performance and proposed a hybrid model called ANN-HHO. the authors compared their novel model with ANN-GA, ANN-PSO, and classical ANN. They argued that the ANN-HHO outperformed other compared algorithms. The same work was conducted by Essa et al. [235], in which their model is compared against the Support Vector Machine (SVM) and the traditional ANN. They applied the novel model to improve active solar productivity prediction.
Fan et al. [113] used their novel algorithm, NCOHHO, in training a multilayer feed-forward neural network using five different datasets. A similar work which combined the HHO with ANN was conducted by Moayedi et al. [236] and applied to predict the compression coefficient of soil. The authors argued that the HHO is better than GOA in training ANN.
Moreover, in [237] Kolli and Tatavarth developed the Harris Water Optimization (HWO) based on a deep recurrent neural network (RNN) to detect fraud in a bank transaction.
Artificial Neural Networks (ANNs) are one of the most famous and popular learning methods that simulate the biological nervous system. The HHO has been used by many authors to train ANNs. Bacanin et al. [238] tried to adapt the HHO algorithm to train ANNs. They used two popular classification datasets to test their proposed approach. Moreover, Atta et al. [239] applied their enhanced version of the HHO in order to train a feed-forward neural network. They compared their method with eight metaheurstic algorithms using five classification datasets. A hybrid algorithm between the HHO and Whale Optimization Algorithm was developed to enhance ANN by Agarwal et al. [240].
Bac et al. [241] developed a hybrid model based on the HHO and Multiple Layers Perceptron (MLP). They used their developed system, which is called HHO-MLP, to estimate the efficiency of heavy metal absorption using nanotube-type halloysite.
Alamir in [242] proposed an enhanced ANN using the HHO to predict food liking in different masking background presence noise levels and types. Simsek and Alagoz [243] used an ANN learning model and the HHO in order to develop analysis schemes for optimal engine behavior. Likewise, Zhang et al. [244] estimated clay’s friction angle using deep NN and the HHO to evaluate slope stability. Murugadoss used Deep Convolution ANN (DCANN) with the HHO to have an early diabetes prediction.

6.2.2. Image Processing

In [210], Bao et al. applied their hybrid algorithm that combined the HHO with differential evolution (HHO-DE) in the segmentation of color multilevel thresholding images by using two techniques: Ostu’s method and Kapur’s entropy. They compared their results with seven other algorithms using ten images. They argued that HHO-DE outperforms all other algorithms in terms of structure, similarity index, peak signal-to-noise ratio, and feature similarity index. Similar work has been conducted by Wunnava et al. [245]. In addition to using DE, they modified the exploration phase by limiting the escape energy in the interval [ 2 , 3 ] .
Moreover, Golilarz et al. [246] utilized the HHO for obtaining the optimal thresholding function parameters for satellite images. Jia et al. [118] employed their algorithm, which is called dynamic HHO, with mutation (DHHO/M) in order to segment satellite images by using three criteria: Kanpur’s entropy, Tsallis entropy, and Otsu. Similar work has been conducted by Shahid et al. [247], in which denoising of the image was presented in the wavelet domain.
In [248], an efficient HHO variant was introduced by Esparza et al. in which they used minimum cross-entropy as a fitness function since they applied it to image segmentation. To validate their method, they compared it with K-means and fuzzy iterAg.
Naik et al. [249] proposed a leader HHO (LHHO) in order to enhance the exploration of algorithm. They applied it for 2-D Masi entropy multilevel image thresholding. They used segmentation metrics such as PSNR, FSIM, and SSIM [250].

6.2.3. Scheduling Problem

Machine scheduling can be considered as a decision-making optimization process that has a vital role in transport, manufacturing, etc.
Jouhari et al. [174] used SSA with the HHO in solving a machine scheduling problem. Moreover, Attiya et al. [175] used a modified version of the HHO called HHOSA in order to solve a scheduling job in the cloud environment.
Utama and Widodo [211] developed a hybrid algorithm based on the HHO to solve a flow shop scheduling problem (FSSP).

6.2.4. Feature Selection

Feature Selection (FS) is one of the most important preprocessing techniques which aims to reduce features which may influence machine learning performance [251,252,253]. The HHO has been used in solving FS problems. For example, Thaher et al. [123] used a binary version of the HHO in solving the FS problem. They used nine high-dimensional datasets.
Moreover, the authors in [254] used the HHO with Simulated Annealing and Bitwise operators (AND and OR). They used 19 datasets of all sizes. They claimed that HHOBSA has good results when compared with others.
A similar work was conducted by Thaher and Arman [124], where they used five different datasets. They used the ADASYN technique. Moreover, Sihwail et al. [150] used IHHO in order to solve the FS problem. They used 20 datasets with different feature dimensionality levels (low, moderate, and high). They compared IHHHO with seven other algorithms. Thaher et al. [255] detected Arabic tweets’ false information using the hybrid HHO algorithm based on ML models and feature selection. Their algorithm, which is called Binary HHO Logistic Regression (LR), has better results compared with other previous work on the same dataset. In [254], Abdel-Basset et al. tried to hybridize the HHO with Simulated Annealing based on a bitwise operator. They applied the novel algorithm, which is called (HHOBSA), in a feature selection problem using 24 datasets and 19 artificial ones.
Moreover, in [256] Turabieh et al. proposed an enhanced version of the HHO and applied it to a feature selection problem using K-Nearest Neighbor (KNN). They applied it in order to predict the performance of students. They evaluated their prediction system using many machine learning classifiers such as kNN, Naïve Bayes, Layered recurrent neural network (LRNN), and Artificial Neural Network.
Al-Wajih et al. [257] introduced a hybrid algorithm which combined the HHO with the Grey Wolf Optimizer. The new algorithm, which is called HBGWOHHO, used sigmoid function to transfer from the continuous to binary domain. They compared it with Binary Particle Swarm Optimization (BPSO), Binary Genetic Algorithm (BGA), Binary Grey Wolf Optimizer (BGWO), Binary Harris Hawks Optimizer (BHHO), and Binary Hybrid BWOPSO. They assumed that their algorithm had better accuracy and a smaller size of selected features.
Khurma et al. [258] developed two binary HHO versions based on filter approach for feature selection. The first one applies mutual information with the HHO for any two features. The second version applies the HHO with each feature’s group entropy. The authors assumed that the first approach selects fewer feature subsets, whereas the second one achieves higher classification accuracy.
Too et al. [212] enhanced the HHO algorithm by saving the memory mechanism and adopting a learning strategy. The novel algorithms, which are called MEHHO1 and MEHHO2, were employed to solve a feature selection problem. These approaches were evaluated using thirteen benchmark datasets with low-dimensional and eight others with high-dimensional ones.

6.2.5. Traveling Salesman Problem

Yaser and Ku-Mahamud [259] applied their hybrid algorithm called the Harris Hawk Optimizer Ant Colony System (HHO-ACS) in solving the Traveling Salesman Problem (TSP). They used many symmetric TSP instances such as bayg29, att48, berlin52, bays29, eil51, st70, eil76, and eil101. They compared the novel algorithm with Black hole [260], PSO, DA, GA, and ACO. They argued that the HHO-ACS has a good performance compared with them.
Ismael et al. [261] developed a new version of the HHO to solve an FS problem using V-support regression.

6.3. Wireless Sensor Network

The wireless sensor network (WSN) has recently been used in many fields, such as smart homes, health care, and environment detection, due to its features such as self-organizing and being environment friendly [262,263]. Sriniv and Amgoth in [264] used the HHO with SSA in proposing an energy-efficient WSN.
Likewise, Bhat and Venkata [265] used the HHO in classifying node coverage ranges to an incoming neighbor and outgoing neighbor. This technique is based on area minimization. the authors tested the HHO-MA in a 2D square, 2D C-shape, 3D G-shape, 3D cube, and 3D mountain. In [266], Singh and Prakash used the HHO to find the optimal place of multiple optical networks. Xu et al. [267] used the HHO in the intelligent reflecting surface by trying to maximize signal power by optimizing access point beamforming.
Sharma and Prakash [268] developed a model called HHO-LPWSN which used the HHO to localize sensors nodes in a wireless sensor network.

6.4. Medical Applications

A pulse-coupled neural network (PCNN) based on the HHO called HHO-PCNN has been developed by Jia et al. [269]. They applied it in image segmentation using many performance indicators (UM, CM, Recall, Dice, and Precision). The results were compared with WOA-PCNN, SCA-PCNN, PSO-PCNN, SSA-PCNN, MVO-PCNN, and GWO-PCNN. They claimed that their method has the best results.
Moreover, In [270], Rammurthy and Mahesh used their hybrid algorithm Whale HHO (WHHO) with deep learning classifier in detecting brain tumors using MRI images from two datasets: BRATS and SimBRATS.
Moreover, Abd Elaziz et al. [177] employed their method, called competitive chain HHO, in multilevel image thresholding using eleven natural gray scales. An adaptive HHO, which was proposed by Wunnava et al. [117], has been applied in D gray gradient multilevel image thresholding. They proved that their algorithm using I2DGG outperforms all other methods.
Moreover, Golilarz et al. [116] used their algorithm (CMDHHO), which is based on multipopulation differential evolution, in denoising satellite images.
Suresh et al. [186] used their chaotic hybrid algorithm, which is based on the deep kernel machine learning classifier CMVHHO-DKMLC, to classify medical diagnoses.
In [271], Kaur et al. employed Dimension Learning-based Hunting (DLH) with the original HHO. The novel algorithm, known as DLHO, was developed for biomedical datasets. The authors applied it to detect breast cancer.
Likewise, Chacko and Chacko [272] used the HHO to integrate watermarks on various strengths. The bits of watermarking were merged with Deep Learning Convolution NN (DLCNN). DLCNN used the HHO to identify watermarks.
Bandyopadhyay et al. [273] used the altruism concept and chaotic initialization to improve the HHO. Their novel algorithm was applied to segment brain Magnetic Resonance Images (MRI) using 18 benchmark images taken from the brainweb and WBE databases.
Iswisi et al. [274] used HHO in order to select the optimal cluster centers in fuzzy C-means (FCM) and segmentation. They tested their model using many brain MRIs.
Balamurugan et al. [275] tried to classify heart disease using adaptive HHO and deep GA. Their classified features were clustered using adaptive HHO, whereas enhanced deep GA was used to process classification processes. They used the UCI dataset to test their novel algorithm.
Abd Elaziz et al. [164] used their developed algorithm, called multilayer HHO, which based on DE (MLHHDE), to predict H1N1 viruses.
Qu et al. [167] used their algorithm, called VNLHHO, to be able enhance the classification of gene expression by improving the performance of feature selection. They used the Bayesian Classifier as a fitness function. They tested their model using profile data of the gene expression of different eight tumor types.

6.5. Chemical Engineering and Drug Discovery

Cheminformatics or chemical engineering is a field which is interested in discovering, analyzing, and predicting the properties of molecules by combining processes from mathematics and information science [276].
Houssein et al. [277] used the HHO with two classification techniques: SVM and kNN. By using two datasets: MonoAmine Oxidase and QSAR Biodegradation, they proved that the HHO-SVM achieved better results than the HHO-KNN. Moreover, in [178], the same authors hybridized the HHO with Cuckoo Search (CS) and chaotic maps and used SVM as an objective function. They proved that CHHO-CS is better than PSO, MFO, GWO, SSA, and SCA.
Abd Elaziz and Yousri [213] used the Henary Gas Solubility Optimization (HGSO) to enhance the HHO. They applied their algorithm (DHGHHD) to predict drug discovery and design using two real-world datasets.
Houssien et al. [278] used genetic algorithm operators (crossover and mutation) in addition to OBL and random OBL strategies in classical HHO to select chemical descriptors/features and compound activities.

6.6. Electronic and Control Engineering

Proportional–Integral Derivative (PID) controllers are used in order to improve Automatic Voltage Regulator (AVR) performance. Ekinic et al. [279] used the HHO to tune the PID parameters in AVR systems. Moreover, the same authors used the same algorithm [280] in tuning a DC motor PID controller by minimizing multiplied absolute error (ITAE) time. They compared their approach with the Grey Wolf Optimizer, Atom Search Algorithm, and Sine Cosine Algorithm. Moreover, the same authors [281] applied the HHO to tune the Fractional Order PID (FOPID) controller. They used it in a DC–DC buck. They compared their algorithm, which is called HHO-FOPID, with WOA-PID and GA-PID.
Likewise, Fu and Lu [205] developed a multiobjective PID controller based on the HHO with hybrid strategies (HMOHHO) for Hydraulic Turbine Governing Systems (HTGS).
Yousri et al. [282] used the HHO to evaluate PI parameter controllers which simulate load frequency control in the multi-interconnected system by using two systems: (1) two interconnected thermal areas and comprised PV and (2) four PV plants, two thermal plants, and a wind turbine. They compared the HHO against SCA, MVO, ALO, and GWO.
Barakat et al. [283] developed an interconnected power system LFC using PD-PI cascade control.
Munagala and Jatoth [284] tried to design a Fractional-order PID (FOPID) using the HHO for speed control.

6.7. Geological Engineering

The gravity-triggered mass downward movements can be used to define landslides. Bui et al. [285] applied the HHO to landslide susceptibility analysis. The authors used 208 historical landslides to build a predictive tool using ANN.
Moreover, Moayedi et al. applied the HHO to train multilayer perceptron (MLP). They used it in the footings bearing capacity assessments over two-layer foundation soils.
Murlidhar et al. [286] introduced a novel version of the HHO based on the multilayer perceptron neural network to predict flyrock distance induced by mine blasting.
Yu et al. [287] developed an ELM model based on the HHO in order to forecast mine blasting peak particle velocity.
Payani et al. [288] used the HHO and Bat Algorithm with machine learning tools (ANFIS/SVR) in order to improve modeling of a landslide spatial.

6.8. Building and Construction or Civil Engineering

Golafshani et al. [289] used the Radial Basis Function neural network (RBFNN) and multilayer neural network (MLNN) with the HHO in order to measure concrete Compressive Strength (CS). Parsa and Naderpour [290] tried to estimate the strength of shear of reinforced concrete walls using the HHO with support vector regression.
Kardani et al. [143] applied ICHHO to structural optimization using five truss structures.

6.9. Coronavirus COVID-19

The new coronavirus, also known as COVID-19, was termed as an infectious disease by the World Health Organization (WHO). This virus began in China (Wuhan city) and has affected billions of people’s lives [291,292]. Computer science researchers tried to use the HHO to analyze and detect this virus.
Balaha et al. [214] tried to propose an approach, called CovH2SD, in order to detect COVID-19 from chest Computed Tomography (CT) images. They applied transfer learning techniques using nine convolutional NNs, namely: ResNet101, ResNet50, VGG16, VGG19, MobileNetV1, MobileNetV2,Xception, DenseNet121, and DenseNet169.
Another work was conducted by Houssein et al. [293] to classify COVID-19 genes using SVM. They used a big gene expression cancer (RNA-Seq) dataset with 20531 features to test their model. Hu et al. [294] employed the HHO with the Extreme Learning Machine (ELM) to detect COVID-19 severity using blood gas analysis. They used Specular Reflection Learning and named the new HHO algorithm HHOSRL.
Ye et al. [295] developed the fuzzy KNN HHO method to predict and diagnose COVID-19. They compared their algorithm, which is called HHO-FKNN, with several machine learning algorithms and argued that it has a higher classification and better stability. Moreover, Bandyopadhyay et al. [296] used their hybrid algorithm, which combined chaotic HHO with SA, to screen COVID-19 CT scans.

6.10. Other Applications

6.10.1. Microchannel Heat Sinks Design

Electronic devices’ thermal management has become very important in product designs that require more efficiency and power. Abbasi et al. [297] tried to used the HHO in the microchannel heat sinks design. The authors compared their results with the Newton–Raphson method, PSO, GOA, WOA, DA, and the bees Optimization Algorithm.

6.10.2. Chart Patterns Recognition

Golilarz et al. [298] proposed a novel automatic approach based on the HHO and deep learning (ConvNet) for nine control chat patterns (CCP). In this approach, the CPP recognition method, the unprocessed data are passed and processed using more than one hidden layer in order to extract all representation features.

6.10.3. Water Distribution

Khalifeh et al. [299] developed a model based on the HHO in order to optimize the distribution network of water in a city named Homashahr in Iran from September 2018 to October 2019. the authors stated that the HHO proved efficiency in finding the optimal water network design.

6.10.4. Internet of Things

The Internet of Things (IoT) gives the ability to different entities to access the environment, monitor it, and communicate with other entities [300]. Seyfollahi and Ghaffari [301] developed a scheme for handling Reliable Data Dissemination in IoT (RDDI) based on the HHO. The authors evaluated their schemes using three comparative approaches using five different metrics: reliability, energy consumption, end-to-end delay, packet forwarding distance, and computational overhead. Moreover, Saravanan et al. [302] proposed a PI controller based on the HHO. They tuned BLDC motor parameters in the globe with IoT establishment.

6.10.5. Short-Term Load Forecasting

Tayab et al. [303] proposed a novel hybrid algorithm called HHO-FNN by training feed-forward neural network (FNN) using the HHO. They applied it in predicting load demand in the electric market in Queensland. They compared HHO-FNN with PSO, ANN, and PSO based on a support vector machine and a back-propagation neural network.

6.10.6. Cardiomyopathy

Ding et al. [304] developed a fuzzy HHO algorithm (FHHO) to monitor cardiomyopathy patients using wearable devices and sensors. They introduced Wearable Sensing Data Optimization (WSDO) to have accurate cardiomyopathy data.

6.10.7. Qos-Aware Service Composition

Li et al. [305] tried to solve QoS-aware Web Service (QWSC) problems and drawbacks using a metaheuristic algorithm by developing a method to construct fuzzy neighborhood relations and combining the HHO with logical chaotic function.

6.10.8. PEMFC Parameter Estimation

Mossa et al. [224] employed the HHO in order to evaluate the Proton Exchange Membrane Fuel Cell (PEMFC) unknown parameters. They tested it using three PEMFC stacks: 500W SR-12 PEM, BCS 500-W PEM, and 250 W. They claimed that based on the HHO it surpassed other algorithms in convergence speed and accuracy.

6.10.9. DVR Control System

ElKady et al. [306] employed the HHO to develop an optimized, enhanced, and less complex Dynamic Voltage Restorer (DVR). The results are compared with PSO and WOA. The authors used MATLAB/Simulink to simulate a system via Typhoon HIL402 real-time emulator validation.

7. A Brief Discussion

In this paper, we reviewed the most contemporary works and developments in the improvement and verification of the HHO. As collected works show, there are several points to be considered for both the conventional HHO and the enhanced variants of HHO. First, the performance matter. The different studies verified that one of the reasons for interest in the HHO over other competitors is its performance. Due to several reasons such as dynamic components, greedy selection, and multiphase searching rules, the HHO can show high-quality results in the same conditions that other methods are applied. This is one of the main whys and wherefores that the HHO is utilized in tracked papers. The second reason is the need for performance optimization. In most of the variants, the authors mention that the stability of the optimizer needs to be improved to reach higher levels of the convergence and lower stagnation problems. This requirement is a need in all population-based methods to make a more stable balance among the local and global search inclinations.
Second, most of the studies have enhanced the sense of balance among the exploratory and exploitative propensities of the HHO. The accuracy of results an convergence speed are the most frequent features enhanced in the literature until now. Hence, we observed in the studied papers that the authors applied the HHO and its variants to many new problems and datasets. As per the no free lunch (NFL) [307] theorem, the authors more and more understood, compared with before 2019, that for tackling new problems, how to adapt algorithm features and how some of their enhanced operations of each method contribute to the efficacy of the final results. Hence, they could investigate different performance aspects of the HHO in dealing with many real-world problems. The last but not least is to further enhance the quality of the results of the HHO with more deep evolutionary basis, such as coevolutionary technology, multipopulation approaches, memetic methods, and parallel computing. Such bases can further assist in harmonizing global and local trends, which will result in better variants of the HHO. Another aspect we want to suggest is that if the literature also attaches the source codes of their enhanced variants, it will be more productive for future research on the HHO, and they can also compare their variants with previous ideas.

8. Conclusions and Future Work

In this survey paper, we reviewed the recent applications and variants of the recently well-established robust optimizer, Harris hawk optimizer (HHO), as blueone of the most popular swarm-based techniques in recent years. The original HHO includes a set of random solutions, as a population-based method, which can perform two phases of global searches and four phases of local searches. The HHO can show high flexibility in the transition of phases and has several dynamic components that assist it in more efficient exploratory and exploitative trends. The literature review also contained an in-depth review of enhanced versions, the way they enhanced, and application domains.
There are several possible future directions and possible ideas worth investigations regarding the new variants of the HHO algorithm and its widespread applications. First, the HHO is still one year old and several problems in the machine learning domain, mainly feature selection, are still not resolved. Moreover, it is expected that authors provide more balance measures into their analysis and provide more about what resulted in the computational complexity of variants after modifications. Although the HHO is a relatively fast method, such analysis can help the literature to be more open regarding the time of computations.

Author Contributions

A.G.H.: conceptualization, supervision, methodology, formal analysis, resources, data curation, and writing—original draft preparation. L.A. and K.H.A.: conceptualization, supervision, writing—review and editing, project administration, and funding acquisition. R.A.Z., F.A.H., M.A. and A.S.: conceptualization, writing—review and editing, and supervision. A.H.G.: conceptualization and writing—review and editing. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Acknowledgments

The authors would like to thank the Deanship of Scientific Research at Umm Al-Qura University for supporting this work by Grant Code: (22UQU4320277DSR07).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Abualigah, L. Group search optimizer: A nature-inspired meta-heuristic optimization algorithm with its results, variants, and applications. Neural Comput. Appl. 2020, 33, 2949–2972. [Google Scholar] [CrossRef]
  2. Zitar, R.A.; Abualigah, L.; Al-Dmour, N.A. Review and analysis for the Red Deer Algorithm. J. Ambient. Intell. Humaniz. Comput. 2021, 1–11. [Google Scholar] [CrossRef] [PubMed]
  3. Hashim, F.A.; Salem, N.M.; Seddik, A.F. Automatic segmentation of optic disc from color fundus images. Jokull J. 2013, 63, 142–153. [Google Scholar]
  4. Fathi, H.; AlSalman, H.; Gumaei, A.; Manhrawy, I.I.; Hussien, A.G.; El-Kafrawy, P. An Efficient Cancer Classification Model Using Microarray and High-Dimensional Data. Comput. Intell. Neurosci. 2021, 2021, 7231126. [Google Scholar] [CrossRef] [PubMed]
  5. Nadimi-Shahraki, M.H.; Taghian, S.; Mirjalili, S.; Zamani, H.; Bahreininejad, A. GGWO: Gaze cues learning-based grey wolf optimizer and its applications for solving engineering problems. J. Comput. Sci. 2022, 61, 101636. [Google Scholar] [CrossRef]
  6. Almotairi, K.H.; Abualigah, L. Hybrid reptile search algorithm and remora optimization algorithm for optimization tasks and data clustering. Symmetry 2022, 14, 458. [Google Scholar] [CrossRef]
  7. Shah, A.; Azam, N.; Alanazi, E.; Yao, J. Image blurring and sharpening inspired three-way clustering approach. Appl. Intell. 2022, 1–25. [Google Scholar] [CrossRef]
  8. Alotaibi, Y. A New Meta-Heuristics Data Clustering Algorithm Based on Tabu Search and Adaptive Search Memory. Symmetry 2022, 14, 623. [Google Scholar] [CrossRef]
  9. Tejani, G.G.; Kumar, S.; Gandomi, A.H. Multi-objective heat transfer search algorithm for truss optimization. Eng. Comput. 2021, 37, 641–662. [Google Scholar] [CrossRef]
  10. Shehab, M.; Abualigah, L.; Al Hamad, H.; Alabool, H.; Alshinwan, M.; Khasawneh, A.M. Moth–flame optimization algorithm: Variants and applications. Neural Comput. Appl. 2020, 32, 9859–9884. [Google Scholar] [CrossRef]
  11. Abualigah, L. Multi-verse optimizer algorithm: A comprehensive survey of its results, variants, and applications. Neural Comput. Appl. 2020, 32, 12381–12401. [Google Scholar] [CrossRef]
  12. Islam, M.J.; Basalamah, S.; Ahmadi, M.; Sid-Ahmed, M.A. Capsule image segmentation in pharmaceutical applications using edge-based techniques. In Proceedings of the 2011 IEEE International Conference on Electro/Information Technology, Mankato, MN, USA, 15–17 May 2011; pp. 1–5. [Google Scholar]
  13. Kumar, S.; Tejani, G.G.; Pholdee, N.; Bureerat, S. Multi-objective passing vehicle search algorithm for structure optimization. Expert Syst. Appl. 2021, 169, 114511. [Google Scholar] [CrossRef]
  14. Holland, J.H. Genetic algorithms. Sci. Am. 1992, 267, 66–73. [Google Scholar] [CrossRef]
  15. Koza, J.R. Genetic Programming II, Automatic Discovery of Reusable Subprograms; MIT Press: Cambridge, MA, USA, 1992. [Google Scholar]
  16. Storn, R.; Price, K. Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
  17. Beyer, H.G.; Schwefel, H.P. Evolution strategies—A comprehensive introduction. Nat. Comput. 2002, 1, 3–52. [Google Scholar] [CrossRef]
  18. Abualigah, L.; Diabat, A.; Mirjalili, S.; Abd Elaziz, M.; Gandomi, A.H. The Arithmetic Optimization Algorithm. Comput. Methods Appl. Mech. Eng. 2021, 376, 113609. [Google Scholar] [CrossRef]
  19. Eberhart, R.; Kennedy, J. A new optimizer using particle swarm theory. In Proceedings of the MHS’95. Proceedings of the Sixth International Symposium on Micro Machine and Human Science, Nagoya, Japan, 4–6 October 1995; pp. 39–43. [Google Scholar]
  20. Dorigo, M.; Di Caro, G. Ant colony optimization: A new meta-heuristic. In Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406), Washington, DC, USA, 6–9 July 1999; Volume 2, pp. 1470–1477. [Google Scholar]
  21. Geem, Z.W.; Kim, J.H.; Loganathan, G.V. A new heuristic optimization algorithm: Harmony search. Simulation 2001, 76, 60–68. [Google Scholar] [CrossRef]
  22. Karaboga, D.; Basturk, B. A powerful and efficient algorithm for numerical function optimization: Artificial bee colony (ABC) algorithm. J. Glob. Optim. 2007, 39, 459–471. [Google Scholar] [CrossRef]
  23. Formato, R.A. Central Force Optimization. Prog. Electromagn. Res. 2007, 77, 425–491. [Google Scholar] [CrossRef] [Green Version]
  24. Simon, D. Biogeography-based optimization. IEEE Trans. Evol. Comput. 2008, 12, 702–713. [Google Scholar] [CrossRef] [Green Version]
  25. Yang, X.S.; Deb, S. Cuckoo search via Lévy flights. In Proceedings of the 2009 World Congress on Nature & Biologically Inspired Computing (NaBIC), Coimbatore, India, 9–11 December 2009; pp. 210–214. [Google Scholar]
  26. Das, S.; Biswas, A.; Dasgupta, S.; Abraham, A. Bacterial foraging optimization algorithm: Theoretical foundations, analysis, and applications. In Foundations of Computational Intelligence; Springer: Berlin/Heidelberg, Germany, 2009; Volume 3, pp. 23–55. [Google Scholar]
  27. Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A gravitational search algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  28. Yang, X.S. Firefly algorithm, stochastic test functions and design optimisation. Int. J. Bio-Inspired Comput. 2010, 2, 78–84. [Google Scholar] [CrossRef]
  29. Rao, R.V.; Savsani, V.J.; Vakharia, D. Teaching–learning-based optimization: A novel method for constrained mechanical design optimization problems. Comput.-Aided Des. 2011, 43, 303–315. [Google Scholar] [CrossRef]
  30. Pan, W.T. A new fruit fly optimization algorithm: Taking the financial distress model as an example. Knowl.-Based Syst. 2012, 26, 69–74. [Google Scholar] [CrossRef]
  31. Gandomi, A.H.; Alavi, A.H. Krill herd: A new bio-inspired optimization algorithm. Commun. Nonlinear Sci. Numer. Simul. 2012, 17, 4831–4845. [Google Scholar] [CrossRef]
  32. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
  33. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  34. Hashim, F.A.; Houssein, E.H.; Mabrouk, M.S.; Al-Atabany, W.; Mirjalili, S. Henry gas solubility optimization: A novel physics-based algorithm. Future Gener. Comput. Syst. 2019, 101, 646–667. [Google Scholar] [CrossRef]
  35. Li, S.; Chen, H.; Wang, M.; Heidari, A.A.; Mirjalili, S. Slime mould algorithm: A new method for stochastic optimization. Future Gener. Comput. Syst. 2020, 111, 300–323. [Google Scholar] [CrossRef]
  36. Hashim, F.A.; Hussien, A.G. Snake Optimizer: A novel meta-heuristic optimization Algorithm. Knowl.-Based Syst. 2022, 242, 108320. [Google Scholar] [CrossRef]
  37. Abualigah, L.; Shehab, M.; Alshinwan, M.; Mirjalili, S.; Abd Elaziz, M. Ant Lion Optimizer: A Comprehensive Survey of Its Variants and Applications. Arch. Comput. Methods Eng 2020, 242, 108320. [Google Scholar] [CrossRef]
  38. Alsalibi, B.; Mirjalili, S.; Abualigah, L.; Gandomi, A.H. A Comprehensive Survey on the Recent Variants and Applications of Membrane-Inspired Evolutionary Algorithms. Arch. Comput. Methods Eng. 2022, 1–17. [Google Scholar] [CrossRef]
  39. Hashim, F.; Salem, N.; Seddik, A. Optic disc boundary detection from digital fundus images. J. Med. Imaging Health Inform. 2015, 5, 50–56. [Google Scholar] [CrossRef]
  40. Abualigah, L.; Almotairi, K.H.; Abd Elaziz, M.; Shehab, M.; Altalhi, M. Enhanced Flow Direction Arithmetic Optimization Algorithm for mathematical optimization problems with applications of data clustering. Eng. Anal. Bound. Elem. 2022, 138, 13–29. [Google Scholar] [CrossRef]
  41. Kumar, S.; Tejani, G.G.; Pholdee, N.; Bureerat, S.; Mehta, P. Hybrid heat transfer search and passing vehicle search optimizer for multi-objective structural optimization. Knowl.-Based Syst. 2021, 212, 106556. [Google Scholar] [CrossRef]
  42. Abualigah, L.; Shehab, M.; Alshinwan, M.; Alabool, H. Salp swarm algorithm: A comprehensive survey. Neural Comput. Appl. 2019, 32, 11195–11215. [Google Scholar] [CrossRef]
  43. Nadimi-Shahraki, M.H.; Zamani, H. DMDE: Diversity-maintained multi-trial vector differential evolution algorithm for non-decomposition large-scale global optimization. Expert Syst. Appl. 2022, 198, 116895. [Google Scholar] [CrossRef]
  44. Zamani, H.; Nadimi-Shahraki, M.H.; Gandomi, A.H. Starling murmuration optimizer: A novel bio-inspired algorithm for global and engineering optimization. Comput. Methods Appl. Mech. Eng. 2022, 392, 114616. [Google Scholar] [CrossRef]
  45. Abualigah, L.; Diabat, A.; Geem, Z.W. A Comprehensive Survey of the Harmony Search Algorithm in Clustering Applications. Appl. Sci. 2020, 10, 3827. [Google Scholar] [CrossRef]
  46. Abualigah, L.; Gandomi, A.H.; Elaziz, M.A.; Hussien, A.G.; Khasawneh, A.M.; Alshinwan, M.; Houssein, E.H. Nature-inspired optimization algorithms for text document clustering—A comprehensive analysis. Algorithms 2020, 13, 345. [Google Scholar] [CrossRef]
  47. Nadimi-Shahraki, M.H.; Fatahi, A.; Zamani, H.; Mirjalili, S.; Abualigah, L.; Abd Elaziz, M. Migration-based moth-flame optimization algorithm. Processes 2021, 9, 2276. [Google Scholar] [CrossRef]
  48. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  49. Nadimi-Shahraki, M.H.; Taghian, S.; Mirjalili, S.; Ewees, A.A.; Abualigah, L.; Abd Elaziz, M. MTV-MFO: Multi-Trial Vector-Based Moth-Flame Optimization Algorithm. Symmetry 2021, 13, 2388. [Google Scholar] [CrossRef]
  50. Fatani, A.; Dahou, A.; Al-Qaness, M.A.; Lu, S.; Elaziz, M.A. Advanced feature extraction and selection approach using deep learning and Aquila optimizer for IoT intrusion detection system. Sensors 2021, 22, 140. [Google Scholar] [CrossRef] [PubMed]
  51. Molina, D.; Poyatos, J.; Del Ser, J.; García, S.; Hussain, A.; Herrera, F. Comprehensive Taxonomies of Nature-and Bio-inspired Optimization: Inspiration versus Algorithmic Behavior, Critical Analysis and Recommendations. arXiv 2020, arXiv:2002.08136. [Google Scholar] [CrossRef]
  52. Nadimi-Shahraki, M.H.; Taghian, S.; Mirjalili, S.; Abualigah, L.; Abd Elaziz, M.; Oliva, D. EWOA-OPF: Effective Whale Optimization Algorithm to Solve Optimal Power Flow Problem. Electronics 2021, 10, 2975. [Google Scholar] [CrossRef]
  53. Kharrich, M.; Abualigah, L.; Kamel, S.; AbdEl-Sattar, H.; Tostado-Véliz, M. An Improved Arithmetic Optimization Algorithm for design of a microgrid with energy storage system: Case study of El Kharga Oasis, Egypt. J. Energy Storage 2022, 51, 104343. [Google Scholar] [CrossRef]
  54. Niu, P.; Niu, S.; Chang, L. The defect of the Grey Wolf optimization algorithm and its verification method. Knowl.-Based Syst. 2019, 171, 37–43. [Google Scholar] [CrossRef]
  55. Hashim, F.; Mabrouk, M.S.; Al-Atabany, W. GWOMF: Grey Wolf Optimization for Motif Finding. In Proceedings of the 2017 13th International Computer Engineering Conference (ICENCO), Cairo, Egypt, 27–28 December 2017; pp. 141–146. [Google Scholar]
  56. Sörensen, K. Metaheuristics—The metaphor exposed. Int. Trans. Oper. Res. 2015, 22, 3–18. [Google Scholar] [CrossRef]
  57. Hassanien, A.E.; Emary, E. Swarm Intelligence: Principles, Advances, and Applications; CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar]
  58. Koza, J.R. Genetic Programming II; MIT Press: Cambridge, UK, 1994; Volume 17. [Google Scholar]
  59. Yao, X.; Liu, Y.; Lin, G. Evolutionary programming made faster. IEEE Trans. Evol. Comput. 1999, 3, 82–102. [Google Scholar]
  60. Andrew, A.M. Evolution and optimum seeking. Kybernetes 1998, 27, 975–978. [Google Scholar] [CrossRef]
  61. Civicioglu, P. Backtracking search optimization algorithm for numerical optimization problems. Appl. Math. Comput. 2013, 219, 8121–8144. [Google Scholar] [CrossRef]
  62. Karaboga, D.; Basturk, B. Artificial bee colony (ABC) optimization algorithm for solving constrained optimization problems. In International Fuzzy Systems Association World Congress; Springer: Berlin/Heidelberg, Germany, 2007; pp. 789–798. [Google Scholar]
  63. Yang, X.S. Firefly algorithms for multimodal optimization. In International Symposium on Stochastic Algorithms; Springer: Berlin/Heidelberg, Germany, 2009; pp. 169–178. [Google Scholar]
  64. Mostafa, R.R.; Hussien, A.G.; Khan, M.A.; Kadry, S.; Hashim, F. Enhanced COOT optimization algorithm for Dimensionality Reduction. In Proceedings of the 2022 Fifth International Conference of Women in Data Science at Prince Sultan University (WiDS PSU), Riyadh, Saudi Arabia, 28–29 March 2022. [Google Scholar] [CrossRef]
  65. Chu, S.C.; Tsai, P.W.; Pan, J.S. Cat swarm optimization. In Pacific Rim International Conference on Artificial Intelligence; Springer: Berlin/Heidelberg, Germany, 2006; pp. 854–858. [Google Scholar]
  66. Yang, X.S. A new metaheuristic bat-inspired algorithm. In Nature Inspired Cooperative Strategies for Optimization (NICSO 2010); Springer: Berlin/Heidelberg, Germany, 2010; pp. 65–74. [Google Scholar]
  67. Kumar, S.; Tejani, G.G.; Mirjalili, S. Modified symbiotic organisms search for structural optimization. Eng. Comput. 2019, 35, 1269–1296. [Google Scholar] [CrossRef] [Green Version]
  68. Nadimi-Shahraki, M.H.; Fatahi, A.; Zamani, H.; Mirjalili, S.; Abualigah, L. An Improved Moth-Flame Optimization Algorithm with Adaptation Mechanism to Solve Numerical and Mechanical Engineering Problems. Entropy 2021, 23, 1637. [Google Scholar] [CrossRef] [PubMed]
  69. Hussien, A.G.; Amin, M.; Abd El Aziz, M. A comprehensive review of moth-flame optimisation: Variants, hybrids, and applications. J. Exp. Theor. Artif. Intell. 2020, 32, 705–725. [Google Scholar] [CrossRef]
  70. Hussien, A.G.; Heidari, A.A.; Ye, X.; Liang, G.; Chen, H.; Pan, Z. Boosting whale optimization with evolution strategy and Gaussian random walks: An image segmentation method. Eng. Comput. 2022, 1–45. [Google Scholar] [CrossRef]
  71. Hussien, A.G.; Hassanien, A.E.; Houssein, E.H.; Amin, M.; Azar, A.T. New binary whale optimization algorithm for discrete optimization problems. Eng. Optim. 2020, 52, 945–959. [Google Scholar] [CrossRef]
  72. Hussien, A.G.; Oliva, D.; Houssein, E.H.; Juan, A.A.; Yu, X. Binary whale optimization algorithm for dimensionality reduction. Mathematics 2020, 8, 1821. [Google Scholar] [CrossRef]
  73. Saremi, S.; Mirjalili, S.; Lewis, A. Grasshopper optimisation algorithm: Theory and application. Adv. Eng. Softw. 2017, 105, 30–47. [Google Scholar] [CrossRef] [Green Version]
  74. Mirjalili, S.; Gandomi, A.H.; Mirjalili, S.Z.; Saremi, S.; Faris, H.; Mirjalili, S.M. Salp Swarm Algorithm: A bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 2017, 114, 163–191. [Google Scholar] [CrossRef]
  75. Hussien, A.G. An enhanced opposition-based Salp Swarm Algorithm for global optimization and engineering problems. J. Ambient. Intell. Humaniz. Comput. 2021, 13, 129–150. [Google Scholar] [CrossRef]
  76. Hussien, A.G.; Amin, M.; Wang, M.; Liang, G.; Alsanad, A.; Gumaei, A.; Chen, H. Crow Search Algorithm: Theory, Recent Advances, and Applications. IEEE Access 2020, 8, 173548–173565. [Google Scholar] [CrossRef]
  77. Cheng, M.Y.; Prayogo, D. Symbiotic organisms search: A new metaheuristic optimization algorithm. Comput. Struct. 2014, 139, 98–112. [Google Scholar] [CrossRef]
  78. Abualigah, L.; Abd Elaziz, M.; Sumari, P.; Geem, Z.W.; Gandomi, A.H. Reptile Search Algorithm (RSA): A nature-inspired meta-heuristic optimizer. Expert Syst. Appl. 2022, 191, 116158. [Google Scholar] [CrossRef]
  79. Arora, S.; Singh, S. Butterfly optimization algorithm: A novel approach for global optimization. Soft Comput. 2019, 23, 715–734. [Google Scholar] [CrossRef]
  80. Wang, S.; Hussien, A.G.; Jia, H.; Abualigah, L.; Zheng, R. Enhanced Remora Optimization Algorithm for Solving Constrained Engineering Optimization Problems. Mathematics 2022, 10, 1696. [Google Scholar] [CrossRef]
  81. Zheng, R.; Jia, H.; Abualigah, L.; Liu, Q.; Wang, S. An improved arithmetic optimization algorithm with forced switching mechanism for global optimization problems. Math. Biosci. Eng. 2022, 19, 473–512. [Google Scholar] [CrossRef]
  82. Dhiman, G.; Kumar, V. Seagull optimization algorithm: Theory and its applications for large-scale industrial engineering problems. Knowl.-Based Syst. 2019, 165, 169–196. [Google Scholar] [CrossRef]
  83. Assiri, A.S.; Hussien, A.G.; Amin, M. Ant Lion Optimization: Variants, hybrids, and applications. IEEE Access 2020, 8, 77746–77764. [Google Scholar] [CrossRef]
  84. Kirkpatrick, S.; Gelatt, C.D.; Vecchi, M.P. Optimization by simulated annealing. Science 1983, 220, 671–680. [Google Scholar] [CrossRef]
  85. Erol, O.K.; Eksin, I. A new optimization method: Big bang–big crunch. Adv. Eng. Softw. 2006, 37, 106–111. [Google Scholar] [CrossRef]
  86. Shareef, H.; Ibrahim, A.A.; Mutlag, A.H. Lightning search algorithm. Appl. Soft Comput. 2015, 36, 315–333. [Google Scholar] [CrossRef]
  87. Abedinpourshotorban, H.; Shamsuddin, S.M.; Beheshti, Z.; Jawawi, D.N. Electromagnetic field optimization: A physics-inspired metaheuristic optimization algorithm. Swarm Evol. Comput. 2016, 26, 8–22. [Google Scholar] [CrossRef]
  88. Kaveh, A.; Dadras, A. A novel meta-heuristic optimization algorithm: Thermal exchange optimization. Adv. Eng. Softw. 2017, 110, 69–84. [Google Scholar] [CrossRef]
  89. Doğan, B.; Ölmez, T. A new metaheuristic for numerical function optimization: Vortex Search algorithm. Inf. Sci. 2015, 293, 125–145. [Google Scholar] [CrossRef]
  90. Tabari, A.; Ahmad, A. A new optimization method: Electro-Search algorithm. Comput. Chem. Eng. 2017, 103, 1–11. [Google Scholar] [CrossRef]
  91. Zhao, W.; Wang, L.; Zhang, Z. A novel atom search optimization for dispersion coefficient estimation in groundwater. Future Gener. Comput. Syst. 2019, 91, 601–610. [Google Scholar] [CrossRef]
  92. Lam, A.Y.; Li, V.O. Chemical-reaction-inspired metaheuristic for optimization. IEEE Trans. Evol. Comput. 2009, 14, 381–399. [Google Scholar] [CrossRef] [Green Version]
  93. Huan, T.T.; Kulkarni, A.J.; Kanesan, J.; Huang, C.J.; Abraham, A. Ideology algorithm: A socio-inspired optimization methodology. Neural Comput. Appl. 2017, 28, 845–876. [Google Scholar] [CrossRef]
  94. Mousavirad, S.J.; Ebrahimpour-Komleh, H. Human mental search: A new population-based metaheuristic optimization algorithm. Appl. Intell. 2017, 47, 850–887. [Google Scholar] [CrossRef]
  95. Shi, Y. Brain storm optimization algorithm. In International Conference in Swarm Intelligence; Springer: Berlin/Heidelberg, Germany, 2011; pp. 303–309. [Google Scholar]
  96. Xu, Y.; Cui, Z.; Zeng, J. Social emotional optimization algorithm for nonlinear constrained optimization problems. In International Conference on Swarm, Evolutionary, and Memetic Computing; Springer: Berlin/Heidelberg, Germany, 2010; pp. 583–590. [Google Scholar]
  97. Kumar, M.; Kulkarni, A.J.; Satapathy, S.C. Socio evolution & learning optimization algorithm: A socio-inspired optimization methodology. Future Gener. Comput. Syst. 2018, 81, 252–272. [Google Scholar]
  98. Tan, Y.; Zhu, Y. Fireworks algorithm for optimization. In International Conference in Swarm Intelligence; Springer: Berlin/Heidelberg, Germany, 2010; pp. 355–364. [Google Scholar]
  99. Kashan, A.H. League championship algorithm: A new algorithm for numerical function optimization. In Proceedings of the 2009 International Conference of Soft Computing and Pattern Recognition, Malacca, Malaysia, 4–7 December 2009; pp. 43–48. [Google Scholar]
  100. Osaba, E.; Diaz, F.; Onieva, E. Golden ball: A novel meta-heuristic to solve combinatorial optimization problems based on soccer concepts. Appl. Intell. 2014, 41, 145–166. [Google Scholar] [CrossRef]
  101. Razmjooy, N.; Khalilpour, M.; Ramezani, M. A new meta-heuristic optimization algorithm inspired by FIFA world cup competitions: Theory and its application in PID designing for AVR system. J. Control Autom. Electr. Syst. 2016, 27, 419–440. [Google Scholar] [CrossRef]
  102. Zheng, R.; Jia, H.; Abualigah, L.; Liu, Q.; Wang, S. Deep ensemble of slime mold algorithm and arithmetic optimization algorithm for global optimization. Processes 2021, 9, 1774. [Google Scholar] [CrossRef]
  103. Wang, S.; Liu, Q.; Liu, Y.; Jia, H.; Abualigah, L.; Zheng, R.; Wu, D. A Hybrid SSA and SMA with mutation opposition-based learning for constrained engineering problems. Comput. Intell. Neurosci. 2021, 2021. [Google Scholar] [CrossRef] [PubMed]
  104. Goldberg, D.E.; Holland, J.H. Genetic algorithms and machine learning. In Proceedings of the Sixth Annual Conference on Computational Learning Theory, Santa Cruz, CA, USA, 26–28 July 1993. [Google Scholar]
  105. Hansen, N.; Müller, S.D.; Koumoutsakos, P. Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evol. Comput. 2003, 11, 1–18. [Google Scholar] [CrossRef]
  106. Tanabe, R.; Fukunaga, A.S. Improving the search performance of SHADE using linear population size reduction. In Proceedings of the 2014 IEEE Congress on Evolutionary Computation (CEC), Beijing, China, 6–11 July 2014; pp. 1658–1665. [Google Scholar]
  107. Awad, N.H.; Ali, M.Z.; Suganthan, P.N.; Reynolds, R.G. An ensemble sinusoidal parameter adaptation incorporated with L-SHADE for solving CEC2014 benchmark problems. In Proceedings of the 2016 IEEE Congress on Evolutionary Computation (CEC), Vancouver, BC, Canada, 24–29 July 2016; pp. 2958–2965. [Google Scholar]
  108. Mirjalili, S. SCA: A sine cosine algorithm for solving optimization problems. Knowl.-Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  109. Zhao, W.; Wang, L.; Zhang, Z. Artificial ecosystem-based optimization: A novel nature-inspired meta-heuristic algorithm. Neural Comput. Appl. 2020, 32, 9383–9425. [Google Scholar] [CrossRef]
  110. Too, J.; Abdullah, A.R.; Mohd Saad, N. A new quadratic binary harris hawk optimization for feature selection. Electronics 2019, 8, 1130. [Google Scholar] [CrossRef] [Green Version]
  111. Hans, R.; Kaur, H.; Kaur, N. Opposition-based Harris Hawks optimization algorithm for feature selection in breast mass classification. J. Interdiscip. Math. 2020, 23, 97–106. [Google Scholar]
  112. Jiao, S.; Chong, G.; Huang, C.; Hu, H.; Wang, M.; Heidari, A.A.; Chen, H.; Zhao, X. Orthogonally adapted Harris Hawk Optimization for parameter estimation of photovoltaic models. Energy 2020, 203, 117804. [Google Scholar] [CrossRef]
  113. Fan, C.; Zhou, Y.; Tang, Z. Neighborhood centroid opposite-based learning Harris Hawks optimization for training neural networks. Evol. Intell. 2020, 14, 1847–1867. [Google Scholar] [CrossRef]
  114. Song, Y.; Tan, X.; Mizzi, S. Optimal parameter extraction of the proton exchange membrane fuel cells based on a new Harris Hawks Optimization algorithm. Energy Sources Part A Recover. Util. Environ. Eff. 2020, 1–18. [Google Scholar] [CrossRef]
  115. Hussain, K.; Zhu, W.; Salleh, M.N.M. Long-term memory Harris’ hawk optimization for high dimensional and optimal power flow problems. IEEE Access 2019, 7, 147596–147616. [Google Scholar] [CrossRef]
  116. Golilarz, N.A.; Mirmozaffari, M.; Gashteroodkhani, T.A.; Ali, L.; Dolatsara, H.A.; Boskabadi, A.; Yazdi, M. Optimized wavelet-based satellite image de-noising with multi-population differential evolution-assisted harris hawks optimization algorithm. IEEE Access 2020, 8, 133076–133085. [Google Scholar] [CrossRef]
  117. Wunnava, A.; Naik, M.K.; Panda, R.; Jena, B.; Abraham, A. An adaptive Harris hawks optimization technique for two dimensional grey gradient based multilevel image thresholding. Appl. Soft Comput. 2020, 95, 106526. [Google Scholar] [CrossRef]
  118. Jia, H.; Lang, C.; Oliva, D.; Song, W.; Peng, X. Dynamic harris hawks optimization with mutation mechanism for satellite image segmentation. Remote Sens. 2019, 11, 1421. [Google Scholar] [CrossRef] [Green Version]
  119. Shao, K.; Fu, W.; Tan, J.; Wang, K. Coordinated Approach Fusing Time-shift Multiscale Dispersion Entropy and Vibrational Harris Hawks Optimization-based SVM for Fault Diagnosis of Rolling Bearing. Measurement 2020, 173, 108580. [Google Scholar] [CrossRef]
  120. Wei, Y.; Lv, H.; Chen, M.; Wang, M.; Heidari, A.A.; Chen, H.; Li, C. Predicting Entrepreneurial Intention of Students: An Extreme Learning Machine With Gaussian Barebone Harris Hawks Optimizer. IEEE Access 2020, 8, 76841–76855. [Google Scholar] [CrossRef]
  121. Li, C.; Li, J.; Chen, H.; Jin, M.; Ren, H. Enhanced Harris hawks optimization with multi-strategy for global optimization tasks. Expert Syst. Appl. 2021, 185, 115499. [Google Scholar] [CrossRef]
  122. Arini, F.Y.; Chiewchanwattana, S.; Soomlek, C.; Sunat, K. Joint Opposite Selection (JOS): A premiere joint of selective leading opposition and dynamic opposite enhanced Harris’ hawks optimization for solving single-objective problems. Expert Syst. Appl. 2022, 188, 116001. [Google Scholar] [CrossRef]
  123. Thaher, T.; Heidari, A.A.; Mafarja, M.; Dong, J.S.; Mirjalili, S. Binary Harris Hawks optimizer for high-dimensional, low sample size feature selection. In Evolutionary Machine Learning Techniques; Springer: Berlin/Heidelberg, Germany, 2020; pp. 251–272. [Google Scholar]
  124. Thaher, T.; Arman, N. Efficient Multi-Swarm Binary Harris Hawks Optimization as a Feature Selection Approach for Software Fault Prediction. In Proceedings of the 2020 11th International Conference on Information and Communication Systems (ICICS), Irbid, Jordan, 7–9 April 2020; pp. 249–254. [Google Scholar]
  125. Chellal, M.; Benmessahel, I. Dynamic Complex Protein Detection using Binary Harris Hawks Optimization. In Journal of Physics: Conference Series; IOP Publishing: Bristol, UK, 2020; Volume 1642, p. 012019. [Google Scholar]
  126. Dokeroglu, T.; Deniz, A.; Kiziloz, H.E. A robust multiobjective Harris’ Hawks Optimization algorithm for the binary classification problem. Knowl.-Based Syst. 2021, 227, 107219. [Google Scholar] [CrossRef]
  127. Chantar, H.; Thaher, T.; Turabieh, H.; Mafarja, M.; Sheta, A. BHHO-TVS: A binary harris hawks optimizer with time-varying scheme for solving data classification problems. Appl. Sci. 2021, 11, 6516. [Google Scholar] [CrossRef]
  128. Gupta, S.; Deep, K.; Heidari, A.A.; Moayedi, H.; Wang, M. Opposition-based Learning Harris Hawks Optimization with Advanced Transition Rules: Principles and Analysis. Expert Syst. Appl. 2020, 158, 113510. [Google Scholar] [CrossRef]
  129. Ridha, H.M.; Hizam, H.; Mirjalili, S.; Othman, M.L.; Ya’acob, M.E.; Abualigah, L. A Novel Theoretical and Practical Methodology for Extracting the Parameters of the Single and Double Diode Photovoltaic Models (December 2021). IEEE Access 2022, 10, 11110–11137. [Google Scholar] [CrossRef]
  130. Abbassi, A.; Mehrez, R.B.; Touaiti, B.; Abualigah, L.; Touti, E. Parameterization of Photovoltaic Solar Cell Double-Diode Model based on Improved Arithmetic Optimization Algorithm. Optik 2022, 253, 168600. [Google Scholar] [CrossRef]
  131. Jamei, M.; Karbasi, M.; Mosharaf-Dehkordi, M.; Olumegbon, I.A.; Abualigah, L.; Said, Z.; Asadi, A. Estimating the density of hybrid nanofluids for thermal energy application: Application of non-parametric and evolutionary polynomial regression data-intelligent techniques. Measurement 2021, 189, 110524. [Google Scholar] [CrossRef]
  132. Amer, D.A.; Attiya, G.; Zeidan, I.; Nasr, A.A. Elite learning Harris hawks optimizer for multi-objective task scheduling in cloud computing. J. Supercomput. 2021, 78, 2793–2818. [Google Scholar] [CrossRef]
  133. Akdag, O.; Ates, A.; Yeroglu, C. Modification of Harris hawks optimization algorithm with random distribution functions for optimum power flow problem. Neural Comput. Appl. 2021, 33, 1959–1985. [Google Scholar] [CrossRef]
  134. Zhang, Y.; Zhou, X.; Shih, P.C. Modified Harris Hawks Optimization Algorithm for Global Optimization Problems. Arab. J. Sci. Eng. 2020, 45, 10949–10974. [Google Scholar] [CrossRef]
  135. Yousri, D.; Allam, D.; Eteiba, M.B. Optimal photovoltaic array reconfiguration for alleviating the partial shading influence based on a modified harris hawks optimizer. Energy Convers. Manag. 2020, 206, 112470. [Google Scholar] [CrossRef]
  136. Zhao, L.; Li, Z.; Chen, H.; Li, J.; Xiao, J.; Yousefi, N. A multi-criteria optimization for a CCHP with the fuel cell as primary mover using modified Harris Hawks optimization. Energy Sources Part A Recover. Util. Environ. Eff. 2020, 1–16. [Google Scholar] [CrossRef]
  137. Liu, Y.; Chong, G.; Heidari, A.A.; Chen, H.; Liang, G.; Ye, X.; Cai, Z.; Wang, M. Horizontal and vertical crossover of Harris hawk optimizer with Nelder-Mead simplex for parameter estimation of photovoltaic models. Energy Convers. Manag. 2020, 223, 113211. [Google Scholar] [CrossRef]
  138. Rizk-Allah, R.M.; Hassanien, A.E. A hybrid Harris hawks-Nelder-Mead optimization for practical nonlinear ordinary differential equations. Evol. Intell. 2020, 15, 141–165. [Google Scholar] [CrossRef]
  139. Yousri, D.; Mirjalili, S.; Machado, J.T.; Thanikanti, S.B.; Fathy, A. Efficient fractional-order modified Harris hawks optimizer for proton exchange membrane fuel cell modeling. Eng. Appl. Artif. Intell. 2021, 100, 104193. [Google Scholar] [CrossRef]
  140. Irfan, M.; Oh, S.R.; Rhee, S.B. An Effective Coordination Setting for Directional Overcurrent Relays Using Modified Harris Hawk Optimization. Electronics 2021, 10, 3007. [Google Scholar] [CrossRef]
  141. Ge, L.; Liu, J.; Yan, J.; Rafiq, M.U. Improved Harris Hawks Optimization for Configuration of PV Intelligent Edge Terminals. IEEE Trans. Sustain. Comput. 2021. [Google Scholar] [CrossRef]
  142. Singh, T.; Panda, S.S.; Mohanty, S.R.; Dwibedy, A. Opposition learning based Harris hawks optimizer for data clustering. J. Ambient. Intell. Humaniz. Comput. 2021, 1–16. [Google Scholar] [CrossRef]
  143. Kardani, N.; Bardhan, A.; Roy, B.; Samui, P.; Nazem, M.; Armaghani, D.J.; Zhou, A. A novel improved Harris Hawks optimization algorithm coupled with ELM for predicting permeability of tight carbonates. Eng. Comput. 2021, 1–24. [Google Scholar] [CrossRef]
  144. Guo, W.; Xu, P.; Dai, F.; Zhao, F.; Wu, M. Improved Harris hawks optimization algorithm based on random unscented sigma point mutation strategy. Appl. Soft Comput. 2021, 113, 108012. [Google Scholar] [CrossRef]
  145. Liu, C. An improved Harris hawks optimizer for job-shop scheduling problem. J. Supercomput. 2021, 77, 14090–14129. [Google Scholar] [CrossRef]
  146. Duan, Y.X.; Liu, C.Y. An improved Harris Hawk algorithm based on Golden Sine mechanism. In Proceedings of the 2021 4th International Conference on Advanced Electronic Materials, Computers and Software Engineering (AEMCSE), Changsha, China, 26–28 March 2021; pp. 493–497. [Google Scholar]
  147. Hu, H.; Ao, Y.; Bai, Y.; Cheng, R.; Xu, T. An Improved Harris’s Hawks Optimization for SAR Target Recognition and Stock Market Index Prediction. IEEE Access 2020, 8, 65891–65910. [Google Scholar] [CrossRef]
  148. Li, Q.; Song, K.; He, Z.; Li, E.; Cheng, A.; Chen, T. The artificial tree (AT) algorithm. Eng. Appl. Artif. Intell. 2017, 65, 99–110. [Google Scholar] [CrossRef] [Green Version]
  149. Selim, A.; Kamel, S.; Alghamdi, A.S.; Jurado, F. Optimal Placement of DGs in Distribution System Using an Improved Harris Hawks Optimizer Based on Single-and Multi-Objective Approaches. IEEE Access 2020, 8, 52815–52829. [Google Scholar] [CrossRef]
  150. Sihwail, R.; Omar, K.; Ariffin, K.A.Z.; Tubishat, M. Improved Harris Hawks Optimization Using Elite Opposition-Based Learning and Novel Search Mechanism for Feature Selection. IEEE Access 2020, 8, 121127–121145. [Google Scholar] [CrossRef]
  151. Song, S.; Wang, P.; Heidari, A.A.; Wang, M.; Zhao, X.; Chen, H.; He, W.; Xu, S. Dimension decided Harris hawks optimization with Gaussian mutation: Balance analysis and diversity patterns. Knowl.-Based Syst. 2021, 215, 106425. [Google Scholar] [CrossRef]
  152. Yin, Q.; Cao, B.; Li, X.; Wang, B.; Zhang, Q.; Wei, X. An Intelligent Optimization Algorithm for Constructing a DNA Storage Code: NOL-HHO. Int. J. Mol. Sci. 2020, 21, 2191. [Google Scholar] [CrossRef] [Green Version]
  153. Ridha, H.M.; Heidari, A.A.; Wang, M.; Chen, H. Boosted mutation-based Harris hawks optimizer for parameters identification of single-diode solar cell models. Energy Convers. Manag. 2020, 209, 112660. [Google Scholar] [CrossRef]
  154. Zhang, X.; Zhao, K.; Niu, Y. Improved Harris Hawks Optimization Based on Adaptive Cooperative Foraging and Dispersed Foraging Strategies. IEEE Access 2020, 8, 160297–160314. [Google Scholar] [CrossRef]
  155. Menesy, A.S.; Sultan, H.M.; Selim, A.; Ashmawy, M.G.; Kamel, S. Developing and applying chaotic harris hawks optimization technique for extracting parameters of several proton exchange membrane fuel cell stacks. IEEE Access 2019, 8, 1146–1159. [Google Scholar] [CrossRef]
  156. Chen, H.; Jiao, S.; Wang, M.; Heidari, A.A.; Zhao, X. Parameters identification of photovoltaic cells and modules using diversification-enriched Harris hawks optimization with chaotic drifts. J. Clean. Prod. 2020, 244, 118778. [Google Scholar] [CrossRef]
  157. Gao, Z.-M.; Zhao, J.; Hu, Y.-R.; Chen, H.-F. The improved Harris hawk optimization algorithm with the Tent map. In Proceedings of the 2019 3rd International Conference on Electronic Information Technology and Computer Engineering (EITCE), Xiamen, China, 18–20 October 2019; pp. 336–339. [Google Scholar]
  158. Dhawale, D.; Kamboj, V.K.; Anand, P. An improved Chaotic Harris Hawks Optimizer for solving numerical and engineering optimization problems. Eng. Comput. 2021, 1–46. [Google Scholar] [CrossRef]
  159. Basha, J.; Bacanin, N.; Vukobrat, N.; Zivkovic, M.; Venkatachalam, K.; Hubálovskỳ, S.; Trojovskỳ, P. Chaotic Harris Hawks Optimization with Quasi-Reflection-Based Learning: An Application to Enhance CNN Design. Sensors 2021, 21, 6654. [Google Scholar] [CrossRef]
  160. Hussien, A.G.; Amin, M. A self-adaptive Harris Hawks optimization algorithm with opposition-based learning and chaotic local search strategy for global optimization and feature selection. Int. J. Mach. Learn. Cybern. 2021, 13, 309–336. [Google Scholar] [CrossRef]
  161. Dehkordi, A.A.; Sadiq, A.S.; Mirjalili, S.; Ghafoor, K.Z. Nonlinear-based Chaotic Harris Hawks Optimizer: Algorithm and Internet of Vehicles application. Appl. Soft Comput. 2021, 109, 107574. [Google Scholar] [CrossRef]
  162. Jiao, S.; Wang, C.; Gao, R.; Li, Y.; Zhang, Q. Harris Hawks Optimization with Multi-Strategy Search and Application. Symmetry 2021, 13, 2364. [Google Scholar] [CrossRef]
  163. Zhong, C.; Li, G. Comprehensive learning Harris hawks-equilibrium optimization with terminal replacement mechanism for constrained optimization problems. Expert Syst. Appl. 2021, 192, 116432. [Google Scholar] [CrossRef]
  164. Abd Elaziz, M.; Yang, H.; Lu, S. A multi-leader Harris hawk optimization based on differential evolution for feature selection and prediction influenza viruses H1N1. Artif. Intell. Rev. 2022, 55, 2675–2732. [Google Scholar] [CrossRef]
  165. Bujok, P. Harris Hawks Optimisation: Using of an Archive. In International Conference on Artificial Intelligence and Soft Computing; Springer: Berlin/Heidelberg, Germany, 2021; pp. 415–423. [Google Scholar]
  166. Al-Betar, M.A.; Awadallah, M.A.; Heidari, A.A.; Chen, H.; Al-Khraisat, H.; Li, C. Survival exploration strategies for harris hawks optimizer. Expert Syst. Appl. 2021, 168, 114243. [Google Scholar] [CrossRef]
  167. Qu, C.; Zhang, L.; Li, J.; Deng, F.; Tang, Y.; Zeng, X.; Peng, X. Improving feature selection performance for classification of gene expression data using Harris Hawks optimizer with variable neighborhood learning. Briefings Bioinform. 2021, 22, bbab097. [Google Scholar] [CrossRef]
  168. Nandi, A.; Kamboj, V.K. A Canis lupus inspired upgraded Harris hawks optimizer for nonlinear, constrained, continuous, and discrete engineering design problem. Int. J. Numer. Methods Eng. 2021, 122, 1051–1088. [Google Scholar] [CrossRef]
  169. Gölcük, İ.; Ozsoydan, F.B. Quantum particles-enhanced multiple Harris Hawks swarms for dynamic optimization problems. Expert Syst. Appl. 2021, 167, 114202. [Google Scholar] [CrossRef]
  170. Yu, Z.; Du, J.; Li, G. Compact Harris Hawks Optimization Algorithm. In Proceedings of the 2021 40th Chinese Control Conference (CCC), Shanghai, China, 26–28 July 2021; pp. 1925–1930. [Google Scholar]
  171. Wang, S.; Jia, H.; Liu, Q.; Zheng, R. An improved hybrid Aquila Optimizer and Harris Hawks Optimization for global optimization. Math. Biosci. Eng 2021, 18, 7076–7109. [Google Scholar] [CrossRef] [PubMed]
  172. Wang, S.; Jia, H.; Abualigah, L.; Liu, Q.; Zheng, R. An improved hybrid aquila optimizer and harris hawks algorithm for solving industrial engineering optimization problems. Processes 2021, 9, 1551. [Google Scholar] [CrossRef]
  173. Abualigah, L.; Yousri, D.; Abd Elaziz, M.; Ewees, A.A.; Al-qaness, M.A.; Gandomi, A.H. Aquila Optimizer: A novel meta-heuristic optimization Algorithm. Comput. Ind. Eng. 2021, 157, 107250. [Google Scholar] [CrossRef]
  174. Jouhari, H.; Lei, D.; Al-qaness, M.A.; Elaziz, M.A.; Damaševičius, R.; Korytkowski, M.; Ewees, A.A. Modified Harris Hawks Optimizer for Solving Machine Scheduling Problems. Symmetry 2020, 12, 1460. [Google Scholar] [CrossRef]
  175. Attiya, I.; Abd Elaziz, M.; Xiong, S. Job scheduling in cloud computing using a modified harris hawks optimization and simulated annealing algorithm. Comput. Intell. Neurosci. 2020, 2020. [Google Scholar] [CrossRef] [Green Version]
  176. Fu, W.; Zhang, K.; Wang, K.; Wen, B.; Fang, P.; Zou, F. A hybrid approach for multi-step wind speed forecasting based on two-layer decomposition, improved hybrid DE-HHO optimization and KELM. Renew. Energy 2021, 164, 211–229. [Google Scholar] [CrossRef]
  177. Abd Elaziz, M.; Heidari, A.A.; Fujita, H.; Moayedi, H. A competitive chain-based Harris Hawks Optimizer for global optimization and multi-level image thresholding problems. Appl. Soft Comput. 2020, 95, 106347. [Google Scholar] [CrossRef]
  178. Houssein, E.H.; Hosney, M.E.; Elhoseny, M.; Oliva, D.; Mohamed, W.M.; Hassaballah, M. Hybrid Harris hawks optimization with cuckoo search for drug design and discovery in chemoinformatics. Sci. Rep. 2020, 10, 14439. [Google Scholar] [CrossRef]
  179. Barshandeh, S.; Piri, F.; Sangani, S.R. HMPA: An innovative hybrid multi-population algorithm based on artificial ecosystem-based and Harris Hawks optimization algorithms for engineering problems. Eng. Comput. 2022, 38, 1581–1625. [Google Scholar] [CrossRef]
  180. Xie, W.; Xing, C.; Wang, J.; Guo, S.; Guo, M.W.; Zhu, L.F. Hybrid Henry Gas Solubility Optimization Algorithm Based on the Harris Hawk Optimization. IEEE Access 2020, 8, 144665–144692. [Google Scholar] [CrossRef]
  181. Fu, W.; Wang, K.; Tan, J.; Zhang, K. A composite framework coupling multiple feature selection, compound prediction models and novel hybrid swarm optimizer-based synchronization optimization strategy for multi-step ahead short-term wind speed forecasting. Energy Convers. Manag. 2020, 205, 112461. [Google Scholar] [CrossRef]
  182. Qu, C.; He, W.; Peng, X.; Peng, X. Harris Hawks Optimization with Information Exchange. Appl. Math. Model. 2020, 84, 52–75. [Google Scholar] [CrossRef]
  183. Kamboj, V.K.; Nandi, A.; Bhadoria, A.; Sehgal, S. An intensify Harris Hawks optimizer for numerical and engineering optimization problems. Appl. Soft Comput. 2020, 89, 106018. [Google Scholar] [CrossRef]
  184. Dhawale, D.; Kamboj, V.K. hHHO-IGWO: A New Hybrid Harris Hawks Optimizer for Solving Global Optimization Problems. In Proceedings of the 2020 International Conference on Computation, Automation and Knowledge Management (ICCAKM), Dubai, United Arab Emirates, 9–10 January 2020; pp. 52–57. [Google Scholar]
  185. Fu, W.; Shao, K.; Tan, J.; Wang, K. Fault diagnosis for rolling bearings based on composite multiscale fine-sorted dispersion entropy and SVM with hybrid mutation SCA-HHO algorithm optimization. IEEE Access 2020, 8, 13086–13104. [Google Scholar] [CrossRef]
  186. Suresh, T.; Brijet, Z.; Sheeba, T.B. CMVHHO-DKMLC: A Chaotic Multi Verse Harris Hawks optimization (CMV-HHO) algorithm based deep kernel optimized machine learning classifier for medical diagnosis. Biomed. Signal Process. Control 2021, 70, 103034. [Google Scholar] [CrossRef]
  187. ElSayed, S.K.; Elattar, E.E. Hybrid Harris hawks optimization with sequential quadratic programming for optimal coordination of directional overcurrent relays incorporating distributed generation. Alex. Eng. J. 2021, 60, 2421–2433. [Google Scholar] [CrossRef]
  188. Kaveh, A.; Rahmani, P.; Eslamlou, A.D. An efficient hybrid approach based on Harris Hawks optimization and imperialist competitive algorithm for structural optimization. Eng. Comput. 2021, 1–29. [Google Scholar] [CrossRef]
  189. Atashpaz-Gargari, E.; Lucas, C. Imperialist competitive algorithm: An algorithm for optimization inspired by imperialistic competition. In Proceedings of the 2007 IEEE Congress on Evolutionary Computation, Singapore, 25–28 September 2007; pp. 4661–4667. [Google Scholar]
  190. Sihwail, R.; Solaiman, O.S.; Omar, K.; Ariffin, K.A.Z.; Alswaitti, M.; Hashim, I. A Hybrid Approach for Solving Systems of Nonlinear Equations Using Harris Hawks Optimization and Newton’s Method. IEEE Access 2021, 9, 95791–95807. [Google Scholar] [CrossRef]
  191. Mohamed, A.W.; Hadi, A.A.; Mohamed, A.K. Gaining-sharing knowledge based algorithm for solving optimization problems: A novel nature-inspired algorithm. Int. J. Mach. Learn. Cybern. 2020, 11, 1501–1529. [Google Scholar] [CrossRef]
  192. Abualigah, L.; Abd Elaziz, M.; Shehab, M.; Ahmad Alomari, O.; Alshinwan, M.; Alabool, H.; Al-Arabiat, D.A. Hybrid Harris Hawks Optimization with Differential Evolution for Data Clustering. In Metaheuristics in Machine Learning: Theory and Applications; Springer: Berlin/Heidelberg, Germany, 2021; pp. 267–299. [Google Scholar]
  193. Azar, N.A.; Milan, S.G.; Kayhomayoon, Z. The prediction of longitudinal dispersion coefficient in natural streams using LS-SVM and ANFIS optimized by Harris hawk optimization algorithm. J. Contam. Hydrol. 2021, 240, 103781. [Google Scholar] [CrossRef] [PubMed]
  194. Firouzi, B.; Abbasi, A.; Sendur, P. Identification and evaluation of cracks in electrostatically actuated resonant gas sensors using Harris Hawk/Nelder Mead and perturbation methods. Smart Struct. Syst. 2021, 28, 121–142. [Google Scholar]
  195. Li, W.; Shi, R.; Zou, H.; Dong, J. Fireworks Harris Hawk Algorithm Based on Dynamic Competition Mechanism for Numerical Optimization. In International Conference on Swarm Intelligence; Springer: Berlin/Heidelberg, Germany, 2021; pp. 441–450. [Google Scholar]
  196. Li, C.; Li, J.; Chen, H.; Heidari, A.A. Memetic Harris Hawks Optimization: Developments and perspectives on project scheduling and QoS-aware web service composition. Expert Syst. Appl. 2021, 171, 114529. [Google Scholar] [CrossRef]
  197. Ahmad, A.A. Solving partial differential equations via a hybrid method between homotopy analytical method and Harris hawks optimization algorithm. Int. J. Nonlinear Anal. Appl. 2022, 13, 663–671. [Google Scholar]
  198. Yuan, Y.; Ren, J.; Zu, J.; Mu, X. An adaptive instinctive reaction strategy based on Harris hawks optimization algorithm for numerical optimization problems. AIP Adv. 2021, 11, 025012. [Google Scholar] [CrossRef]
  199. Setiawan, I.N.; Kurniawan, R.; Yuniarto, B.; Caraka, R.E.; Pardamean, B. Parameter Optimization of Support Vector Regression Using Harris Hawks Optimization. Procedia Comput. Sci. 2021, 179, 17–24. [Google Scholar] [CrossRef]
  200. Hossain, M.A.; Noor, R.M.; Yau, K.L.A.; Azzuhri, S.R.; Z’Abar, M.R.; Ahmedy, I.; Jabbarpour, M.R. Multi-Objective Harris Hawks Optimization Algorithm Based 2-Hop Routing Algorithm for CR-VANET. IEEE Access 2021, 9, 58230–58242. [Google Scholar] [CrossRef]
  201. Dabba, A.; Tari, A.; Meftali, S. A new multi-objective binary Harris Hawks optimization for gene selection in microarray data. J. Ambient. Intell. Humaniz. Comput. 2021, 1–20. [Google Scholar] [CrossRef]
  202. Jangir, P.; Heidari, A.A.; Chen, H. Elitist non-dominated sorting Harris hawks optimization: Framework and developments for multi-objective problems. Expert Syst. Appl. 2021, 186, 115747. [Google Scholar] [CrossRef]
  203. Du, P.; Wang, J.; Hao, Y.; Niu, T.; Yang, W. A novel hybrid model based on multi-objective Harris hawks optimization algorithm for daily PM2.5 and PM10 forecasting. Appl. Soft Comput. 2020, 96, 106620. [Google Scholar] [CrossRef]
  204. Islam, M.Z.; Wahab, N.I.A.; Veerasamy, V.; Hizam, H.; Mailah, N.F.; Guerrero, J.M.; Mohd Nasir, M.N. A Harris Hawks Optimization Based Single-and Multi-Objective Optimal Power Flow Considering Environmental Emission. Sustainability 2020, 12, 5248. [Google Scholar] [CrossRef]
  205. Fu, W.; Lu, Q. Multiobjective Optimal Control of FOPID Controller for Hydraulic Turbine Governing Systems Based on Reinforced Multiobjective Harris Hawks Optimization Coupling with Hybrid Strategies. Complexity 2020, 2020, 9274980. [Google Scholar] [CrossRef]
  206. Piri, J.; Mohapatra, P. An Analytical Study of Modified Multi-objective Harris Hawk Optimizer Towards Medical Data Feature Selection. Comput. Biol. Med. 2021, 135, 104558. [Google Scholar] [CrossRef]
  207. Shekarappa G, S.; Mahapatra, S.; Raj, S. Voltage constrained reactive power planning problem for reactive loading variation using hybrid harris hawk particle swarm optimizer. Electr. Power Components Syst. 2021, 49, 421–435. [Google Scholar] [CrossRef]
  208. Mohandas, P.; Devanathan, S.T. Reconfiguration with DG location and capacity optimization using crossover mutation based Harris Hawk Optimization algorithm (CMBHHO). Appl. Soft Comput. 2021, 113, 107982. [Google Scholar] [CrossRef]
  209. Naeijian, M.; Rahimnejad, A.; Ebrahimi, S.M.; Pourmousa, N.; Gadsden, S.A. Parameter estimation of PV solar cells and modules using Whippy Harris Hawks Optimization Algorithm. Energy Rep. 2021, 7, 4047–4063. [Google Scholar] [CrossRef]
  210. Bao, X.; Jia, H.; Lang, C. A novel hybrid harris hawks optimization for color image multilevel thresholding segmentation. IEEE Access 2019, 7, 76529–76546. [Google Scholar] [CrossRef]
  211. Utama, D.M.; Widodo, D.S. An energy-efficient flow shop scheduling using hybrid Harris hawks optimization. Bull. Electr. Eng. Inform. 2021, 10, 1154–1163. [Google Scholar] [CrossRef]
  212. Too, J.; Liang, G.; Chen, H. Memory-based Harris hawk optimization with learning agents: A feature selection approach. Eng. Comput. 2021, 1–22. [Google Scholar] [CrossRef]
  213. Abd Elaziz, M.; Yousri, D. Automatic selection of heavy-tailed distributions-based synergy Henry gas solubility and Harris hawk optimizer for feature selection: Case study drug design and discovery. Artif. Intell. Rev. 2021, 54, 4685–4730. [Google Scholar] [CrossRef]
  214. Balaha, H.M.; El-Gendy, E.M.; Saafan, M.M. CovH2SD: A COVID-19 detection approach based on Harris Hawks Optimization and stacked deep learning. Expert Syst. Appl. 2021, 186, 115805. [Google Scholar] [CrossRef] [PubMed]
  215. Abualigah, L.; Abd Elaziz, M.; Hussien, A.G.; Alsalibi, B.; Jalali, S.M.J.; Gandomi, A.H. Lightning search algorithm: A comprehensive survey. Appl. Intell. 2021, 51, 2353–2376. [Google Scholar] [CrossRef] [PubMed]
  216. Abualigah, L.; Zitar, R.A.; Almotairi, K.H.; Hussein, A.M.; Abd Elaziz, M.; Nikoo, M.R.; Gandomi, A.H. Wind, Solar, and Photovoltaic Renewable Energy Systems with and without Energy Storage Optimization: A Survey of Advanced Machine Learning and Deep Learning Techniques. Energies 2022, 15, 578. [Google Scholar] [CrossRef]
  217. Al Shinwan, M.; Abualigah, L.; Huy, T.D.; Younes Shdefat, A.; Altalhi, M.; Kim, C.; El-Sappagh, S.; Abd Elaziz, M.; Kwak, K.S. An Efficient 5G Data Plan Approach Based on Partially Distributed Mobility Architecture. Sensors 2022, 22, 349. [Google Scholar] [CrossRef] [PubMed]
  218. Islam, M.Z.; Wahab, N.I.A.; Veerasamy, V.; Hizam, H.; Mailah, N.F.; Khan, A.; Sabo, A. Optimal Power Flow using a Novel Harris Hawk Optimization Algorithm to Minimize Fuel Cost and Power loss. In Proceedings of the 2019 IEEE Conference on Sustainable Utilization and Development in Engineering and Technologies (CSUDET), Penang, Malaysia, 7–9 November 2019; pp. 246–250. [Google Scholar]
  219. Paital, S.R.; Ray, P.K.; Mohanty, S.R. A robust dual interval type-2 fuzzy lead-lag based UPFC for stability enhancement using Harris Hawks Optimization. ISA Trans. 2022, 123, 425–442. [Google Scholar] [CrossRef]
  220. Mohanty, D.; Panda, S. Sine cosine adopted Harris’ hawks optimization for function optimization and power system frequency controller design. Int. Trans. Electr. Energy Syst. 2021, 31, e12915. [Google Scholar] [CrossRef]
  221. Abdel Aleem, S.H.E.; Zobaa, A.F.; Balci, M.E.; Ismael, S.M. Harmonic overloading minimization of frequency-dependent components in harmonics polluted distribution systems using harris hawks optimization algorithm. IEEE Access 2019, 7, 100824–100837. [Google Scholar] [CrossRef]
  222. Diaaeldin, I.M.; Aleem, S.H.A.; El-Rafei, A.; Abdelaziz, A.Y.; Ćalasan, M. Optimal Network Reconfiguration and Distributed Generation Allocation using Harris Hawks Optimization. In Proceedings of the 2020 24th International Conference on Information Technology (IT), Zabljak, Montenegro, 18–22 February 2020; pp. 1–6. [Google Scholar]
  223. Abdelsalam, M.; Diab, H.Y.; El-Bary, A. A Metaheuristic Harris Hawk Optimization Approach for Coordinated Control of Energy Management in Distributed Generation Based Microgrids. Appl. Sci. 2021, 11, 4085. [Google Scholar] [CrossRef]
  224. Mossa, M.A.; Kamel, O.M.; Sultan, H.M.; Diab, A.A.Z. Parameter estimation of PEMFC model based on Harris Hawks’ optimization and atom search optimization algorithms. Neural Comput. Appl. 2020, 33, 5555–5570. [Google Scholar] [CrossRef]
  225. Chakraborty, S.; Verma, S.; Salgotra, A.; Elavarasan, R.M.; Elangovan, D.; Mihet-Popa, L. Solar-Based DG Allocation Using Harris Hawks Optimization While Considering Practical Aspects. Energies 2021, 14, 5206. [Google Scholar] [CrossRef]
  226. Qais, M.H.; Hasanien, H.M.; Alghuwainem, S. Parameters extraction of three-diode photovoltaic model using computation and Harris Hawks optimization. Energy 2020, 195, 117040. [Google Scholar] [CrossRef]
  227. Sahoo, B.P.; Panda, S. Load Frequency Control of Solar Photovoltaic/Wind/Biogas/Biodiesel Generator Based Isolated Microgrid Using Harris Hawks Optimization. In Proceedings of the 2020 First International Conference on Power, Control and Computing Technologies (ICPC2T), Raipur, India, 3–5 January 2020; pp. 188–193. [Google Scholar]
  228. Fang, P.; Fu, W.; Wang, K.; Xiong, D.; Zhang, K. A compositive architecture coupling outlier correction, EWT, nonlinear Volterra multi-model fusion with multi-objective optimization for short-term wind speed forecasting. Appl. Energy 2021, 307, 118191. [Google Scholar] [CrossRef]
  229. Roy, R.; Mukherjee, V.; Singh, R.P. Harris hawks optimization algorithm for model order reduction of interconnected wind turbines. ISA Trans. 2021. [Google Scholar] [CrossRef]
  230. Hassan, M.H.; Kamel, S.; Abualigah, L.; Eid, A. Development and application of slime mould algorithm for optimal economic emission dispatch. Expert Syst. Appl. 2021, 182, 115205. [Google Scholar] [CrossRef]
  231. Houssein, E.H.; Dirar, M.; Abualigah, L.; Mohamed, W.M. An efficient equilibrium optimizer with support vector regression for stock market prediction. Neural Comput. Appl. 2021, 34, 3165–3200. [Google Scholar] [CrossRef]
  232. Pham, T.N.; Van Tran, L.; Dao, S.V.T. A Multi-Restart Dynamic Harris Hawk Optimization Algorithm for the Economic Load Dispatch Problem. IEEE Access 2021, 9, 122180–122206. [Google Scholar] [CrossRef]
  233. Nandi, A.; Kamboj, V.K. A meliorated Harris Hawks optimizer for combinatorial unit commitment problem with photovoltaic applications. J. Electr. Syst. Inf. Technol. 2021, 8, 1–73. [Google Scholar] [CrossRef]
  234. Sammen, S.S.; Ghorbani, M.A.; Malik, A.; Tikhamarine, Y.; AmirRahmani, M.; Al-Ansari, N.; Chau, K.W. Enhanced Artificial Neural Network with Harris Hawks Optimization for Predicting Scour Depth Downstream of Ski-Jump Spillway. Appl. Sci. 2020, 10, 5160. [Google Scholar] [CrossRef]
  235. Essa, F.; Abd Elaziz, M.; Elsheikh, A.H. An enhanced productivity prediction model of active solar still using artificial neural network and Harris Hawks optimizer. Appl. Therm. Eng. 2020, 170, 115020. [Google Scholar] [CrossRef]
  236. Moayedi, H.; Gör, M.; Lyu, Z.; Bui, D.T. Herding Behaviors of grasshopper and Harris hawk for hybridizing the neural network in predicting the soil compression coefficient. Measurement 2020, 152, 107389. [Google Scholar] [CrossRef]
  237. Kolli, C.S.; Tatavarthi, U.D. Fraud detection in bank transaction with wrapper model and Harris water optimization-based deep recurrent neural network. Kybernetes 2021, 50, 1731–1750. [Google Scholar] [CrossRef]
  238. Bacanin, N.; Vukobrat, N.; Zivkovic, M.; Bezdan, T.; Strumberger, I. Improved Harris Hawks Optimization Adapted for Artificial Neural Network Training. In International Conference on Intelligent and Fuzzy Systems; Springer: Berlin/Heidelberg, Germany, 2021; pp. 281–289. [Google Scholar]
  239. Atta, E.A.; Ali, A.F.; Elshamy, A.A. Chaotic Harris Hawk Optimization Algorithm for Training Feed-Forward Neural Network. In International Conference on Advanced Intelligent Systems and Informatics; Springer: Berlin/Heidelberg, Germany, 2021; pp. 382–391. [Google Scholar]
  240. Agarwal, P.; Farooqi, N.; Gupta, A.; Mehta, S.; Khandelwal, S. A New Harris Hawk Whale Optimization Algorithm for Enhancing Neural Networks. In Proceedings of the 2021 Thirteenth International Conference on Contemporary Computing (IC3-2021), Noida, India, 5–7 August 2021; pp. 179–186. [Google Scholar]
  241. Bac, B.H.; Nguyen, H.; Thao, N.T.T.; Hanh, V.T.; Duyen, L.T.; Dung, N.T.; Du, N.K.; Hiep, N.H. Estimating heavy metals absorption efficiency in an aqueous solution using nanotube-type halloysite from weathered pegmatites and a novel Harris hawks optimization-based multiple layers perceptron neural network. Eng. Comput. 2021, 1–16. [Google Scholar] [CrossRef]
  242. Alamir, M.A. An enhanced artificial neural network model using the Harris Hawks optimiser for predicting food liking in the presence of background noise. Appl. Acoust. 2021, 178, 108022. [Google Scholar] [CrossRef]
  243. Simsek, O.I.; Alagoz, B.B. A Computational Intelligent Analysis Scheme for Optimal Engine Behavior by Using Artificial Neural Network Learning Models and Harris Hawk Optimization. In Proceedings of the 2021 International Conference on Information Technology (ICIT), Amman, Jordan, 14–15 July 2021; pp. 361–365. [Google Scholar]
  244. Zhang, H.; Nguyen, H.; Bui, X.N.; Pradhan, B.; Asteris, P.G.; Costache, R.; Aryal, J. A generalized artificial intelligence model for estimating the friction angle of clays in evaluating slope stability using a deep neural network and Harris Hawks optimization algorithm. Eng. Comput. 2021, 1–14. [Google Scholar] [CrossRef]
  245. Wunnava, A.; Naik, M.K.; Panda, R.; Jena, B.; Abraham, A. A differential evolutionary adaptive Harris hawks optimization for two dimensional practical Masi entropy-based multilevel image thresholding. J. King Saud-Univ.-Comput. Inf. Sci. 2020. [Google Scholar] [CrossRef]
  246. Golilarz, N.A.; Gao, H.; Demirel, H. Satellite image de-noising with harris hawks meta heuristic optimization algorithm and improved adaptive generalized gaussian distribution threshold function. IEEE Access 2019, 7, 57459–57468. [Google Scholar] [CrossRef]
  247. Shahid, M.; Li, J.P.; Golilarz, N.A.; Addeh, A.; Khan, J.; Haq, A.U. Wavelet Based Image DE-Noising with Optimized Thresholding Using HHO Algorithm. In Proceedings of the 2019 16th International Computer Conference on Wavelet Active Media Technology and Information Processing, Chengdu, China, 14–15 December 2019; pp. 6–12. [Google Scholar]
  248. Rodríguez-Esparza, E.; Zanella-Calzada, L.A.; Oliva, D.; Heidari, A.A.; Zaldivar, D.; Pérez-Cisneros, M.; Foong, L.K. An Efficient Harris Hawks-inspired Image Segmentation Method. Expert Syst. Appl. 2020, 155, 113428. [Google Scholar] [CrossRef]
  249. Naik, M.K.; Panda, R.; Wunnava, A.; Jena, B.; Abraham, A. A leader Harris hawks optimization for 2-D Masi entropy-based multilevel image thresholding. Multimed. Tools Appl. 2021, 80, 35543–35583. [Google Scholar] [CrossRef]
  250. Lin, S.; Jia, H.; Abualigah, L.; Altalhi, M. Enhanced Slime Mould Algorithm for Multilevel Thresholding Image Segmentation Using Entropy Measures. Entropy 2021, 23, 1700. [Google Scholar] [CrossRef]
  251. Hussien, A.G.; Hassanien, A.E.; Houssein, E.H.; Bhattacharyya, S.; Amin, M. S-shaped Binary Whale Optimization Algorithm for Feature Selection. In Recent Trends in Signal and Image Processing; Springer: Berlin/Heidelberg, Germany, 2019; pp. 79–87. [Google Scholar]
  252. Hussien, A.G.; Houssein, E.H.; Hassanien, A.E. A binary whale optimization algorithm with hyperbolic tangent fitness function for feature selection. In Proceedings of the 2017 Eighth International Conference on Intelligent Computing and Information Systems (ICICIS), Cairo, Egypt, 5–7 December 2017; pp. 166–172. [Google Scholar]
  253. Abualigah, L.M.Q. Feature Selection and Enhanced Krill Herd Algorithm for Text Document Clustering; Springer: Berlin/Heidelberg, Germany, 2019. [Google Scholar]
  254. Abdel-Basset, M.; Ding, W.; El-Shahat, D. A hybrid Harris Hawks optimization algorithm with simulated annealing for feature selection. Artif. Intell. Rev. 2021, 54, 593–637. [Google Scholar] [CrossRef]
  255. Thaher, T.; Saheb, M.; Turabieh, H.; Chantar, H. Intelligent Detection of False Information in Arabic Tweets Utilizing Hybrid Harris Hawks Based Feature Selection and Machine Learning Models. Symmetry 2021, 13, 556. [Google Scholar] [CrossRef]
  256. Turabieh, H.; Al Azwari, S.; Rokaya, M.; Alosaimi, W.; Alharbi, A.; Alhakami, W.; Alnfiai, M. Enhanced harris hawks optimization as a feature selection for the prediction of student performance. Computing 2021, 103, 1417–1438. [Google Scholar] [CrossRef]
  257. Al-Wajih, R.; Abdulkadir, S.J.; Aziz, N.; Al-Tashi, Q.; Talpur, N. Hybrid Binary Grey Wolf With Harris Hawks Optimizer for Feature Selection. IEEE Access 2021, 9, 31662–31677. [Google Scholar] [CrossRef]
  258. Khurma, R.A.; Awadallah, M.A.; Aljarah, I. Binary Harris Hawks Optimisation Filter Based Approach for Feature Selection. In Proceedings of the 2021 Palestinian International Conference on Information and Communication Technology (PICICT), Gaza, Palestine, 28–29 September 2021; pp. 59–64. [Google Scholar]
  259. Yasear, S.A.; Ku-Mahamud, K.R. Fine-Tuning the Ant Colony System Algorithm Through Harris’s Hawk Optimizer for Travelling Salesman Problem. Int. J. Intell. Eng. Syst. 2021, 14, 136–145. [Google Scholar] [CrossRef]
  260. Hatamlou, A. Black hole: A new heuristic optimization approach for data clustering. Inf. Sci. 2013, 222, 175–184. [Google Scholar] [CrossRef]
  261. Ismael, O.M.; Qasim, O.S.; Algamal, Z.Y. A new adaptive algorithm for v-support vector regression with feature selection using Harris hawks optimization algorithm. In Journal of Physics: Conference Series; IOP Publishing: Bristol, UK, 2021; Volume 1897, p. 012057. [Google Scholar]
  262. Safaldin, M.; Otair, M.; Abualigah, L. Improved binary gray wolf optimizer and SVM for intrusion detection system in wireless sensor networks. J. Ambient. Intell. Humaniz. Comput. 2021, 12, 1559–1576. [Google Scholar] [CrossRef]
  263. Khasawneh, A.M.; Kaiwartya, O.; Abualigah, L.M.; Lloret, J. Green computing in underwater wireless sensor networks pressure centric energy modeling. IEEE Syst. J. 2020, 14, 4735–4745. [Google Scholar] [CrossRef]
  264. Srinivas, M.; Amgoth, T. EE-hHHSS: Energy-efficient wireless sensor network with mobile sink strategy using hybrid Harris hawk-salp swarm optimization algorithm. Int. J. Commun. Syst. 2020, 33, e4569. [Google Scholar] [CrossRef]
  265. Bhat, S.J.; Venkata, S.K. An optimization based localization with area minimization for heterogeneous wireless sensor networks in anisotropic fields. Comput. Netw. 2020, 179, 107371. [Google Scholar] [CrossRef]
  266. Singh, P.; Prakash, S. Optimizing multiple ONUs placement in Fiber-Wireless (FiWi) access network using Grasshopper and Harris Hawks Optimization Algorithms. Opt. Fiber Technol. 2020, 60, 102357. [Google Scholar] [CrossRef]
  267. Xu, H.; Zhang, G.; Zhao, J.; Pham, Q.-V. Intelligent reflecting surface aided wireless networks-Harris Hawks optimization for beamforming design. arXiv 2020, arXiv:2010.01900. [Google Scholar]
  268. Sharma, R.; Prakash, S. HHO-LPWSN: Harris Hawks Optimization Algorithm for Sensor Nodes Localization Problem in Wireless Sensor Networks. EAI Endorsed Trans. Scalable Inf. Syst. 2021, 8, e5. [Google Scholar] [CrossRef]
  269. Jia, H.; Peng, X.; Kang, L.; Li, Y.; Jiang, Z.; Sun, K. Pulse coupled neural network based on Harris hawks optimization algorithm for image segmentation. Multimed. Tools Appl. 2020, 79, 28369–28392. [Google Scholar] [CrossRef]
  270. Rammurthy, D.; Mahesh, P. Whale Harris hawks optimization based deep learning classifier for brain tumor detection using MRI images. J. King Saud-Univ.-Comput. Inf. Sci. 2020. [Google Scholar] [CrossRef]
  271. Kaur, N.; Kaur, L.; Cheema, S.S. An enhanced version of Harris Hawks Optimization by dimension learning-based hunting for Breast Cancer Detection. Sci. Rep. 2021, 11, 21933. [Google Scholar] [CrossRef]
  272. Chacko, A.; Chacko, S. Deep learning-based robust medical image watermarking exploiting DCT and Harris hawks optimization. Int. J. Intell. Syst. 2021. [Google Scholar] [CrossRef]
  273. Bandyopadhyay, R.; Kundu, R.; Oliva, D.; Sarkar, R. Segmentation of brain MRI using an altruistic Harris Hawks’ Optimization algorithm. Knowl.-Based Syst. 2021, 232, 107468. [Google Scholar] [CrossRef]
  274. Iswisi, A.F.; Karan, O.; Rahebi, J. Diagnosis of Multiple Sclerosis Disease in Brain Magnetic Resonance Imaging Based on the Harris Hawks Optimization Algorithm. BioMed Res. Int. 2021, 2021. [Google Scholar] [CrossRef]
  275. Balamurugan, R.; Ratheesh, S.; Venila, Y.M. Classification of heart disease using adaptive Harris hawk optimization-based clustering algorithm and enhanced deep genetic algorithm. Soft Comput. 2021, 26, 2357–2373. [Google Scholar] [CrossRef]
  276. Hussien, A.G.; Hassanien, A.E.; Houssein, E.H. Swarming behaviour of salps algorithm for predicting chemical compound activities. In Proceedings of the 2017 Eighth International Conference on Intelligent Computing and Information Systems (ICICIS), Cairo, Egypt, 5–7 December 2017; pp. 315–320. [Google Scholar]
  277. Houssein, E.H.; Hosney, M.E.; Oliva, D.; Mohamed, W.M.; Hassaballah, M. A novel hybrid Harris hawks optimization and support vector machines for drug design and discovery. Comput. Chem. Eng. 2020, 133, 106656. [Google Scholar] [CrossRef]
  278. Houssein, E.H.; Neggaz, N.; Hosney, M.E.; Mohamed, W.M.; Hassaballah, M. Enhanced harris hawks optimization with genetic operators for selection chemical descriptors and compounds activities. Neural Comput. Appl. 2021, 33, 13601–13618. [Google Scholar] [CrossRef]
  279. Ekinci, S.; Hekimoğlu, B.; Eker, E. Optimum Design of PID Controller in AVR System Using Harris Hawks Optimization. In Proceedings of the 2019 3rd International Symposium on Multidisciplinary Studies and Innovative Technologies (ISMSIT), Ankara, Turkey, 11–13 October 2019; pp. 1–6. [Google Scholar]
  280. Ekinci, S.; Izci, D.; Hekimoğlu, B. PID Speed Control of DC Motor Using Harris Hawks Optimization Algorithm. In Proceedings of the 2020 International Conference on Electrical, Communication, and Computer Engineering (ICECCE), Istanbul, Turkey, 12–13 June 2020; pp. 1–6. [Google Scholar]
  281. Ekinci, S.; Hekimoğlu, B.; Demirören, A.; Kaya, S. Harris Hawks Optimization Approach for Tuning of FOPID Controller in DC-DC Buck Converter. In Proceedings of the 2019 International Artificial Intelligence and Data Processing Symposium (IDAP), Malatya, Turkey, 21–22 September 2019; pp. 1–9. [Google Scholar]
  282. Yousri, D.; Babu, T.S.; Fathy, A. Recent methodology based Harris Hawks optimizer for designing load frequency control incorporated in multi-interconnected renewable energy plants. Sustain. Energy, Grids Netw. 2020, 22, 100352. [Google Scholar] [CrossRef]
  283. Barakat, M.; Donkol, A.; Hamed, H.F.; Salama, G.M. Harris hawks-based optimization algorithm for automatic LFC of the interconnected power system using PD-PI cascade control. J. Electr. Eng. Technol. 2021, 16, 1845–1865. [Google Scholar] [CrossRef]
  284. Munagala, V.K.; Jatoth, R.K. Design of Fractional-Order PID/PID Controller for Speed Control of DC Motor Using Harris Hawks Optimization. In Intelligent Algorithms for Analysis and Control of Dynamical Systems; Springer: Berlin/Heidelberg, Germany, 2021; pp. 103–113. [Google Scholar]
  285. Bui, D.T.; Moayedi, H.; Kalantar, B.; Osouli, A.; Gör, M.; Pradhan, B.; Nguyen, H.; Rashid, A.S.A. Harris hawks optimization: A novel swarm intelligence technique for spatial assessment of landslide susceptibility. Sensors 2019, 19, 3590. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  286. Murlidhar, B.R.; Nguyen, H.; Rostami, J.; Bui, X.; Armaghani, D.J.; Ragam, P.; Mohamad, E.T. Prediction of flyrock distance induced by mine blasting using a novel Harris Hawks optimization-based multi-layer perceptron neural network. J. Rock Mech. Geotech. Eng. 2021, 13, 1413–1427. [Google Scholar] [CrossRef]
  287. Yu, C.; Koopialipoor, M.; Murlidhar, B.R.; Mohammed, A.S.; Armaghani, D.J.; Mohamad, E.T.; Wang, Z. Optimal ELM–Harris Hawks optimization and ELM–Grasshopper optimization models to forecast peak particle velocity resulting from mine blasting. Nat. Resour. Res. 2021, 30, 2647–2662. [Google Scholar] [CrossRef]
  288. Paryani, S.; Neshat, A.; Pradhan, B. Improvement of landslide spatial modeling using machine learning methods and two Harris hawks and bat algorithms. Egypt. J. Remote Sens. Space Sci. 2021, 24, 845–855. [Google Scholar] [CrossRef]
  289. Golafshani, E.M.; Arashpour, M.; Behnood, A. Predicting the compressive strength of green concretes using Harris hawks optimization-based data-driven methods. Constr. Build. Mater. 2022, 318, 125944. [Google Scholar] [CrossRef]
  290. Parsa, P.; Naderpour, H. Shear strength estimation of reinforced concrete walls using support vector regression improved by Teaching–learning-based optimization, Particle Swarm optimization, and Harris Hawks Optimization algorithms. J. Build. Eng. 2021, 44, 102593. [Google Scholar] [CrossRef]
  291. Zaim, S.; Chong, J.H.; Sankaranarayanan, V.; Harky, A. COVID-19 and multiorgan response. Curr. Probl. Cardiol. 2020, 45, 100618. [Google Scholar] [CrossRef]
  292. Xu, S.; Li, Y. Beware of the second wave of COVID-19. Lancet 2020, 395, 1321–1322. [Google Scholar] [CrossRef]
  293. Houssein, E.H.; Ahmad, M.; Hosney, M.E.; Mazzara, M. Classification Approach for COVID-19 Gene Based on Harris Hawks Optimization. In Artificial Intelligence for COVID-19; Springer: Berlin/Heidelberg, Germany, 2021; pp. 575–594. [Google Scholar]
  294. Hu, J.; Heidari, A.A.; Shou, Y.; Ye, H.; Wang, L.; Huang, X.; Chen, H.; Chen, Y.; Wu, P. Detection of COVID-19 severity using blood gas analysis parameters and Harris hawks optimized extreme learning machine. Comput. Biol. Med. 2021, 142, 105166. [Google Scholar] [CrossRef]
  295. Ye, H.; Wu, P.; Zhu, T.; Xiao, Z.; Zhang, X.; Zheng, L.; Zheng, R.; Sun, Y.; Zhou, W.; Fu, Q.; et al. Diagnosing coronavirus disease 2019 (COVID-19): Efficient Harris Hawks-inspired fuzzy K-nearest neighbor prediction methods. IEEE Access 2021, 9, 17787–17802. [Google Scholar] [CrossRef]
  296. Bandyopadhyay, R.; Basu, A.; Cuevas, E.; Sarkar, R. Harris Hawks optimisation with Simulated Annealing as a deep feature selection method for screening of COVID-19 CT-scans. Appl. Soft Comput. 2021, 111, 107698. [Google Scholar] [CrossRef]
  297. Abbasi, A.; Firouzi, B.; Sendur, P. On the application of Harris hawks optimization (HHO) algorithm to the design of microchannel heat sinks. Eng. Comput. 2019, 37, 1409–1428. [Google Scholar] [CrossRef]
  298. Golilarz, N.A.; Addeh, A.; Gao, H.; Ali, L.; Roshandeh, A.M.; Munir, H.M.; Khan, R.U. A new automatic method for control chart patterns recognition based on ConvNet and Harris Hawks meta heuristic optimization algorithm. IEEE Access 2019, 7, 149398–149405. [Google Scholar] [CrossRef]
  299. Khalifeh, S.; Akbarifard, S.; Khalifeh, V.; Zallaghi, E. Optimization of Water Distribution of Network Systems Using the Harris Hawks Optimization Algorithm (Case study: Homashahr City). MethodsX 2020, 7, 100948. [Google Scholar] [CrossRef]
  300. Abd Elaziz, M.; Abualigah, L.; Ibrahim, R.A.; Attiya, I. IoT Workflow Scheduling Using Intelligent Arithmetic Optimization Algorithm in Fog Computing. Comput. Intell. Neurosci. 2021, 2021. [Google Scholar] [CrossRef]
  301. Seyfollahi, A.; Ghaffari, A. Reliable data dissemination for the Internet of Things using Harris hawks optimization. Peer-to-Peer Netw. Appl. 2020, 13, 1886–1902. [Google Scholar] [CrossRef]
  302. Saravanan, G.; Ibrahim, A.M.; Kumar, D.S.; Vanitha, U.; Chandrika, V. Iot Based Speed Control Of BLDC Motor With Harris Hawks Optimization Controller. Int. J. Grid Distrib. Comput. 2020, 13, 1902–1915. [Google Scholar]
  303. Tayab, U.B.; Zia, A.; Yang, F.; Lu, J.; Kashif, M. Short-term load forecasting for microgrid energy management system using hybrid HHO-FNN model with best-basis stationary wavelet packet transform. Energy 2020, 203, 117857. [Google Scholar] [CrossRef]
  304. Ding, W.; Abdel-Basset, M.; Eldrandaly, K.A.; Abdel-Fatah, L.; de Albuquerque, V.H.C. Smart Supervision of Cardiomyopathy Based on Fuzzy Harris Hawks Optimizer and Wearable Sensing Data Optimization: A New Model. IEEE Trans. Cybern. 2020, 51, 4944–4958. [Google Scholar] [CrossRef]
  305. Li, C.; Li, J.; Chen, H. A Meta-Heuristic-Based Approach for Qos-Aware Service Composition. IEEE Access 2020, 8, 69579–69592. [Google Scholar] [CrossRef]
  306. Elkady, Z.; Abdel-Rahim, N.; Mansour, A.A.; Bendary, F.M. Enhanced DVR Control System based on the Harris Hawks Optimization Algorithm. IEEE Access 2020, 8, 177721–177733. [Google Scholar] [CrossRef]
  307. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef] [Green Version]
Figure 1. The logical diagram of HHO.
Figure 1. The logical diagram of HHO.
Electronics 11 01919 g001
Figure 2. Convergence curve for HHO against other competitors from F1–F12 CEC2005.
Figure 2. Convergence curve for HHO against other competitors from F1–F12 CEC2005.
Electronics 11 01919 g002
Figure 3. Convergence curve for HHO against other competitors from F13–F23 CEC2005.
Figure 3. Convergence curve for HHO against other competitors from F13–F23 CEC2005.
Electronics 11 01919 g003
Figure 4. Boxplot of some functions from F1–F12 for all algorithms.
Figure 4. Boxplot of some functions from F1–F12 for all algorithms.
Electronics 11 01919 g004
Figure 5. Boxplot of some functions from F13–F23 for all algorithms.
Figure 5. Boxplot of some functions from F13–F23 for all algorithms.
Electronics 11 01919 g005
Figure 6. Convergence curve for HHO against other competitors from F1–F10 CEC2017.
Figure 6. Convergence curve for HHO against other competitors from F1–F10 CEC2017.
Electronics 11 01919 g006
Figure 7. Convergence curve for HHO against other competitors from F11–F20 CEC2017.
Figure 7. Convergence curve for HHO against other competitors from F11–F20 CEC2017.
Electronics 11 01919 g007aElectronics 11 01919 g007b
Figure 8. Convergence curve for HHO against other competitors from F21—F30 CEC2017.
Figure 8. Convergence curve for HHO against other competitors from F21—F30 CEC2017.
Electronics 11 01919 g008aElectronics 11 01919 g008b
Figure 9. Boxplot of some functions from F1–F10 for all algorithms.
Figure 9. Boxplot of some functions from F1–F10 for all algorithms.
Electronics 11 01919 g009
Figure 10. Boxplot of some functions from F11–F20 for all algorithms.
Figure 10. Boxplot of some functions from F11–F20 for all algorithms.
Electronics 11 01919 g010
Figure 11. Boxplot of some functions from F21–F30 for all algorithms.
Figure 11. Boxplot of some functions from F21–F30 for all algorithms.
Electronics 11 01919 g011
Figure 12. Distribution of HHO-related papers in many applications, as reported by Scopus.
Figure 12. Distribution of HHO-related papers in many applications, as reported by Scopus.
Electronics 11 01919 g012
Table 1. A list of some optimization algorithms based on their components.
Table 1. A list of some optimization algorithms based on their components.
ClassAlgorithmic BehaviorAlgorithmsRef.Year
EvolutionaryBreeding-based EvolutionGenetic Algorithm (GA)[14]1992
Breeding-based EvolutionGenetic programming (GP)[15]1992
Influenced by representative solutionsDifferential Evolution (DE)[16]1997
Breeding-based EvolutionEvolution Strategies[17]2002
Mathematical Arithmetic operatorsArithmetic Optimization Algorithm[18]2021
Swarm intelligenceInfluenced by representative solutionsParticle Swarm Optimization (PSO)[19]1995
Creation and StimergyAnt Colony optimization (ACO)[20]1999
Creation–CombinationHarmony Search Algorithm (HS)[21]2001
Influenced by representative solutionsArtificial Bee Colony (ABC)[22]2007
Influenced by the entire populationCentral Force Optimization (CFO)[23]2007
Creation–CombinationBiogeography-based optimization (BBO)[24]2008
Influenced by representative solutionsCuckoo Search (CS)[25]2009
Influenced by neighborhoodsBacterial Foraging Optimization (BFO)[26]2009
Influenced by the entire populationGravitational Search Algorithm (GSA)[27]2009
Influenced by the entire populationFirefly Optimizer (FFO)[28]2010
Influenced by representative solutionsTeaching–Learning-Based Optimizer (TLBO)[29]2011
Influenced by representative solutionsFruit Fly Optimization (FFO)[30]2012
Influenced by representative solutionsKrill Herd (KH)[31]2012
Influenced by representative solutionsGrey Wolf Optimizer (GWO)[32]2014
Influenced by representative solutionsHarris Hawks Optimizer (HHO)[33]2019
Influenced by representative solutionsHenry Gas Solubility Optimization (HGSO)[34]2019
Influenced by representative solutionsSlime mold algorithm (SMA)[35]2020
Influenced by mating behavior of snakesSnake Optimizer[36]2022
Table 2. Experiments parameters settings.
Table 2. Experiments parameters settings.
No.Parameter NameValue
1Population Size30
2Dim30
3Max number of iteration500
Table 3. Metaheuristic algorithms parameters settings.
Table 3. Metaheuristic algorithms parameters settings.
Alg.ParameterValue
GOAGMaX1
GMin0.004
f l 2
TEOu1
v0.001
SCAa2
EHOElephants number50
Clans number5
Kept elephants number2
The scale factor α 0.5
The scale factor β 0.1
SSA c 1 , c 2 , c 3 r a n d
WOAa2
HHO β 1.5
AEO r 1 , r 2 , and r r a n d
h 2 × rand 1
L-SHADEPbest0.1
Arc rate2
LSHADE-EpSinPbest0.1
Arc rate2
CMAES α 2
Table 4. The comparison results of all algorithms over 23 functions.
Table 4. The comparison results of all algorithms over 23 functions.
F GACMAESL-SHADELSHADE-EpSinSCAGOAWOATEOAEOHHO
F1Avg 2.79 × 10 3 9.74 × 10 6 3.52 × 10 4 2.57 × 10 4 8.87 1.10 × 10 4 4.38 × 10 74 1.58 × 10 87 6.64 × 10 4 8.61 × 10 95
STD 2.20 × 10 3 4.55 × 10 6 6.18 × 10 3 2.62 × 10 3 1.24 × 10 1 4.15 × 10 3 1.56 × 10 73 7.38 × 10 88 6.18 × 10 3 4.50 × 10 94
F2Avg 1.53 × 10 2 4.98 × 10 3 6.03 × 10 3 4.50 × 10 3 1.24 × 10 2 8.48 × 10 2 1.06 × 10 74 6.57 × 10 51 1.44 × 10 10 2.24 × 10 51
STD8.70 2.28 × 10 3 1.68 × 10 4 1.26 × 10 4 1.56 × 10 2 3.53 × 10 3 5.82 × 10 74 1.27 × 10 50 3.03 × 10 10 8.46 × 10 51
F3Avg 5.66 × 10 1 4.60 × 10 1 6.45 × 10 4 3.70 × 10 4 8.80 × 10 3 3.91 × 10 4 4.79 × 10 4 1.08 × 10 63 9.11 × 10 4 3.26 × 10 67
STD 3.55 × 10 1 2.38 × 10 1 9.94 × 10 3 5.70 × 10 3 5.10 × 10 3 2.81 × 10 4 1.06 × 10 4 5.89 × 10 63 1.44 × 10 4 1.78 × 10 66
F4Avg1.02 2.34 × 10 2 2.05 × 10 1 5.83 × 10 1 4.01 × 10 1 4.40 × 10 1 5.57 × 10 1 8.53 × 10 52 8.58 × 10 1 1.22 × 10 48
STD 1.22 × 10 1 5.16 × 10 3 1.373.89 1.36 × 10 1 8.62 2.94 × 10 1 3.09 × 10 52 3.59 4.49 × 10 48
F5Avg 9.04 × 10 1 8.08 × 10 1 5.03 × 10 6 4.02 × 10 7 8.70 × 10 4 1.01 × 10 7 2.79 × 10 1 2.90 × 10 1 2.24 × 10 8 9.37 × 10 3
STD 1.16 × 10 2 1.88 × 10 2 1.19 × 10 6 9.55 × 10 6 2.50 × 10 5 6.05 × 10 6 5.39 × 10 1 3.36 × 10 2 3.71 × 10 7 1.36 × 10 2
F6Avg 2.91 × 10 3 1.13 × 10 5 3.15 × 10 4 2.51 × 10 4 3.63 × 10 1 1.15 × 10 4 4.53 × 10 1 6.71 6.49 × 10 4 2.28 × 10 4
STD 2.03 × 10 3 4.74 × 10 6 4.19 × 10 3 3.34 × 10 3 7.15 × 10 1 4.43 × 10 3 2.31 × 10 1 7.18 × 10 1 6.69 × 10 3 2.94 × 10 4
F7Avg2.65 2.88 × 10 2 1.25 × 10 1 1.87 × 10 1 1.21 × 10 1 6.26 × 10 1 3.15 × 10 3 8.50 × 10 5 1.11 × 10 2 1.05 × 10 4
STD 7.81 × 10 1 9.43 × 10 3 2.613.91 1.20 × 10 1 6.74 × 10 1 3.83 × 10 3 6.77 × 10 5 1.85 × 10 1 9.05 × 10 5
F8Avg 2.29 × 10 3 1.19 × 10 5 9.60 × 10 2 4.28 × 10 3 3.79 × 10 3 4.52 × 10 3 9.99 × 10 3 3.63 × 10 3 7.46 × 10 3 1.26 × 10 4
STD 5.31 × 10 2 1.10 × 10 5 5.71 × 10 1 2.55 × 10 2 4.11 × 10 2 5.50 × 10 2 1.76 × 10 3 9.25 × 10 2 5.94 × 10 2 1.40
F9Avg 5.68 × 10 1 9.64 × 10 1 3.34 × 10 1 2.98 × 10 2 3.39 × 10 1 2.74 × 10 2 0.000.00 4.29 × 10 2 0.00
STD 2.45 × 10 1 7.02 × 10 1 1.94 1.74 × 10 1 3.07 × 10 1 4.03 × 10 1 0.000.00 2.48 × 10 1 0.00
F10Avg 8.22 × 10 1 9.78 × 10 4 1.53 × 10 1 1.85 × 10 1 1.43 × 10 1 1.59 × 10 1 4.09 × 10 15 8.88 × 10 16 2.05 × 10 1 8.88 × 10 16
STD 3.65 × 10 1 2.01 × 10 4 3.11 × 10 1 3.77 × 10 1 8.791.45 2.70 × 10 16 0.00 1.64 × 10 1 0.00
F11Avg 2.47 × 10 1 1.31 × 10 4 4.06 × 10 2 2.24 × 10 2 7.81 × 10 1 1.01 × 10 2 6.97 × 10 3 0.00 5.73 × 10 2 0.00
STD 2.07 × 10 1 5.06 × 10 5 6.57 × 10 1 3.63 × 10 1 3.78 × 10 1 2.85 × 10 1 2.69 × 10 = 2 0.00 6.20 × 10 1 0.00
F12Avg 4.89 × 10 7 1.23 × 10 6 8.70 × 10 7 4.46 × 10 7 3.38 × 10 2 6.18 × 10 6 2.58 × 10 2 1.33 5.33 × 10 8 1.10 × 10 5
STD 4.40 × 10 7 5.97 × 10 7 3.63 × 10 7 1.86 × 10 7 9.13 × 10 2 7.74 × 10 6 1.49 × 10 2 2.64 × 10 1 9.99 × 10 7 1.64 × 10 5
F13Avg 9.06 × 10 7 1.89 × 10 5 1.21 × 10 7 1.18 × 10 8 1.24 × 10 5 2.87 × 10 7 4.75 × 10 1 3.00 9.85 × 10 8 1.00 × 10 4
STD 7.53 × 10 7 6.95 × 10 6 2.91 × 10 6 2.84 × 10 7 4.31 × 10 5 3.03 × 10 7 3.02 × 10 1 2.79 × 10 3 1.35 × 10 8 2.29 × 10 4
F14Avg 4.02 × 10 1 4.33 9.07 × 10 1 1.001.66 9.9 × 10 1 2.779.781.151.20
STD 3.75 × 10 3 2.45 8.47 × 10 3 9.36 × 10 3 9.48 × 10 1 4.30 × 10 16 2.653.73 2.47 × 10 1 4.04 × 10 1
F15Avg 3.19 × 10 3 2.49 × 10 3 3.38 × 10 3 2.17 × 10 3 9.25 × 10 4 3.80 × 10 3 7.30 × 10 4 6.71 × 10 2 2.86 × 10 3 3.75 × 10 4
STD 1.38 × 10 3 1.65 × 10 3 1.47 × 10 3 9.40 × 10 4 3.53 × 10 4 5.57 × 10 3 5.05 × 10 4 5.22 × 10 2 1.16 × 10 3 1.67 × 10 4
F16Avg−1.03−1.03−1.03−1.03−1.03−1.03−1.03 9.70 × 10 1 −1.02−1.03
STD 1.43 × 10 3 6.78 × 10 16 1.43 × 10 1 1.43 × 10 3 4.76 × 10 5 3.21 × 10 13 2.53 × 10 9 1.20 × 10 1 1.08 × 10 2 5.55 × 10 9
F17Avg 3.98 × 10 1 3.98 × 10 1 3.98 × 10 1 3.98 × 10 1 4.01 × 10 1 3.98 × 10 1 3.98 × 10 1 3.98 × 10 1 4.01 × 10 1 3.98 × 10 1
STD 8.87 × 10 6 8.87 × 10 6 8.87 × 10 6 8.87 × 10 6 4.24 × 10 6 8.87 × 10 6 8.87 × 10 6 8.87 × 10 6 2.86 × 10 3 3.88 × 10 5
F18Avg 6.21 × 10 1 3.003.043.043.005.703.00 2.75 × 10 1 3.183.00
STD 6.18 × 10 1 1.13 × 10 15 4.38 × 10 2 4.38 × 10 2 1.00 × 10 4 1.48 × 10 1 1.46 × 10 4 2.91 × 10 1 1.68 × 10 1 5.56 × 10 7
F19Avg−3.86−3.86−3.86−3.86−3.85−3.86−3.86−3.44−3.85−3.86
STD 2.83 × 10 3 2.71 × 10 15 2.83 × 10 3 2.83 × 10 3 3.63 × 10 3 2.83 × 10 3 1.00 × 10 2 2.87 × 10 1 5.09 × 10 3 3.78 × 10 3
F20Avg−3.07−3.27−3.07−3.07−2.86−2.87−3.21−2.19−2.93−3.10
STD 5.62 × 10 2 5.99 × 10 2 5.62 × 10 2 5.62 × 10 2 3.83 × 10 1 3.00 × 10 1 1.80 × 10 1 4.42 × 10 1 1.12 × 10 1 1.31 × 10 1
F21Avg−4.30−6.39−4.30−4.30−2.46−2.81−8.36−4.08−2.78−5.38
STD1.413.651.411.411.961.732.591.12 9.83 × 10 1 1.27
F22Avg−4.71 1.01 × 10 1 −4.71−4.71−3.79−3.88−7.03−4.29−2.84−5.4137
STD1.691.391.691.691.721.943.111.291.011.261067
F23Avg−4.70−9.77−4.70−4.70−3.44−3.91−6.51−4.33−3.04−5.21
STD1.302.341.301.301.702.323.121.311.071.10
Table 5. Wilcoxon rank sum test results for HHO against other algorithms CEC2020 Dim = 20.
Table 5. Wilcoxon rank sum test results for HHO against other algorithms CEC2020 Dim = 20.
FGAL-SHADELSHADE-EpSinSCAGOAWOATEOHGSOAEO
F1 3.01986 × 10 11 3.01986 × 10 11 3.01986 × 10 11 3.01986 × 10 11 3.01986 × 10 11 3.01986 × 10 11 3.01986 × 10 11 3.01986 × 10 11 3.01986 × 10 11
F2 3.01986 × 10 11 3.01986 × 10 11 3.01986 × 10 11 3.01986 × 10 11 3.01986 × 10 11 3.01986 × 10 11 0.673495053 9.53321 × 10 11 3.01986 × 10 11
F3 1.21178 × 10 12 3.01986 × 10 11 3.01986 × 10 11 3.01986 × 10 11 3.01986 × 10 11 3.01986 × 10 11 3.01986 × 10 11 5.57265 × 10 11 3.01986 × 10 11
F4 1.72025 × 10 12 3.01986 × 10 12 3.01986 × 10 11 3.01986 × 10 11 3.01986 × 10 11 3.01986 × 10 11 3.01986 × 10 11 0.853381737 3.01986 × 10 11
F5 1.21178 × 10 12 3.01986 × 10 11 3.01986 × 10 11 3.01986 × 10 11 3.01986 × 10 11 3.01986 × 10 11 3.01986 × 10 11 3.01986 × 10 11 3.01986 × 10 11
F6 3.019 × 10 11 6.356 × 10 5 3.019 × 10 11 3.019 × 10 11 3.019 × 10 11 3.019 × 10 11 3.019 × 10 11 3.019 × 10 11 3.019 × 10 11
F7 3.019 × 10 11 3.019 × 10 11 3.019 × 10 11 3.019 × 10 11 3.019 × 10 11 3.019 × 10 11 2.921 × 10 11 0.559230536 3.019 × 10 11
F8 3.019 × 10 11 3.019 × 10 11 3.019 × 10 11 3.019 × 10 11 3.019 × 10 11 3.019 × 10 11 1.776 × 10 10 3.019 × 10 11 3.019 × 10 11
F9 1.211 × 10 12 1.211 × 10 12 1.211 × 10 12 1.211 × 10 12 1.211 × 10 12 1 . 211 × 10 12 NaNNaN 1.211 × 10 12
F10 1.193 × 10 12 1.211 × 10 12 1.211 × 10 12 1.211 × 10 12 1.211 × 10 12 1.211 × 10 12 1.024 × 10 7 NaN 1.211 × 10 12
F11 1.211 × 10 12 1.211 × 10 12 1.211 × 10 12 1.211 × 10 12 1.211 × 10 12 1.211 × 10 12 0.160802121 1.211 × 10 12
F12 3.017 × 10 11 0.005322078 3.019 × 10 11 3.019 × 10 11 3.019 × 10 11 3.019 × 10 11 3.019 × 10 11 3.019 × 10 11 3.019 × 10 11
F13 8.480 × 10 9 0.818745653 3.019 × 10 11 3.019 × 10 11 3.017 × 10 11 3.017 × 10 11 3.017 × 10 11 3.017 × 10 11 3.019 × 10 11
F14 3.019 × 10 11 9.755 × 10 10 3.019 × 10 11 6.76501 × 10 5 1.107 × 10 6 2.110 × 10 11 0.000556111 5.494 × 10 11 6.765 × 10 5
F15 3.019 × 10 11 6.695 × 10 11 3.019 × 10 11 4.504 × 10 11 5.072 × 10 10 4.077 × 10 11 3.157 × 10 5 3.019 × 10 11 3.338 × 10 11
F16 3.019 × 10 11 1.211 × 10 12 3.019 × 10 11 3.019 × 10 11 3.019 × 10 11 1.697 × 10 8 0.773119942 2.389 × 10 8 3.019 × 10 11
F170.3110531630.3110531630.3110531630.311053163 5.427 × 10 11 0.3110531630.3110531630.311053163 5.427 × 10 11
F18 3.019 × 10 11 4.107 × 10 12 3.019 × 10 11 3.019 × 10 11 8.993 × 10 11 2.002 × 10 6 1.956 × 10 10 3.019 × 10 11 3.019 × 10 11
F190.000268057 1.211 × 10 12 0.0002680570.000268057 2.601 × 10 8 0.0002680570.157975689 8.152 × 10 11 9.832 × 10 8
F200.045146208 8.885 × 10 9 0.0451462080.0451462080.0001584610.0001493160.00033679 9.918 × 10 11 6.282 × 10 11
F210.0001783560.7255381890.0001783560.000178356 5.967 × 10 9 6.045 × 10 7 3.081 × 10 8 5.967 × 10 9 4.615 × 10 10
F220.006097142 1.943 × 10 10 0.0060971420.006097142 5.967 × 10 9 0.0001783560.003182959 4.744 × 10 6 4.615 × 10 10
F230.003848068 2.251 × 10 8 0.0038480680.003848068 3.964 × 10 8 0.0020523340.000356384 3.157 × 10 5 3.351 × 10 8
Table 6. The comparison results of all algorithms over 30 functions.
Table 6. The comparison results of all algorithms over 30 functions.
F GAL-SHADELSHADE-EpSinSCAGOAWOATEOHGSOAEOHHO
F1Avg 4.83 × 10 10 1.11 × 10 10 5.36 × 10 10 2.10 × 10 10 6.68 × 10 10 5.29 × 10 9 6.32 × 10 10 2.31 × 10 10 1.13 × 10 11 5.13 × 10 8
STD 9.34 × 10 8 4.83 × 10 9 8.13 × 10 9 3.75 × 10 10 1.16 × 10 10 1.56 × 10 9 8.36 × 10 9 4.77 × 10 8 1.37 × 10 10 3.09 × 10 8
F2AvgNANANANANANANANANANA
STDNANANANANANANANANANA
F3Avg 6.28 × 10 9 1.77 × 10 5 1.28 × 10 5 7.87 × 10 4 3.29 × 10 5 2.85 × 10 5 9.12 × 10 4 7.15 × 10 4 2.50 × 10 5 5.57 × 10 4
STD 2.66 × 10 10 5.49 × 10 4 1.88 × 10 4 1.29 × 10 4 1.51 × 10 5 7.90 × 10 4 7.33 × 10 3 6.75 × 10 3 4.56 × 10 4 6.11 × 10 3
F4Avg 2.54 × 10 4 1.94 × 10 3 1.33 × 10 4 3.29 × 10 3 1.61 × 10 4 1.32 × 10 3 2.37 × 10 4 4.54 × 10 3 3.53 × 10 4 7.31 × 10 2
STD 8.39 × 10 3 6.54 × 10 2 2.63 × 10 3 8.81 × 10 2 6.18 × 10 3 4.43 × 10 2 4.20 × 10 3 1.34 × 10 3 5.98 × 10 3 1.43 × 10 2
F5Avg 8.34 × 10 2 8.19 × 10 2 9.42 × 10 2 8.30 × 10 2 1.00 × 10 3 8.61 × 10 2 9.38 × 10 2 8.60 × 10 2 1.14 × 10 3 7.57 × 10 2
STD 2.91 × 10 1 2.16 × 10 1 1.89 × 10 1 2.40 × 10 1 5.82 × 10 1 5.06 × 10 1 3.59 × 10 1 2.00 × 10 1 5.19 × 10 1 3.45 × 10 1
F6Avg 6.60 × 10 2 6.53 × 10 2 6.91 × 10 2 6.65 × 10 2 7.07 × 10 2 6.84 × 10 2 6.97 × 10 2 6.79 × 10 2 7.27 × 10 2 6.69 × 10 2
STD9.878.047.116.21 1.64 × 10 1 1.30 × 10 1 6.636.387.306.94
F7Avg 1.36 × 10 3 1.24 × 10 3 2.07 × 10 3 1.23 × 10 3 1.78 × 10 3 1.32 × 10 3 1.45 × 10 3 1.26 × 10 3 3.40 × 10 3 1.32 × 10 3
STD 6.94 × 10 1 6.71 × 10 1 9.70 × 10 1 5.82 × 10 1 1.67 × 10 2 7.93 × 10 1 5.77 × 10 1 5.13 × 10 1 2.64 × 10 2 6.23 × 10 1
F8Avg 1.08 × 10 3 1.10 × 10 3 1.20 × 10 3 1.10 × 10 3 1.24 × 10 3 1.07 × 10 3 1.16 × 10 3 1.10 × 10 3 1.36 × 10 3 9.85 × 10 2
STD 3.58 × 10 1 3.45 × 10 1 2.60 × 10 1 1.98 × 10 1 4.93 × 10 1 6.39 × 10 1 2.48 × 10 1 1.99 × 10 1 4.66 × 10 1 1.96 × 10 1
F9Avg 1.63 × 10 4 8.37 × 10 3 1.68 × 10 4 8.52 × 10 3 2.13 × 10 4 1.06 × 10 4 1.08 × 10 4 9.02 × 10 3 3.32 × 10 4 8.94 × 10 3
STD 3.73 × 10 3 2.57 × 10 3 1.86 × 10 3 1.91 × 10 3 6.80 × 10 3 3.48 × 10 3 1.27 × 10 3 9.62 × 10 2 4.37 × 10 3 7.75 × 10 2
F10Avg 7.91 × 10 3 9.39 × 10 3 8.83 × 10 3 8.88 × 10 3 8.99 × 10 3 7.51 × 10 3 9.22 × 10 3 7.71 × 10 3 9.02 × 10 3 5.93 × 10 3
STD 5.12 × 10 2 6.82 × 10 2 3.02 × 10 2 4.0 × 10 2 6.72 × 10 2 7.13 × 10 2 4.89 × 10 2 4.55 × 10 2 2.99 × 10 2 7.74 × 10 2
F11Avg 2.19 × 10 5 7.38 × 10 3 9.12 × 10 3 3.58 × 10 3 2.87 × 10 4 1.05 × 10 4 4.02 × 10 4 6.01 × 10 3 2.29 × 10 4 1.65 × 10 3
STD 3.42 × 10 5 3.92 × 10 3 1.85 × 10 3 1.11 × 10 3 1.16 × 10 4 4.64 × 10 3 7.88 × 10 4 7.81 × 10 2 5.70 × 10 3 1.58 × 10 2
F12Avg 1.37 × 10 10 8.41 × 10 8 8.03 × 10 9 2.90 × 10 9 9.36 × 10 9 5.23 × 10 8 1.94 × 10 9 3.77 × 10 9 1.64 × 10 10 9.86 × 10 7
STD 3.05 × 10 9 3.82 × 10 8 1.53 × 10 9 7.99 × 10 8 3.60 × 10 8 3.82 × 10 8 3.20 × 10 8 1.13 × 10 8 3.09 × 10 8 6.97 × 10 7
F13Avg 1.99 × 10 10 2.56 × 10 8 3.64 × 10 9 1.15 × 10 8 5.93 × 10 8 1.47 × 10 7 2.17 × 10 10 1.70 × 10 9 9.16 × 10 9 1.07 × 10 6
STD 6.87 × 10 9 2.73 × 10 8 9.23 × 10 8 5.04 × 10 8 3.28 × 10 9 1.93 × 10 7 7.87 × 10 9 8.10 × 10 8 2.62 × 10 9 6.62 × 10 5
F14Avg 8.26 × 10 7 1.00 × 10 6 1.66 × 10 6 7.71 × 10 5 6.71 × 10 6 2.26 × 10 6 3.47 × 10 7 1.46 v 3.61 × 10 6 1.70 × 10 6
STD 5.40 × 10 7 1.13 × 10 6 7.57 × 10 5 8.49 × 10 5 5.18 v 2.55 × 10 6 3.77 × 10 7 6.04 × 10 5 1.61 × 10 6 1.69 × 10 6
F15Avg 4.01 × 10 9 1.96 × 10 7 4.42 × 10 8 6.70 × 10 7 7.82 × 10 8 6.64 × 10 6 1.99 × 10 8 1.15 × 10 7 1.15 × 10 9 1.27 × 10 5
STD 1.67 × 10 9 2.10 × 10 7 1.74 × 10 8 5.54 × 10 7 4.90 × 10 8 7.77 × 10 6 9.49 × 10 8 6.46 × 10 6 4.44 × 10 8 8.11 × 10 4
F16Avg 7.76 × 10 3 4.17 × 10 3 4.97 × 10 3 4.07 × 10 3 5.07 × 10 3 4.52 × 10 3 8.20 × 10 3 4.30 × 10 3 5.98 × 10 3 3.71 × 10 3
STD 1.37 × 10 3 3.11 × 10 2 2.42 × 10 2 3.05 × 10 2 5.74 × 10 2 6.95 × 10 2 2.17 × 10 3 2.58 × 10 2 3.87 × 10 2 5.04 × 10 2
F17Avg 5.87 × 10 4 2.97 × 10 3 3.29 × 10 3 2.82 × 10 3 3.79 × 10 3 2.78 × 10 3 1.56 × 10 4 2.97 × 10 3 4.28 × 10 3 2.75 × 10 3
STD 5.39 × 10 4 2.23 × 10 2 1.38 × 10 2 1.76 × 10 2 4.24 × 10 2 3.41 × 10 2 1.54 × 10 4 1.79 × 10 2 4.47 × 10 2 2.49 × 10 2
F18Avg 6.24 × 10 8 1.12 × 10 7 2.39 × 10 7 1.43 × 10 7 8.22 × 10 7 1.34 × 10 7 3.95 × 10 8 1.18 × 10 7 5.33 × 10 7 3.37 × 10 6
STD 4.07 × 10 8 9.78 × 10 6 1.08 × 10 7 9.37 × 10 6 7.69 × 10 7 1.53 × 10 7 3.66 × 10 8 5.69 × 10 6 3.47 × 10 7 3.79 × 10 6
F19Avg 4.48 × 10 9 2.26 × 10 7 6.65 × 10 8 8.85 × 10 7 1.08 × 10 9 2.17 × 10 7 2.65 × 10 9 5.60 × 10 7 2.58 × 10 9 1.58 × 10 6
STD 2.27 × 10 9 2.25 × 10 7 2.89 × 10 8 3.94 × 10 7 6.25 × 10 8 3.15 × 10 7 1.59 × 10 9 3.52 × 10 7 8.11 × 10 8 1.44 × 10 6
F20Avg 3.24 × 10 3 3.23 × 10 3 2.97 × 10 3 2.96 × 10 3 3.25 × 10 3 2.93 × 10 3 3.28 × 10 3 2.84 × 10 3 3.20 × 10 3 2.83 × 10 3
STD 1.53 × 10 2 1.70 × 10 2 1.24 × 10 2 1.53 × 10 2 1.15 × 10 2 2.31 × 10 2 2.28 × 10 2 1.27 × 10 2 1.15 × 10 2 1.92 × 10 2
F21Avg 2.66 × 10 3 2.59 × 10 3 2.71 × 10 3 2.60 × 10 3 2.77 × 10 3 2.62 × 10 3 2.81 × 10 3 2.64 × 10 3 2.87 × 10 3 2.61 × 10 3
STD 7.17 × 10 1 3.12 × 10 1 2.28 × 10 1 2.38 × 10 1 6.02 × 10 1 6.05 × 10 1 5.82 × 10 1 3.05 × 10 1 3.13 × 10 1 6.02 × 10 1
F22Avg 8.87 × 10 3 6.67 × 10 3 8.68 × 10 3 1.01 × 10 4 9.70 × 10 3 8.19 × 10 3 1.00 × 10 4 5.02 × 10 3 1.02 × 10 4 7.05 v
STD 1.54 × 10 3 2.62 × 10 3 6.30 × 10 2 1.02 × 10 3 1.28 × 10 3 1.73 × 10 3 7.17 × 10 2 8.07 × 10 2 4.15 × 10 2 2.03 × 10 3
F23Avg 3.42 × 10 3 3.00 × 10 3 3.34 × 10 3 3.08 × 10 3 3.23 × 10 3 3.15 × 10 3 4.03 × 10 3 3.23 × 10 3 3.41 × 10 3 3.32 × 10 3
STD 1.25 × 10 2 4. 88 × 10 1 4.35 × 10 1 3.41 × 10 1 1.08 × 10 2 7.73 × 10 1 2.33 × 10 2 6.06 × 10 1 5.20 × 10 1 1.92 × 10 2
F24Avg 3.59 × 10 3 3.15 v 3.55 × 10 3 3.25 × 10 3 3.28 × 10 3 3.23 × 10 3 4.48 × 10 3 3.44 × 10 3 3.51 × 10 3 3.46 × 10 3
STD 1.37 × 10 2 4.37 × 10 1 5.89 × 10 1 4.67 × 10 1 5.97 × 10 1 9.13 × 10 1 2.27 × 10 2 6.22 × 10 1 4.62 × 10 1 1.68 × 10 2
F25Avg 1.15 × 10 4 3.54 × 10 3 6.81 × 10 3 3.62 × 10 3 6.89 × 10 3 3.26 × 10 3 6.78 × 10 3 3.68 × 10 3 1.56 × 10 4 3.01 × 10 3
STD 3.54 × 10 3 1.96 × 10 2 8.53 × 10 2 2.44 × 10 2 1.71 × 10 3 8.79 × 10 1 9.49 × 10 2 2.18 × 10 2 2.27 × 10 3 3.43 × 10 1
F26Avg 1.30 × 10 4 7.34 × 10 3 1.04 × 10 4 8.03 × 10 3 1.02 × 10 4 8.41 × 10 3 1.31 × 10 4 8.40 × 10 3 1.25 × 10 4 8.10 × 10 3
STD 1.78 × 10 3 4.72 × 10 2 5.16 × 10 2 3.66 × 10 2 1.19 × 10 3 1.31 × 10 3 1.16 × 10 3 6.20 × 10 2 7.50 × 10 2 1.23 × 10 3
F27Avg 4.47 × 10 3 3.38 × 10 3 3.97 × 10 3 3.56 × 10 3 3.66 × 10 3 3.48 × 10 3 5.66 × 10 3 3.20 × 10 3 3.70 × 10 3 3.61 × 10 3
STD 3.75 × 10 2 4.85 × 10 1 1.25 × 10 2 7.56 × 10 1 2.39 × 10 2 1.37 × 10 2 5.72 × 10 2 8.63 × 10 5 9.64 × 10 1 1.96 × 10 2
F28Avg 9.66 × 10 3 4.22 × 10 3 6.96 × 10 3 4.52 × 10 3 7.55 × 10 3 3.89 × 10 3 8.33 × 10 3 4.65 × 10 3 8.82 × 10 3 3.50 × 10 3
STD 1.75 × 10 3 3.56 × 10 2 5.10 × 10 2 4.34 × 10 2 1.24 × 10 3 2.27 × 10 2 8.61 × 10 2 6.86 × 10 2 8.00 × 10 2 1.34 × 10 2
F29Avg 1.09 × 10 5 5.23 × 10 3 5.98 × 10 3 5.35 × 10 3 6.57 × 10 3 5.66 × 10 3 1.71 × 10 4 5.28 × 10 3 6.76 × 10 3 5.06 × 10 3
STD 1.15 × 10 5 3.13 × 10 2 3.76 × 10 2 3.78 × 10 2 8.11 × 10 2 6.46 × 10 2 9.77 × 10 3 4.97 × 10 2 6.18 × 10 2 4.11 × 10 2
F30Avg 3.25 × 10 9 3.64 × 10 7 4.81 × 10 9 2.03 × 10 8 7.21 × 10 8 6.96 × 10 7 3.63 × 10 8 2.07 × 10 8 9.92 × 10 8 1.52 × 10 7
STD 1.25 × 10 9 2.41 × 10 7 1.29 × 10 8 8.22 × 10 7 5.21 × 10 8 6.95 × 10 7 1.79 × 10 9 6.29 × 10 7 2.79 × 10 8 1.32 × 10 7
Table 7. Summary of literature review on variants and modified HHO algorithms.
Table 7. Summary of literature review on variants and modified HHO algorithms.
SN.Modification NameRef.AuthorsJournal/Conf.YearRemarks
1Binary HHO (BHHO)[110]Too et al.Electronics2019Authors introduced two binary versions of HHO called (BHHO) and Quadratic Binary HHO (QBHHO).
2Opposite HHO (OHHO)[111]Hans et al.Journal of Interdisciplinary Mathematics2020Authors applied OHHO in feature selection in breast cancer classification.
3EHHO[112]Jiao et al.Energy2020Authors combined OBL and (OL) in HHO.
4NCOHHO[113]Fan et al.Evolutionary Intelligence2020Authors improved HHO by two mechanisms: neighborhood centroid and opposite-based learning.
5IHHO[114]Song et al.Energy Sources2020Two techniques were employed: Quasi-Oppositional and Chaos theory.
6LMHHO[115]Hussain et al.IEEE Access2019Long-term HHO algorithm (LMHHO) in which information share of multiple promising areas is shared.
7CMDHHO[116]Golilarz et al.IEEE Access20203 techniques are used with HHO, namely: Chaos theory, Multipopulation topological structure, and DE operators: mutation and crossover.
8GCHHO[117]Song et al.Knowledge-based Systems2020Gaussian mutation and Cuckoo Search were employed in HHO.
9AHHO[117]Wunnava et al.Applied Soft Computing2020Authors used mutation strategy to force the escape energy in the interval [0, 2].
10(DHHO/M)[118]Jia et al.Remote Sensing2019A dynamic HHO algorithm with mutation strategy is proposed.
11vibrational HHO (VHHO)[119]Shao et al.Measurement2020VHHO is proposed by embedding SVM into HHO and using a periodic mutation.
12GBHHO[120]Wei et al.IEEE ACCESS2020Authors developed an improved HHO approach by using Gaussian barebone (GB)
Table 9. The applications of HHO algorithm.
Table 9. The applications of HHO algorithm.
ProposedApplicationDescriptionResults and ConclusionYear and Authors(Ref.)
HHOPSOreactive power planning problemHHo with PSOHHOPSO has better results than HHOShekarappa et al.[207]
CMBHHOdistribution generator (DG)crossover and mutation is used in HHOCMBHHO outperforms HHO, LSA, GA, amd TLBOMohandas and Devanathan[208]
WHHOPV solarWhippy HHOWHHO achieves better resultsNaeijian et al.[209]
NCOHHOANNtraining multilayer feed-forward ANNNCOHHO tested using 5 different datasetsFan et al.[113]
HHO-DEmultilevel thresholding imageOstu’s and Kapur’s entropy method usedoutperforms all other algorithmsBao et al.[210]
HHOflow shop schedulinghybrid algorithm based on HHO is designedHHO achieved good results compared with othersUtama and Widodo[211]
MEHHO1 and MEHHO2Feature selectionsaving memory mechanism and adopting a learning strategy are usedMEHHO1 achieved good results compared with HHOToo et al.[212]
DHGHHDdrug discoveryHGSO enhanced HHO2 real-world datasets were usedAbd Elaziz and Yousri[213]
CovH2SDCOVID-19HHO was used to optimize CNNtransfer learning techniques using 9 convolutional NNBalaha et al.[214]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Hussien, A.G.; Abualigah, L.; Abu Zitar, R.; Hashim, F.A.; Amin, M.; Saber, A.; Almotairi, K.H.; Gandomi, A.H. Recent Advances in Harris Hawks Optimization: A Comparative Study and Applications. Electronics 2022, 11, 1919. https://doi.org/10.3390/electronics11121919

AMA Style

Hussien AG, Abualigah L, Abu Zitar R, Hashim FA, Amin M, Saber A, Almotairi KH, Gandomi AH. Recent Advances in Harris Hawks Optimization: A Comparative Study and Applications. Electronics. 2022; 11(12):1919. https://doi.org/10.3390/electronics11121919

Chicago/Turabian Style

Hussien, Abdelazim G., Laith Abualigah, Raed Abu Zitar, Fatma A. Hashim, Mohamed Amin, Abeer Saber, Khaled H. Almotairi, and Amir H. Gandomi. 2022. "Recent Advances in Harris Hawks Optimization: A Comparative Study and Applications" Electronics 11, no. 12: 1919. https://doi.org/10.3390/electronics11121919

APA Style

Hussien, A. G., Abualigah, L., Abu Zitar, R., Hashim, F. A., Amin, M., Saber, A., Almotairi, K. H., & Gandomi, A. H. (2022). Recent Advances in Harris Hawks Optimization: A Comparative Study and Applications. Electronics, 11(12), 1919. https://doi.org/10.3390/electronics11121919

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop