Next Article in Journal
A Unit Half-Logistic Geometric Distribution and Its Application in Insurance
Next Article in Special Issue
Decision Making of Agile Patterns in Offshore Software Development Outsourcing: A Fuzzy Logic-Based Analysis
Previous Article in Journal
Text Data Analysis Using Generalized Linear Mixed Model and Bayesian Visualization
Previous Article in Special Issue
Factor Prioritization for Effectively Implementing DevOps in Software Development Organizations: A SWOT-AHP Approach
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

HPSBA: A Modified Hybrid Framework with Convergence Analysis for Solving Wireless Sensor Network Coverage Optimization Problem

1
Electrical Engineering College, Guizhou University, Guiyang 550025, China
2
School of Computer Science and Engineering, South China University of Technology, Guangzhou 510006, China
3
Guizhou Provincial Key Laboratory of Internet + Intelligent Manufacturing, Guiyang 550025, China
4
College of Forestry, Guizhou University, Guiyang 550025, China
*
Author to whom correspondence should be addressed.
Axioms 2022, 11(12), 675; https://doi.org/10.3390/axioms11120675
Submission received: 23 October 2022 / Revised: 22 November 2022 / Accepted: 24 November 2022 / Published: 27 November 2022
(This article belongs to the Special Issue Computational Intelligence and Software Engineering)

Abstract

:
Complex optimization (CO) problems have been solved using swarm intelligence (SI) methods. One of the CO problems is the Wireless Sensor Network (WSN) coverage optimization problem, which plays an important role in Internet of Things (IoT). A novel hybrid algorithm is proposed, named hybrid particle swarm butterfly algorithm (HPSBA), by combining their strengths of particle swarm optimization (PSO) and butterfly optimization algorithm (BOA), for solving this problem. Significantly, the value of individual scent intensity should be non-negative without consideration of the basic BOA, which is calculated with absolute value of the proposed HPSBA. Moreover, the performance of the HPSBA is comprehensively compared with the fundamental BOA, numerous potential BOA variants, and tried-and-true algorithms, for solving the twenty-six commonly used benchmark functions. The results show that HPSBA has a competitive overall performance. Finally, when compared to PSO, BOA, and MBOA, HPSBA is used to solve the node coverage optimization problem in WSN. The experimental results demonstrate that the HPSBA optimized coverage has a higher coverage rate, which effectively reduces node redundancy and extends WSN survival time.

1. Introduction

With the emergence of heuristic intelligent optimization algorithms, new methods have been provided for solving complex engineering problems. The principle is mostly to imitate the biological habits of foraging and courtship of the biological community. According to the theoretical principle of meta-heuristic optimization algorithm, it can be simply divided into four categories (See Figure 1). Typical swarm intelligence algorithms (SI-based) are: Particle swarm optimization (PSO) [1], Cuckoo search (CS) [2], Grey wolf optimizer (GWO) [3], Whale optimization algorithm (WOA) [4], Marine predators algorithm (MPA) [5], Ant colony optimization (ACO) [6], Firefly algorithm (FA) [7], Moth-flame optimization (MFO) [8], Grasshopper optimization algorithm (GOA) [9], Butterfly optimization algorithm (BOA) [10]. Evolution algorithms (Ev-based) are: Genetic algorithm (GA) [11], Differential evolution (DE) [12], Biogeography-based optimizer (BBO) [13], Genetic programming (GP) [14]. Algorithms based on physical characteristics (Phy-based) are: Simulated annealing (SA) [15], Gravitational search algorithm (GSA) [16], Harmony search (HS) [17], Sine cosine algorithm (SCA) [18], Equilibrium optimizer (EO) [19], Gradient-based optimizer (GBO) [20]. Algorithms based on human social behavior (Hu-based) are: Teaching learning based optimization (TLBO) [21], Tabu search (TS) [22], Socio evolution and learning optimization (SELO) [23], Political optimizer (PO) [24]. For a more detailed review, we can refer to the literature [25,26].
According to the biological characteristics of natural animals (such as oviparous animals or mammals, and insects) and plants, swarm intelligence optimization algorithms can be divided into three categories. Imitating animal habits such as: Particle swarm optimization (PSO) [1], Cuckoo search (CS) [2], Grey wolf optimizer (GWO) [3], Whale optimization algorithm (WOA) [4], Marine predators algorithm (MPA) [5], etc. Imitating the habits of insects such as: Ant colony optimization (ACO) [6], Firefly algorithm (FA) [7], Moth-flame optimization (MFO) [8], Grasshopper optimization algorithm (GOA) [9], Butterfly optimization algorithm (BOA) [10], etc. Imitating plant characteristics such as: Flower pollination algorithm (FPA) [27], Tree-seed algorithm (TSA) [28], etc.
PSO is a typical and widely used intelligent optimization algorithm. It has the advantage of fast convergence. Some of the recent works on hybrid PSO with other SI algorithms are as follows: hybrid PSO and DE [29], hybrid PSO and GSA [30], hybrid GA and PSO [31,32], hybrid PSO and SSA [33], etc. Butterfly optimization algorithm [10] is a novel swarm intelligent algorithm proposed by Arora and Singh, which has been used to solve the wireless sensor network node localization problem [34], and optimization training of wavelet neural network [35]. However, BOA is susceptible to local optima and suffers from premature convergence. There are several recent works on BOA as follows: improved BOA [36], modified BOA [37], hybrid BOA and PSO [38], etc.
The energy consumption of the wireless sensor network (WSN) is limited by the large number of sensor nodes [39], which is crucial to the Internet of Things (IoT). It can provide users with accurate and comprehensive real-time data by processing the detection data of sensing objects of the mutual cooperation between nodes. WSN has been widely used in military, transportation, environmental monitoring and other fields [40]. The coverage problem is one of the key tasks in the research field of the WSN, which reflects the quality of the service. Coverage ratio is an important indicator for evaluating the performance of the WSN nodes.
In a working area, sensor nodes are usually arranged at random for the initial stage. High-density nodes will result from this method, resulting in a low coverage rate that directly impacts monitoring quality [41,42]. As a result, optimizing sensor node coverage is critical to increasing WSN coverage ratio in the work area. SI optimization algorithms have recently made significant contributions to the WSN’s problem of optimizing node coverage. Wang et al. [43] proposed a resampled PSO to solve the coverage control problem in IoT. Yang et. al [44] used the improved FA to solve sensor coverage problem, which considered the target coverage and network connectivity of sensor nodes. Miao et al. [45] proposed a GWO-EH algorithm to address the WSN node coverage optimization problem. Wang et al. [46] proposed the topology optimization of coverage-oriented method for a WSN based on wolf pack algorithm (WPA). Dao et al. [47] proposed a WSN coverage optimization model based on an improved Archimedes optimization algorithm for the working area. Although the above-mentioned SI algorithms have achieved success, they are still difficult to rid of local optima when optimization problems become more challenging, which makes it necessary to find new methods.
Because of the advantages of BOA with simple structure and few adjustment parameters, and PSO with fast convergence, a novel Chaotic hybrid butterfly optimization algorithm with particle swarm optimization (HPSOBOA) [38] was proposed for solving the high-dimensional optimization problems. However, both of BOA and PSO can easily fall into a local optimum and have low convergence accuracy, and HPSOBOA also has poor results for engineering optimization problems, which makes further research necessary for them. Significantly, the value of individual scent intensity should be a non-negative in nature without consideration in the basic BOA and others. Thus, a novel hybrid particle swarm butterfly algorithm (HPSBA) is proposed on the basis of HPSOBOA. The main contributions and highlights are as follows:
  • A novel hybrid particle swarm butterfly algorithm is proposed. This combination strikes a balance between exploitation and exploration. We design that the control strategy of parameter c is based on Logistic map, and the parameter ω is based on adaptive adjustment strategy of the HPSBA for improving the optimization speed, convergence accuracy and global search capability. Moreover, the individual scent intensity value is calculated with absolute value of the proposed HPSBA.
  • To ensure that the proposed algorithm works, we compare the optimization results of twenty-six benchmark functions with ten intelligent optimization algorithms. According to the mean value (Mean), standard deviation (Std), Wilcoxon rank-sum (WRS) test findings, and convergence curves, the simulation results show that HPSBA has a competitive overall performance.
  • The node optimization coverage problem of the WSN is solved using the proposed HPSBA. The application and advantages of the HPSBA are also discussed.
The remaining sections of this study are as follows: The mathematical model for the WSN’s node coverage optimization (NCO) problem is established in Section 2, which goes over the underlying concepts of the PSO and BOA. The proposed HPSBA is explained in detail in Section 3. The outcomes of the algorithms’ comparison experiments are presented in Section 4. In Section 5, HPSBA is applied to solve the WSN’s NCO problem. Section 6 concludes with a discussion of the next steps.

2. Basic Knowledge

2.1. Particle Swarm Optimization

There are two important characteristics of PSO algorithm [1] are the position and velocity of the particles. The position and velocity of the particles are updated as Equations (1) and  (2).
v i t + 1 = ω · v i t + c 1 · r a n d 1 × ( p b e s t x i t ) + c 2 · r a n d 2 × ( g b e s t x i t )
x i t + 1 = x i t + v t + 1
where v i t and v i t + 1 are the velocity of the i-th particle when the iteration number is t and t + 1 , respectively. p b e s t and g b e s t represent the initial individual best position and global best position of the particle. r a n d 1 and r a n d 2 are the random number in (0, 1), and usually c 1 = c 2 = 2 . ω is the inertia weight coefficient.

2.2. Butterfly Optimization Algorithm

In BOA [10], each butterfly in the group has a unique sense and individual perception ability. The intensity of fragrance perception is generated between individuals. Figure 2 presents the food foraging of butterflies in the 2-D search space.
The following is an expression of the intensity of scent that other butterflies perceive:
F ( x ) = c I a
where F ( x ) denotes the scent intensity function, c denotes the sensory modality, and I indicates stimulus intensity, that is, the function fitness value. a denotes the intensity factor, and the value range of parameter a is [0, 1]. The sensory modality c is calculated as follows:
c t + 1 = c t + [ 0.025 / ( c t · T max ) ]
where the initial value of c is set to 0.01 in basic BOA [10]. However, the parameter c can be set to any value within [ 0 , ) in theory. T max is the maximum number of iterations.
The switching probability S P determines the global search and local search of the BOA. The position update formula is expressed as follows:
x i t + 1 = x i t + ( r 2 × g x i t ) × F i , S P r a n d x i t + ( r 2 × x j t x k t ) × F i , S P < r a n d
where x i t denotes the spatial position of the i-th butterfly in the t-th iteration. g is the best position of all butterfly individuals in the current iteration. x j t and x k t are the position of the j-th and k-th butterfly individual when the iteration number is t, respectively. r is a random number in (0, 1), and F i is the scent intensity value of the i-th butterfly.

2.3. Node Coverage Optimization Problem Model

For the two-dimensional point coverage problem in WSN [39], it is assumed that there are n detection points to be covered in the two-dimensional coverage area, and the coverage nodes use homogeneous sensors, that is, the sensors have the same sensing radius. Supposing the sensing radius is r s , thus r c is regarded as the communication radius, and the unit of them is meter. The sensing radius is the maximum distance at which the received signal strength of a node is greater than the inherent noise, that is, the sensing range of the node. The communication radius is the maximum distance for transmitting data or signals between nodes, generally.
Assuming that the monitoring area contains n target points, the position coordinates of the i-th target point to be monitored are ( x i , y i ) , and the sensor nodes s position coordinates are ( x s , y s ) . Then, the Euclidean distance that the sensor can cover the target to be monitored can be expressed as:
d ( i , s ) = ( x s x i ) 2 + ( y s y i ) 2
The binary perception model [48,49] is used in this study, and the sensor node s covers the probability p that the target node i will be monitored, it can be defined as:
p ( i , s ) = 0 , d ( i , s ) r s 1 , d ( i , s ) < r s
We divide the two-dimensional plane area to be deployed along the x and y axes with the step length q, and then the length of each segment is l = q , and the intersection of the coverage area is q 2 . The probability that the monitoring point set T in the coverage area is interpreted by the node set S. The node coverage rate is defined as:
C ov = p cov q 2 = i = 1 S p ( i , s ) q 2
Assuming that the coverage area is a square, the side length is L, and r s denotes the node sensing radius. Theoretically, the number of deployed nodes can be calculated in the coverage area. The schematic diagram of the nodes full coverage is as Figure 3.
In Figure 3, O 1 , O 2 and O 3 indicate the positions of the three nodes, the triangle O 1 O 2 O 3 is an equilateral triangle where O 3 A = r s , that is, sensing radius of the sensor, A O 3 B = π / 3 , A B = B O 3 = O 3 A = r s . According to the nature of the circle, A B O 2 O 3 , A O 3 C = 1 / 2 A O 3 B = π / 6 . According to the law of cosines, the length of the line segment O 3 C can be expressed as follows:
L O 3 C = L O 3 A × cos ( A O 3 C ) = r s × cos ( π / 6 ) = 3 2 r s
Thus, the number of nodes in the coverage area can be calculated by Equation (10) in theory.
M = L 3 / 2 · r s + r s + 1 2
According to the above analysis, the NCO problem can be simplified as a constrained optimization problem, the expression is as follows:
max f ( x ) = C o v ,   s . t g 1 = i = 1 S p ( i , s ) 0 , g 2 = i = 1 S p ( i , s ) q 2 0 , g 3 = d ( i , s ) r s 0 , g 4 = S M 0 .
where r s denotes perception radius of the node, p ( i , s ) indicates the probability of the target node i monitored and covered by the sensor node s, and d ( i , s ) indicates the Euclidean distance between sensor node s and monitored target node i. M is the theoretical number of nodes in the coverage area, and S is the set of coverage nodes in the monitoring area.

3. Method

In this section, a novel HPSBA is proposed to improve the optimization speed, convergence accuracy and global search capability between PSO and BOA. It is possible to strike a balance between exploitation and exploration and benefits strengths of both algorithms by combining them. Significantly, the individual scent intensity value is calculated with absolute value of the proposed HPSBA. Furthermore, HPSBA is used to solve the node optimization coverage problem of the WSN for Internet of Things (IoT).

3.1. Hybrid Particle Swarm Butterfly Algorithm (HPSBA)

3.1.1. Algorithmic Population Initialization

Supposing the expression of randomly generated initial solution is defined as follow in the D-dimensional search space.
X i = L b + ( U b L b ) · r a n d
where X i presents the spatial position of the i-th butterfly individual ( i = 1 , 2 , 3 , , N ) in the butterfly swarm, N denotes the number of initial individual solutions. L b and U b are the upper and lower bounds of the search space, and r a n d is a random number matrix between (0, 1).

3.1.2. Algorithmic Exploration

The exploration stage of the proposed HPSBA is expressed as:
V i 1 t = ω V i 1 t + C 1 r 1 × ( p b X i 1 t ) + C 2 r 2 × ( g b X i 1 t )
X i t = X i 1 t + V i 1 t
where ω is the inertia weight coefficient. C 1 and C 2 indicate adjustment parameters, respectively. X i t and X i 1 t represent the position of the i-th and ( i 1 ) -th agent at t. V i t and V i 1 t are the velocity of the i-th and ( i 1 ) -th agent when t-th iteration, respectively. p b and g b represent the initial global best position of the agent. r 1 and r 2 are the random number in (0, 1).

3.1.3. Algorithmic Exploitation

The exploitation stage of the proposed HPSBA is expressed as:
X i t + 1 = ω · X i t + r 2 · ( g b X i t ) × | F i | , S P r a n d ω · X i t + r 2 · ( X k t X j t ) × | F i | , S P < r a n d
where ω indicates adaptive adjustment parameter. X i t + 1 and X i t represent the position of the i-th particle at t + 1 and t, respectively. X j t and X k t are the positions of the j-th and k-th individuals randomly selected from the solution. F i is the scent intensity value of the i-th individual. Most notably, the value of individual scent intensity is a non-negative, thus we take the absolute value of F i in the proposed HPSBA.

3.1.4. The Chaotic Adjusting Strategies

Chaos theory has a lot of application research in SI algorithms, such as chaotic population initialization [38], chaotic adjusting strategies of the control parameters [50], etc. The expression of the Logistic map is defined in Ref. [51]. The chaotic sequence of the Logistic map is (0, 1). When μ = 4 , the mapping will produce strong chaotic phenomena. In this paper, the control parameter c of the proposed HPSBA is expressed as:
c ( t ) = 4 · c · ( 1 c )
The inertial weight coefficient ω has a direct impact on the particle flight speed of the PSO algorithm, and can modify the algorithm’s global and local search capabilities. We adopt an adaptive adjustment strategy with chaos, and its expression is as follows:
ω ( t ) = ω u ( ω u ω l ) · t / T m a x
where ω u = 0.9, ω l = 0.2, T m a x is the maximum iteration number of the algorithm. For the control parameters c and ω , T m a x = 500 and different c ( 0 ) values are taken, and the corresponding chaotic sequences are shown in Figure 4. For selecting the best initial value of the control parameters, Schwefel 1.2 and Solomon functions are used to perform optimization tests. The optimization results are shown in Table 1.
As seen from Figure 4, when c ( 0 ) = 0.25 and c ( 0 ) = 0.75 , according to the property of Logistic map, the control strategy of parameter c falls into a fixed point. That is, the initial value of parameter c cannot be set 0.25 or 0.75 in this study.
According to Figure 4 and Table 1, when c ( 0 ) = 0.35 , HPSBA can obtain the best value for the Schwefel 1.2 function (Uni-modal). For the Solomon function (Multi-modal), although the optimal value is not the best with c ( 0 ) = 0.35 , its search value is in the same order of magnitude as the best search value. They take the same time to search the optimal value. In summary, the initial value of parameter c is set to 0.35 in the following experiments.
Figure 5 shows the optimization process of the proposed algorithm. From Figure 5, the new fitness value ( F n e w ) and new agent positions are obtained in the exploration stage, which is the local optimum. Then, according to the parameters value in exploration, the proposed HPSBA will get the global optimum in the theory. Finally, the best fitness and agent position are output.

3.2. Complexity Analysis of the HPSBA

To better understand computational complexity the proposed HPSBA, the time and space complexity of HPSBA are given in this section.

3.2.1. Time Complexity

Assuming that the population size is n, the search space dimension is d, and the maximum iteration is T m a x . The complexity of the HPSBA includes: the population initialization complexity is O ( n d ) , the fitness value calculation complexity is O ( n d ) , the global and local search location update complexity is O ( n 2 log n ) , the fitness value sorting complexity is O ( n 2 ) , and the control parameter update complexity of the algorithm is O ( n d ) . The HPSBA’s total time complexity can be shown as follows by looking at all of the aforementioned components:
O H P S B A = O n d + O T m a x O n d + n 2 log n + n 2 + n d
The time complexity of BOA is:
O B O A = O n d + O T m a x O n 2 log n + n + n d

3.2.2. Space Complexity

The space complexity of an algorithm is regarded as the storage space consumed by the algorithm. The population size is n and the dimension is d. The hybrid algorithm is used to calculate the space complexity. The total space complexity of the proposed HPSBA is O ( n d ) . The butterfly optimization algorithm uses n search agents to calculate the space complexity, and the total space complexity of the BOA is O ( n d ) . Therefore, the total space complexity of the basic BOA is the same as the total space complexity of the HPSBA. Therefore, the proposed algorithm has a reliable and effective space efficiency.

3.3. The Pseudo-Code and Flowchart of the HPSBA

The pseudo-code of the HPSBA is presented in Algorithm 1.
Algorithm 1: Pseudo-code of HPSBA
Axioms 11 00675 i001
In addition, the flowchart of the proposed HPSBA is shown as Figure 6.

3.4. Convergence Analysis of the HPSBA

Theorem 1. 
The population position vector sequence { X ( t ) , t 0 } of the proposed hybrid HPSBA method is a finite homogeneous Markov process.
Proof. 
The search space of any optimization algorithm is limited, so the population position vector sequence X ( t ) , t 0 of the hybrid particle swarm butterfly algorithm is also limited. In addition, the position vector of the population in the optimization process is determined by the odor behavior F i ( t ) and the flight speed V i ( t ) . It can be seen that X ( t + 1 ) is only related to X ( t ) , namely X ( t ) , t 0 is a Markov chain. Individuals gradually approach the optimal position base on the search space’s fitness value, that is, when f ( x t + 1 ) > f ( x t ) , the movement of the population is adjusted, which only has to do with time t. In summary, the population position vector sequence X ( t ) , t 0 of the HPSBA is a finite homogeneous Markov process. □
The essence of the HPSBA belongs to the category of random search algorithm, so the convergence criterion of random optimization algorithm [52] is used to prove the convergence of the hybrid algorithm HPSBA.

3.4.1. Convergence Criterion

For the problem < Y , f > , there is a random optimization algorithm Z [53]. The result of the k-th iteration is x k , and the result of the next iteration is x k + 1 = D ( x k , ζ ) . Y represents the space of potential solutions, f denotes the fitness function, and ζ is the solution searched in Z iteration of the algorithm. The search’s lower bound of the Lebesgue measure space [52] is defined as follows:
σ = inf { t | U ( x Y | f ( x ) < t ) > 0 }
where U ( X ) denotes the Lebesgue measure on the set X, and the optimum can be defined as:
R ξ , M = { x Y | f ( x ) < σ + ξ } , σ l i m i t e d { x Y | f ( x ) < ϵ } , σ =
where ξ denotes greater than zero, and ϵ is a sufficiently large positive number. If the algorithm can find a point in R ξ , M , that is, the algorithm may have reached an acceptable global optimal point or an approximate global optimal point.
Condition 1: f ( Z ( x , ζ ) ) f ( x ) , and if ζ Y , f ( Z ( x , ζ ) ) f ( ζ ) .
Condition 2: If B Y , s.t. U ( B ) > 0 , then
k = 0 ( 1 U k ( B ) ) = 0
where U k ( B ) denotes the measure of probability of algorithm Z searching solution on set B at the k-th iteration.
Theorem 2. 
(Conditions necessary and sufficient for global convergence) Supposing f is measurable, the measurable space Y denotes a measurable subset of R n , and algorithm Z fulfills both Conditions 1 and 2, and x k k = 0 is the algorithm Z used to generate the solution sequence. So there is a probability measure:
lim k P ( x k R ξ , M ) = 1
Thus, the algorithm Z converges globally. P ( x k , R ξ , M ) is the probability measure of the solution x k in R ξ , M of the iterative search step of the algorithm.

3.4.2. Convergence Analysis

Lemma 1. 
Condition 1 is met by HPSBA, which states that the hybrid algorithm’s direction of population optimization is monotonic.
Lemma 2. 
The HPSBA population state space’s general state has a transition probability of one to the optimal state, that is, lim t P ( t ) ( ζ i ζ j ) = 1 .
Proof. 
Assuming the population state ζ ( j ) is the optimal solution, if the algorithm converges, after infinite state transitions, the probability of its state space from the general state to the optimal state should be 1. Due to
lim t P ( t ) ( ζ i ζ j ) = k = 1 N P ( t ) ( ζ i k ζ j k )
Each iteration of the HPSBA population state is based on the transfer of individual odor to the optimal state, that is, the position of the worst individual state of the population is updated. Therefore
lim t P ( t ) ( ζ i ζ j ) = 1
Lemma 3. 
HPSBA satisfies Condition 2.
Theorem 3. 
HPSBA converges to the global optimum, namely lim t P { X ( t ) G | X ( 0 ) = Φ 0 } = 1 .
Proof. 
Since HPSBA satisfies Condition 1 and Condition 2, in each iteration of the algorithm, the individual will choose to update the retention mechanism of the optimal individual. That is, when the iteration has a tendency to be infinite, lim n P ( x k , R ξ , M ) = 1, and { x k } k = 0 is the solution sequence generated by HPSBA iteration. It can be concluded from Theorem 2 that HPSBA is a globally convergent algorithm. □

4. Analyses and Results of Numerical Optimization

Twenty-six benchmark functions are used to test the proposed hybrid algorithm performance, which are listed in Table 2. There are two categories of the benchmark functions. F1–F15 are unimodal functions, which notes the category as U. F16–F26 are multimodal functions, which notes the category as M.

4.1. Parameter Setting of Comparison Algorithms

To verify the performance of HPSBA for solving numerical optimization problems, ten algorithms are employed as competitors for the experiment. The proposed algorithm is compared with five standard algorithms (they are PSO [1], GWO [3], BOA [10], EO [19], MPA [5]), and four BOA variants (they are LBOA [34], CBOA [36], HPSOBOA [38], and IBOA [35]), and SOGWO [54].
The comparison algorithm parameter settings for numerical optimization experiments are shown in Table 3. All of the experimental series were carried out using MATLAB 2018a and an Intel(R) Core (TM) i5-10210U CPU @2.11G with 8G RAM in this study. Furthermore, to better set the number of nodes in the node optimization coverage experiment, the dimensions of the function are respectively set to 30 and 100 in the numerical optimization problems, that is, the setting range of the number of nodes in 2-D monitoring area.

4.2. Comparison Results of HPSBA with Others (Dim = 30)

In order to ensure the reasonableness and fairness of the comparison results, the dimension of the test function in the numerical optimization experiment is set to 30, and the maximum number of iterations is set to 500. For the same test function, each algorithm is independently run 30 times. The mean (Mean), standard deviation (Std) are calculated according to the statistical value. Table 4 shows the comparison results of eleven algorithms with Dim = 30. Results of Wilcoxon rank-sum test calculated at a significance level of α = 0.05 are also listed in Table 4 and Table 5. The second last row indicates the number of success (+), failure (–), and approximate (≈) of the compared algorithms with respect to HPSBA. The last row shows the rank of the compared algorithms.

4.2.1. Analysis of the Numerical Results

As can be seen from Table 4, for the benchmark functions F1, F2, F3, F4, F6, F8, F9, F11, F14, F20, and F25, the proposed algorithm outperforms all comparative algorithms. For the F16, F17, F19, F24, and F26, HPSBA achieved the theoretical optimal value. For the F18, HPSBA, HPSOBOA and CBOA have the superior results over the other algorithms. For the F5, F21, F22, and F23, MPA have the best results. For the F7, PSO achieved the theoretical optimal value. For the F10, EO has the best result. For the F12, although the Means of GWO, EO, SOGWO, and HPSBA are the same, the Std of EO is the smallest. For the F13, GWO, EO, and SOGWO achieved the theoretical optimal value.
Although the performance of the PSO algorithm is poor, the optimization time is the shortest, which shows that the optimization speed of the algorithm is relatively strong. The conclusions that can be drawn from the results presented in Table 4 are that the rankings of the comparison algorithms are HPSBA > MPA > EO = HPSOBOA = CBOA > LBOA = SOGWO > IBOA > GWO > PSO > BOA.

4.2.2. Convergence Behavior Analysis

Figure 7 shows the 2D search space for twenty-six benchmark functions in three dimensional visualization. Figure 8 shows the convergence curves of the comparison algorithms for the functions F1 to F4 (unimodal functions), F16 to F19 and F24 (multimodal functions) when Dim = 30. As can be seen from Figure 8, the comparison curves confirm the superiority of HPSBA over the PSO, GWO, BOA, EO, MPA and other comparison algorithms for functions F1 to F4. In addition, for F16, F17, and F19, HPSBA, HPSOBOA, IBOA, MPA and EO can obtain the best value of the functions in theory. For F24, there are two comparison algorithms can obtain the optimal value, called HPSBA and EO algorithm from the Figure 8. From the curves of the proposed HPSBA, the performance of the algorithm needs to be further improved, especially in terms of convergence speed.

4.3. Comparison Results of HPSBA with Others (Dim = 100)

Table 5 shows the comparison results of eleven algorithms with Dim = 100. Results of the Wilcoxon rank-sum test calculated at a significance level of α = 0.05 are listed in Table 5. Where the number of success (+), failure (–), and approximate (≈) of the compared algorithms with respect to HPSBA are listed in the second last row. The last row shows the rank of the compared algorithms.

4.3.1. Analysis of the Numerical Results

As can be seen from Table 5, for the benchmark functions F1, F3, F4, F6, F8, F9, F11, F14, F15, F20, and F25, the proposed algorithm outperforms all comparative algorithms. For the F16, F17, F19, F24, and F26, HPSBA achieved the theoretical optimal value. For the F2, CBOA has the best result. For the F18, HPSBA, HPSOBOA and CBOA have the superior results over the other algorithms. For the F5, F10, F21, and F23, HPSOBOA has the best results. For the F7 and F12, EO algorithm has the best results. For the F13, GWO, EO, and SOGWO achieved the theoretical optimal value. For the F22, PSO algorithm has the best result.
It can be seen from Table 5 that with the increase of the problem dimensions, the optimization speed of PSO algorithm decreases to be some extent. This shows that the difficulty of solving a problem increases with its complexity. The conclusion that can be drawn from the results presented in Table 5 are that the rankings of the comparison algorithms is HPSBA > HPSOBOA > CBOA > EO = MPA > LBOA > IBOA > PSO = SOGWO = GWO > BOA.

4.3.2. Boxplot Results Analysis

To better explain the stability of the comparison algorithms for solving the high dimensional optimization problems. Figure 9 presents the boxplot results of the eleven algorithms on the four test functions (F3, F8, F20, and F25). 30 independent runs of each algorithm are conducted for the same test function.

5. Nodes Coverage Optimization in WSN

Through the above experiments, it can be known that HPSBA performs better in numerical optimization problems, but for practical problems, the effectiveness of the proposed algorithm remains to be verified. In this section, we apply the proposed HPSBA without parameter ω in Equation (14) to the NOC problem of the WSN. The model description and objective function of the problem are detailed in Section 2.3.

5.1. Parameter Setting and Pseudo Code of Node Coverage Using HPSBA

To confirm the performance of the HPSBA for solving the NOC problem, there are two group experiments are designed as follows: (1) To study the performance of the HPSBA in node optimization coverage problem, three comparison algorithms, PSO, BOA, and MBOA, are employed as competitors. (2) The HPSBA algorithm is applied to the problem of optimal coverage of nodes with obstacles.
Equation (10) can be used to determine the number of sensor nodes needed to cover the theoretical area. The node coverage area, the number of sensor nodes, and the parameter settings of the simulation experiment are shown in Table 6.
The simulations of node optimization coverage of the WSN are conducted using PSO, BOA, MBOA, and HPSBA. The population and number of iterations of comparison algorithms are respectively set 30 and 150. Additionally, the following is a discussion of the parameters for the comparison methods:
  • PSO based node optimization coverage: Each target node runs the PSO to become a deployed node. Parameters that are considered for coverage are: inertial weight = 0.7, cognitive and social scaling parameters c 1 = c 2 = 2 .
  • BOA based node optimization coverage: Each target node runs the BOA to become a deployed node. Parameters that are considered for coverage are: probability switch weight S P = 0.8 , cognitive and social scaling parameters c = 1 , and a = 0.1 .
  • MBOA based node optimization coverage: Each target node runs the MBOA to become a deployed node. Parameters that are considered for coverage are: probability switch weight S P = 0.5 , cognitive and social scaling parameters c ( 0 ) = r 1 = 0.35 with chaotic adjust strategy, and a = 0.1 .
  • HPSBA based node optimization coverage: Each target node runs the proposed HPSBA to become a deployed node. Parameters that are considered for coverage are set as follows: initial value of inertial weight = 0.9, probability switch weight S P = 0.6 , cognitive and social scaling parameters c 1 = c 2 = 2 , a = 0.1 and c ( 0 ) = 0.35 .

5.2. Results Analyses of Coverage Optimization Problem

5.2.1. The Effect of the Number of Nodes on Coverage

To further test the optimization performance of the HPSBA for coverage optimization problem, the coverage rates of various algorithms in the monitoring area under different numbers of sensor nodes are compared. Deploy sensor nodes in a 100 m × 100 m square monitoring area, with that the sensing radius is set 10 m, the communication radius is set 20 m, and the maximum iterations is set 150. Comparison algorithms are used when sensor nodes numbers are 40, 45, and 50, respectively. When nodes are 45 with 150 iterations, the coverage simulation results are shown in Figure 10. Other parameters remain unchanged, and the variation trend of the coverage rate with the number of nodes under different coverage strategies is shown in Figure 10a.
As seen from Figure 10 that the initial random coverage is shown in Figure 10a with 45 sensor nodes. As the number of iterations of HPSBA reaches 150 times, the coverage position of the nodes is shown in Figure 10b. According to the HPSBA to optimize the sensor node coordinate positions before and after coverage, the minimum spanning tree Prim algorithm [55] is used to draw the node communication network in the coverage area, as shown in Figure 10c,d.
As seen from Figure 10c,d, the uniformity of the communication distance between the initially deployed nodes is poor, the sink node is located in the center of the coverage area, and the data transmission distance between the nodes is longer. Energy consumption is large, and the optimized communication distance between nodes is more uniform, there are multiple convergence nodes, and the location is located at the boundary, thereby enhancing the reliability of the network, thereby reducing the energy consumption of node data transmission, and extending the network life.
The coverage curves of node coverage based on BOA, MBOA, PSO and HPSBA node are shown in Figure 11. The coverage effect of the BOA is poor. HPSBA is in the leading position after 10 iterations. The final coverage rates of the four algorithms after 150 iterations are: 83.65%, 83.90%, 94.12%, and 96.54%.
In the case of different numbers of nodes, Figure 12 provides an illustration of the calculated fitness average values. The proposed HPSBA coverage strategy is better than BOA, MBOA and PSO in the WSN node coverage with different number of nodes. Moreover, it is evident that the most effective simulation results are obtained with the proposed HPSBA, and the coverage rate reaches 93.15% in nodes = 40, 96.54% in nodes N = 45 , and 98.42% in nodes N = 50 . Compared to the standard BOA, the coverage rate is increased by the percentage points of 11.49, 12.89 and 12.41 in N = 40 , N = 45 and N = 50 , respectively. It is noteworthy that as the coverage optimization problem (80-dimension when N = 40 , 90-dimension when N = 45 , 100-dimension when N = 50 ) grows in size, HPSBA has more outstanding advantages over the basic BOA.

5.2.2. The Effect of the Number of Iterations on Coverage

To verify the influence of the number of algorithm iterations on the coverage rate, we set different iteration numbers, 100, 150, and 200, respectively. The optimized coverage results with 45 sensor nodes of the four comparison algorithms are shown in Table 7.
As seen from Table 7, it is evident that the most effective simulation results are obtained with the proposed HPSBA, and the coverage rate reaches 93.28% in T m a x = 100 , 96.54% in T m a x = 150 , and 96.32% in T m a x = 200 . Compared to the standard BOA, the coverage rate is increased by the percentage points of 8.49, 12.89, and 9.90 in T m a x = 100, 150 and 200, respectively. It is noteworthy that as the coverage optimization problem (80-dimension when N = 40 , 90-dimension when N = 45 , 100-dimension when N = 50 ) grows in size, the proposed HPSBA demonstrates a superior advantage to the basic BOA. Furthermore, the proposed HPSBA compared with MBOA, the coverage rate is increased by the percentage points of 8.17, 12.64. HPSBA compared with PSO, the coverage rate is increased by the percentage points of 1.08, 2.42, and 2.00. Overall, when it comes to the WSN node coverage optimization problem, HPSBA outperforms the other three competitors.

5.2.3. Node Obstacle Avoidance Coverage Based on HPSBA

The proposed HPSBA has a positive effect on the sensor network’s coverage of two-dimensional nodes when compared to the previous experimental results. To verify the effectiveness of the HPSBA node obstacle avoidance coverage problem, a node coverage experiment is designed under obstacles: the node coverage area is a two-dimensional plane of 100 m × 100 m, the number of sensor nodes is 40, the sensing radius r s is set 10 m, communication radius r c = 20 m, the number of iterations is 200, and the obstacle is a rectangular area of 20 m × 20 m. The optimization results of the coverage of obstacle avoidance nodes based on HPSBA and the coverage curve of 200 iterations are depicted in Figure 13.
As seen from Figure 13, HPSBA has a better application effect in the coverage of obstacle avoidance nodes. After 200 iterations, the node coverage in the set coverage area increased from 77.01% to 98.67%, an increase point of 21.66, and the time consumption is 21.34 s.

6. Conclusions

Aiming at the uneven distribution of nodes and low coverage in the random coverage of sensor networks, a hybrid particle swarm butterfly algorithm (HPSBA) is proposed for deploying WSN nodes. HPSBA improves the convergence speed through Logistic map and adaptive adjustment strategies, and the algorithm’s accuracy at convergence is also improved. Through the optimization experiment of twenty-six benchmark functions with ten comparison swarm intelligence algorithms, the optimization results show that HPSBA’s optimization ability and convergence accuracy are improved, and the stability is enhanced.
For the WSN node coverage problem, HPSBA can effectively coordinate the global exploration and local development capabilities of the algorithm. Compared with other algorithms, HPSBA effectively improves the coverage of WSN nodes while using fewer nodes, thus reducing it. The configuration cost of the network is reduced. However, the energy of nodes is usually taken into account in the real node coverage. More advanced algorithms [56] will be considered in the future work.
We will concentrate on the following tasks in future work: (i) In order to guarantee optimization precision, we will further develop HPSBA in light of the high complexity of its main framework. (ii) The proposed HPSBA will be further applied to solve multi-objective optimization problem, such as energy, distance, and uniformity for WSN in a three-dimensional environment, etc, between nodes.

Author Contributions

Methodology, M.Z.; software, M.Z.; writing—original draft preparation, M.Z. and J.Y.; supervision, D.W. and M.Y.; funding acquisition, D.W., W.T. and J.Y. All authors have read and agreed to the published version of the manuscript.

Funding

The work was supported by NNSF of China (No.61640014), Industrial Project of Guizhou province (No. Qiankehe Zhicheng [2022]017, [2019]2152), Engineering Research Center of Guizhou Education Department (No. Qianjiaoji[2022]043), Innovation group of Guizhou Education Department under Grant Qianjiaohe (No.KY[2021]012), Science and Technology Fund of Guizhou Province under Grant Qiankehe (No.[2020]1Y266), Qiankehejichu [No.ZK[2022]Yiban103], Science and Technology Foundation of Guizhou University (Guidateganghezi [2021]04).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, Australia, 27 November–1 December 1995; Volume 4, pp. 1942–1948. [Google Scholar]
  2. Gandomi, A.H.; Yang, X.S.; Alavi, A.H. Cuckoo search algorithm: A metaheuristic approach to solve structural optimization problems. Eng. Comput. 2013, 29, 17–35. [Google Scholar]
  3. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar]
  4. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar]
  5. Faramarzi, A.; Heidarinejad, M.; Mirjalili, S.; Gandomi, A.H. Marine Predators Algorithm: A nature-inspired metaheuristic. Expert Syst. Appl. 2020, 152, 113377. [Google Scholar]
  6. Dorigo, M.; Di Caro, G. Ant colony optimization: A new meta-heuristic. In Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406), Washington, DC, USA, 6–9 July 1999; Volume 2, pp. 1470–1477. [Google Scholar]
  7. Yang, X.S. Nature-Inspired Metaheuristic Algorithms; Luniver Press: Frome, UK, 2010. [Google Scholar]
  8. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl.-Based Syst. 2015, 89, 228–249. [Google Scholar]
  9. Saremi, S.; Mirjalili, S.; Lewis, A. Grasshopper optimisation algorithm: Theory and application. Adv. Eng. Softw. 2017, 105, 30–47. [Google Scholar]
  10. Arora, S.; Singh, S. Butterfly optimization algorithm: A novel approach for global optimization. Soft Comput. 2019, 23, 715–734. [Google Scholar]
  11. Holland, J.H. Genetic algorithms. Sci. Am. 1992, 267, 66–73. [Google Scholar]
  12. Storn, R.; Price, K. Differential evolution—A simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar]
  13. Simon, D. Biogeography-based optimization. IEEE Trans. Evol. Comput. 2008, 12, 702–713. [Google Scholar]
  14. Koza, J.R. Genetic programming II: Automatic discovery of reusable subprograms. Cambridge MA USA 1994, 13, 32. [Google Scholar]
  15. Kirkpatrick, S.; Gelatt, C.D., Jr.; Vecchi, M.P. Optimization by simulated annealing. Science 1983, 220, 671–680. [Google Scholar]
  16. Rashedi, E.; Nezamabadi-pour, H.; Saryazdi, S. GSA: A Gravitational Search Algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  17. Geem, Z.W.; Kim, J.H.; Loganathan, G.V. A new heuristic optimization algorithm: Harmony search. Simulation 2001, 76, 60–68. [Google Scholar]
  18. Mirjalili, S. SCA: A sine cosine algorithm for solving optimization problems. Knowl.-Based Syst. 2016, 96, 120–133. [Google Scholar]
  19. Faramarzi, A.; Heidarinejad, M.; Stephens, B.; Mirjalili, S. Equilibrium optimizer: A novel optimization algorithm. Knowl.-Based Syst. 2020, 191, 105190. [Google Scholar]
  20. Ahmadianfar, I.; Bozorg-Haddad, O.; Chu, X. Gradient-based optimizer: A new metaheuristic optimization algorithm. Inf. Sci. 2020, 540, 131–159. [Google Scholar]
  21. Rao, R.V.; Savsani, V.J.; Vakharia, D. Teaching–learning-based optimization: An optimization method for continuous non-linear large scale problems. Inf. Sci. 2012, 183, 1–15. [Google Scholar]
  22. Glover, F. Tabu search: A tutorial. Interfaces 1990, 20, 74–94. [Google Scholar]
  23. Kumar, M.; Kulkarni, A.J.; Satapathy, S.C. Socio evolution & learning optimization algorithm: A socio-inspired optimization methodology. Future Gener. Comput. Syst. 2018, 81, 252–272. [Google Scholar]
  24. Askari, Q.; Younas, I.; Saeed, M. Political Optimizer: A novel socio-inspired meta-heuristic for global optimization. Knowl.-Based Syst. 2020, 195, 105709. [Google Scholar]
  25. Tu, J.; Chen, H.; Wang, M.; Gandomi, A.H. The colony predation algorithm. J. Bionic Eng. 2021, 18, 674–710. [Google Scholar]
  26. Li, C.; Chen, G.; Liang, G.; Luo, F.; Zhao, J.; Dong, Z.Y. Integrated optimization algorithm: A metaheuristic approach for complicated optimization. Inf. Sci. 2022, 586, 424–449. [Google Scholar]
  27. Yang, X.S. Flower pollination algorithm for global optimization. In International Conference on Unconventional Computing and Natural Computation; Springer: Berlin/Heidelberg, Germany, 2012; pp. 240–249. [Google Scholar]
  28. Kiran, M.S. TSA: Tree-seed algorithm for continuous optimization. Expert Syst. Appl. 2015, 42, 6686–6698. [Google Scholar]
  29. Liu, H.; Cai, Z.; Wang, Y. Hybridizing particle swarm optimization with differential evolution for constrained numerical and engineering optimization. Appl. Soft Comput. 2010, 10, 629–640. [Google Scholar]
  30. Razmjooy, N.; Ramezani, M. Training wavelet neural networks using hybrid particle swarm optimization and gravitational search algorithm for system identification. Int. J. Mechatronics, Electr. Comput. Technol. 2016, 6, 2987–2997. [Google Scholar]
  31. Garg, H. A hybrid GSA-GA algorithm for constrained optimization problems. Inf. Sci. 2019, 478, 499–523. [Google Scholar]
  32. Gong, Y.J.; Li, J.J.; Zhou, Y.; Li, Y.; Chung, H.S.H.; Shi, Y.H.; Zhang, J. Genetic learning particle swarm optimization. IEEE Trans. Cybern. 2015, 46, 2277–2290. [Google Scholar]
  33. El Sehiemy, R.A.; Selim, F.; Bentouati, B.; Abido, M. A novel multi-objective hybrid particle swarm and salp optimization algorithm for technical-economical-environmental operation in power systems. Energy 2020, 193, 116817. [Google Scholar]
  34. Arora, S.; Singh, S. Node localization in wireless sensor networks using butterfly optimization algorithm. Arab. J. Sci. Eng. 2017, 42, 3325–3335. [Google Scholar]
  35. Tan, L.S.; Zainuddin, Z.; Ong, P. Wavelet neural networks based solutions for elliptic partial differential equations with improved butterfly optimization algorithm training. Appl. Soft Comput. 2020, 95, 106518. [Google Scholar]
  36. Zhi, Y.; Weiqing, W.; Haiyun, W.; Khodaei, H. Improved butterfly optimization algorithm for CCHP driven by PEMFC. Appl. Therm. Eng. 2020, 173, 114766. [Google Scholar]
  37. Shams, I.; Mekhilef, S.; Tey, K.S. Maximum power point tracking using modified butterfly optimization algorithm for partial shading, uniform shading, and fast varying load conditions. IEEE Trans. Power Electron. 2020, 36, 5569–5581. [Google Scholar]
  38. Mengjian, Z.; Daoyin, L.; Qin, T.; Jing, Y. A Chaotic Hybrid Butterfly Optimization Algorithm with Particle Swarm Optimization for High-Dimensional Optimization Problems. Symmetry 2020, 12, 1800. [Google Scholar] [CrossRef]
  39. Akyildiz, I.F.; Su, W.; Sankarasubramaniam, Y.; Cayirci, E. Wireless sensor networks: A survey. Comput. Netw. 2002, 38, 393–422. [Google Scholar]
  40. Rashid, B.; Rehmani, M.H. Applications of wireless sensor networks for urban areas: A survey. J. Netw. Comput. Appl. 2016, 60, 192–219. [Google Scholar]
  41. Liao, W.H.; Kao, Y.; Wu, R.T. Ant colony optimization based sensor deployment protocol for wireless sensor networks. Expert Syst. Appl. 2011, 38, 6599–6605. [Google Scholar]
  42. Adulyasas, A.; Sun, Z.; Wang, N. Connected Coverage Optimization for Sensor Scheduling in Wireless Sensor Networks. IEEE Sensors J. 2015, 15, 3877–3892. [Google Scholar] [CrossRef]
  43. Wang, X.; Zhang, H.; Fan, S.; Gu, H. Coverage Control of Sensor Networks in IoT Based on RPSO. IEEE Internet Things J. 2018, 5, 3521–3532. [Google Scholar] [CrossRef]
  44. Yang, M.; Wang, A.; Sun, G.; Zhang, Y. Deploying charging nodes in wireless rechargeable sensor networks based on improved firefly algorithm. Comput. Electr. Eng. 2018, 72, 719–731. [Google Scholar] [CrossRef]
  45. Miao, Z.; Yuan, X.; Zhou, F.; Qiu, X.; Song, Y.; Chen, K. Grey wolf optimizer with an enhanced hierarchy and its application to the wireless sensor network coverage optimization problem. Appl. Soft Comput. 2020, 96, 106602. [Google Scholar] [CrossRef]
  46. Wang, S.; You, H.; Yue, Y.; Cao, L. A novel topology optimization of coverage-oriented strategy for wireless sensor networks. Int. J. Distrib. Sens. Netw. 2021, 17, 1550147721992298. [Google Scholar] [CrossRef]
  47. Dao, T.K.; Chu, S.C.; Nguyen, T.T.; Nguyen, T.D.; Nguyen, V.T. An Optimal WSN Node Coverage Based on Enhanced Archimedes Optimization Algorithm. Entropy 2022, 24, 1018. [Google Scholar] [CrossRef]
  48. Abdollahzadeh, S.; Navimipour, N.J. Deployment strategies in the wireless sensor network: A comprehensive review. Comput. Commun. 2016, 91–92, 1–16. [Google Scholar] [CrossRef]
  49. Zhang, M.; Yang, J.; Qin, T. An Adaptive Three-Dimensional Improved Virtual Force Coverage Algorithm for Nodes in WSN. Axioms 2022, 11, 199. [Google Scholar] [CrossRef]
  50. Zhang, M.; Wang, D.; Yang, J. Hybrid-Flash Butterfly Optimization Algorithm with Logistic Mapping for Solving the Engineering Constrained Optimization Problems. Entropy 2022, 24, 525. [Google Scholar] [CrossRef]
  51. May, R.M. Simple mathematical models with very complicated dynamics. Nature 1976, 261, 459–467. [Google Scholar] [CrossRef]
  52. Solis, F.J.; Wets, R.J.B. Minimization by random search techniques. Math. Oper. Res. 1981, 6, 19–30. [Google Scholar] [CrossRef]
  53. Zhang, M.-J.; Long, D.-Y.; Wang, X.; Yang, J. Research on Convergence of Grey Wolf Optimization Algorithm Based on Markov Chain. Acta Electron. Sin. 2020, 48, 1587–1595. [Google Scholar] [CrossRef]
  54. Dhargupta, S.; Ghosh, M.; Mirjalili, S.; Sarkar, R. Selective Opposition based Grey Wolf Optimization. Expert Syst. Appl. 2020, 151, 113389. [Google Scholar] [CrossRef]
  55. Knowles, J.; Corne, D. A new evolutionary approach to the degree-constrained minimum spanning tree problem. IEEE Trans. Evol. Comput. 2000, 4, 125–134. [Google Scholar] [CrossRef]
  56. LaTorre, A.; Molina, D.; Osaba, E.; Poyatos, J.; Del Ser, J.; Herrera, F. A prescription of methodological guidelines for comparing bio-inspired optimization algorithms. Swarm Evol. Comput. 2021, 67, 100973. [Google Scholar]
Figure 1. Classification of heuristic optimization algorithms.
Figure 1. Classification of heuristic optimization algorithms.
Axioms 11 00675 g001
Figure 2. Food foraging of butterflies.
Figure 2. Food foraging of butterflies.
Axioms 11 00675 g002
Figure 3. Node coverage diagram.
Figure 3. Node coverage diagram.
Axioms 11 00675 g003
Figure 4. Chaotic sequence with different c ( 0 ) values.
Figure 4. Chaotic sequence with different c ( 0 ) values.
Axioms 11 00675 g004
Figure 5. The optimization process of HPSBA.
Figure 5. The optimization process of HPSBA.
Axioms 11 00675 g005
Figure 6. The flowchart of the proposed HPSBA.
Figure 6. The flowchart of the proposed HPSBA.
Axioms 11 00675 g006
Figure 7. Search space of twenty-six benchmark functions.
Figure 7. Search space of twenty-six benchmark functions.
Axioms 11 00675 g007
Figure 8. Convergence curves of comparison algorithms when Dim = 30.
Figure 8. Convergence curves of comparison algorithms when Dim = 30.
Axioms 11 00675 g008
Figure 9. Box plot results for the function F3, F8, F20 and F25.
Figure 9. Box plot results for the function F3, F8, F20 and F25.
Axioms 11 00675 g009
Figure 10. Node coverage and communication distribution diagram.
Figure 10. Node coverage and communication distribution diagram.
Axioms 11 00675 g010
Figure 11. Coverage comparison curves and optimized coverage time.
Figure 11. Coverage comparison curves and optimized coverage time.
Axioms 11 00675 g011
Figure 12. Coverage rates of different number of nodes.
Figure 12. Coverage rates of different number of nodes.
Axioms 11 00675 g012
Figure 13. Coverage of obstacle avoidance nodes based on HPSBA.
Figure 13. Coverage of obstacle avoidance nodes based on HPSBA.
Axioms 11 00675 g013
Table 1. Optimal value with different c(0) values.
Table 1. Optimal value with different c(0) values.
c(0)Schwefel 1.2Solomon
MeanStdTime/sMeanStdTime/s
0.155.28E-29901.153.85E-30100.17
0.254.75E-30001.111.49E-22300.17
0.357.70E-30001.125.73E-30100.17
0.453.49E-29901.111.87E-30100.17
0.554.54E-29901.132.40E-30100.17
0.653.16E-29901.118.90E-30100.17
0.757.87E-29801.119.75E-22200.17
0.854.82E-29901.124.40E-30100.17
0.953.16E-29901.121.93E-30100.17
Table 2. Twenty-six benchmark functions.
Table 2. Twenty-six benchmark functions.
NameFormulaSearch RangeDim f min Category
Sphere F 1 = i = 1 D i m x i 2 [−100,100]30/1000U
Schwefel 2.22 F 2 = i = 1 D i m x i + i = 1 D i m x i [−10,10]30/1000U
Schwefel 1.2 F 3 = i = 1 D i m j = 1 i x j 2 [−10,10]30/1000U
Schwefel 2.21 F 4 = max x i , 1 i D i m [−10,10]30/1000U
Step F 5 = i = 1 D i m x i + 0.5 2 [−10,10]30/1000U
Quartic F 6 = i = 1 D i m i x i 4 + r a n d ( 0 , 1 ) [−1.28,1.28]30/1000U
Exponential F 7 = exp 0.5 i = 1 D i m x i [−10,10]30/1000U
Sum Power F 8 = i = 1 D i m x i ( i + 1 ) [−1,1]30/1000U
Sum Square F 9 = i = 1 D i m i x i 2 [−10,10]30/1000U
Rosenbrock F 10 = i = 1 D i m 100 x i + 1 x i 2 2 + x i 1 2 [−10,10]30/1000U
Zakharov F 11 = i = 1 D i m x i 2 + i = 1 D i m 0.5 i x i 2 + i = 1 D i m 0.5 i x i 4 [−5.12,5.12]30/1000U
Trid F 12 = ( x 1 1 ) 2 + i = 2 D i m i ( 2 x i 2 x i 1 ) 2 [−5,5]30/1000U
Elliptic F 13 = i = 1 D i m ( 10 6 ) i 1 D 1 x i 2 [−100,100]30/1000U
Cigar F 14 = x 1 2 + 10 6 i = 2 D i m x i 2 [−100,100]30/1000U
Tablet F 15 = 10 6 x 1 2 + i = 2 D i m x i 6 [−10,10]30/1000U
Rastrigin F 16 = i = 1 D i m x i 2 10 cos ( 2 π x i ) + 10 [−5.12,5.12]30/1000M
NCRastrigin F 17 = i = 1 D i m y i 2 10 cos ( 2 π y i ) + 10 , y i = x i , x i < 0.5 , r o u n d ( 2 x i ) / 2 , x i < 0.5 [−5.12,5.12]30/1000M
Ackley F 18 = 20 exp 0.2 1 D i m i = 1 D i m x i 2 exp 1 D i m i = 1 D i m cos ( 2 π x i ) + 20 + e [−20,20]30/1000M
Griewank F 19 = 1 4000 i = 1 D i m x i 2 i = 1 D i m cos x i i + 1 [−600,600]30/1000M
Alpine F 20 = i = 1 D i m x i · sin ( x i ) + 0.1 x i [−10,10]30/1000M
Penalized 1 F 21 = π D i m i = 1 D i m 1 ( y i 1 ) 2 [ 1 + 10 sin 2 ( π y i + 1 ) ] + ( y D i m 1 ) 2 + 10 sin 2 ( π y 1 ) + i = 1 D i m u ( x i , 10 , 100 , 4 ) , y i = 1 + x i + 1 4 , u y i , a , k , m = k ( x i a ) m , x i > a , 0 , a x i a , k ( x i a ) m , x i < a [−10,10]30/1000M
Penalized 2 F 22 = 1 10 sin 2 ( π x 1 ) + i = 1 D i m 1 ( x i 1 ) 2 1 + sin 2 ( 3 π x i + 1 ) + ( x D i m 1 ) 2 1 + sin 2 ( 2 π x i + 1 ) + i = 1 D i m u ( x i , 5 , 100 , 4 ) [−5,5]30/1000M
Levy F 23 = sin 2 ( 3 π x 1 ) + i = 1 D i m 1 ( x i 1 ) 2 [ 1 + sin 2 ( 3 π x i + 1 ) ] + x D i m 1 · [ 1 + sin 2 ( 2 π x D i m ) ] [−2,2]30/1000M
Weierstrass F 24 = i = 1 D i m k = 0 k max a k cos 2 π b k ( x i + 0.5 ) D i m k = 0 k max a k cos 2 π b k · 0.5 , a = 0.5 , b = 3 , k max = 20 [−1,1]30/1000M
Solomon F 25 = 1 cos ( 2 π i = 1 D i m x i 2 ) + 0.1 i = 1 D i m x i 2 [−20,20]30/1000M
Bohachevsky F 26 = i = 1 D i m x i 2 + 2 x i + 1 2 0.3 · cos ( 3 π x i ) 0.4 · cos ( 4 π x i + 1 ) + 0.7 [−5,5]30/1000M
Table 3. Parameter settings of the comparison algorithms.
Table 3. Parameter settings of the comparison algorithms.
AlgorithmsParameter Settings
PSO N = 30 , c 1 = c 2 = 2 , v m a x = 1 , v m i n = 1 , ω = 0.7
GWO N = 30 , a f i r s t = 2 , a f i n a l = 0
BOA N = 30 , a = 0.1 , c ( 0 ) = 0.01 , S P = 0.6
EO N = 25 , a 1 = 2 , a 2 = 1 , G P = 0.5 , λ ( 0 , 1 )
MPA N = 30 , p = 0.5 , F A D s = 0.2
LBOA N = 30 , a = 0.1 , c ( 0 ) = 0.01 , p = 0.6 , γ = 1.5
CBOA N = 30 , a ( 0 ) = 0.1 , c ( 0 ) = 0.01 , p = 0.6 , r ( 0 ) = 0.33 , μ = 4
HPSOBOA N = 30 , a f i r s t = 0.1 , a f i n a l = 0.3 , c 0 = 0.01 , p = 0.6 , x 0 = 0.315 , ρ = 0.295 , c 1 = c 2 = 0.5
IBOA N = 30 , a = 0.1 , c ( 0 ) = 0.01 , a n d p i s d y n a m i c .
SOGWO N = 50 , a f i r s t = 2 , a f i n a l = 0
HPSBA N = 30 , a = 0.1 , c ( 0 ) = 0.35 , S P = 0.6 , μ = 4 , ω u = 0.9 , ω l = 0.2 , C 1 = C 2 = 2
Table 4. Comparison results of eleven algorithms: Dim = 30.
Table 4. Comparison results of eleven algorithms: Dim = 30.
FunctionsPSOGWOBOAEOMPALBOACBOAHPSOBOAIBOASOGWOHPSBA
F1Mean1.16E-011.87E-277.70E-113.35E-405.72E-233.54E-122.20E-303.59E-1527.47E-153.46E-333.29E-252
Std3.89E-023.11E-276.78E-121.70E-395.83E-233.57E-124.29E-307.89E-1531.72E-157.28E-330.00E+00
F2Mean6.35E-019.50E-172.35E-086.78E-242.53E-131.24E-093.92E-195.04E-604.97E+128.25E-202.52E-134
Std2.05E-017.46E-176.49E-096.18E-242.68E-132.10E-096.67E-192.06E-591.22E+139.41E-206.87E-134
F3Mean4.16E+007.10E-085.31E-112.38E-112.71E-062.77E-121.11E-304.05E-1539.32E-151.97E-097.60E-300
Std9.64E-011.69E-075.83E-121.04E-107.11E-062.71E-123.47E-301.13E-1531.11E-158.79E-090.00E+00
F4Mean3.36E-017.05E-082.65E-082.10E-113.39E-102.48E-091.48E-191.05E-778.05E-121.82E-096.02E-152
Std4.82E-024.84E-082.94E-094.31E-112.12E-103.13E-092.49E-197.23E-791.06E-121.47E-091.69E-152
F5Mean7.12E-025.40E-015.23E+007.61E-063.42E-083.40E+004.55E+004.12E-023.36E+003.46E-015.38E+00
Std3.20E-023.28E-016.84E-016.22E-061.73E-086.50E-015.83E-012.40E-027.86E-012.66E-016.36E-01
F6Mean2.60E-011.79E-031.99E-031.36E-031.28E-031.92E-031.17E-042.31E-043.10E-041.20E-039.21E-05
Std8.64E-029.30E-045.51E-049.12E-047.43E-049.45E-041.17E-043.77E-042.58E-045.60E-049.73E-05
F7Mean0.00E+003.19E-584.94E-117.18E-667.18E-666.36E-213.84E-191.53E-627.09E-148.16E-618.51E-16
Std0.00E+001.20E-571.34E-101.02E-781.40E-692.34E-201.16E-186.12E-633.06E-134.36E-604.27E-15
F8Mean7.23E-071.75E-958.88E-141.97E-1341.98E-608.27E-161.46E-361.02E-1564.45E-192.00E-1161.70E-307
Std1.17E-069.25E-955.52E-149.42E-1344.89E-609.08E-166.48E-368.09E-1582.44E-199.70E-1160.00E+00
F9Mean7.51E-012.35E-286.94E-111.36E-414.83E-243.11E-128.60E-312.02E-1529.61E-152.03E-343.19E-263
Std2.87E-014.02E-288.22E-123.56E-416.23E-244.11E-121.77E-302.89E-1531.34E-152.28E-340.00E+00
F10Mean5.99E+012.72E+012.89E+012.53E+012.51E+012.88E+012.89E+012.71E+012.89E+012.68E+012.89E+01
Std3.87E+018.55E-012.70E-021.54E-013.89E-013.25E-023.73E-026.30E+003.53E-028.00E-013.37E-02
F11Mean1.47E+003.17E-286.67E-113.42E-411.23E-233.52E-122.81E-306.89E-1538.32E-151.18E-331.28E-252
Std8.03E-015.15E-287.26E-121.26E-402.24E-233.22E-127.66E-301.17E-1531.52E-152.59E-330.00E+00
F12Mean4.22E+006.67E-019.74E-016.67E-016.67E-019.18E-019.76E-011.00E+009.35E-016.67E-016.67E-01
Std1.82E+003.76E-058.43E-033.08E-103.87E-082.48E-029.00E-031.25E-051.81E-024.37E-061.86E-04
F13Mean6.08E-310.00E+002.80E-210.00E+003.55E-1745.49E-261.77E-342.30E-1488.87E-310.00E+002.44E-302
Std2.42E-300.00E+008.91E-210.00E+000.00E+001.14E-255.23E-341.14E-1474.33E-300.00E+000.00E+00
F14Mean1.16E-242.82E-2051.92E-177.30E-2071.34E-613.31E-183.49E-311.95E-1475.73E-234.59E-2286.12E-296
Std2.26E-240.00E+002.08E-170.00E+007.35E-614.01E-187.94E-314.53E-1471.22E-220.00E+000.00E+00
F15Mean3.02E-306.90E-2614.54E-198.38E-2558.23E-941.65E-191.09E-341.92E-1533.69E-221.06E-3133.61E-304
Std1.65E-290.00E+008.61E-190.00E+003.33E-933.98E-195.42E-346.79E-1537.04E-220.00E+000.00E+00
F16Mean2.37E+024.02E+006.54E+011.89E-150.00E+000.00E+000.00E+000.00E+000.00E+002.27E+000.00E+00
Std5.65E+013.88E+009.09E+011.04E-140.00E+000.00E+000.00E+000.00E+000.00E+003.20E+000.00E+00
F17Mean2.76E+028.31E+001.24E+022.33E-013.96E-070.00E+000.00E+000.00E+000.00E+008.87E+000.00E+00
Std7.80E+014.39E+007.02E+016.26E-012.17E-060.00E+000.00E+000.00E+000.00E+005.03E+000.00E+00
F18Mean2.47E-019.05E-142.75E-088.47E-158.53E-132.47E-098.88E-168.88E-167.10E-124.14E-148.88E-16
Std7.83E-021.67E-142.47E-091.80E-155.41E-131.38E-090.00E+000.00E+007.10E-132.89E-150.00E+00
F19Mean3.41E+013.23E-039.73E-120.00E+000.00E+001.79E-130.00E+000.00E+004.18E-162.70E-030.00E+00
Std5.57E+008.81E-031.06E-110.00E+000.00E+003.96E-130.00E+000.00E+001.90E-155.90E-030.00E+00
F20Mean1.53E-014.74E-043.47E-096.29E-096.61E-146.42E-141.00E-199.42E-606.67E-123.52E-047.05E-136
Std9.79E-027.66E-047.60E-093.44E-084.58E-141.86E-131.28E-192.99E-597.69E-135.84E-043.83E-135
F21Mean7.09E+005.03E-025.39E-013.46E-037.59E-052.90E-014.78E-012.47E-031.48E+003.38E-025.49E-01
Std3.04E+002.12E-021.58E-011.89E-024.15E-049.21E-021.36E-012.64E-032.31E-011.49E-021.37E-01
F22Mean8.06E-037.07E-013.40E+002.18E-023.45E-032.37E+003.00E+004.07E+002.63E+005.15E-013.42E+00
Std4.77E-032.10E-014.83E-014.72E-021.65E-026.28E-015.74E-012.15E+005.90E-011.88E-015.79E-01
F23Mean3.88E-011.67E+001.18E+011.52E-011.38E-019.31E+001.01E+018.89E-011.06E+011.25E+001.09E+01
Std2.43E-011.02E+002.10E+003.22E-011.89E-012.67E+002.71E+001.18E+001.94E+008.19E-013.23E+00
F24Mean5.70E+004.93E+001.23E+000.00E+000.00E+000.00E+000.00E+000.00E+000.00E+004.61E+000.00E+00
Std2.76E+002.04E+002.35E+000.00E+000.00E+000.00E+000.00E+000.00E+000.00E+001.67E+000.00E+00
F25Mean7.11E-012.79E-017.66E-019.95E-029.95E-022.99E-021.68E-326.43E-029.95E-022.89E-014.07E-301
Std3.41E-011.49E-012.18E-012.08E-127.05E-174.64E-023.64E-324.99E-021.24E-061.46E-010.00E+00
F26Mean1.24E+000.00E+008.02E-110.00E+000.00E+004.45E-120.00E+000.00E+009.79E-150.00E+000.00E+00
Std5.92E-010.00E+008.59E-120.00E+000.00E+005.45E-120.00E+000.00E+001.33E-150.00E+000.00E+00
+/-/≈1/25/00/24/20/26/01/20/54/18/40/23/30/20/60/20/60/23/31/22/3
Rank76832433541
Table 5. Comparison results of eleven algorithms: Dim = 100.
Table 5. Comparison results of eleven algorithms: Dim = 100.
FunctionsPSOGWOBOAEOMPALBOACBOAHPSOBOAIBOASOGWOHPSBA
F1Mean8.45E+001.48E-128.75E-114.14E-291.77E-196.12E-126.86E-303.31E-1529.05E-156.44E-157.12E-299
Std7.28E-011.83E-129.69E-125.38E-292.10E-197.39E-121.51E-291.35E-1521.87E-158.86E-150.00E+00
F2Mean1.82E+014.04E-084.71E+501.85E-171.43E-111.52E+503.74E-181.48E+368.10E+491.60E-094.35E+50
Std2.23E+001.35E-081.92E+511.36E-171.15E-116.27E+506.73E-182.80E+353.95E+506.14E-101.42E+51
F3Mean2.80E+024.54E+006.08E-113.15E-011.04E-013.40E-124.42E-311.48E-1529.74E-151.39E+001.06E-299
Std7.38E+014.04E+005.91E-121.11E+001.50E-013.36E-128.91E-311.35E-1521.11E-151.98E+000.00E+00
F4Mean1.77E+005.79E-022.98E-082.64E-012.40E-082.86E-091.20E-191.08E-778.47E-121.60E-027.43E-152
Std1.71E-016.77E-022.71E-091.45E+001.16E-082.88E-091.58E-195.19E-791.17E-121.52E-022.49E-152
F5Mean5.47E+009.20E+002.21E+012.94E+002.56E+002.03E+012.12E+011.48E-012.07E+017.67E+002.24E+01
Std9.43E-019.87E-011.06E+005.46E-017.71E-011.60E+001.18E+001.27E-011.06E+009.06E-018.81E-01
F6Mean6.02E+017.14E-032.11E-032.44E-031.87E-032.01E-031.03E-041.13E-042.77E-044.90E-036.85E-05
Std1.38E+012.79E-038.96E-041.41E-039.55E-041.18E-037.42E-051.28E-042.49E-041.70E-036.09E-05
F7Mean0.00E+009.84E-1351.71E-237.16E-2181.70E-2022.74E-284.05E-321.93E-2071.92E-202.96E-1554.15E-27
Std0.00E+005.39E-1348.68E-230.00E+000.00E+001.39E-271.12E-310.00E+008.47E-201.62E-1542.27E-26
F8Mean9.97E-021.28E-667.32E-141.16E-1292.23E-609.96E-164.86E-379.98E-1574.00E-192.38E-641.82E-306
Std2.20E-014.84E-666.04E-146.03E-1294.85E-601.62E-151.49E-367.46E-1582.59E-191.30E-630.00E+00
F9Mean2.82E+026.63E-138.60E-111.94E-298.17E-203.97E-121.13E-304.23E-1529.06E-151.38E-154.72E-299
Std5.17E+015.34E-138.92E-122.56E-295.83E-203.66E-122.60E-301.02E-1521.95E-151.01E-150.00E+00
F10Mean1.24E+039.79E+019.89E+019.66E+019.69E+019.88E+019.89E+019.07E+019.89E+019.77E+019.89E+01
Std2.28E+025.73E-012.93E-021.08E+008.66E-013.76E-024.76E-022.30E+013.48E-027.22E-013.49E-02
F11Mean3.65E+027.60E-138.13E-113.57E-293.77E-204.82E-123.26E-309.41E-1531.00E-141.93E-154.59E-299
Std7.04E+016.45E-136.27E-121.09E-283.04E-203.99E-121.45E-295.44E-1531.66E-151.49E-150.00E+00
F12Mean6.01E+026.67E-019.98E-016.67E-016.67E-019.95E-019.98E-011.00E+009.96E-016.67E-019.99E-01
Std1.55E+023.47E-058.04E-043.93E-081.41E-069.93E-045.10E-048.11E-058.74E-045.49E-064.19E-04
F13Mean1.06E-330.00E+005.13E-220.00E+003.61E-1693.15E-253.19E-341.33E-1504.62E-310.00E+001.53E-302
Std5.50E-330.00E+001.69E-210.00E+000.00E+008.18E-251.63E-333.20E-1501.56E-300.00E+000.00E+00
F14Mean6.38E-241.24E-2053.84E-172.66E-2011.86E-634.88E-183.01E-316.40E-1509.63E-232.87E-1863.05E-298
Std1.76E-230.00E+006.24E-170.00E+001.02E-627.45E-186.15E-311.34E-1492.26E-220.00E+000.00E+00
F15Mean1.88E-328.39E-2611.23E-193.97E-2539.63E-923.93E-191.55E-341.71E-1533.26E-226.24E-3101.69E-303
Std5.59E-320.00E+002.55E-190.00E+004.94E-919.39E-198.12E-345.23E-1534.34E-220.00E+000.00E+00
F16Mean4.87E+029.55E+001.77E-063.79E-150.00E+000.00E+000.00E+000.00E+000.00E+009.04E+000.00E+00
Std6.30E+018.37E+009.70E-062.08E-140.00E+000.00E+000.00E+000.00E+000.00E+005.80E+000.00E+00
F17Mean4.34E+022.34E+017.57E+011.00E-010.00E+000.00E+000.00E+000.00E+005.92E-171.32E+010.00E+00
Std6.22E+012.09E+012.31E+023.05E-010.00E+000.00E+000.00E+000.00E+003.24E-169.07E+000.00E+00
F18Mean2.52E+008.15E-083.08E-083.27E-142.87E-112.76E-098.88E-168.88E-167.78E-124.06E-098.88E-16
Std1.91E-013.11E-082.55E-096.08E-151.33E-112.31E-090.00E+000.00E+009.15E-131.29E-090.00E+00
F19Mean1.38E+024.01E-036.69E-110.00E+000.00E+001.98E-120.00E+000.00E+009.25E-151.83E-030.00E+00
Std1.42E+011.15E-022.72E-110.00E+000.00E+001.80E-120.00E+000.00E+004.81E-155.61E-030.00E+00
F20Mean1.33E+013.92E-032.01E-093.66E-183.01E-123.26E-111.05E-191.15E-577.60E-122.49E-031.27E-151
Std3.35E+002.70E-031.85E-092.04E-182.42E-125.92E-111.32E-196.30E-579.27E-131.50E-033.56E-152
F21Mean1.13E-012.05E-019.83E-012.83E-023.74E-027.49E-019.33E-011.49E-037.70E-011.51E-011.09E+00
Std8.19E-024.33E-028.89E-028.01E-039.89E-031.14E-011.20E-016.36E-041.10E-014.35E-027.82E-02
F22Mean1.02E+005.66E+009.99E+005.16E+006.05E+009.99E+009.98E+009.70E+009.94E+004.96E+009.99E+00
Std1.98E-014.20E-015.25E-031.35E+002.96E+004.55E-034.85E-036.11E-011.34E-014.19E-012.48E-03
F23Mean2.35E+011.81E+016.84E+013.96E+004.54E+006.05E+016.85E+011.94E+006.54E+011.37E+016.69E+01
Std7.20E+004.80E+004.20E+001.59E+001.69E+006.39E+004.94E+008.69E-015.49E+003.09E+005.65E+00
F24Mean5.16E+011.67E+012.66E+003.20E-050.00E+000.00E+000.00E+000.00E+000.00E+001.33E+010.00E+00
Std9.49E+001.08E+013.18E+001.75E-040.00E+000.00E+000.00E+000.00E+000.00E+004.05E+000.00E+00
F25Mean3.97E+006.80E-014.00E-012.29E-012.49E-013.33E-021.09E-328.86E-029.95E-026.47E-017.39E-301
Std1.06E+002.51E-013.16E-031.50E-011.52E-014.77E-023.81E-323.58E-023.00E-062.53E-010.00E+00
F26Mean6.60E+011.01E-138.60E-110.00E+000.00E+005.22E-120.00E+000.00E+007.87E-152.96E-160.00E+00
Std6.71E+001.50E-138.49E-120.00E+000.00E+006.98E-120.00E+000.00E+001.50E-154.86E-160.00E+00
+/-/≈1/25/01/25/00/26/02/21/30/21/50/23/31/19/64/16/60/24/21/25/0
Rank77844532671
Table 6. Node coverage parameter settings.
Table 6. Node coverage parameter settings.
ParametersSetting Values
Side length of coverage area/m100 × 100100 × 100
Number of nodes4540, 45, 50
Perception radius/m1010
Communication radius r c /m2020
Maximum iterations ( T m a x )100, 150, 200150
Boundary threshold/m r s / 3 r s / 3
Table 7. Node coverage rate of different iterations.
Table 7. Node coverage rate of different iterations.
Item T Max = 100 T Max = 150 T Max = 200
Cov/%Time/sCov/%Time/sCov/%Time/s
BOA84.7912.2183.6516.5586.4240.98
MBOA85.1110.9383.9016.0485.8221.10
PSO92.2011.9494.1224.494.3251.92
HPSBA93.2811.0996.5416.5596.3221.31
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zhang, M.; Wang, D.; Yang, M.; Tan, W.; Yang, J. HPSBA: A Modified Hybrid Framework with Convergence Analysis for Solving Wireless Sensor Network Coverage Optimization Problem. Axioms 2022, 11, 675. https://doi.org/10.3390/axioms11120675

AMA Style

Zhang M, Wang D, Yang M, Tan W, Yang J. HPSBA: A Modified Hybrid Framework with Convergence Analysis for Solving Wireless Sensor Network Coverage Optimization Problem. Axioms. 2022; 11(12):675. https://doi.org/10.3390/axioms11120675

Chicago/Turabian Style

Zhang, Mengjian, Deguang Wang, Ming Yang, Wei Tan, and Jing Yang. 2022. "HPSBA: A Modified Hybrid Framework with Convergence Analysis for Solving Wireless Sensor Network Coverage Optimization Problem" Axioms 11, no. 12: 675. https://doi.org/10.3390/axioms11120675

APA Style

Zhang, M., Wang, D., Yang, M., Tan, W., & Yang, J. (2022). HPSBA: A Modified Hybrid Framework with Convergence Analysis for Solving Wireless Sensor Network Coverage Optimization Problem. Axioms, 11(12), 675. https://doi.org/10.3390/axioms11120675

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop