Next Article in Journal
Complex Network Analysis of Mass Violation, Specifically Mass Killing
Next Article in Special Issue
Gaussian-Based Adaptive Fish Migration Optimization Applied to Optimization Localization Error of Mobile Sensor Networks
Previous Article in Journal
Quantum Thermodynamic Uncertainty Relations, Generalized Current Fluctuations and Nonequilibrium Fluctuation–Dissipation Inequalities
Previous Article in Special Issue
Energy-Efficient Clustering Mechanism of Routing Protocol for Heterogeneous Wireless Sensor Network Based on Bamboo Forest Growth Optimizer
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Optimal WSN Node Coverage Based on Enhanced Archimedes Optimization Algorithm

1
Fujian Provincial Key Laboratory of Big Data Mining and Applications, Fujian University of Technology, Fuzhou 350118, China
2
College of Computer Science and Engineering, Shandong University of Science and Technology, Qingdao 266590, China
3
University of Information Technology, Ho Chi Minh City 700000, Vietnam
4
Vietnam National University, Ho Chi Minh City 700000, Vietnam
*
Author to whom correspondence should be addressed.
Entropy 2022, 24(8), 1018; https://doi.org/10.3390/e24081018
Submission received: 14 June 2022 / Revised: 15 July 2022 / Accepted: 19 July 2022 / Published: 23 July 2022
(This article belongs to the Special Issue Wireless Sensor Networks and Their Applications)

Abstract

:
Node coverage is one of the crucial metrics for wireless sensor networks’ (WSNs’) quality of service, directly affecting the target monitoring area’s monitoring capacity. Pursuit of the optimal node coverage encounters increasing difficulties because of the limited computational power of individual nodes, the scale of the network, and the operating environment’s complexity and constant change. This paper proposes a solution to the optimal node coverage of unbalanced WSN distribution during random deployment based on an enhanced Archimedes optimization algorithm (EAOA). The best findings for network coverage from several sub-areas are combined using the EAOA. In order to address the shortcomings of the original Archimedes optimization algorithm (AOA) in handling complicated scenarios, we suggest an EAOA based on the AOA by adapting its equations with reverse learning and multidirection techniques. The obtained results from testing the benchmark function and the optimal WSN node coverage of the EAOA are compared with the other algorithms in the literature. The results show that the EAOA algorithm performs effectively, increasing the feasible range and convergence speed.

1. Introduction

Wireless sensor networks (WSNs) are mainly composed of several autonomous devices called sensor nodes implemented for specific purposes and scattered in wide areas [1,2]. As wireless communication technology has improved and time has passed [3], WSNs have become more common in the information field [4]. They are utilized in various crucial fields, including the military, intelligent transportation, urban planning, industrial and agricultural automation, and environmental monitoring [5]. The sensor node’s job is to send captured information to the base station (BS) or the destination node by sensing and collecting ambient data, including sound vibration, pressure, temperature, and light intensity, among other things [6].
Due to their ease of implementation, cheap maintenance costs, and high flexibility, WSNs have successfully replaced wired networks and been embraced in the industrial field in recent years [7]. However, due to the nature of wireless communication, interference and conflict are invariably present during data transmission [8], and data packets may be lost or delayed past their planned deadline [9].
One of the most fundamental difficulties in WSNs is coverage, which is a critical metric for evaluating coverage optimization efforts. Because coverage affects the monitoring capability of the target monitoring area, it substantially impacts WSNs’ quality of service [10]. A node coverage optimization technique has been developed to increase the coverage of wireless sensor nodes in a big-data environment, considering the characteristics of large wireless sensor networks with limited node computing capabilities [11,12]. However, wireless sensor networks’ operating environment is complex and changing, and sensor energy is limited and cannot be supplemented [13].
Deployment of sensor nodes that is both sensible and effective reduces network expenses and energy consumption [14]. All WSN coverage applications try to deploy a minimal number of sensor nodes to monitor a defined target region of interest to improve coverage efficiency. Sensor nodes are typically placed randomly in the target monitoring region, resulting in an uneven distribution of nodes and limited coverage [15]. As a result, the network coverage control problem is the central research problem in wireless sensor networks [16]. Adopting an effective and acceptable network coverage control technique is beneficial to optimizing sensor node deployment to increase wireless sensor network performance. Sensor nodes are randomly placed around the monitoring region [17]. Strategically positioning sensor nodes in the monitoring zone is crucial to increasing WSN node coverage. For large-scale sensor node deployment challenges, logical and efficient deployment of WSNs has been demonstrated to be an NP-hard problem, and finding the best solution remains challenging [18]. Multiple nodes must be deployed to meet the monitoring needs, resulting in significant network redundancy coverage issues, the repeated transmission of vast amounts of data in the network, and a rise in the number of network nodes [19].
The metaheuristic algorithm is one of the promising approaches being examined as a solution for dealing with WSN node coverage in this scenario [20]. Metaheuristic algorithms can identify near-optimal solutions in a fair amount of time with limited nodes and computational resources, making them a convenient approach to the WSN coverage optimization problem [21]. Approximation optimization techniques with solutions that can tackle high-dimensional optimization problems effectively are known as metaheuristic algorithms [22]. Natural phenomena, such as human behaviors, physical sensations, animal swarm behaviors, and evolutionary concepts, are frequently used to inspire metaheuristic algorithms [23]. The metaheuristic optimization algorithms are widely used in a variety of fields, including technology, health, society, and finance, and are especially good at meeting time deadlines [20]. They are usually fairly easy to implement, having few parameters, being relatively simple to understand, and powerful, including selecting for biological nature, natural social swarm behavior, and autocatalytic physical phenomena, e.g., simulated annealing (SA) [24], genetic algorithms (GAs) [25,26], particle swarm optimization (PSO) [27], cat swarm optimization (CSO) [28], parallel PSO (PPSO) [29], ant colony optimization (ACO) [30], artificial bee colony (ABC) [31], bat algorithms (BA) [32,33], moth–flame optimization (MFO) [34,35], whale optimization algorithm (WOA) [36], flower pollination algorithm (FPA) [37,38], sine–cosine algorithm (SCA) [39,40], etc.
A new metaheuristic optimization method based on suggested physical laws is the Archimedes optimization algorithm (AOA) [41], which is mimicked by the location update technique that uses object collisions for processing optimization equations. The optimization is carried out by modeling Archimedes’ buoyancy principle process: following a crash, the object progressively assumes neutral buoyancy. The AOA has advantages and the potential to optimize various engineering problems because of its fewer parameters, making it more easily understandable in programming. However, there are specific problems with the AOA algorithm approach to particular issues, such as the solution’s slow convergence time and poor quality.
This paper suggests an enhanced Archimedean algorithm (EAOA) for the global optimization problems and node coverage optimization in WSN deployment. The difficulties of WSN nodes’ uneven distribution and low coverage in the random deployment of WSN monitoring applications are approached based on the EAOA. The entire WSN monitoring area can be divided into multiple sub-areas, and then node optimization coverage can be implemented in each sub-area based on evaluating the objective function values. The modeled objective function is calculated by the all-nodes coverage area ratio of the probability of the deployed surface 2D WSN monitoring area of the network. We implemented the EAOA by adapting its updating equations using reverse learning and multidirection strategies to overcome the limitations of its original approach. The following item list briefly highlights the contributions of this paper’s innovations:
  • Offering strategies for enhancing the AOA to prevent the original algorithm’s drawbacks in dealing with complex situations, evaluating the recommended method’s performance by using the CEC2017 test suite, and comparing the proposed method’s results with the other algorithms in the literature.
  • Establishing the objective function of the optimal WSN node coverage issues in applying the EAOA and AOA for the first time, and analyzing and discussing the results of the experiment in comparison with swarm intelligence optimization algorithms.
The paper’s remaining parts are organized as follows: Section 2 describes the WSN node coverage model as a statement problem, and reviews the AOA algorithm as related work. Section 3 presents the proposed EAOA, and evaluates its performance under the test suite. Section 4 offers the EAOA for tackling the node coverage issues by applying the EAOA algorithm and analyzing the simulation results. The conclusions are presented in Section 5.

2. System Definition

This section presents the WSN node coverage model as the problem statement, and the original algorithm—called the Archimedes optimization algorithm (AOA)—as a recent metaheuristic optimization algorithm. The subsections are reviewed as follows.

2.1. WSN Node Coverage Model

The coverage optimization problem is the desired location of each deployed node, with a fixed sensing radius for each sensor. Each node needs to be deployed with a limited sensing radius, and each sensor can only sense and find within its sensing radius. Detection within its sensing radius is a workable solution to the coverage problem. Assuming that the WSN is deployed in a two-dimensional (2D) monitoring area of W × L m2, with M nodes set up randomly [15,42], then if S is a set of nodes denoted as S = { S 1 ,   S 2 ,   ,   S i ,   ,   S M ,   i = 1 ,   2 ,   ,   and   M } , the coordinates of each node S i can be represented as ( x i ,   y i ). A sensor node’s sensing range is a circle, with the center of the sensing radius R s as its radius. The model of a two-dimensional WSN monitoring area network is assumed as follows:
  • The sensing radius of each sensor node is R s , and the communication radius is R c , both measured in meters, with R c 2 R s .
  • The sensor nodes can normally communicate, have sufficient energy, and can access time and data information.
  • The sensor nodes have the same parameters, structure, and communication capabilities.
  • The sensor nodes can move freely and update their location information in time.
Let T be a set of target monitoring points, T = { T 1 ,   T 2 ,   ,   T j ,   ,   T n } ,   j = 1 ,   2 ,   ,   n ; the T j coordinate is ( x j ,   y j ) in the two-dimensional WSN monitoring area. If the distance between the target monitoring point T j and any sensor node is less than or equal to the sensing radius R s , then T j is covered by the sensor nodes. With the sensor node S i and goal monitoring point T j , the Euclidean distance is defined as follows:
d ( S i ,   T j ) = ( x i x j ) 2   +   ( y i y j ) 2 ,
where d ( S i , T j ) is the distance from node S i ( x i ,   y i ) to node T j ( x j ,   y j ) . The node sensing model is set on the sensing radius if R s is greater than or equal to d ( S i , T j ) —the probability p that the target is set to 1; otherwise, it is set to 0. The probability formula is given as follows:
p ( S i , T j ) = { 1 ,                       R s   d ( S i , T j ) 0 ,                       R s < d ( S i , T j )       ,    
where p ( S i , T j )   is the probability between the sensor node S i and goal monitoring point T j . The sensor nodes can work cooperatively by affecting the neighbor nodes of the deployed two-dimensional WSN monitoring area. Whenever any target monitoring point can be covered by more than one sensor simultaneously, the probability of monitoring the target point T j is given by the following formula:
P ( S ,   T j ) = 1 i = 1 M ( 1 p ( S i , T j ) ) ,
The ratio of the total area covered by all sensor nodes in the monitoring area to that area’s overall size is known as the coverage rate. Accordingly, the probability ratio to the network’s surface 2D WSN monitoring area is used to calculate the coverage ratio.
C o v R = j = 1 M P ( S , T j ) W × L ,
where C o v R is the WSN nodes’ coverage ratio in the target point reaching area, P ( S , T j ) is the probability of the target point reaching sensed node monitoring, and W × L is the deployed area of the desired surface 2D network.

2.2. Archimedes Optimization Algorithm (AOA)

The AOA is a recent metaheuristic optimization algorithm based on Archimedes’ buoyancy principle’s physical principles [41]. The position of its object is updated by imitating the process of the object gradually exhibiting neutral buoyancy following a collision. The AOA algorithm provides the individual population by immersing objects with volume, density, and acceleration properties. The items can determine their position in the fluid based on these attributes. The characteristics and places of the object are randomly initialized at the start of the process. The AOA updates the object’s volume, density, and acceleration during processing optimization. The object’s position is updated based on its individual qualities. Initialization, updating object properties, updating the object’s status, and evaluation are the significant processing steps of the AOA.
Initialization of the position and attributes of the object is conducted as follows:
X i = l b i + r a n d ( ) · ( u b i l b i ) ,
where X i is a candidate solution vector i -th of the object population size N , i = 1 ,   2 ,   ,   N ; the boundaries l b i and u b i are the upper and lower boundaries, respectively; and the variable r a n d ( ) is a d -dimensional vector generated randomly between [0, 1]. The variables of acceleration, volume, and density of the i -th object are noted as a c i , v o i , and d e i , respectively; vo i = rand ( ) , de i = rand ( ) , and a c c i = l b i + rand ( ) · ( ub i   lb i ) . The position and attributes of the optimal object—such as X b e s t , d e b e s t , v o b e s t , and a c b e s t —are the selected objects with the best fitness values according to the evaluation of each object.
Updating object properties phase: During the iteration, the volume and density of the object are updated according to the following formula:
vo i t + 1   = vo i t + rand · ( vo best   vo i t ) ,
de i t + 1 = de i t + rand · ( de best de i t ) ,
where v o i t + 1 and d e i t + 1 denote the volume and density of the i -th object in the t + 1 iteration, respectively. The simulated collisions between objects in the AOA are mimicked for the optimization process; as time goes on with iterations, the algorithm gradually reaches equilibrium. A transform variable is used as a simulation of the process to realize the algorithm’s transformation from searching exploration to exploitation, as follows:
TF = exp ( t t max t max )
where T F is the transition transform variable, while t m a x and t are the maximum number of iterations and the current number of iterations, respectively. T F gradually increases to 1 over time. T F 0.5 , meaning that one second of the iteration is in the exploration phase. The update acceleration of object attributes is related to the collision objects.
ac i t + 1 = { de mr   +   vo mr · ac mr de i t + 1   ·   vo i t + 1 ,   if   T F 1 / 2 de best   +   vol best · ac best de i t + 1   ·   vo i t + 1 ,   otherwise  
where d e m r , v o m r , and a c m r are the density, volume, and acceleration of random material ( m r ), respectively. If T F 0.5 , there is a collision between objects, and the acceleration updates the formula of object i in iteration t ; otherwise, there is no collision between objects. The normalization strategy for the acceleration can be updated as follows:
ac i , norm t + 1 = ur · ac i t + 1 min ( ac ) max ( ac )     min ( ac ) + lr ,  
where a c i , n o r m t + 1 represents the normalized acceleration of the i-th object in the t + 1 iteration, while u r and l r are the normalized ranges, which are set to 0.8 and 0.2, respectively.
Updating the objects’ position is conducted as follows: If T F 1 / 2 (exploration phase), the position update formula of object i at the t + 1 iteration is helpful to search from global to local and converge in the region where the optimal solution exists; otherwise, it is a searching exploitation phase for the positional updating. When the object is far from the best position, the acceleration value is enormous, and the object is in the exploration phase. When the acceleration value is small, the object is close to the optimal solution. The exploitation phase can be described as follows:
X i t + 1   = X i t   +   C 1 · rand · ac i , norm t + 1 · d · ( X rand X i t )
where C 1 is a constant that is set to 2, and d is the density factor that decreases over time, i.e., d = exp ( t     t max t max ) ( t t max ) . The acceleration changes from big to small, indicating the algorithm’s transition from exploration to exploitation, respectively, which helps the object approach the optimal global solution.
X i t + 1   = X best t + F · C 2 · rand · ac i , norm t + 1 · d · ( T · X best X i t )
where C 2 represents the constant t; T is a variable proportional to the transfer operator—the percentage used to attain the best position— T = C 3 × T F ; and F is the direction of motion, and its expression is as follows:
F = { + 1 ,   if   P     0.5 1 ,   if   P   >   0.5
where P   is   set   to   2 · r a n d C 4 .
Evaluating objective function involves computing the fitness values for the objective function after updating the object’s position each iteration time. The model with objective function is used for fitness value evaluation by evaluating each object that is recorded with the best fitness value found in each position, e.g., X b e s t , d e b e s t , v o b e s t , and a c b e s t are updated for the next iterations or generations.

3. Enhanced Archimedes Optimization Algorithm

In order to enhance the population of diverse objects, an enhanced version of the Archimedes optimization algorithm (EAOA) based on the opposing learning and diversity guiding techniques is presented in this section. The suggested processes are offered first, followed by a detailed presentation of the evaluation and discussion findings.

3.1. Enhanced Archimedes Optimization Algorithm

The AOA is a new metaheuristic algorithm with several advantages, including ease of understanding and implementation, along with local search capability. Still, it has drawbacks, such as jumping out of the optimal local operation, slow convergence, or vulnerability to local optima when dealing with complex problems, such as optimal WSN node coverage issues.
A multiverse-directing strategy: In the original expression in Equation (13), the direction of motion F has just two motion directions. For complicated problems, the space may have more scales in terms of motion in space. We can exploit this to increase the number of search directions in complex spaces. A variable of the direction guiding factor   G is used as an equivalent to the direction value. An alternative formula of motion direction can be expressed as follows:
F new = { + G · rand ( ) , if   P     0.5 tG · rand ( ) ,   otherwise ,
where F n e w is an alternative direction guiding factor, and r a n d ( ) is a random number [0, 1] for making the different search values of directions.
Opposite direction strategy: The original and reversed solutions are sorted fitness values based on objective function issues to convert objects in a seeking, forward-exploiting procedure in the optimization problem space. The agents in the optimization space can swiftly converge to the task of the ideal solution by identifying new objects with the best fitness ratings by using direct vetting or other optimization strategies to establish new things in the solution space. A new solution set can be generated by applying reverse learning with a specific rate to join the original for further optimization.
Let S ( x 1 ,   x 2 ,   ,   x i ,   ,   x D ) , and S ( x 1 ,   x 2 ,   ,   x D ) be solutions of forwarding and corresponding inverse sets, where   x i [ a i , b i ] , ( i = 1 ,   2 ,   ,   d ) . A range [ a , b ] of the opposite solution set can be expressed as x i = a i + b i x i . The same idea of the opposite learning applied to a new solution is as follows:
S i n e w = S · β r ,
where β r is a variable as an adjustment coefficient for generating and affecting a new solution object set. A portion of the worst solution—e.g., about 15% of sorting values of evaluation object positions—is eliminated to be used for generating a new object set in dimension d of the solution space. The adjustment coefficient is calculated as follows:
β r = R istar · rand ( β , γ ) D ,
where   r a n d ( ) is a random function in the range from β to γ . In the experiment, β can be set to −0.5 and γ set to 0.5. D is the dimension of problem space, while R i s t a r is the distance between the ideal solution and the one that is closest to optimal. The adjustment coefficient can be applied to the exploiting search of the algorithm for generating and affecting a new solution object set merged into Equation (17).
The strategies and equations of reverse learning β r and multiverse-directing F n e w can be hybridized into updated formulas for generating new solutions. An update of the position of the objects is conducted as follows:
x i t + 1 = { x i t + β r · C 1 · r a n d · a c i , n o r m t + 1 · d · ( x r a n d x i t ) , i f   T F     0.5 x best t   +   F new · C 2 · rand · a c i , n o r m t + 1 · d · ( T   ×   x b e s t x i t ) , otherwise
Algorithm 1 depicts the pseudocode of the enhanced Archimedes optimization algorithm (EAOA).
Algorithm 1 A pseudocode of the EAOA.
1. Input: N P : The population size, D: dimensions, T: the Max_iter, C1, C 2 , C 3 ,   C 4 : variables, and ub, lb: upper and lower boundaries.
2. Output: The global best optimal solution.
3. Initialization: Initializing the locations, vol., de., and acc. of each object in the population of Equation (8); obtaining each object’s position by calculating the objective function, and the best object in the population is selected; the iteration t is set to 1.
4. While t < T  do
5.   For  i = 1 : N p  do
6.     Updating vol., and de., of the object by Equations (6) and (7).
7.     Updating T F - transfer impactor and d -de., variables are by Equation (8).
      If  T F 1 / 2  then
8.         Updating acc. the object acceleration by Equation (10).
9.         Updating the local solution by Equation (11).
10.       Else
11. Updating the object accelerations by Equations (9) and (10).
12. Updating global solution position by Equation (17).
13. End-if
14.     End-for
15. End-while
16. Evaluating each object with the positions and
17.   Selecting the best object of the whole population.
18. Recording the best global outcome of the optimal object.
19. t-iteration is set to t + 1
20. Output: The best object optimization of the whole population size.

3.2. Experimental Results for Global Optimization

The suggested algorithm’s potential performance needs to be tested and verified with the benchmark functions. The CEC 2017 [43] test suite has 29 different test functions to evaluate the EAOA algorithm. There are various types of complexity and dimension functions in the test suite, e.g., f1~f3: unimodal, f4~f10: multimodal, f11~f20: hybrid, and f21~f29: compound test functions. The achieved results of the EAOA are compared not only with the original AOA [41], but also with other selected popular algorithms in the literature, e.g., genetic algorithms (GAs) [25], simulated annealing (SA) [24], particle swarm optimization (PSO) [27], moth–flame optimization (MFO) [34], improved MFO (IMFO) [35], flower pollination algorithm (FPA) [37], sine–cosine algorithm (SCA) [39], enhanced SCA (ESCA) [40], parallel PSO (PPSO) [29], and parallel bat algorithm (PBA) [33]. An expression of Δ f = f i f i * is a different error value between the function minimum value f i * and the obtained result value f i of the i-th function. The fundamental conditions are set for all algorithms to ensure that the experiment is fair, e.g., the population size is set to 40; the maximum number of iterations is set to 1000; the number of dimensions is set to 30; the solution range of all of the test functions is set to [−100, 100]; and the number of runs is set to 25. Table 1 lists the basic parameters of each algorithm.
The obtained outcomes of the proposed EAOA approach can be verified by several cases—such as the affected strategies with the original algorithm—and compared with the other algorithms. First, the outcomes of several implemented tactics are contrasted with those of the original AOA algorithm. The findings from the EAOA are then contrasted with those from other algorithms. Table A1 compares the affected strategies in applying the EAOA with the original AOA algorithm, and verifies the impact of the suggested techniques used in the EAOA compared to the original AOA algorithm. The data values of the mean outcomes of 25 runs show the best obtained global optimal results, as well as the data on runtime and CPU execution. It can be seen that in some cases strategies 1 and 2 are better than the original algorithm. In most test function cases, the combined strategies 1 and 2 in the proposed EAOA can produce better results than the AOA, and the runtime is not much longer than that of the AOA.
Moreover, the obtained results from the EAOA were also further evaluated to verify the proposed approach’s performance in the presentation. The findings of the EAOA compared with the other algorithms—e.g., GA [25], PSO [27], BA [32], PPSO [29], MFO [34], and WOA [36] algorithms—are presented in Table A2, Table A3, Table A4 and Table A5 and Figure A1. The data values in Table A2, Table A3 and Table A4 are the Mean, Best, and Std.—a standard deviation that measures variables for analyzing the algorithm’s performance. The values of Mean, Best, and Std. are for assessing the search capability, quality, and resilience of the algorithm, respectively. Table A2, Table A3 and Table A4 compare the results of the proposed EAOA with the other popular metaheuristic algorithms in the literature, e.g., the GA [25], PSO [27], BA [32], PPSO [29], MFO, [34], and WOA [36] algorithms. The highlighted data values in each row of Table A2, Table A3 and Table A4 are the best in each pair comparing the EAOA-obtained results with the others for testing functions with a suitable format and layout. The symbols Win, Loss, and Draw at the end of each table provide a brief statistical summary. It can be seen that the proposed EAOA algorithm has the highest number of ‘Wins’. This means that the EAOA produces better results than the other algorithms, and that the EAOA has an excellent optimization performance.
Figure A1 compares the convergence outcome curves of the EAOA with the ESCA [40], IMFO [35], AOA [41], PPSO [29], WOA [36], and PBA [33] algorithms for the selected functions. The Y coordinate axis represents the average of 25 runs of the best output of the algorithms thus far. The X coordinate shows the iteration in the generation of searching methods. It can be seen from the figure that the EAOA performance curve shows a faster convergence rate than the other algorithms.
Furthermore, for another view of the evaluation results of the proposed approach, we applied the Wilcoxon signed-rank technique for ranking the outcomes. This test compares the pairwise algorithms’ results between the EAOA and other enhanced methods—e.g., PBA, WOA, PPSO, AOA, IFMO, and ESCA algorithms—under the Wilcoxon signed-rank test. Table A5 lists the results of comparison of the pairwise algorithms’ results between the EAOA and other algorithms when applying the Wilcoxon signed-rank test. The bold-highlighted results in Table A5 are the outcomes with p < 0.05. It can be seen that most values have p < 0.05, indicating that the optimization results of the EAOA are significantly different from those of the other algorithms. The average ranking value is 2.25204, and the lowest output of the EAOA is superior to that of the other algorithms. In general, it can be seen that the proposed EAOA can compete with some of the other popular algorithms.

4. Optimal WSN Node Coverage Based on EAOA

This section demonstrates how the EAOA algorithm can be used to deploy a WSN with the best node coverage possible, followed by a subsection covering the majority of the processing stages, analysis, and discussion of the findings.

4.1. Optimal Node Coverage Strategy

The feasible solution to the optimal node coverage problem is the deployment of each node with a limited sensing radius, where each sensor can only sense and find within its sensing radius. The finding within its sensing radius is a workable solution to the coverage optimization problem. Assuming that the sensing radius of all nodes is the same, and the sensing radius of the node r R , any point in the monitoring area is covered if it is located within the sensing radius of at least one sensor node. The monitoring area is divided into the coverage area and the blind spot. Any point in the coverage area is covered by at least one sensor node, and the blind spot complements the coverage area. Some applications need to monitor events with high accuracy. Any point in the coverage area must be at least within the sensing radius of M nodes simultaneously; otherwise, it will be regarded as a blind spot, which we call M double-coverage. The location-seeking process of nodes is abstracted as the process of implementing varied movement behaviors of the object group toward food or a specific site.
The purpose of WSN coverage optimization utilizing the EAOA approach is to optimize the coverage of the target monitoring area by using a limited number of sensor nodes and optimizing their deployment locations. Let F ( x ) be the objective function of the WSN nodes’ coverage optimization; the coverage ratio, which is the maximum ratio of probability to the network’s deployed surface 2D WSN monitoring area, is used to determine the objective function for the optimization problem. The maxima are as follows according to Equation (4):
F ( x ) = M a x i m i z e   C o v R = j = 1 M P ( S , T j ) W × L ,
where C o v R and P ( S , T j ) are the coverage ratio of the WSN nodes and the probability of the target point reaching W × L in the sensed node of the 2D monitoring network’s deployed area, respectively. Each individual object in the algorithm represents a coverage distribution, and the specific steps of the algorithm scheme for the coverage optimization are listed as follows:
  • Step 1: Input parameters such as a number of nodes M , perception radius R s , area of region W × L , etc.
  • Step 2: Set the parameters of population size N, the maximum number of iterations max_Iter, the density factor, and prey attraction, and randomly initialize the object’s positions using Equations (5)–(7).
  • Step 3: Enhance the initializing population—the parameters of Equations (8)–(10), (14), and (15)—and calculate the objective function for initial coverage according to Equation (18).
  • Step 4: Update the position of objects and the strategy according to Equation (17), and then compare them to select the best fitness value according to the objective function value.
  • Step 5: Calculate the individual values of objects and retain the optimal solution of the global best.
  • Step 6: Determine whether the end condition is reached; if yes, proceed to the next step; otherwise, return to Step 4.
  • Step 7: The program ends and outputs the optimal fitness value and the object’s best location, representing the node’s optimal coverage rate outputs.

4.2. Analysis and Discussion of Results

The scenarios assuming that the WSN’s sensor nodes are deployed in a square monitoring area of W × L can be set to scenario areas, e.g., 40 m × 40 m, 80 m × 80 m, 100 m × 100 m, and 160 m × 160 m. Table 2 lists the experimental parameters of the WSN node deployment areas. The sensing radius of sensor nodes R s is set to 10 m. The communication radius R c is set to 20 m. The number of sensor nodes is denoted by M, consisting of 20, 40, 50, and 60 sensor nodes. Iter indicates the number of iterations, which may be set to 500, 1000, or 1500.
The optimal results of the EAOA were compared with the other selected schemes—i.e., the SSA [44], PSO [45], GWO [46], SCA [47], and AOA [48]—for the coverage optimization of WSN node deployment to verify the adequate performance of the algorithm. Figure 1 displays a graphical diagram of the nodes’ initialization with the EAOA for the statistical coverage optimization scheme with different numbers of sensor nodes: (a) 20, (b) 40, (c) 50, and (d) 60.
Table 3 compares the proposed EAOA approach to other strategies—i.e., the SSA, PSO, GWO, SCA, and AOA algorithms—in terms of percentage coverage rate, running time, convergence iterations, and monitoring area size. It can be seen that the EAOA scheme produces the best global solution in the coverage areas, with a high coverage rate, coverage of the node’s whole area, and a faster runtime than the other approaches.
Figure 2 indicates the graphical coverage of six different metaheuristic algorithms—i.e., the AOA, SSA, PSO, GWO, SCA, and EAOA approaches—for the WSN node area deployment scenarios for optimal coverage rates, with the same density and environmental setting conditions. Because the EAOA algorithm can avoid premature phenomena, its coverage rate is reasonably high, with less overlap. It can better alter the node configuration than the other competitors for the monitoring area’s network coverage. The graphics show the differences in the distribution of coverage; the differences are so small that the graphics look very similar, with the graph of node coverage distribution showing seemingly identical results. Furthermore, Figure 3 and Figure 4 show that the convergence curves of the proposed EAOA approach can provide higher percentages of statistical coverage than the other methods used.
Figure 3 indicates four different sizes of WSN monitoring node area deployment scenarios of the metaheuristic approaches for optimal coverage rates. The convergence curves of the proposed EAOA approach can provide higher percentages of statistical coverage than the other methods used.
Figure 4 shows the coverage rate of the EAOA compared against the SSA, PSO, GWO, SCA, and AOA algorithms for statistical sensor node count deployment for the 2D monitoring of different areas. It can be seen the EAOA algorithm produces a coverage rate that is reasonably high in the monitoring area’s network coverage. The results show that the EAOA approach provides a reasonably high coverage rate, with less overlap and better alteration of the sensor nodes’ configuration, compared to the average coverage rate under the same test conditions.

5. Conclusions

This paper suggests an enhanced Archimedes optimization algorithm (EAOA) to solve the wireless sensor network (WSN) nodes’ uneven distribution and low coverage issues in random deployment. Each divided sub-area of the monitoring area of the entire WSN was subjected to node coverage optimization based on the EAOA. The objective function of the optimal node coverage was modeled mathematically by calculating the distance between nodes by measuring each sensor node’s sensing radius and its communication capability in the deployed WSN. The optimization results of multiple sub-areas were fused, combining the sub-areas’ coverage with the complete network node coverage via a mapping mechanism. The updated equations of the EAOA were modified with reverse learning and multidirection strategies to avoid the original drawbacks of the AOA, e.g., slow convergence speed and ease of falling into local extrema whenever dealing with complicated situations. The compared results of the optimal findings on the selected benchmark functions and the WSN node coverage show that the proposed EAOA makes the optimal solution effective for both coverage and benchmark problems. The suggested algorithm will be applied in future works to address WSN node localization [49,50] and optimal WSN deployment [51,52].

Author Contributions

Conceptualization, T.-K.D. and V.-T.N.; methodology, T.-T.N.; software, T.-T.N.; validation, T.-K.D., V.-T.N. and T.-T.N.; formal analysis, T.-T.N.; investigation, S.-C.C.; resources, T.-K.D.; data curation, S.-C.C.; writing—original draft preparation, T.-D.N.; writing—review and editing, T.-D.N.; visualization, V.-T.N.; supervision, T.-T.N.; project administration, V.-T.N.; funding acquisition, V.-T.N. All authors have read and agreed to the published version of the manuscript.

Funding

This study was partially supported by the VNUHCM-University of Information Technology’s Scientific Research Support Fund.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Verifying the impact of the suggested techniques used in the EAOA in comparison with the original AOA algorithm.
Table A1. Verifying the impact of the suggested techniques used in the EAOA in comparison with the original AOA algorithm.
Fun
Test
OriginalSuggested
Strategy 01
Suggested
Strategy 02
Suggested
Strategies 01 and 02
AOAMultidirection Opposite LearningEAOA
MeanCPU Runtime
(s)
MeanCPU Runtime
(s)
MeanCPU Runtime
(s)
MeanCPU Runtime
(s)
f12.95 × 10−137.931.86 × 10−136.101.91 × 10−134.301.71 × 10−138.52
f22.71 × 10+132.761.94 × 10+134.021.61 × 10+134.121.65 × 10+138.32
f33.66 × 10−145.342.58 × 10−147.096.52 × 10−247.232.44 × 10−153.04
f43.02 × 10−144.161.45 × 10−145.864.59 × 10−246.001.27 × 10−252.12
f57.99 × 10−240.327.84 × 10−342.811.38 × 10−242.005.38 × 10−348.03
f65.58 × 10−185.442.11 × 10−188.736.21 × 10−289.001.92 × 10−198.89
f72.21 × 10−1203.521.07 × 10−1221.312.44 × 10−1212.101.26 × 10−1237.18
f86.32 × 100117.126.61 × 10−1121.411.97 × 100122.007.25 × 10−1136.23
f97.20 × 100229.604.82 × 100234.614.25 × 100235.0104.36 × 100251.72
f102.25 × 100224.612.63 × 10−1233.422.05 × 10−1234.102.10 × 10−2263.69
f114.95 × 10+3274.651.77 × 10+3275.318.09 × 10+3278.011.06 × 10+3278.59
f121.66 × 10+2229.443.65 × 10+1238.288.09 × 10+1239.002.29 × 10+1268.40
f133.58 × 10+1120.012.87 × 100124.613.30 × 10+1125.101.53 × 100140.28
f142.96 × 10+196.261.62 × 100100.711.09 × 10+1101.101.26 × 100113.41
f152.05 × 100221.767.88 × 10−1231.314.74 × 10−1231.107.27 × 10−1259.42
f164.73 × 10−1126.721.85 × 10−1131.612.59 × 10−1132.011.30 × 10−1148.34
f174.04 × 10+2223.695.63 × 10+1232.315.53 × 10+2233.107.90 × 10+1262.71
f182.49 × 10+2100.813.70 × 10−1104.351.46 × 10+1105.101.09 × 100117.92
f194.06 × 10−1206.403.24 × 10−1214.363.79 × 10−1215.003.86 × 10−1241.45
f205.87 × 10−1298.564.11 × 10−1310.074.34 × 10−2311.003.98 × 10−2349.25
f216.51 × 10−1327.362.25 × 10−1339.988.29 × 10−2341.002.10 × 10−1384.15
f228.94 × 10−1312.966.22 × 10−1325.767.03 × 10−1326.006.34 × 10−1367.09
f231.02 × 10303.367.63 × 10−1315.055.72 × 10−2316.007.59 × 10−2354.87
f247.38 × 10−1282.246.63 × 10−1294.324.75 × 10−1294.004.12 × 10−1331.25
f253.28 × 100206.407.26 × 10−1215.361.51 × 100215.007.74 × 10−1243.15
f268.53 × 10−1253.448.03 × 10−1263.222.78 × 10−2264.007.78 × 10−1297.17
f277.28 × 10−1273.447.74 × 10−1265.455.19 × 10−1284.007.41 × 10−2295.92
f282.37 × 100225.601.09 × 100234.303.34 × 10−1235.009.39 × 10−1263.91
f292.15 × 10+3221.768.37 × 10+1230.313.14 × 10+2231.124.67 × 10+1259.82
Avg.1.72 × 10−1198.016.88 × 10−1199.913.15 × 10−1199.474.45 × 10−2219.12
The bold data values in each row of the Table are the best ones in each pair compared with the EAOA approach.
Table A2. The performance presentation of the EAOA, SA, and GA for the CEC 2017 test suite with each paired comparison.
Table A2. The performance presentation of the EAOA, SA, and GA for the CEC 2017 test suite with each paired comparison.
FunsGASAEAOA
MeanBestStd.MeanBestStd.MeanBestStd.
f15.66 × 10−51.46 × 10−55.19 × 10−57.16 × 10−51.25 × 10−52.38 × 10−51.29 × 10−52.71 × 10−51.11 × 10−5
f23.72 × 10−11.54 × 10−11.01 × 10−13.78 × 10+12.21 × 10+19.86 × 10+13.57 × 10−11.85 × 10−11.11 × 10−1
f32.57 × 10−11.58 × 10−15.11 × 10−24.92 × 10−12.66 × 10−11.28 × 10−12.33 × 10−11.44 × 10−15.52 × 10−2
f42.31 × 10−11.45 × 10−14.84 × 10−24.58 × 10−13.02 × 10−14.34 × 10−21.91 × 10−11.11 × 10−14.59 × 10−2
f53.90 × 10−27.86 × 10−31.68 × 10−21.12 × 10−27.99 × 10−21.63 × 10−22.57 × 10−25.38 × 10−31.38 × 10−2
f63.28 × 10−12.11 × 10−17.81 × 10−28.30 × 10−15.58 × 10−11.26 × 10−12.68 × 10−11.92 × 10−16.21 × 10−2
f71.95 × 10−11.07 × 10−13.69 × 10−23.59 × 10−12.21 × 10−18.33 × 10−21.66 × 10−11.26 × 10−12.44 × 10−2
f83.43 × 1001.21 × 10+11.64 × 1001.32 × 1006.32 × 1004.29 × 1003.23 × 1007.17 × 10−11.97 × 100
f96.43 × 1004.81 × 1001.19 × 1008.79 × 1007.20 × 1001.09 × 1006.53 × 1004.36 × 1001.25 × 100
f103.99 × 10−12.03 × 10−11.03 × 10−15.33 × 1001.20 × 1004.24 × 1003.81 × 10−12.10 × 10−11.01 × 10−1
f111.02 × 10+41.77 × 10+39.36 × 10+32.81 × 10+54.45 × 10+41.90 × 10+57.58 × 10+31.06 × 10+38.09 × 10+3
f129.53 × 10+23.65 × 10+11.07 × 10+29.30 × 10+21.66 × 10+21.10 × 10+39.47 × 10+12.29 × 10+18.09 × 10+1
f133.62 × 10+19.87 × 1003.51 × 10+19.11 × 10+39.80 × 1002.89 × 10+29.23 × 10+19.83 × 1003.30 × 10+1
f141.21 × 10+11.62 × 1007.68 × 1001.66 × 10+22.96 × 10+11.15 × 10+21.05 × 10+11.26 × 1001.09 × 10+1
f151.57 × 1007.88 × 10−14.83 × 10−13.69 × 1002.05 × 1004.63 × 10−11.66 × 1007.27 × 10−14.74 × 10−1
f165.97 × 10−11.85 × 10−12.70 × 10−11.30 × 1004.73 × 10−13.87 × 10−15.77 × 10−11.30 × 10−12.59 × 10−1
f176.05 × 10+25.63 × 10+17.15 × 10+21.01 × 10+44.04 × 10+21.60 × 10+44.81 × 10+27.90 × 10+15.53 × 10+2
f189.73 × 1003.70 × 10−11.74 × 10+12.11 × 10+42.49 × 10+21.87 × 10+41.10 × 10+11.09 × 1001.46 × 10+1
f197.95 × 10−13.24 × 10−13.13 × 10−11.29 × 10−14.06 × 10−13.70 × 10−17.97 × 10−13.86 × 10−12.79 × 10−1
f204.87 × 10−14.11 × 10−13.48 × 10−27.39 × 10−15.87 × 10−18.77 × 10−24.80 × 10−13.98 × 10−14.34 × 10−2
f213.46 × 10−12.25 × 10−16.95 × 10−28.38 × 1006.51 × 10−12.32 × 1002.95 × 10−12.10 × 10−18.29 × 10−2
f227.69 × 10−16.22 × 10−15.70 × 10−21.21 × 1009.94 × 10−11.04 × 10−17.46 × 10−16.79 × 10−14.63 × 10−2
f238.64 × 10−17.63 × 10−16.44 × 10−21.26 × 1001.02 × 10−11.34 × 10−18.49 × 10−17.59 × 10−15.72 × 10−2
f247.34 × 10−16.63 × 10−13.85 × 10−28.77 × 10−17.38 × 10−17.48 × 10−27.00 × 10−16.12 × 10−14.75 × 10−2
f252.79 × 1007.26 × 10−11.62 × 1008.04 × 1003.28 × 1001.63 × 1003.45 × 1007.74 × 10−11.51 × 100
f268.44 × 10−18.03 × 10−12.31 × 10−21.13 × 1008.53 × 10−11.82 × 10−18.29 × 10−17.78 × 10−12.78 × 10−2
f278.55 × 10−17.74 × 10−15.76 × 10−21.02 × 1008.28 × 10−11.38 × 10−18.17 × 10−17.41 × 10−15.19 × 10−2
f281.74 × 1001.09 × 1003.76 × 10−13.66 × 1002.37 × 10−16.31 × 10−11.49 × 1009.39 × 10−13.34 × 10−1
f291.12 × 10+38.37 × 10+11.16 × 10+34.33 × 10+44.15 × 10+33.70 × 10+43.55 × 10+24.89 × 10+13.14 × 10+2
Win597655201819
Lose21182022222291110
Draw344324000
The bold data values in each row of the Table are the best ones in each pair compared with the EAOA approach.
Table A3. The performance presentation of the EAOA, FPA, and PSO for the CEC 2017 test suite with each paired comparison.
Table A3. The performance presentation of the EAOA, FPA, and PSO for the CEC 2017 test suite with each paired comparison.
FunsFPAPSOEAOA
MeanBestStd.MeanBestStd.MeanBestStd.
f12.24 × 10−21.19 × 10−25.79 × 10−24.37 × 10−22.25 × 10−21.36 × 10−21.34 × 10−32.83 × 10−21.16 × 10−2
f21.15 × 1007.23 × 10−12.60 × 10−18.74 × 10−15.18 × 10−16.01 × 10−17.58 × 10−13.92 × 10−14.36 × 10−1
f32.11 × 10−11.43 × 10−13.47 × 10−12.34 × 10−11.21 × 10−13.40 × 10−12.43 × 10−11.50 × 10−15.77 × 10−2
f43.45 × 10−12.40 × 10−15.46 × 10−22.55 × 10−11.63 × 10−13.90 × 10−22.00 × 10−11.16 × 10−14.80 × 10−2
f56.68 × 10−22.54 × 10−21.98 × 10−27.62 × 10−24.46 × 10−21.20 × 10−22.68 × 10−25.63 × 10−31.44 × 10−2
f64.30 × 10−13.80 × 10−12.70 × 10−23.26 × 10−12.50 × 10−14.61 × 10−22.81 × 10−12.01 × 10−16.49 × 10−2
f72.59 × 10−11.96 × 10−13.70 × 10−21.98 × 10−11.36 × 10−12.82 × 10−21.73 × 10−11.32 × 10−11.25 × 10−1
f84.69 × 1001.11 × 10−12.95 × 1004.42 × 1001.83 × 1001.59 × 1003.37 × 1007.49 × 10−12.06 × 100
f99.68 × 1007.43 × 1001.17 × 1007.17 × 1004.82 × 1005.09 × 1006.82 × 1004.56 × 1001.30 × 100
f104.07 × 10−12.77 × 10−16.08 × 10−23.14 × 10−12.25 × 10−14.61 × 10−23.98 × 10−12.19 × 10−11.06 × 10−1
f113.01 × 10+13.25 × 10+13.26 × 10+13.13 × 10+13.15 × 10+13.15 × 10+13.40 × 10+13.11 × 10+13.10 × 10+1
f127.20 × 10+23.72 × 10+28.30 × 10+21.11 × 10+24.52 × 10+14.62 × 10+19.90 × 10+12.39 × 10+18.46 × 10+1
f137.34 × 10+18.89 × 1006.28 × 10+12.59 × 10+15.45 × 10−12.66 × 10+13.06 × 10+11.59 × 1003.44 × 10+1
f143.03 × 10+28.32 × 10+12.17 × 10+24.57 × 10+11.92 × 10+12.71 × 10+11.10 × 10+11.32 × 1001.14 × 10+1
f152.01 × 1001.09 × 1003.70 × 10−12.01 × 1001.26 × 1004.51 × 10−11.73 × 1007.59 × 10−14.95 × 10−1
f167.50 × 10−12.41 × 10−12.38 × 10−16.40 × 10−11.93 × 10−12.80 × 10−16.03 × 10−11.36 × 10−12.70 × 10−1
f179.05 × 10+15.47 × 10+11.23 × 10+29.08 × 10+11.09 × 10+18.55 × 10+11.02 × 10+21.68 × 10+11.17 × 10+2
f181.50 × 10+24.30 × 10+11.32 × 10+22.42 × 10+11.64 × 1002.35 × 10+11.38 × 1001.37 × 10−11.84 × 100
f197.85 × 10−14.36 × 10−12.39 × 10−17.69 × 10−15.01 × 10−11.78 × 10−18.33 × 10−14.03 × 10−12.91 × 10−1
f206.30 × 10−15.39 × 10−14.59 × 10−25.71 × 10−14.96 × 10−14.14 × 10−25.02 × 10−14.16 × 10−14.53 × 10−2
f211.45 × 1002.33 × 10−13.18 × 1007.42 × 10−12.04 × 10−12.02 × 1003.08 × 10−12.20 × 10−18.66 × 10−2
f221.03 × 1007.09 × 10−19.12 × 10−19.97 × 10−18.73 × 10−19.76 × 10−17.80 × 10−17.10 × 10−14.83 × 10−2
f231.09 × 1009.27 × 10−17.47 × 10−21.05 × 1008.69 × 10−18.62 × 10−28.87 × 10−17.93 × 10−15.98 × 10−2
f247.17 × 10−16.52 × 10−13.44 × 10−27.17 × 10−16.61 × 10−13.69 × 10−27.32 × 10−16.39 × 10−14.96 × 10−2
f253.91 × 1006.47 × 10−12.82 × 1003.26 × 1005.36 × 10−12.80 × 1003.60 × 1008.08 × 10−11.57 × 100
f269.56 × 10−18.58 × 10−18.74 × 10−29.93 × 10−18.44 × 10−17.07 × 10−28.66 × 10−18.12 × 10−12.90 × 10−2
f277.98 × 10−17.24 × 10−13.70 × 10−27.86 × 10−16.96 × 10−14.20 × 10−28.54 × 10−17.74 × 10−15.43 × 10−2
f282.03 × 1001.31 × 1004.27 × 10−12.38 × 1001.47 × 1004.32 × 10−11.56 × 1009.81 × 10−13.49 × 10−1
f295.47 × 10+31.89 × 10+32.54 × 10+32.89 × 10+34.93 × 10+21.89 × 10+33.71 × 10+21.01 × 10+23.28 × 10+2
Win5567710181813
Lose232321212112111016
Draw332111010
The bold data values in each row of the Table are the best ones in each pair compared with the EAOA approach.
Table A4. The performance presentation of the EAOA, MFO, and SCA for the CEC 2017 test suite with each paired comparison.
Table A4. The performance presentation of the EAOA, MFO, and SCA for the CEC 2017 test suite with each paired comparison.
FunsMFOSCAEAOA
MeanBestStd.MeanBestStd.MeanBestStd.
f14.60 × 10−12.99 × 10−17.04 × 10−12.41 × 10−11.23 × 10−14.80 × 10−11.34 × 10−12.83 × 10−11.16 × 10−1
f22.33 × 10+29.80 × 10+11.30 × 10+21.41 × 10+19.03 × 10+12.65 × 10+13.73 × 10+11.93 × 10+11.16 × 10+1
f36.57 × 1004.83 × 1001.17 × 1002.97 × 10−11.36 × 10−18.95 × 10−12.43 × 10−11.50 × 10−15.77 × 10−2
f46.23 × 10−15.46 × 10−14.52 × 10−25.24 × 10−14.54 × 10−13.69 × 10−22.00 × 10−11.16 × 10−14.80 × 10−2
f51.38 × 10−11.21 × 10−11.43 × 10−26.61 × 10−26.50 × 10−21.54 × 10−26.68 × 10−25.63 × 10−31.44 × 10−2
f61.79 × 1001.34 × 10−11.66 × 1009.48 × 10−17.89 × 10−19.52 × 10−22.81 × 10−12.01 × 10−16.49 × 10−2
f75.83 × 10−15.16 × 10−12.83 × 10−24.86 × 10−13.81 × 10−12.13 × 10−21.73 × 10−11.32 × 10−12.54 × 10−2
f82.38 × 10−11.78 × 1003.28 × 1001.17 × 10+15.84 × 1003.05 × 1003.37 × 1007.49 × 10−12.06 × 100
f99.99 × 1009.31 × 1003.36 × 10−11.23 × 10+11.05 × 10+16.06 × 10−16.82 × 1004.56 × 1001.30 × 100
f107.53 × 1003.80 × 1001.73 × 1003.00 × 10−11.14 × 10−19.63 × 10−13.98 × 10−12.19 × 10−11.06 × 10−1
f112.90 × 10+61.62 × 10+65.79 × 10+51.78 × 10+69.66 × 10+55.17 × 10+57.92 × 10+51.10 × 10+58.45 × 10+5
f127.12 × 10+53.18 × 10+52.50 × 10+53.12 × 10+58.18 × 10+42.17 × 10+59.90 × 10+48.39 × 10+48.46 × 10+4
f132.14 × 10+22.04 × 10+19.37 × 10+17.63 × 10+22.93 × 10+15.99 × 10+23.06 × 10+12.59 × 10+13.44 × 10+1
f141.93 × 10+41.54 × 10+39.89 × 10+37.40 × 10+39.60 × 10+24.59 × 10+31.10 × 10+11.32 × 1001.14 × 10+1
f153.58 × 1003.14 × 10−12.28 × 1003.76 × 10−12.59 × 1004.23 × 1001.73 × 1007.59 × 10−14.95 × 10−1
f161.42 × 1001.09 × 1001.39 × 10−11.43 × 1006.64 × 10−13.42 × 10−16.03 × 10−11.36 × 10−12.70 × 10−1
f173.40 × 10+28.93 × 10+21.31 × 10+39.63 × 10+31.88 × 10+37.49 × 10+35.03 × 10+28.25 × 10+15.78 × 10+2
f185.66 × 10+42.32 × 10+42.43 × 10+41.32 × 10+43.22 × 10+37.60 × 10+31.15 × 10+11.14 × 1001.53 × 10+1
f191.17 × 1009.15 × 10−11.15 × 10−11.17 × 1006.21 × 10−12.66 × 10−18.33 × 10−14.03 × 10−12.91 × 10−1
f208.92 × 10−28.28 × 10−29.98 × 10−18.06 × 10−17.10 × 10−14.72 × 10−25.02 × 10−14.16 × 10−24.53 × 10−2
f219.34 × 1007.19 × 1001.04 × 1003.19 × 1001.95 × 10−15.98 × 10−13.08 × 10−12.20 × 10−18.66 × 10−2
f221.28 × 1001.22 × 10−14.01 × 10−11.14 × 1001.05 × 1004.84 × 10−27.80 × 10−17.10 × 10−14.83 × 10−1
f231.38 × 1001.27 × 1006.06 × 10−21.26 × 1001.13 × 1005.20 × 10−28.87 × 10−17.93 × 10−15.98 × 10−2
f243.78 × 1002.27 × 1005.41 × 10−11.72 × 10−11.28 × 10−12.58 × 10−17.32 × 10−16.39 × 10−24.96 × 10−1
f258.23 × 1005.94 × 1009.24 × 10−16.57 × 1003.22 × 1001.84 × 1003.60 × 1008.08 × 10−11.57 × 100
f261.20 × 1001.13 × 1003.57 × 10−11.20 × 10−18.15 × 10−11.74 × 10−18.66 × 10−18.12 × 10−22.90 × 10−1
f273.39 × 1002.15 × 10−16.11 × 10−12.06 × 1004.31 × 10−18.16 × 10−18.54 × 10−17.74 × 10−15.43 × 10−1
f283.20 × 1002.76 × 1002.16 × 10−13.17 × 1001.92 × 1004.95 × 10−11.56 × 1009.81 × 10−13.49 × 10−1
f298.17 × 10+45.16 × 10+42.12 × 10+44.90 × 10+41.70 × 10+42.05 × 10+43.71 × 10+21.51 × 10+23.28 × 10+2
Win456766212017
Lose2321212122238914
Draw232110000
The bold data values in each row of the Table are the best ones in each pair compared with the EAOA approach.
Table A5. Wilcoxon signed-rank results of the test pairs of the pairwise algorithms’ results between the EAOA and other algorithms, i.e., PBA [33], WOA [36], PPSO [29], AOA [41], IFMO [35], and ESCA [40].
Table A5. Wilcoxon signed-rank results of the test pairs of the pairwise algorithms’ results between the EAOA and other algorithms, i.e., PBA [33], WOA [36], PPSO [29], AOA [41], IFMO [35], and ESCA [40].
FunsPBA [33]WOA [36]PPSO [29]AOA [41]IFMO [35]ESCA [40]EAOA-Itself
f12.4018 × 10−71.4018 × 10−111.7018 × 10−111.1205 × 10−56.9641 × 10−82.5668 × 10−7~N/A
f21.4018 × 10−111.4018 × 10−111.4018 × 10−112.2080 × 10−77.1665 × 10−31.3749 × 10−2~N/A
f31.4018 × 10−111.4018 × 10−118.5710 × 10−111.8717 × 10−21.8717 × 10−24.6578 × 10−3~N/A
f41.4018 × 10−111.4018 × 10−111.4018 × 10−114.8753 × 10−111.5456 × 10−22.7237 × 10−5~N/A
f51.4018 × 10−111.5447 × 10−111.4018 × 10−111.8376 × 10−95.3326 × 10−54.0332 × 10−11~N/A
f61.4018 × 10−111.4018 × 10−111.4018 × 10−114.4659 × 10−106.5678 × 10−49.8637 × 10−4~N/A
f71.4018 × 10−111.4018 × 10−111.7018 × 10−119.4096 × 10−111.6922 × 10−36.2370 × 10−4~N/A
f81.4018 × 10−113.6674 × 10−111.8745 × 10−118.0856 × 10−21.0244 × 10−11.2706 × 10−2~N/A
f94.8753 × 10−111.4018 × 10−111.2873 × 10−82.0041 × 10−93.9045 × 10−12.6004 × 10−1~N/A
f101.4018 × 10−111.4018 × 10−111.4018 × 10−115.8296 × 10−19.7754 × 10−11.5366 × 10−3~N/A
f111.4018 × 10−111.4018 × 10−111.4018 × 10−111.5439 × 10−91.4703 × 10−13.1620 × 10−6~N/A
f121.4018 × 10−111.4018 × 10−116.4699 × 10−111.5447 × 10−111.3749 × 10−24.1212 × 10−2~N/A
f137.1071 × 10−117.8055 × 10−117.1071 × 10−111.2847 × 10−48.7693 × 10−18.2178 × 10−1~N/A
f141.4018 × 10−111.4018 × 10−112.5021 × 10−111.4018 × 10−113.7194 × 10−23.6588 × 10−9~N/A
f151.4018 × 10−111.4018 × 10−114.0332 × 10−112.9096 × 10−27.2487 × 10−17.3779 × 10−2~N/A
f161.4018 × 10−113.7291 × 10−106.1854 × 10−12.7082 × 10−27.3779 × 10−26.3217 × 10−1~N/A
f177.1071 × 10−111.8745 × 10−111.6408 × 10−101.4729 × 10−69.8877 × 10−17.2487 × 10−1~N/A
f181.4018 × 10−111.4018 × 10−111.4018 × 10−111.4018 × 10−116.1228 × 10−16.4699 × 10−11~N/A
f192.7567 × 10−65.3326 × 10−51.8183 × 10−64.8148 × 10−18.9917 × 10−14.6412 × 10−1~N/A
f201.4018 × 10−111.4018 × 10−111.8745 × 10−111.0328 × 10−102.5189 × 10−24.3218 × 10−7~N/A
f211.4018 × 10−111.4018 × 10−111.4018 × 10−113.3134 × 10−17.4745 × 10−36.9641 × 10−8~N/A
f221.4018 × 10−111.4018 × 10−111.4018 × 10−112.7539 × 10−113.7194 × 10−22.5021 × 10−11~N/A
f231.4018 × 10−111.4018 × 10−111.4018 × 10−119.4096 × 10−112.2599 × 10−12.5970 × 10−9~N/A
f241.4018 × 10−111.4018 × 10−117.8055 × 10−112.0014 × 10−11.8717 × 10−21.7206 × 10−1~N/A
f252.2729 × 10−111.3989 × 10−71.3643 × 10−104.4711 × 10−18.2178 × 10−13.1751 × 10−1~N/A
f261.4018 × 10−113.9843 × 10−93.0304 × 10−111.1106 × 10−72.8074 × 10−22.3679 × 10−10~N/A
f271.4018 × 10−119.4096 × 10−115.8443 × 10−102.1213 × 10−55.9218 × 10−42.2408 × 10−6~N/A
f281.4018 × 10−112.2729 × 10−111.4018 × 10−117.1825 × 10−52.0181 × 10−24.3379 × 10−9~N/A
f291.4018 × 10−111.4018 × 10−111.4018 × 10−111.4018 × 10−116.2370 × 10−47.8055 × 10−11~N/A
Avg.6.55175.72415.62073.62072.51382.54832.25204
Rank7654231
The bold data values in each row of the Table are the best ones in each pair compared with the EAOA approach.

Appendix B

Figure A1. The EAOA’s convergence output curves represented graphically and compared to those of the GA, AOA, SA, FPA, PSO, MFO, and SCA algorithms for the selected functions. The green line is the set background of the worst one; here, the green line is the GA method.
Figure A1. The EAOA’s convergence output curves represented graphically and compared to those of the GA, AOA, SA, FPA, PSO, MFO, and SCA algorithms for the selected functions. The green line is the set background of the worst one; here, the green line is the GA method.
Entropy 24 01018 g0a1aEntropy 24 01018 g0a1b

References

  1. Yick, J.; Mukherjee, B.; Ghosal, D. Wireless sensor network survey. Comput. Netw. 2008, 52, 2292–2330. [Google Scholar] [CrossRef]
  2. Qiao, Y.; Dao, T.K.; Pan, J.S.; Chu, S.C.; Nguyen, T.T. Diversity teams in soccer league competition algorithm for wireless sensor network deployment problem. Symmetry 2020, 12, 445. [Google Scholar] [CrossRef] [Green Version]
  3. Dao, T.K.; Yu, J.; Nguyen, T.T.; Ngo, T.G. A Hybrid Improved MVO and FNN for Identifying Collected Data Failure in Cluster Heads in WSN. IEEE Access 2020, 8, 124311–124322. [Google Scholar] [CrossRef]
  4. Dao, T.K.; Nguyen, T.T.; Pan, J.S.; Qiao, Y.; Lai, Q. Identification Failure Data for Cluster Heads Aggregation in WSN Based on Improving Classification of SVM. IEEE Access 2020, 8, 61070–61084. [Google Scholar] [CrossRef]
  5. Ali, A.; Ming, Y.; Chakraborty, S.; Iram, S. A comprehensive survey on real-time applications of WSN. Future internet 2017, 9, 77. [Google Scholar] [CrossRef] [Green Version]
  6. Chu, S.C.; Dao, T.K.; Pan, J.S.; Nguyen, T.T. Identifying correctness data scheme for aggregating data in cluster heads of wireless sensor network based on naive Bayes classification. Eurasip J. Wirel. Commun. Netw. 2020, 52. [Google Scholar] [CrossRef] [Green Version]
  7. Nguyen, T.T.; Pan, J.S.; Dao, T.K. An Improved Flower Pollination Algorithm for Optimizing Layouts of Nodes in Wireless Sensor Network. IEEE Access 2019, 7, 75985–75998. [Google Scholar] [CrossRef]
  8. Chai, Q.-W.; Chu, S.-C.; Pan, J.-S.; Zheng, W.-M. Applying Adaptive and Self Assessment Fish Migration Optimization on Localization of Wireless Sensor Network on 3-D Terrain. J. Inf. Hiding Multim. Signal Process. 2020, 11, 90–102. [Google Scholar]
  9. Kulkarni, R.V.R.; Ferster, A.; Venayagamoorthy, G.K. Computational intelligence in wireless sensor networks: A survey. IEEE Commun. Surv. Tutor. 2011, 13, 68–96. [Google Scholar] [CrossRef]
  10. Li, Z.; Chu, S.-C.; Pan, J.-S.; Hu, P.; Xue, X. A Mahalanobis Surrogate-Assisted Ant Lion Optimization and Its Application in 3D Coverage of Wireless Sensor Networks. Entropy 2022, 24, 586. [Google Scholar] [CrossRef]
  11. Pan, J.-S.; Dao, T.-K.; Pan, T.-S.; Nguyen, T.-T.; Chu, S.-C.; Roddick, J.F. An improvement of flower pollination algorithm for node localization optimization in WSN. J. Inf. Hiding Multimed. Signal. Process. 2017, 8, 486–499. [Google Scholar]
  12. Pan, J.-S.; Sun, X.-X.; Chu, S.-C.; Abraham, A.; Yan, B. Digital watermarking with improved SMS applied for QR code. Eng. Appl. Artif. Intell. 2021, 97, 104049. [Google Scholar] [CrossRef]
  13. Nguyen, T.-T.; Pan, J.-S.; Dao, T.-K.; Sung, T.-W.; Ngo, T.-G. Pigeon-Inspired Optimization for Node Location in Wireless Sensor Network. In Proceedings of the International Conference on Engineering Research and Applications, Thai Nguyen, Vietnam, 1–2 December 2019; Volume 104, pp. 589–598. [Google Scholar]
  14. Nguyen, T.-T.; Dao, T.-K.; Kao, H.-Y.; Horng, M.-F.; Shieh, C.-S. Hybrid Particle Swarm Optimization with Artificial Bee Colony Optimization for Topology Control Scheme in Wireless Sensor Networks. J. Internet Technol. 2017, 18, 743–752. [Google Scholar] [CrossRef]
  15. Nguyen, T.-T.; Pan, J.-S.; Wu, T.-Y.; Dao, T.-K.; Nguyen, T.-D. Node Coverage Optimization Strategy Based on Ions Motion Optimization. J. Netw. Intell. 2019, 4, 2414–8105. [Google Scholar]
  16. Pan, J.-S.; Nguyen, T.-T.; Dao, T.-K.; Pan, T.-S.; Chu, S.-C. Clustering Formation in Wireless Sensor Networks: A Survey. J. Netw. Intell. 2017, 2, 287–309. [Google Scholar]
  17. Dao, T.K.; Pan, T.S.; Nguyen, T.T.; Chu, S.C. A compact Articial bee colony optimization for topology control scheme in wireless sensor networks. J. Inf. Hiding Multimed. Signal Processing 2015, 6, 297–310. [Google Scholar]
  18. Othman, M.F.; Shazali, K. Wireless sensor network applications: A study in environment monitoring system. Procedia Eng. 2012, 41, 1204–1210. [Google Scholar] [CrossRef] [Green Version]
  19. Liu, N.; Pan, J.S.; Nguyen, T.T. A bi-Population QUasi-Affine TRansformation Evolution algorithm for global optimization and its application to dynamic deployment in wireless sensor networks. Eurasip J. Wirel. Commun. Netw. 2019, 2019, 175. [Google Scholar] [CrossRef]
  20. Mahdavi, S.; Shiri, M.E.; Rahnamayan, S. Metaheuristics in large-Scale global continues optimization: A survey. Inf. Sci. 2015, 295, 407–428. [Google Scholar] [CrossRef]
  21. Shiva Prasad Yadav, S.G.; Chitra, A. Wireless Sensor Networks-Architectures, Protocols, Simulators and Applications: A Survey. Int. J. Electron. Comput. Sci. Eng. 2012, 1, 1941–1953. [Google Scholar]
  22. Akyildiz, I.F.; Su, W.; Sankarasubramaniam, Y.; Cayirci, E. Wireless sensor networks: A survey. Comput. Netw. 2002, 38, 393–422. [Google Scholar] [CrossRef] [Green Version]
  23. Pan, T.-S.; Dao, T.-K.; Nguyen, T.-T.; Chu, S.-C. Optimal base station locations in heterogeneous wireless sensor network based on hybrid particle swarm optimization with bat algorithm. J. Comput. 2014, 25, 14–25. [Google Scholar]
  24. Van Laarhoven, P.J.M.; Aarts, E.H.L. Simulated annealing. In Simulated Annealing: Theory and Applications; Springer Science & Business Media: Berlin/Heidelberg, Germany, 1987; pp. 7–15. [Google Scholar]
  25. Whitley, D. A genetic algorithm tutorial. Stat. Comput. 1994, 4, 65–85. [Google Scholar] [CrossRef]
  26. Srinivas, M.; Patnaik, L.M. Genetic Algorithms: A Survey. Computer 1994, 27, 17–26. [Google Scholar] [CrossRef]
  27. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, Australia, 27 November–1 December 1995; Volume 6, pp. 1942–1948. [Google Scholar]
  28. Chu, S.A.; Tsai, P.W.; Pan, J.S. Cat swarm optimization. In Proceedings of the 9th Pacific Rim International Conference on Artificial Intelligence, Guilin, China, 7–11 August 2006; Volume 4099, pp. 854–858. [Google Scholar]
  29. Chu, S.C.; Pan, J.-S. A parallel particle swarm optimization algorithm with communication strategies. J. Inf. Sci. Eng. 2005, 21, 9. [Google Scholar]
  30. Dorigo, M.; Di Caro, G. Ant colony optimization: A new meta-Heuristic. In Proceedings of the 1999 Congress on Evolutionary Computation, CEC, Washington, DC, USA, 6–9 July 1999; Volume 2, pp. 1470–1477. [Google Scholar]
  31. Karaboga, D.; Basturk, B. A powerful and efficient algorithm for numerical function optimization: Artificial bee colony (ABC) algorithm. J. Glob. Optim. 2007, 39, 459–471. [Google Scholar] [CrossRef]
  32. Yang, X.S. A new metaheuristic Bat-Inspired Algorithm. In Studies in Computational Intelligence; González, J., Pelta, D., Cruz, C., Terrazas, G., Krasnogor, N., Eds.; Springer: Berlin/Heidelberg, Germany, 2010; Volume 284, pp. 65–74. ISBN 9783642125379. [Google Scholar]
  33. Tsai, C.F.; Dao, T.K.; Yang, W.J.; Nguyen, T.T.; Pan, T.S. Parallelized bat algorithm with a communication strategy. In Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science); Ali, M., Pan, J.-S., Chen, S.-M., Horng, M.-F., Eds.; Springer International Publishing: Cham, Switzerland, 2014; Volume 8481, pp. 87–95. [Google Scholar]
  34. Mirjalili, S. Moth-Flame optimization algorithm: A novel nature-Inspired heuristic paradigm. Knowl.-Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  35. Nguyen, T.-T.; Wang, H.-J.; Dao, T.-K.; Pan, J.-S.; Ngo, T.-G.; Yu, J. A Scheme of Color Image Multithreshold Segmentation Based on Improved Moth-Flame Algorithm. IEEE Access 2020, 8, 174142–174159. [Google Scholar] [CrossRef]
  36. Mirjalili, S.; Lewis, A. The Whale Optimization Algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  37. Yang, X.S. Flower pollination algorithm for global optimization. In Proceedings of the Lecture Notes in Computer Science, Orléans, France, 3–7 September 2012; Volume 744, pp. 240–249. [Google Scholar]
  38. Nguyen, T.T.; Shieh, C.S.; Horng, M.F.; Dao, T.K.; Ngo, T.G. Parallelized Flower Pollination Algorithm with a Communication Strategy. In Proceedings of the Proceedings-2015 IEEE International Conference on Knowledge and Systems Engineering, KSE, Ho Chi Minh City, Vietnam, 8–10 October 2015; pp. 103–107. [Google Scholar]
  39. Mirjalili, S. SCA: A sine cosine algorithm for solving optimization problems. Knowl.-Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  40. Chen, H.; Wang, M.; Zhao, X. A multi-strategy enhanced sine cosine algorithm for global optimization and constrained practical engineering problems. Appl. Math. Comput. 2020, 369, 124872. [Google Scholar] [CrossRef]
  41. Hashim, F.A.; Hussain, K.; Houssein, E.H.; Mabrouk, M.S.; Al-Atabany, W. Archimedes optimization algorithm: A new metaheuristic algorithm for solving optimization problems. Appl. Intell. 2021, 51, 1531–1551. [Google Scholar] [CrossRef]
  42. Tian, J.; Gao, M.; Ge, G. Wireless sensor network node optimal coverage based on improved genetic algorithm and binary ant colony algorithm. EURASIP J. Wirel. Commun. Netw. 2016, 104. [Google Scholar] [CrossRef] [Green Version]
  43. Wu, G.; Mallipeddi, R.; Suganthan, P.N. Problem Definitions and Evaluation Criteria for the CEC 2017 Competition on Constrained Real-Parameter Optimization; Technical Report 2017. 2017. Available online: https://www3.ntu.edu.sg/home/epnsugan/index_files/CEC2017/CEC2017.htm (accessed on 1 July 2021).
  44. Chelliah, J.; Kader, N. Optimization for connectivity and coverage issue in target-based wireless sensor networks using an effective multiobjective hybrid tunicate and salp swarm optimizer. Int. J. Commun. Syst. 2021, 34, e4679. [Google Scholar] [CrossRef]
  45. Ab Aziz, N.A.B.; Mohemmed, A.W.; Alias, M.Y. A wireless sensor network coverage optimization algorithm based on particle swarm optimization and Voronoi diagram. In Proceedings of the 2009 international conference on networking, sensing and control, Okayama, Japan, 26–29 March 2009; pp. 602–607. [Google Scholar]
  46. Wang, Z.; Xie, H.; Hu, Z.; Li, D.; Wang, J.; Liang, W. Node coverage optimization algorithm for wireless sensor networks based on improved grey wolf optimizer. J. Algorithms Comput. Technol. 2019, 13, 1748302619889498. [Google Scholar] [CrossRef] [Green Version]
  47. Fan, F.; Chu, S.-C.; Pan, J.-S.; Yang, Q.; Zhao, H. Parallel Sine Cosine Algorithm for the Dynamic Deployment in Wireless Sensor Networks. J. Internet Technol. 2021, 22, 499–512. [Google Scholar]
  48. Hashim, F.A.; Houssein, E.H.; Hussain, K.; Mabrouk, M.S.; Al-Atabany, W. Honey Badger Algorithm: New metaheuristic algorithm for solving optimization problems. Math. Comput. Simul. 2022, 192, 84–110. [Google Scholar] [CrossRef]
  49. Nguyen, T.-T.; Pan, J.-S.; Chu, S.-C.; Roddick, J.F.; Dao, T.-K. Optimization Localization in Wireless Sensor Network Based on Multi-Objective Firefly Algorithm. J. Netw. Intell. 2016, 1, 130–138. [Google Scholar]
  50. Pan, J.-S.; Nguyen, T.-T.; Chu, S.-C.; Dao, T.-K.; Ngo, T.-G. Diversity enhanced ion motion optimization for localization in wireless sensor network. J. Inf. Hiding Multimed. Signal Process. 2019, 10, 221–229. [Google Scholar]
  51. Nguyen, T.-T.; Yu, J.; Nguyen, T.-T.-T.; Dao, T.-K.; Ngo, T.-G. A Solution to Sensor Node Localization Using Glow-Worm Swarm Optimization Hybridizing Positioning Model BT—Advances in Intelligent Information Hiding and Multimedia Signal Processing; Pan, J.-S., Li, J., Ryu, K.H., Meng, Z., Klasnja-Milicevic, A., Eds.; Springer: Singapore, 2021; pp. 260–268. [Google Scholar]
  52. Liu, N.; Pan, J.-S.; Wang, J.; Nguyen, T.-T. An adaptation multi-group quasi-affine transformation evolutionary algorithm for global optimization and its application in node localization in wireless sensor networks. Sensors 2019, 19, 4112. [Google Scholar] [CrossRef] [Green Version]
Figure 1. The graphical initialization of the EAOA with the statistical node coverage optimization scheme for different numbers of sensor nodes: (a) 20, (b) 40, (c) 50, and (d) 60.
Figure 1. The graphical initialization of the EAOA with the statistical node coverage optimization scheme for different numbers of sensor nodes: (a) 20, (b) 40, (c) 50, and (d) 60.
Entropy 24 01018 g001
Figure 2. The graphical coverage of six different metaheuristic algorithms for the WSN node area deployment. (a) AOA, (b) EAOA, (c) GWO, (d) PSO, (e) SSA, (f) SCA algorithms.
Figure 2. The graphical coverage of six different metaheuristic algorithms for the WSN node area deployment. (a) AOA, (b) EAOA, (c) GWO, (d) PSO, (e) SSA, (f) SCA algorithms.
Entropy 24 01018 g002
Figure 3. Comparison of the optimal coverage rates of the EAOA with the other schemes in different-sized WSN monitoring node area deployment scenarios. (a) 160 m × 160 m, (b) 100 m × 100, (c) 80 m × 80 m, and (d) 40 m × 40 m.
Figure 3. Comparison of the optimal coverage rates of the EAOA with the other schemes in different-sized WSN monitoring node area deployment scenarios. (a) 160 m × 160 m, (b) 100 m × 100, (c) 80 m × 80 m, and (d) 40 m × 40 m.
Entropy 24 01018 g003
Figure 4. Comparison of the EAOA optimization coverage rates for various sensor node counts deployed in the 2D monitoring of a 100 m × 100 m area.
Figure 4. Comparison of the EAOA optimization coverage rates for various sensor node counts deployed in the 2D monitoring of a 100 m × 100 m area.
Entropy 24 01018 g004
Table 1. Algorithm settings for parameters and variables.
Table 1. Algorithm settings for parameters and variables.
AlgorithmsSetting Parameters
EAOA C 1 = 2.1 ,   C 2 = 5.6 ,   C 3 = 1.95 ,   C 4 = 0.65
AOA [41] C 1 = 2.1 ,   C 2 = 5.6 ,   C 3 = 1.95 ,   C 4 = 0.65
GA [25] R m u = 0.1 ,   R c r = 0.9
SA [24] P = 0.6 ,   α = 0.8 ,   τ = 0.05 ,   S N = 14.41
PSO [27] V m a x = 10 ,   V m i n = 10 ,   ω = 0.9   t o   0.4 ,   c 1 = c 2 = 1.49455
PPSO [29] G = 2 ,   R = 10 ,   V m a x = 10 ,   V m i n = 10 ,   ω = 0.9   t o   0.4 ,   c 1 = c 2 = 1.49465
PBA [33] G = 2 ,   R = 10 ,   A 0 = 0.7 ,   r 0 = 0.15 ,   α = 0.25 ,   γ = 0.16
FPA [37] P s w i t c h = 0.65 ,   λ = 1.5 ,   s 0 = 0.1
MFO [34] a = 1 ,   b = 1
IMFO [35] a = 1 ,   b = 1 ,   ω = 0.9   t o   0.4
WOA [36] a = 2   to   0 ,   b = 1 ,   l = [ 1 , 1 ]
SCA [39] r 1 ,   r 3 ,   = r a n d ( 0 , 2 ) ,   r 2   [ 0 ,   2 π ] ,   r 4 = r a n d ( 0 , 1 )
ESCA [40] r 1 ,   r 3 ,   = r a n d ( 0 , 2 ) ,   r 2   [ 0 ,   2 π ] ,   r 4 = r a n d ( 0 , 1 ) ,   ω = 0.9   t o   0.5
Table 2. The parameter settings for the desired WSN node deployment areas.
Table 2. The parameter settings for the desired WSN node deployment areas.
DescriptionParametersValues
Desired deployment areasW × L40 m × 40 m, 80 m × 80 m,
100 m × 100 m, 160 m × 160 m
Sensing radiusRs15 m
Communication radiusRc20 m
Number of sensor nodesM20, 40, 50, 60
Number of iterationsIter500, 1000, 1500
Table 3. Comparison of the proposed EAOA method with the other techniques used—i.e., the SAA, PSO, GWO, SCA, and AOA algorithms—in terms of percentage coverage rate, running time, iterations to convergence, and monitoring area size.
Table 3. Comparison of the proposed EAOA method with the other techniques used—i.e., the SAA, PSO, GWO, SCA, and AOA algorithms—in terms of percentage coverage rate, running time, iterations to convergence, and monitoring area size.
ApproachFactor Variables40 m × 40 m80 m × 80 m100 m × 100 m160 m × 160 m
SSACoverage rate (%)78%74%77%74%
Consumed execution time (s)3.09 6.917.38 9.34
No. of iterations to convergence145256234844
WSN node numbers 20405060
PSOCoverage rate (%)79%77%79%76%
Consumed execution time (s)2.78 6.22 6.658.41
No. of iterations to convergence396343578754
WSN node numbers 20405060
GWOCoverage rate (%)80%80%84%78%
Consumed execution time (s)3.06 6.847.319.25
No. of iterations to convergence33444544755
WSN node numbers 20405060
CSACoverage rate (%)78%79%82%78%
Consumed execution time (s)2.926.297.239.22
No. of iterations to convergence445555665876
No. of mobile nodes20405060
AOACoverage rate (%)80%79%80%79%
Consumed execution time (s)3.126.987.469.44
No. of iterations to convergence665333563954
WSN node numbers 20405060
EAOACoverage rate (%)80%82%87%80%
Consumed execution time (s)2.756.156.578.31
No. of iterations to convergence135503556765
WSN node numbers 20405060
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Dao, T.-K.; Chu, S.-C.; Nguyen, T.-T.; Nguyen, T.-D.; Nguyen, V.-T. An Optimal WSN Node Coverage Based on Enhanced Archimedes Optimization Algorithm. Entropy 2022, 24, 1018. https://doi.org/10.3390/e24081018

AMA Style

Dao T-K, Chu S-C, Nguyen T-T, Nguyen T-D, Nguyen V-T. An Optimal WSN Node Coverage Based on Enhanced Archimedes Optimization Algorithm. Entropy. 2022; 24(8):1018. https://doi.org/10.3390/e24081018

Chicago/Turabian Style

Dao, Thi-Kien, Shu-Chuan Chu, Trong-The Nguyen, Trinh-Dong Nguyen, and Vinh-Tiep Nguyen. 2022. "An Optimal WSN Node Coverage Based on Enhanced Archimedes Optimization Algorithm" Entropy 24, no. 8: 1018. https://doi.org/10.3390/e24081018

APA Style

Dao, T. -K., Chu, S. -C., Nguyen, T. -T., Nguyen, T. -D., & Nguyen, V. -T. (2022). An Optimal WSN Node Coverage Based on Enhanced Archimedes Optimization Algorithm. Entropy, 24(8), 1018. https://doi.org/10.3390/e24081018

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop