1. Introduction
In the constant search for solutions to complex problems that arise in a variety of fields, optimization plays a crucial role. One such challenging problem is the Set Covering Problem (SCP), a fundamental issue in combinatorial optimization with practical applications in logistics, route planning, resource allocation, and network optimization [
1]. The SCP involves finding the minimum set of subsets that covers all given elements, aiming to minimize the cost or the number of selected subsets. Despite its apparent simplicity, the SCP is an NP-hard problem, implying that its complexity increases exponentially with the size of the problem, making it challenging to obtain optimal solutions through exact methods for large-scale instances.
Researchers have explored various strategies to address the SCP efficiently and effectively in response to this complexity. Among these strategies, metaheuristic algorithms have emerged as a promising option due to their ability to find high-quality solutions in a reasonable time [
2]. However, adapting these algorithms to combinatorial problems like the SCP presents unique challenges that require innovative approaches.
The field of metaheuristics has grown considerably in recent years, as demonstrated by Rajwar et al. in their systematic literature review [
3], where they evidence more than 500 metaheuristics. One of the metaheuristics that has gained attention in this context is the White Shark Optimizer (WSO). Modeled after the natural behavior of great white sharks, WSO operates as a population-based algorithm capable of finding solutions close to the global optimum in complex and nonlinear problems. Its ability to explore the search space effectively makes it an attractive tool for addressing the SCP and other combinatorial optimization problems [
4]. However, given its continuous nature, adapting WSO to combinatorial problems like the SCP requires additional techniques to ensure the quality of the solutions obtained.
Another interesting trend in the study of metaheuristics is hybridization. Hybridizations aim to improve the exploration and exploitation capabilities of metaheuristics. Currently, there are different types of hybridizations:
Metaheuristics improving metaheuristics: This type of hybridization consists of combining two metaheuristic algorithms. Usually, one metaheuristic has exploration virtues and the other has exploitation virtues [
5,
6].
Hyperheuristics: Here, we have again two metaheuristics but with hierarchies where a high-level metaheuristic guides a low-level one [
7].
Machine learning improving metaheuristics: In this approach, machine learning techniques are incorporated to improve the performance of metaheuristics [
8].
Chaos improving metaheuristics: This approach uses chaos theory to modify the stochastic behavior of metaheuristics [
9,
10,
11].
This is where chaotic maps come into play, nonlinear dynamic systems that exhibit seemingly random but deterministic behavior [
12]. Integrating chaotic maps into the binarization process of WSO would allow for the effective exploration of the SCP search space, enhancing solution diversity and avoiding local optima.
The binarization process is essential for a continuous metaheuristic to solve binary combinatorial optimization problems. The motivation for binarizing metaheuristics lies in the Free Lunch Theorem [
13,
14], which tells us that no algorithm can solve all optimization problems. The literature shows that the two-step technique is the most widely used method for binarizing metaheuristics [
15].
In this context, this work proposes a hybrid approach that combines WSO with chaotic maps to address the SCP and obtain high-quality solutions. Through this innovative combination, the aim is to overcome the inherent challenges of the SCP and provide results with practical applications in various fields, from telecommunications network optimization to transportation route planning and limited resource allocation.
A summary of the content structure of the following sections:
Section 2 presents the conceptual and mathematical definition of the SCP.
Section 3 presents the work related to the SCP, binary problem solving, and chaotic maps in metaheuristics.
Section 4 presents the conceptual and mathematical definition of WSO.
Section 5 presents how to binarize continuous metaheuristics.
Section 6 presents our research proposal.
Section 7 presents the experimental results and their discussion to close with the conclusions and future works in
Section 8.
2. Set Covering Problem
In combinatorial optimization, numerous practical problems of great importance involve finding the best combination of binary or discrete variables. An example of this is the Set Covering Problem, which focuses on finding the minimum number of subsets required to cover all the given elements, thus reducing the overall cost or the number of chosen subsets.
The objective function in the context of Set Covering is a mathematical expression that represents the quality or efficiency of a proposed solution to the problem. In Set Covering, the objective is to minimize this function, which involves finding the most efficient combination of sets to cover all elements of the given universe.
Let us define a binary matrix
a with dimensions
, where
indicates that the
j-th column includes the
i-th row, and
if it does not. Each column
j comes with a non-negative cost
. The sets
and
represent the rows and columns, respectively. The Set Covering Problem aims to find a subset of columns
such that every row in
I is covered by at least one column in
S, while minimizing the total cost of the selected columns.
where
.
Here, represents the selection status of column j, where means the column is included, and means it is not. The element corresponds to the respective entry in the binary matrix.
3. Related Work
3.1. Set Covering Optimization
The Set Covering Problem (SCP) is a key challenge in combinatorial optimization. Because of its importance in numerous real-world scenarios, it is widely studied in academic research and industrial applications. The SCP is defined as the task of selecting a minimum set of subsets from a given set to cover all elements of another set. This problem has diverse logistics, route planning, resource allocation, and network optimization applications.
A wide range of approaches and techniques have been proposed in the literature to solve the SCP. Among the most prominent methods are exact methods and metaheuristics. Exact methods, such as integer linear programming, guarantee finding the optimal solution but can be computationally expensive for large-scale problems. On the other hand, metaheuristics, such as genetic algorithms [
16], tabu search [
17], and simulated annealing [
18], offer approximate solutions in a reasonable time and are effective for solving moderate to large-scale SCP instances.
In addition to these approaches, hybrid and advanced approaches have been proposed that combine different optimization techniques to improve performance in solving the SCP [
19]. These approaches may integrate metaheuristics with exact algorithms, data preprocessing techniques, or domain-specific heuristics to achieve even better solutions.
3.2. Binary Problems Resolution
Transfer functions are key mechanisms for adapting continuous metaheuristic algorithms to binary optimization problems. These functions map continuous values from the search space to real values in the range [0, 1] [
20].
The sigmoid function has been widely used, but there are limitations regarding its ability to maintain a proper balance between exploration and exploitation of the search space. Therefore, several alternative transfer functions have been proposed and studied to overcome the shortcomings of the sigmoid function. These new functions aim to enable better transitions between exploration and exploitation phases during algorithm execution, potentially improving performance in solving binary optimization problems [
21].
In addition to transfer functions, binarization rules also play a crucial role in the performance of binary metaheuristic algorithms. These rules determine how continuous values mapped to real numbers are further mapped to binary values. Some commonly used rules include the threshold rule, roulette rule, elite-based rule, and elite-roulette. Several studies have analyzed the relative importance of transfer functions and binarization rules. The results suggest that binarization rules can have a greater impact on performance than the transfer functions themselves. In particular, elite-based or elite-roulette rules have generally produced the best results [
8].
However, according to the principle of no free lunch [
22], there is no single general technique capable of efficient binarization for all optimization problems. The performance of a binary metaheuristic algorithm will depend on the specific problem, the transfer function, and the binarization rule used.
3.3. Chaotic Maps into Metaheuristic
Dynamic systems, being nonlinear and non-periodic, exhibit chaotic behaviors that are deterministic but appear random. In recent decades, they have been widely studied and utilized in the field of metaheuristics to enhance the exploration of search spaces and avoid getting trapped in local optima [
23]. The implementation of chaotic mappings is valued for its computational efficiency and the need for a few initial parameters. These systems are susceptible to minor variations in initial conditions. The literature highlights some relevant chaotic maps [
10].
Table 1 and
Figure 1 show the mathematical definition and behavior of the chaotic map, respectively.
One of the earliest works introducing the use of chaotic maps in metaheuristics was by Ouyang et al. [
24], where they proposed Chaotic Particle Swarm Optimization (CPSO) using the logistic map. Since then, numerous algorithms based on different chaotic maps have been proposed.
Several well-known chaotic maps used in metaheuristics include the logistic map, piecewise map, sine map, Singer map, and sinusoidal map [
10]. These chaotic maps have been applied in various evolutionary algorithms, including chaotic genetic algorithms [
25,
26,
27], chaotic particle swarm optimization [
24], and chaotic artificial bee colony optimization [
28].
4. White Shark Optimizer
In the oceanic environment, white sharks are capable of detecting the presence of prey in deep waters, thanks to their senses of electrical perception, vision, pressure, smell, and hearing, as shown in
Figure 2. It is important to note, as shown in
Figure 3, that the shark’s hearing perception is extensive, as sharks possess an auditory system that runs through their entire body. Despite their senses, they lack prior knowledge of the exact locations of these food sources within a specific region. As a result, white sharks need to extensively search to locate food sources in the vast sea. This research identified three primary behaviors of white sharks that aid in efficient prey localization [
4].
Movement speed towards prey: White sharks employ an undulating movement that harnesses the waves generated by the prey’s displacement. This movement, guided by their senses of hearing and smell, enables the shark to navigate towards the food source.
Random prey search: White sharks head towards regions where prey has been identified and stay near the most promising targets. This approach relies on randomly exploring possible food sources in the deep ocean.
Behavior of nearby prey localization: White sharks mimic the behavior of fish schools to locate nearby prey. In this approach, a shark moves towards the most successful individual in the search, facilitating the location of the optimal prey.
The main objective of WSO is to find optimal solutions in optimization problems, including feature selection problems [
29], intrusion detection [
30], proton exchange membrane fuel cell (PEMFC) design [
31], optimal energy flow [
32], and estimation of parameters for lithium-ion battery models [
33].
In feature selection, WSO has been adapted to address high-dimensional problems, where identifying an optimal subset of features is crucial for model performance. Enhanced binary variants of WSO, such as Binary-Distribution-based WSO (BDWSO), Binary Sine Cosine WSO (BSCWSO), and Binary Hybrid Sine Cosine WSO (BHSCWSO), have been developed to improve the balance between exploration and exploitation in complex multi-peak search problems [
29].
In intrusion detection, WSO has been applied to enhance intrusion prediction models by selecting informative features, leading to improved performance of machine-learning-based intrusion detection systems. Improved variants of WSO, such as BIWSO1, BIWSO2, and BIWSO3, have demonstrated their effectiveness in terms of classification accuracy, precision, recall, and F1 scores [
30].
In the field of PEMFCs, WSO has been used to optimize the parameter design of fuel cells, resulting in improved efficiency and performance of these cells. WSO has proven to be more effective than other optimization approaches, such as Salp Swarm Algorithm (SSA) and Harris Hawks Optimization (HHO) [
31].
Furthermore, WSO has been successfully applied to optimal energy flow problems [
32] and estimation of parameters for lithium-ion battery models [
33]. Hybrid variants of WSO, such as Hybrid White Shark Equilibrium Optimizer (WSEO), have effectively optimized energy scheduling problems in IoT-based systems.
In summary, the White Shark Optimizer has proven to be a promising and versatile metaheuristic that has been adapted and enhanced to address various optimization problems in various fields. Its improved variants have shown superior accuracy, convergence, and efficiency performance compared to other established optimization algorithms.
4.1. Mathematical Model
4.1.1. Parameter Initialization
The WSO framework begins by setting up a matrix that represents the initial population of solutions. This matrix has dimensions , where N is the population size, and d corresponds to the problem’s dimensionality, as detailed below:
Then, we initialize the population of initial solutions for the sharks based on the parameters above, and subsequently initialize the initial velocity and position, where rand is a continuous random number within the range [0, 1].
The initial population is generated within the search space using a uniform random distribution, as outlined below.
In this context,
denotes the starting vector for the
i-th white shark in the
j-th dimension. The values
and
correspond to the maximum and minimum bounds of the search area in the
j-th dimension, respectively, and
r is a random value generated between
.
4.1.2. Movement Speed towards Prey
White sharks are the ultimate hunters. They dedicate a significant part of their existence to searching for and pursuing their prey, using an extraordinary combination of highly developed senses: acute vision, fine hearing, and exceptional smell.
These marine predators have the ability to detect the location of their prey by interpreting disturbances in the water. The waves generated by the movements of the animals they hunt provide crucial information to the shark. Once the white shark has identified these aquatic signals, it begins its approach towards the prey.
This stalking and approaching behavior can be mathematically represented by a specific equation that models the shark’s movement in relation to its target.
Within this framework, the terms and are crucial. They correspond to the updated speed and the current position of the i-th shark, respectively. The collective best position found by all sharks so far is indicated by . Conversely, describes the present location of the i-th shark, while captures the optimal position that this shark has discovered throughout its journey.
Random parameters
and
, drawn from the range
, are utilized to influence the search process. The factor
is essential for managing the convergence rate of the WSO algorithm. In addition,
and
serve as weights to control how
and
affect the shark’s current position
. The values of these parameters
,
, and
are derived using specific formulas presented below:
The forces and establish the upper and lower thresholds needed to guide the movement of the white sharks effectively. The symbols k and K represent the current iteration number and the total number of iterations, respectively. Additionally, acts as an acceleration coefficient to influence the process.
4.1.3. Updating the Sharks’ Velocity in the Water
In this section, we encounter a decision equation where two scenarios are generated:
These two decisions depend on a value rand and , where represents the movement strength that increases over time to ensure convergence by accelerating speed.
The first equation updates the i-th row of by element-wise multiplication with . Here, denotes the negation of a bitwise XOR operation between values a and b, where a and b denote the boundaries.
And the second equation sums the position with the velocity divided by the frequency of the shark’s undulatory movement.
The function is applied to check if the difference between each element of the vector and the boundaries is positive or negative. The function outputs 1 if the input is positive, if it is negative, and 0 when the input is zero. As a result, the variable a is set to true if the difference is positive, while b is marked true if the difference is negative. This approach allows for the categorization or filtering of data based on whether they lie above or below the defined boundaries.
The frequency
f of the oscillatory motion is given by
and correspond to the lowest and highest frequencies of the oscillatory motion, respectively.
The parameter
measures the intensity of sensory abilities, such as hearing and smell, which grow stronger as the number of iterations increases.
where
and
are constants that control exploration and exploitation behaviors.
4.1.4. Updating the Sharks’ Position Based on the Schooling of Fish
This code segment implements a behavior known as “nearby prey localization”. In this stage, the great white sharks adjust their positions using information from nearby best solutions. For each shark in the population, a new position is calculated based on the distance between its current location and the best global solution. If it is the first shark, its position is updated directly with a random displacement. For subsequent sharks, a new position is calculated by weighing the best global solution and the position of the previous shark. This allows sharks to explore and exploit nearby areas based on shared information, contributing to the efficiency of solution search in the search space. Here, represents a suggested parameter to express the strength of the olfactory and visual senses of the great white sharks when following other sharks that are close to the optimal prey, and represents the distance between the prey and the shark.
The approach of the white shark as it converges on the prey can be modeled in the following manner:
indicates the next position of the
i-th white shark with respect to its prey. The function
determines the sign of the argument: it returns 1 if positive, −1 if negative, and 0 if zero, guiding the direction of the search. The terms
,
, and
are random values that are continuously distributed within the range
.
The parameter
is used to describe the strength of the visual and olfactory senses of the great white shark when closely following its prey, calculated as follows:
In this scenario,
functions as a positive constant that regulates the trade-off between exploration and exploitation. In this study,
is assigned a value of 0.0005 for all the experiments. The author made this choice [
4] following an in-depth assessment of various problem instances.
To create a mathematical model of the behavior of the white shark group, the top two optimal solutions were retained. In contrast, the positions of the remaining sharks were adjusted in relation to these best solutions. The formula below was introduced to characterize the behavior of the white shark collective:
where rand represents a uniformly generated random value within the range
.
4.1.5. Pseudocode for WSO
The Algorithm 1 describes the pseudocode of WSO, detailing the initialization of parameters, the calculation of positions and velocities, and the binarization of values. This process is repeated until the defined termination criteria are met.
Algorithm 1 White Shark Optimizer |
- 1:
Define the problem parameters. - 2:
Define the parameters for the WSO. - 3:
Generate the initial positions of the WSO randomly. - 4:
Set the initial population’s velocity. - 5:
Evaluate the initial population’s positions - 6:
while do - 7:
Sequentially revise the parameters v, , , , a, b, , f, , and , - 8:
respectively. - 9:
for do - 10:
- 11:
end for - 12:
for do - 13:
if then - 14:
- 15:
else - 16:
- 17:
end if - 18:
end for - 19:
for do - 20:
if then - 21:
- 22:
if then - 23:
- 24:
else - 25:
- 26:
- 27:
end if - 28:
end if - 29:
end for - 30:
Adjust the white sharks’ positions that cross the boundary limits - 31:
Evaluate and adjust the updated positions - 32:
end while - 33:
Return the best solution obtained up to now
|
5. Binarization Techniques
Within the context of continuous metaheuristics, binarization techniques stand out [
15], which aim to transform values from a continuous domain into a binary format. This process is carried out with the goal of preserving the ability to generate high-quality solutions characteristic of continuous metaheuristics, thereby facilitating the attainment of binary solutions of equivalent quality.
5.1. Two-Step Technique
This technique is a strategy for transforming continuous combinatorial optimization problems into equivalent binary problems [
34]. This approach is often used when applying metaheuristics to combinatorial optimization problems involving continuous variables. The process consists of two main steps, as shown in
Figure 4.
5.1.1. Transfer Function
The transfer function is fundamental in the binarization process, particularly in the realm of continuous metaheuristics. It is one of the most commonly used normalization techniques and was first introduced in [
35]. Its main role is to transform continuous values into a predefined range, typically
. This transformation is key for converting continuous solutions into a binary format, enhancing their effectiveness in optimization algorithms that rely on binary search methods.
By applying the transfer function, continuous values are appropriately scaled and adjusted, facilitating their subsequent binarization and efficient utilization in optimization algorithms. Therefore, the transfer function constitutes a fundamental step in the binarization process, significantly contributing to the effectiveness and efficiency of continuous metaheuristics in various optimization applications.
In
Figure 5, the graph generated by the S-shape and V-shape equations can be observed, which are named as such due to the shape they produce. In
Table 2, these equations and their variations are presented.
5.1.2. Binarization Rules
Binarization refers to converting continuous values into binary states, typically 0 and 1. In this approach, binarization rules are applied to probabilities derived from a transfer function to produce a binary outcome. Various methods for executing this binarization process are detailed in the literature [
36]. Selecting an appropriate binarization rule is crucial, as it depends on the specific context and the problem’s requirements. Proper use of these binarization rules is key to achieving precise and dependable results [
34].
In
Table 3, four binarization rules can be observed along with their respective equations and conditions.
6. A New Binary Approach for the White Shark Optimizer
The White Shark Optimizer (WSO) has proven to be effective in combinatorial optimization, but its adaptation to problems requiring binary solutions poses challenges, particularly in integrating continuous velocity values with binary populations. To address this issue, we propose binarizing both the population and the velocity, allowing the WSO to function effectively in discrete environments.
This hybrid approach combines the WSO with binarization techniques and chaotic maps. By using transfer functions such as the sigmoid and hyperbolic tangent, continuous solutions are converted into binary ones, improving solution diversity and preventing premature convergence. This method has practical applications in network optimization, route planning, and resource allocation.
6.1. Binarization of Velocity and Population
Since velocity is calculated in the WSO similarly to that in Particle Swarm Optimization (PSO), we adapted binarization techniques developed for PSO. This involves using transfer functions to convert continuous values into probabilities, determining binary states (0 or 1). Binarization is integrated into the WSO to ensure consistency between velocity and population, avoiding errors in position updates.
The decision to adapt the White Shark Optimizer by binarizing both the velocity and the population is based on the demonstrated effectiveness of similar techniques in variants of Particle Swarm Optimization (PSO), such as Binary PSO (BPSO). Various studies have explored and improved BPSO to address challenges in combinatorial problems, showing that binarization is crucial for optimizing in discrete spaces. For example, a modified BPSO that adopts concepts from genotype–phenotype representation and mutation operators from genetic algorithms has outperformed the original BPSO on benchmark functions [
37]. Another study implemented an improved BPSO for feature selection in gene expression data, simplifying the selection process and improving classification accuracy [
38]. Additionally, an improved BPSO has been used to solve unit commitment problems, demonstrating its superiority in terms of production cost and computation time [
39]. In the field of association rule mining, a BPSO has been successfully employed to extract quality rules from transactional databases [
40]. A novel approach for defining the velocity vector in BPSO has also been proposed, providing a more effective interpretation of continuous PSO in its binary version [
41]. High-utility itemset mining has been optimized using BPSO, particularly in reducing the search space and improving efficiency [
42]. Furthermore, studies have addressed the challenges of BPSO and proposed new solutions to improve its convergence rate and avoid trapping in local optima [
43]. BPSO has been adapted for classical problems such as the knapsack problem, introducing probability functions that maintain diversity and effectiveness in the search process [
44]. Finally, a BPSO has been developed for the lot-sizing problem, demonstrating its capability to find optimal solutions in most cases [
45]. All these studies reinforce the validity of applying binarization techniques in the WSO, thus justifying the adaptation of the algorithm to discrete problems and improving its performance in combinatorial optimization tasks.
Below, in Equation (
5), we can observe the velocity equation of WSO, where it can be seen that it has the same composition as the velocity of PSO [
37].
To better appreciate the proposal, a flow chart has been created for a clearer understanding of the solution.
6.1.1. Initialization of Parameters
The initialization of parameters is a critical step in any optimization algorithm, including the proposed White Shark Optimizer (WSO). This process involves setting up the necessary parameters that will govern the behavior and performance of the algorithm throughout its execution.
Prior to setting up the parameters for the WSO, it is crucial to identify the parameters specific to the problem. This involves determining the population size, the dimensions of the search space, and the range of values that the variables can take.
The WSO requires several parameters to be initialized, such as the initial velocity of the population, the weight factors for velocity updates, and the parameters for the chaotic maps. These parameters include:
Population size (n): Determines the number of candidate solutions in the population.
Velocity parameters (v, , , , , ): These parameters control the speed and direction of the search process.
We set the initial velocities for each individual in the population. Typically, these velocities are initialized to small random values to ensure that the search process begins with minimal bias.
Proper initialization of these parameters is crucial for the algorithm’s convergence and efficiency. Careful consideration and tuning of these values can significantly impact the overall performance of the WSO.
6.1.2. Update Velocity and Positions
Updating the velocity and positions of the population is a fundamental step in the optimization process. This phase involves adjusting the velocity of each individual in the population and subsequently updating their positions based on the new velocities.
For better understanding, a flowchart has been created in
Figure 6, representing the entire WSO algorithm process, including its binarization.
6.1.3. Binarize Velocity and Positions
Binarization is a crucial step for converting continuous values into discrete binary values. This step helps in adapting the continuous optimization process to a binary representation, which is often necessary for specific types of optimization problems.
Transfer Function for Population: Convert the continuous positions of the individuals into binary values using a transfer function. This function maps continuous values into binary space based on a predefined threshold or chaotic map values.
Transfer Function for Velocity: Similarly, apply a transfer function to the velocity values to obtain binary representations. This ensures that the velocities are also represented in a discrete binary format.
Chaotic Map Values: Incorporate chaotic maps to introduce randomness into the binarization process. Chaotic maps provide a way to create diverse binary patterns, enhancing the exploration capability of the algorithm.
Binarization Rule: Apply a binarization rule that uses the chaotic numbers to decide the binary representation of both positions and velocities. This rule determines the conversion threshold and ensures consistency in binary representation.
Proper binarization of velocity and positions allows the algorithm to operate effectively in a binary search space, which is essential for solving problems that require discrete solutions.
6.1.4. Sort Solutions
Sorting solutions is an essential step for evaluating and selecting the best candidates from the population. This process involves ranking the solutions based on their fitness or quality, which helps in determining the optimal solution.
Sorting solutions ensures that the best candidates are given priority, facilitating the convergence of the algorithm towards optimal solutions.
6.1.5. Return Solutions
Returning the final solutions is the concluding step of the optimization process. This step involves selecting and presenting the best solution(s) obtained during the execution of the algorithm.
Best Solution Selection: Based on the sorted solutions, identify the optimal solution(s) with the highest fitness values. This solution represents the best outcome achieved by the algorithm.
Output: Present the optimal solution(s) and their corresponding fitness values as the result of the optimization process. This provides the final outcome of the algorithm and offers insights into the best solution found.
Reporting: Include additional information such as the number of iterations, parameter settings, and performance metrics to provide a comprehensive view of the algorithm’s performance.
Returning the solutions allows users to assess the effectiveness of the optimization process and make informed decisions based on the results.
6.2. Repairing Complex Solutions
The simultaneous binarization of population and velocity reduces errors in repairing complex solutions, ensuring consistency in the binary space. This process is reinforced by chaotic maps, which provide pseudo-random sequences useful for creating dynamic thresholds in the binarization process. Algorithm 2 summarizes the proposal.
Algorithm 2 Binary White Shark Optimizer |
- 1:
Define the problem parameters. - 2:
Define the parameters for the WSO. - 3:
Generate the initial positions of the WSO randomly. - 4:
Set the initial population’s velocity. - 5:
Evaluate the initial population’s positions - 6:
while do - 7:
Sequentially revise the parameters v, , , , a, b, , f, , and , - 8:
respectively. - 9:
for do - 10:
- 11:
end for - 12:
for do - 13:
if then - 14:
- 15:
else - 16:
- 17:
end if - 18:
end for - 19:
for do - 20:
if then - 21:
- 22:
if then - 23:
- 24:
else - 25:
- 26:
- 27:
- 28:
end if - 29:
end if - 30:
for do - 31:
for do - 32:
Obtain by applying the transfer function to the population - 33:
Get chaotic map value - 34:
Obtain by applying the binarization rule using a chaotic number for the population. - 35:
Obtain by applying the transfer function to the velocity. - 36:
Get chaotic map value - 37:
Obtain by applying the binarization rule using a chaotic number for the velocity. - 38:
end for - 39:
end for - 40:
end for - 41:
Adjust the white sharks’ positions that cross the boundary limits - 42:
Evaluate and adjust the updated positions - 43:
end while - 44:
Return the best solution obtained up to now
|
7. Results and Discussion
To assess the proposed method, various continuous metaheuristics will be employed to tackle the Set Covering problem and then compared against the White Shark Optimizer (WSO). The metaheuristics include the Grey Wolf Optimizer (GWO), Pendulum Search Algorithm (PSA), Sine Cosine Algorithm (SCA), Whale Optimization Algorithm (WOA), and Moth–Flame Optimization (MFO), with each tested across 16 unique configurations, as shown in
Table 4.
7.1. Metaheuristics
The performance of WSO will be assessed using these algorithms to provide a comprehensive evaluation. The GWO, based on social hunting behaviors, will serve as a benchmark. The PSA and SCA offer innovative approaches with oscillatory movements and trigonometric functions, respectively, while the WOA and MFO are inspired by natural predator and moth behaviors. This range of metaheuristics ensures a thorough assessment of WSO’s strengths and weaknesses compared to both traditional and novel optimization techniques.
7.1.1. Grey Wolf Optimizer
Modeled after the social hierarchy found in grey wolf packs [
46], the Grey Wolf Optimizer (GWO) introduces four key hierarchical roles:
(indicating the current best solution),
(the second-best solution),
(third in rank, fostering diversity), and
(representing other potential solutions). This technique maintains a balance between exploration and exploitation within the search space, offering an effective method for tackling complex optimization challenges.
The pseudocode can be seen in Algorithm 3.
7.1.2. Pendulum Search Algorithm
Developed in 2022, the Pendulum Search Algorithm (PSA) is a population-based metaheuristic that draws inspiration from the harmonic motion of a simple pendulum [
47]. The initial solutions are generated randomly, and their positions are adjusted through the following method:
Let
represent the position of the
i-th solution in the
j-th dimension during iteration
t. The variable
denotes the position of the optimal solution in the
j-th dimension. The value of
is then computed as follows:
Here, t denotes the current iteration, represents the total number of iterations, and rand is a random value uniformly distributed within the interval .
The pseudocode can be seen in Algorithm 4.
Algorithm 3 Grey Wolf Optimizer |
- 1:
Input: The population - 2:
Output: The updated population and the best individual - 3:
Initialize the population X randomly - 4:
Evaluate the objective function for each individual in the population X - 5:
Determine the best individual in the population () - 6:
for iteration (t) do - 7:
- 8:
Determine alpha wolf () ▹ is the best solution - 9:
Determine beta wolf () ▹ is the second-best solution - 10:
Determine delta wolf () ▹ is the third-best solution - 11:
for solution (i) do - 12:
for dimension (j) do - 13:
- 14:
- 15:
- 16:
- 17:
- 18:
- 19:
- 20:
- 21:
- 22:
- 23:
- 24:
- 25:
- 26:
- 27:
- 28:
- 29:
- 30:
- 31:
- 32:
end for - 33:
end for - 34:
Determine the objective function value for each member of the population X. - 35:
Update - 36:
end for - 37:
return the updated population X, with as the best result.
|
7.1.3. Sine Cosine Algorithm
Introduced by Mirjalili in 2016, the Sine Cosine Algorithm (SCA) is a metaheuristic designed to address continuous optimization problems. It takes inspiration from the alternating behavior of sine and cosine functions [
48]. The positions of the solutions are then updated using the equations provided below.
represents the position of the
i-th solution at iteration
t, while
denotes the position of the solution with the highest fitness for the
i-th individual.
indicates the best overall solution’s position. The variables
,
, and
are random numbers, where
is a random value,
lies within the interval
, and
ranges from
.
These equations enable both exploration and exploitation of the search space by utilizing sine and cosine functions.
The pseudocode can be seen in Algorithm 5.
Algorithm 4 Pendulum Search Algorithm |
- 1:
Input: The set of individuals - 2:
Output: The modified population and the Best - 3:
Generate a random population X - 4:
Compute the objective function for each individual in the population X - 5:
Select the best individual in the population (Best) - 6:
for iteration (t) do - 7:
for solution (i) do - 8:
for dimension (j) do - 9:
Update - 10:
Update the position of - 11:
end for - 12:
end for - 13:
Assess the objective function for every individual in the population X - 14:
Update Best - 15:
end for - 16:
return the modified population , with Best as the optimal result.
|
Algorithm 5 Sine Cosine Algorithm |
- 1:
Input: The set of individuals - 2:
Output: The modified population and - 3:
Generate a random population X - 4:
Compute the objective function for each member of the population X - 5:
Select the best individual in the population () - 6:
- 7:
for iteration (t) do - 8:
- 9:
for solution (i) do - 10:
for dimension (j) do - 11:
- 12:
- 13:
- 14:
- 15:
- 16:
if then - 17:
- 18:
else - 19:
- 20:
end if - 21:
end for - 22:
end for - 23:
Calculate the objective function for each individual in the population X. - 24:
Update - 25:
end for - 26:
return the modified population X, with as the optimal result.
|
7.1.4. Whale Optimization Algorithm
Developed by Mirjalili and Lewis in 2016, the Whale Optimization Algorithm (WOA) is a metaheuristic inspired by the social structure and hunting strategies of whales [
49]. It emulates the ’bubble-net feeding’ technique, which is a complex and coordinated approach used by whales to capture their prey.
The WOA employs three primary phases during its search and optimization process:
Search for Prey: The whale agents explore the search space to find the target (optimal solution). Each agent’s position is updated based on a randomly chosen agent, which may not always be the best identified so far, promoting a wider and more diverse exploration.
Surrounding the Prey: After identifying the prey (optimal solution), the whales arrange themselves to surround it. This phase marks an intensification step, concentrating the algorithm’s focus on the vicinity of the identified promising solution.
Bubble-Net Strategy: In the final phase, whales employ the bubble-net strategy to capture the prey, refining the search within the chosen region to optimize the solution further.
The position of the whales is updated using the following equations:
Here, denotes the position of the optimal solution, with and serving as coefficient vectors. The term defines the distance between the whale and the prey.
This approach allows WOA to achieve a balance between exploration and exploitation, making it a flexible method for solving a variety of complex optimization problems.
The pseudocode can be seen in Algorithm 6.
7.1.5. Moth–Flame Optimization
The Moth–Flame Optimization (MFO) algorithm simulates the way moths navigate around artificial lights [
50]. In MFO, moth positions serve as candidate solutions and are organized in a matrix, while their fitness values are stored in an array. The optimal positions, known as flames, are updated and saved in a matrix as well.
Each moth’s position is adjusted based on its corresponding flame using the equation below:
where
S is a spiral function defined by
represents the separation between the moth
and the flame
, and it is computed using the following expression:
Parameter b determines the form of the logarithmic spiral, while t is a randomly chosen number within the interval , where r gradually decreases from −1 to −2 as the iterations progress.
To prevent getting stuck in local optima, each moth adjusts its position based on a flame. During each iteration, the list of flames is updated and arranged according to their fitness values. The top-performing moth is aligned with the best flame, and the least-performing moth is matched with the worst flame. The total number of flames is dynamically reduced throughout the iterations using the following approach:
N denotes the total number of moths, while iter and MaxIter refer to the current iteration count and the maximum number of iterations, respectively.
The pseudocode can be seen in Algorithm 7.
Algorithm 6 Whale Optimization Algorithm |
- 1:
Input: The set of individuals - 2:
Output: The modified population and - 3:
Generate a random population X - 4:
Calculate the objective function for each individual in X - 5:
Select the best individual in X () - 6:
- 7:
for iteration (t) do - 8:
- 9:
for solution (i) do - 10:
- 11:
- 12:
- 13:
- 14:
if then - 15:
if then - 16:
for dimension (j) do - 17:
- 18:
- 19:
end for - 20:
else - 21:
a randomly selected individual from the population - 22:
for dimension (j) do - 23:
- 24:
- 25:
end for - 26:
end if - 27:
else - 28:
for dimension (j) do - 29:
- 30:
- 31:
end for - 32:
end if - 33:
end for - 34:
Calculate the objective function for every individual in X. - 35:
Update - 36:
end for - 37:
return the modified population X, with as the best result.
|
Algorithm 7 Moth-Flame Optimization |
Input: N (Size of the population), T (Maximum iterations), dim (Number of dimensions). Output: The global optimum (best flame).
- 1:
procedure MFO - 2:
Initializing the moth population. - 3:
while T do - 4:
Update the number of flames (). - 5:
Compute the fitness function M as OM. - 6:
if then - 7:
OF = sort(). - 8:
F = sort(). - 9:
else - 10:
OF = sort(, ). - 11:
F = sort(, ). - 12:
end if - 13:
Identify the best flame. - 14:
for to N do - 15:
for to dim do - 16:
Update r and t. - 17:
Compute D. - 18:
Modify . - 19:
end for - 20:
end for - 21:
Compute the probability value of . - 22:
Update the new position. - 23:
. - 24:
end while - 25:
return the global optimum (the best flame). - 26:
end procedure
|
7.2. Experimental Setup
For parameter configuration, a population of 30 individuals was used, and 500 iterations were performed, as shown in
Table 5.
Considering the number of metaheuristics (6), number of instances (3), and number of versions of the metaheuristics (16) and independent runs (3), we can calculate that the total number of experiments to be conducted is 864.
Concerning the software employed in the experiments, Python 3.10.5 (64-bit) was used on a system running Windows 10. In terms of hardware, the experiments were executed on a computer equipped with an Intel Core i7-8650U processor operating at 2.11 GHz and 16 GB of RAM.
7.3. Summary of Results
For
Table 6,
Table 7,
Table 8,
Table 9,
Table 10 and
Table 11, we have a summary of the experiments, where the first column shows the variation of all experiments. In the first row, we have the SCP instance that was solved for the experiments. For each SCP instance, we have Best, Avg, and RPD, where (1) corresponds to the best result obtained, (2) corresponds to the average of the best fitness of all runs, and (3) corresponds to the “Relative Percentage Deviation”, which measures the percentage deviation of the obtained solution from the best-known solution, thus evaluating the quality of the algorithm.
Here, Opt represents the optimal value for the given instance, and Best refers to the highest-quality solution obtained from the experiment.
The results presented in
Table 6 allow for a detailed comparison of WSO’s performance in relation to five other metaheuristics—GWO, PSA, SCA, MFO, and WOA—across the three instances of the “Set Covering” problem (SCP41, SCP51, SCP61).
Figure 7,
Figure 8,
Figure 9,
Figure 10,
Figure 11,
Figure 12,
Figure 13,
Figure 14,
Figure 15,
Figure 16,
Figure 17 and
Figure 18 show the convergence tables of the experiments.
For the SCP41 instance, WSO stands out by achieving a Best value of 433, matched by GWO and slightly surpassed by PSA in some configurations. However, it is important to note that WSO’s performance varies across its different configurations. For example, while V3-STD_PIECE shows an RPD of 0.932, V3-ELIT presents an RPD of 1.399. Compared to other algorithms, WSO demonstrates competitive performance in some configurations, but it is not uniformly superior. For instance, SCA presents an RPD of 0.932 in its best configuration, matching WSO’s best result, while MFO and WOA generally show higher RPDs.
In the SCP51 instance, WSO maintains its competitiveness by achieving an RPD as low as 3.557 in the V3-STD_LOG configuration. This result is particularly noteworthy when compared to PSA and MFO, which present higher RPD values, indicating greater dispersion in their results. Although GWO achieves a competitive Best value, a closer examination suggests that it may exhibit greater stability than WSO. In
Table 7 (GWO, scp51), only two values differ from 5.534, while in
Table 6 (WSO, scp51), five RPD values vary from 5.534. This indicates that GWO could be the more stable method, with MFO showing nearly similar stability.
Finally, in SCP61, WSO continues to demonstrate its effectiveness by achieving a minimum RPD of 2.174 in several configurations, including V3-ELIT_CIRCLE and V3-STD_PIECE. Compared to WOA and SCA, which present higher RPDs, WSO demonstrates a superior ability to maintain consistent and near-optimal results. Although GWO also shows strong performance in this instance, WSO manages to match or surpass it in terms of stability and average consistency.
In conclusion, WSO positions itself as one of the most robust and consistent metaheuristics in this set of experiments, achieving an effective balance between exploration and exploitation. Compared to GWO, PSA, SCA, MFO, and WOA, WSO stands out for its ability to maintain low RPD values and obtain consistent average results, making it especially suitable for efficiently and reliably solving instances of the “Set Covering” problem.
7.4. Statistical Tests
Prior to performing the statistical analysis, it was crucial to select the most suitable type of test, deciding between parametric and non-parametric options. In this study, a non-parametric test was chosen because the data originated from machines and did not exhibit the normal distribution typical of natural processes. Given the independence of the samples, the Wilcoxon–Mann–Whitney test was employed for the analysis. This test helps identify significant differences between algorithms, offering detailed insights into their comparative performance.
To apply the chosen statistical test, we used the corresponding function from the Python scipy.stats.mannwhitneyu. library [
51] (accessed on 6 October 2024). We set the “alternative” parameter to “less” in that function. Our analysis focused on comparing two different metaheuristics. A resulting
p-value less than 0.05 would indicate that the sample from the first metaheuristic (MhA) is statistically smaller than that of the second (MhB).
The statistical analysis of the metaheuristics on the SCP41, SCP51, and SCP61 instances was performed using the Wilcoxon–Mann–Whitney test to compare WSO with other metaheuristics: GWO, PSA, SCA, MFO, and WOA. The detailed results of the statistical test for each instance are presented below and are shown in
Table 12,
Table 13 and
Table 14.
Instance SCP41
For the SCP41 instance, the statistical test revealed that WSO configurations with chaotic maps (STD and ELITE) had p-values less than 0.05 compared to several other configurations. Specifically, WSO with STD and ELITE in chaotic configurations showed statistically significant differences compared to STD and ELITE configurations from other experiments, as well as against STD-SINGER, STD-CIRCLE, ELIT, and ELIT-SINGER. These p-values indicate that WSO performed significantly better in these specific configurations compared to the other evaluated metaheuristics.
Instance SCP51
In the SCP51 instance, the statistical test also showed that WSO with chaotic maps (STD and ELITE) had p-values less than 0.05 compared to STD and ELITE configurations, as well as compared to STD-SINU, ELIT, ELITE-SINU, and ELITE-CIRCLE. This finding suggests that WSO performed statistically superior in these configurations compared to the other metaheuristics in this instance, highlighting greater effectiveness compared to the evaluated alternatives.
Instance SCP61
Finally, in the SCP61 instance, WSO with chaotic maps (STD and ELITE) obtained p-values less than 0.05 compared to STD-CIRCLE and ELITE. These results demonstrate that WSO exhibited statistically superior performance in these specific configurations compared to other metaheuristics evaluated for SCP61.
In summary, the statistical analysis confirms that WSO, particularly with chaotic map configurations, shows superior performance compared to other metaheuristics across the three “Set Covering” problem instances. The p-values less than 0.05 in multiple comparisons suggest that WSO has statistically significant performance compared to the other evaluated techniques. These results reinforce the robustness and effectiveness of WSO in solving “Set Covering” problem instances.
7.5. Exploration vs. Exploitation
Within metaheuristics, a key challenge is achieving the right balance between exploration and exploitation throughout the search process. Exploration enables the algorithm to probe new regions of the search space, whereas exploitation hones in on enhancing solutions in areas already deemed promising. If these two elements are not properly balanced, the algorithm may either become stuck in local optima (excessive exploitation) or struggle to converge within a feasible timeframe (excessive exploration).
During the experimentation with the metaheuristic without the application of chaotic maps, it was observed that the algorithm maintained a reasonable balance between exploration and exploitation. As shown in
Figure 19 and
Figure 20, at the beginning of the execution, exploration is predominantly high, which is expected and desirable as the algorithm attempts to cover a large portion of the search space to avoid falling into early suboptimal solutions. As the iterations progress, exploitation starts to gain prominence, allowing for the refinement of the solutions found.
When introducing chaotic maps into the metaheuristic, a noticeable change in the dynamics between exploration and exploitation was observed. Chaotic maps, known for their ability to introduce variability and avoid repetitive cycles, significantly altered the behavior of the algorithm, favoring a faster convergence. As illustrated in
Figure 21 and
Figure 22, the algorithm exhibited an accelerated decrease in the exploration phase, with an almost immediate increase in exploitation.
While rapid convergence may seem advantageous, it is crucial to consider that excessive exploitation from early stages can lead the algorithm to premature convergence, limiting the diversity of solutions explored and potentially trapping the process in local optima. However, in this particular case, the application of chaotic maps proved beneficial in scenarios where the speed of convergence is prioritized over the diversity of solutions.
8. Conclusions
This work presents a comprehensive evaluation of the White Shark Optimizer (WSO) for solving the Set Covering Problem (SCP) in comparison with several metaheuristics: Grey Wolf Optimizer (GWO), Pendulum Search Algorithm (PSA), Sine Cosine Algorithm (SCA), Whale Optimization Algorithm (WOA), and Moth–Flame Optimization (MFO). The study focuses on adapting WSO to binary optimization by incorporating population and velocity binarization techniques, along with chaotic maps to enhance the algorithm’s exploration and convergence.
The results show that WSO demonstrates competitive performance in the SCP41, SCP51, and SCP61 instances, though its superiority is not uniform across all configurations. For SCP41, while WSO achieves a Best value of 433 in some configurations, it is matched by GWO and slightly surpassed by PSA. In SCP51, WSO maintains its competitiveness with low RPD values but does not consistently outperform GWO in stability. In SCP61, WSO achieves minimum RPD values in several configurations, highlighting its effectiveness in certain scenarios compared to WOA and SCA.
Despite the variability in performance, WSO’s use of chaotic maps generally improves its effectiveness, as confirmed by statistical analysis. The Wilcoxon–Mann–Whitney tests indicate that certain WSO configurations with chaotic maps significantly differ from other methods, emphasizing their potential for solving binary optimization problems.
In summary, this work validates the use of WSO with advanced binarization techniques and chaotic maps for combinatorial optimization. Although it may not be uniformly superior across all instances, its ability to achieve near-optimal solutions and maintain consistency in select configurations underscores its applicability. This study contributes empirical evidence to the growing field of metaheuristics, highlighting WSO’s strengths and identifying areas for further research in binary optimization.
Author Contributions
Conceptualization, F.L.-S. and B.C.; methodology, F.C.-C. and B.C.; software, F.L.-S., F.C.-C. and J.B.-G.; validation, B.C. and R.S.; formal analysis, F.L.-S.; investigation, F.L.-S., B.C., F.C.-C., J.B.-G. and R.S.; resources, F.L.-S., F.C.-C. and J.B.-G.; writing—original draft F.L.-S. and B.C.; writing—review and editing, B.C., F.C.-C., J.B.-G. and R.S.; supervision, B.C. and R.S.; funding acquisition, B.C. and R.S. All authors have read and agreed to the published version of the manuscript.
Funding
This research received no external funding.
Data Availability Statement
The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding authors.
Acknowledgments
Felipe Cisternas-Caneo is supported by National Agency for Research and Development (ANID)/Scholarship Program/DOCTORADO NACIONAL/2023-21230203, and Jose Barrera-García is supported by the National Agency for Research and Development (ANID)/Scholarship Program/DOCTORADO NACIONAL/2024-21242516.
Conflicts of Interest
The authors declare no conflicts of interest.
References
- Weerasena, L.; Aththanayake, C.; Bandara, D. Advances in the decision-making of set covering models under uncertainty. Ann. Oper. Res. 2024. [Google Scholar] [CrossRef]
- Crawford, B.; Soto, R.; Astorga, G.; García, J. Constructive Metaheuristics for the Set Covering Problem. In Proceedings of the Bioinspired Optimization Methods and Their Applications, Paris, France, 16–18 May 2018; Springer International Publishing: Cham, Switzerland, 2018; pp. 88–99. [Google Scholar]
- Rajwar, K.; Deep, K.; Das, S. An exhaustive review of the metaheuristic algorithms for search and optimization: Taxonomy, applications, and open challenges. Artif. Intell. Rev. 2023, 56, 13187–13257. [Google Scholar] [CrossRef] [PubMed]
- Braik, M.; Hammouri, A.; Atwan, J.; Al-Betar, M.A.; Awadallah, M.A. White Shark Optimizer: A novel bio-inspired meta-heuristic algorithm for global optimization problems. Knowl.-Based Syst. 2022, 243, 108457. [Google Scholar] [CrossRef]
- Abdel-Basset, M.; Sallam, K.M.; Mohamed, R.; Elgendi, I.; Munasinghe, K.; Elkomy, O.M. An Improved Binary Grey-Wolf Optimizer with Simulated Annealing for Feature Selection. IEEE Access 2021, 9, 139792–139822. [Google Scholar] [CrossRef]
- Korashy, A.; Salah Kamel, F.J.; Youssef, A.R. Hybrid Whale Optimization Algorithm and Grey Wolf Optimizer Algorithm for Optimal Coordination of Direction Overcurrent Relays. Electr. Power Components Syst. 2019, 47, 644–658. [Google Scholar] [CrossRef]
- Tapia, D.; Crawford, B.; Soto, R.; Cisternas-Caneo, F.; Lemus-Romani, J.; Castillo, M.; García, J.; Palma, W.; Paredes, F.; Misra, S. A Q-Learning Hyperheuristic Binarization Framework to Balance Exploration and Exploitation. In Proceedings of the 11th International Conference on Applied Informatics, Eger, Hungary, 29–31 January 2020; Florez, H., Misra, S., Eds.; Springer: Cham, Switzerland, 2020; pp. 14–28. [Google Scholar] [CrossRef]
- Lemus-Romani, J.; Crawford, B.; Cisternas-Caneo, F.; Soto, R.; Becerra-Rozas, M. Binarization of Metaheuristics: Is the Transfer Function Really Important? Biomimetics 2023, 8, 400. [Google Scholar] [CrossRef] [PubMed]
- Ibrahim, A.M.; Tawhid, M.A. Chaotic electromagnetic field optimization. Artif. Intell. Rev. 2022, 56, 9989–10030. [Google Scholar] [CrossRef]
- Cisternas-Caneo, F.; Crawford, B.; Soto, R.; Giachetti, G.; Paz, Á.; Peña Fritz, Á. Chaotic Binarization Schemes for Solving Combinatorial Optimization Problems Using Continuous Metaheuristics. Mathematics 2024, 12, 262. [Google Scholar] [CrossRef]
- Agrawal, P.; Ganesh, T.; Mohamed, A.W. Chaotic gaining sharing knowledge-based optimization algorithm: An improved metaheuristic algorithm for feature selection. Soft Comput. 2021, 25, 9505–9528. [Google Scholar] [CrossRef]
- Saremi, S.; Mirjalili, S.; Lewis, A. Biogeography-based optimisation with chaos. Neural Comput. Appl. 2014, 25, 1077–1097. [Google Scholar] [CrossRef]
- Igel, C. No Free Lunch Theorems: Limitations and Perspectives of Metaheuristics. In Theory and Principled Methods for the Design of Metaheuristics; Borenstein, Y., Moraglio, A., Eds.; Springer: Berlin/Heidelberg, Germany, 2014; pp. 1–23. [Google Scholar] [CrossRef]
- Wolpert, D.; Macready, W. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
- Becerra-Rozas, M.; Lemus-Romani, J.; Cisternas-Caneo, F.; Crawford, B.; Soto, R.; Astorga, G.; Castro, C.; García, J. Continuous Metaheuristics for Binary Optimization Problems: An Updated Systematic Literature Review. Mathematics 2023, 11, 129. [Google Scholar] [CrossRef]
- Beasley, J.; Chu, P. A genetic algorithm for the set covering problem. Eur. J. Oper. Res. 1996, 94, 392–404. [Google Scholar] [CrossRef]
- Caserta, M. Tabu Search-Based Metaheuristic Algorithm for Large-scale Set Covering Problems. In Metaheuristics: Progress in Complex Systems Optimization; Doerner, K.F., Gendreau, M., Greistorfer, P., Gutjahr, W., Hartl, R.F., Reimann, M., Eds.; Springer: Boston, MA, USA, 2007; pp. 43–63. [Google Scholar] [CrossRef]
- Caprara, A.; Toth, P.; Fischetti, M. Algorithms for the Set Covering Problem. Ann. Oper. Res. 2000, 98, 353–371. [Google Scholar] [CrossRef]
- Crawford, B.; Soto, R.; Monfroy, E.; Castro, C.; Palma, W.; Paredes, F. A Hybrid Soft Computing Approach for Subset Problems. Math. Probl. Eng. 2013, 2013, 716069. [Google Scholar] [CrossRef]
- Kaveh, A. Chaos Embedded Metaheuristic Algorithms. In Advances in Metaheuristic Algorithms for Optimal Design of Structures; Springer International Publishing: Cham, Switzerland, 2017; pp. 375–398. [Google Scholar] [CrossRef]
- Pan, J.S.; Hu, P.; Snášel, V.; Chu, S.C. A Survey on Binary Metaheuristic Algorithms and Their Engineering Applications. Artif. Intell. Rev. 2023, 56, 6101–6167. [Google Scholar] [CrossRef]
- Ho, Y.; Pepyne, D. Simple Explanation of the No-Free-Lunch Theorem and Its Implications. J. Optim. Theory Appl. 2002, 115, 549–570. [Google Scholar] [CrossRef]
- Atali, G.; Pehlivan, İ.; Gürevin, B.; Şeker, H.İ. Chaos in metaheuristic based artificial intelligence algorithms: A short review. Turk. J. Electr. Eng. Comput. Sci. 2021, 29, 1354–1367. [Google Scholar] [CrossRef]
- Sun, Y.; Qi, G.; Wang, Z.; van Wyk, B.J.; Hamam, Y. Chaotic particle swarm optimization. In Proceedings of the First ACM/SIGEVO Summit on Genetic and Evolutionary Computation, GEC ’09, New York, NY, USA, 12–14 June 2009; pp. 505–510. [Google Scholar] [CrossRef]
- Javidi, M.; Hosseinpourfard, R. Chaos Genetic Algorithm Instead Genetic Algorithm. Int. Arab J. Inf. Technol. 2015, 12, 163–168. [Google Scholar]
- Determan, J.; Foster, J. Using chaos in genetic algorithms. In Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406), Washington, DC, USA, 6–9 July 1999; Volume 3, pp. 2094–2101. [Google Scholar] [CrossRef]
- Snaselova, P.; Zboril, F. Genetic Algorithm using Theory of Chaos. Procedia Comput. Sci. 2015, 51, 316–325. [Google Scholar] [CrossRef]
- Xiao, W.S.; Li, G.X.; Liu, C.; Tan, L.P. A Novel Chaotic and Neighborhood Search-Based Artificial Bee Colony Algorithm for Solving Optimization Problems. Sci. Rep. 2023, 13, 20496. [Google Scholar] [CrossRef] [PubMed]
- Hammouri, A.I.; Braik, M.S.; Al-hiary, H.H.; Abdeen, R.A. A Binary Hybrid Sine Cosine White Shark Optimizer for Feature Selection. Clust. Comput. 2024, 27, 7825–7867. [Google Scholar] [CrossRef]
- Alawad, N.A.; Abed-alguni, B.H.; Al-Betar, M.A.; Jaradat, A. Binary Improved White Shark Algorithm for Intrusion Detection Systems. Neural Comput. Appl. 2023, 35, 19427–19451. [Google Scholar] [CrossRef]
- Fathy, A.; Alanazi, A. An Efficient White Shark Optimizer for Enhancing the Performance of Proton Exchange Membrane Fuel Cells. Sustainability 2023, 15, 11741. [Google Scholar] [CrossRef]
- Farhat, M.; Kamel, S.; Elseify, M.A.; Abdelaziz, A.Y. A Modified White Shark Optimizer for Optimal Power Flow Considering Uncertainty of Renewable Energy Sources. Sci. Rep. 2024, 14, 3051. [Google Scholar] [CrossRef]
- Fathy, A.; Yousri, D.; Alharbi, A.G.; Abdelkareem, M.A. A New Hybrid White Shark and Whale Optimization Approach for Estimating the Li-Ion Battery Model Parameters. Sustainability 2023, 15, 5667. [Google Scholar] [CrossRef]
- Crawford, B.; Soto, R.; Astorga, G.; García, J.; Castro, C.; Paredes, F. Putting Continuous Metaheuristics to Work in Binary Search Spaces. Complexity 2017, 2017, 8404231. [Google Scholar] [CrossRef]
- Kennedy, J.; Eberhart, R. A discrete binary version of the particle swarm algorithm. In Proceedings of the 1997 IEEE International Conference on Systems, Man, and Cybernetics, Computational Cybernetics and Simulation, Orlando, FL, USA, 12–15 October 1997; Volume 5, pp. 4104–4108. [Google Scholar] [CrossRef]
- Lanza-Gutierrez, J.M.; Crawford, B.; Soto, R.; Berrios, N.; Gomez-Pulido, J.A.; Paredes, F. Analyzing the effects of binarization techniques when solving the set covering problem through swarm optimization. Expert Syst. Appl. 2017, 70, 67–82. [Google Scholar] [CrossRef]
- Lee, S.; Soak, S.; Oh, S.; Pedrycz, W.; Jeon, M. Modified binary particle swarm optimization. Prog. Nat. Sci. 2008, 18, 1161–1166. [Google Scholar] [CrossRef]
- Chuang, L.Y.; Chang, H.W.; Tu, C.J.; Yang, C.H. Improved binary PSO for feature selection using gene expression data. Comput. Biol. Chem. 2008, 32, 29–38. [Google Scholar] [CrossRef]
- Yuan, X.; Nie, H.; Su, A.; Wang, L.; Yuan, Y. An improved binary particle swarm optimization for unit commitment problem. Expert Syst. Appl. 2009, 36, 8049–8055. [Google Scholar] [CrossRef]
- Sarath, K.; Ravi, V. Association rule mining using binary particle swarm optimization. Eng. Appl. Artif. Intell. 2013, 26, 1832–1840. [Google Scholar] [CrossRef]
- Khanesar, M.A.; Teshnehlab, M.; Shoorehdeli, M.A. A novel binary particle swarm optimization. In Proceedings of the 2007 Mediterranean Conference on Control & Automation, Athens, Greece, 27–29 June 2007; pp. 1–6. [Google Scholar] [CrossRef]
- Lin, J.C.W.; Yang, L.; Fournier-Viger, P.; Hong, T.P.; Voznak, M. A binary PSO approach to mine high-utility itemsets. Soft Comput. 2017, 21, 5103–5121. [Google Scholar] [CrossRef]
- Nezamabadi-pour, H.; Rostami-Shahrbabaki, M.; Farsangi, M. Binary Particle Swarm Optimization: Challenges and New Solutions. J. Comput. Soc. Iran (CSI) Comput. Sci. Eng. (JCSE) 2008, 6, 21–32. [Google Scholar]
- Bansal, J.C.; Deep, K. A Modified Binary Particle Swarm Optimization for Knapsack Problems. Appl. Math. Comput. 2012, 218, 11042–11061. [Google Scholar] [CrossRef]
- Tasgetiren, M.; Liang, Y.C. A Binary Particle Swarm Optimization Algorithm for Lot Sizing Problem. J. Econ. Soc. Res. 2004, 5, 1–20. [Google Scholar]
- Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
- Ab. Aziz, N.A.; Ab. Aziz, K. Pendulum search algorithm: An optimization algorithm based on simple harmonic motion and its application for a vaccine distribution problem. Algorithms 2022, 15, 214. [Google Scholar] [CrossRef]
- Mirjalili, S. SCA: A Sine Cosine Algorithm for solving optimization problems. Knowl.-Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
- Mirjalili, S.; Lewis, A. The Whale Optimization Algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
- Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl.-Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
- SciPy Library of Python. 2024. Available online: https://docs.scipy.org/doc/scipy/reference/generated/scipy.stats.mannwhitneyu.html (accessed on 6 October 2024).
Figure 2.
The senses of the white shark: smell, sight, and hearing [
4].
Figure 2.
The senses of the white shark: smell, sight, and hearing [
4].
Figure 3.
Auditory sensor located on the shark’s torso [
4].
Figure 3.
Auditory sensor located on the shark’s torso [
4].
Figure 4.
Two-step binarization scheme.
Figure 4.
Two-step binarization scheme.
Figure 5.
S-shaped and V-shaped graph.
Figure 5.
S-shaped and V-shaped graph.
Figure 6.
Flowchart for BWSO.
Figure 6.
Flowchart for BWSO.
Figure 7.
The convergence curve for WSO-STD on the SCP41 instance.
Figure 7.
The convergence curve for WSO-STD on the SCP41 instance.
Figure 8.
The convergence curve for WSO-ELIT on the SCP41 instance.
Figure 8.
The convergence curve for WSO-ELIT on the SCP41 instance.
Figure 9.
The convergence curve for WOA-STD on the SCP41 instance.
Figure 9.
The convergence curve for WOA-STD on the SCP41 instance.
Figure 10.
The convergence curve for WOA-ELIT on the SCP41 instance.
Figure 10.
The convergence curve for WOA-ELIT on the SCP41 instance.
Figure 11.
The convergence curve for SCA-STD on the SCP41 instance.
Figure 11.
The convergence curve for SCA-STD on the SCP41 instance.
Figure 12.
The convergence curve for SCA-ELIT on the SCP41 instance.
Figure 12.
The convergence curve for SCA-ELIT on the SCP41 instance.
Figure 13.
The convergence curve for PSA-STD on the SCP41 instance.
Figure 13.
The convergence curve for PSA-STD on the SCP41 instance.
Figure 14.
The convergence curve for PSA-ELIT on the SCP41 instance.
Figure 14.
The convergence curve for PSA-ELIT on the SCP41 instance.
Figure 15.
The convergence curve for MFO-STD on the SCP41 instance.
Figure 15.
The convergence curve for MFO-STD on the SCP41 instance.
Figure 16.
The convergence curve for MFO-ELIT on the SCP41 instance.
Figure 16.
The convergence curve for MFO-ELIT on the SCP41 instance.
Figure 17.
The convergence curve for GWO-STD on the SCP41 instance.
Figure 17.
The convergence curve for GWO-STD on the SCP41 instance.
Figure 18.
The convergence curve for GWO-ELIT on the SCP41 instance.
Figure 18.
The convergence curve for GWO-ELIT on the SCP41 instance.
Figure 19.
Exploration vs. exploitation in WSO_STD for SCP61.
Figure 19.
Exploration vs. exploitation in WSO_STD for SCP61.
Figure 20.
Exploration vs. exploitation in WSO_ELIT for SCP61.
Figure 20.
Exploration vs. exploitation in WSO_ELIT for SCP61.
Figure 21.
Exploration vs. exploitation in WSO_STD-SINU for SCP61.
Figure 21.
Exploration vs. exploitation in WSO_STD-SINU for SCP61.
Figure 22.
Exploration vs. exploitation in WSO_ELIT-SINU for SCP61.
Figure 22.
Exploration vs. exploitation in WSO_ELIT-SINU for SCP61.
Table 1.
Table of chaotic maps.
Table 1.
Table of chaotic maps.
Type | Chaotic Maps |
---|
Logistic Map | , |
Piecewise Map | |
Sine Map | , where |
Singer Map | |
Sinusoidal Map | , where |
Tent Map | |
Circle Map | , where |
Table 2.
Transfer function table.
Table 2.
Transfer function table.
S-Shaped | Equation | V-Shaped | Equation |
---|
S1 | | V1 | |
S2 | | V2 | |
S3 | | V3 | |
S4 | | V4 | |
Table 3.
Table of binarization rules.
Table 3.
Table of binarization rules.
Type | Binarization Rules |
---|
Standard (STD) | |
Complement (COM) | |
Static Probability (SP) | |
Elitist (ELIT) | |
Table 4.
Table of binarization rules and chaotic maps.
Table 4.
Table of binarization rules and chaotic maps.
Chaotic Maps | STD | ELIT |
---|
Nothing | STD | ELIT |
Logistic Map | STD_LOG | ELIT_LOG |
Piecewise Map | STD_PIECE | ELIT_PIECE |
Sine Map | STD_SINE | ELIT_SINE |
Singer Map | STD_SINGER | ELIT_SINGER |
Sinusoidal Map | STD_SINU | ELIT_SINU |
Tent Map | STD_TENT | ELIT_TENT |
Circle Map | STD_CIRCLE | ELIT_CIRCLE |
Table 5.
Parameter table.
Table 5.
Parameter table.
Parameter | Value |
---|
Number of metaheuristics | 6 |
Independent runs | 3 |
Transfer function | V3 |
Number of binarization schemes | 16 |
Number of SCP instances | 3 |
Population size | 30 |
Number of iterations | 500 |
WSO Parameters | a = 2 |
= 0.07 |
= 0.75 |
= 4.125 |
= 6.25 |
= 100 |
= 0.0005 |
Table 6.
Metaheuristic WSO table for SCP41, SCP51, and SCP61.
Table 6.
Metaheuristic WSO table for SCP41, SCP51, and SCP61.
Experiment | scp41 | scp51 | scp61 |
---|
Best | Avg. | RPD | Best | Avg. | RPD | Best | Avg. | RPD |
---|
V3-STD | 438 | 440 | 2.098 | 272 | 274.667 | 7.51 | 141 | 143.667 | 2.174 |
V3-STD_LOG | 433 | 433.333 | 0.932 | 267 | 267 | 5.534 | 141 | 141 | 2.174 |
V3-STD_PIECE | 433 | 433 | 0.932 | 267 | 267.333 | 5.534 | 141 | 141 | 2.174 |
V3-STD_SINE | 433 | 433 | 0.932 | 267 | 267.333 | 5.534 | 141 | 141 | 2.174 |
V3-STD_SINGER | 434 | 434.333 | 1.166 | 267 | 267.333 | 5.534 | 141 | 141.333 | 2.174 |
V3-STD_SINU | 433 | 433 | 0.932 | 268 | 268 | 5.929 | 141 | 141 | 2.174 |
V3-STD_TENT | 433 | 433 | 0.932 | 267 | 267 | 5.534 | 141 | 141 | 2.174 |
V3-STD_CIRCLE | 434 | 435.667 | 1.166 | 267 | 267.667 | 5.534 | 147 | 148 | 6.522 |
V3-ELIT | 435 | 435.333 | 1.399 | 268 | 268.667 | 5.929 | 142 | 142 | 2.899 |
V3-ELIT_LOG | 433 | 433 | 0.932 | 267 | 267.333 | 5.534 | 141 | 141 | 2.174 |
V3-ELIT_PIECE | 433 | 433 | 0.932 | 267 | 267 | 5.534 | 141 | 141 | 2.174 |
V3-ELIT_SINE | 433 | 433 | 0.932 | 267 | 267.333 | 5.534 | 141 | 141 | 2.174 |
V3-ELIT_SINGER | 434 | 434 | 1.166 | 267 | 267 | 5.534 | 141 | 141 | 2.174 |
V3-ELIT_SINU | 433 | 433 | 0.932 | 268 | 268 | 5.929 | 141 | 141 | 2.174 |
V3-ELIT_TENT | 433 | 433 | 0.932 | 267 | 267.333 | 5.534 | 141 | 141 | 2.174 |
V3-ELIT_CIRCLE | 433 | 433.333 | 0.932 | 269 | 269 | 6.324 | 141 | 142.333 | 2.174 |
Table 7.
Metaheuristic GWO table for SCP41, SCP51, and SCP61.
Table 7.
Metaheuristic GWO table for SCP41, SCP51, and SCP61.
Experiment | scp41 | scp51 | scp61 |
---|
Best | Avg. | RPD | Best | Avg. | RPD | Best | Avg. | RPD |
---|
V3-STD | 433 | 433 | 0.932 | 267 | 267 | 5.534 | 141 | 141 | 2.174 |
V3-STD_LOG | 434 | 434 | 1.166 | 262 | 265.667 | 3.557 | 141 | 141 | 2.174 |
V3-STD_PIECE | 433 | 433 | 0.932 | 267 | 267 | 5.534 | 141 | 141 | 2.174 |
V3-STD_SINE | 434 | 434.333 | 1.166 | 267 | 267.333 | 5.534 | 141 | 141 | 2.174 |
V3-STD_SINGER | 434 | 434.667 | 1.166 | 268 | 268 | 5.929 | 141 | 141.333 | 2.174 |
V3-STD_SINU | 434 | 434 | 1.166 | 267 | 267.667 | 5.534 | 141 | 141 | 2.174 |
V3-STD_TENT | 433 | 433 | 0.932 | 267 | 267 | 5.534 | 141 | 141 | 2.174 |
V3-STD_CIRCLE | 433 | 433.333 | 0.932 | 267 | 267.667 | 5.534 | 141 | 141 | 2.174 |
V3-ELIT | 433 | 433 | 0.932 | 267 | 267.333 | 5.534 | 141 | 141 | 2.174 |
V3-ELIT_LOG | 433 | 433 | 0.932 | 267 | 267.667 | 5.534 | 141 | 141.667 | 2.174 |
V3-ELIT_PIECE | 433 | 433 | 0.932 | 267 | 267.333 | 5.534 | 141 | 142.333 | 2.174 |
V3-ELIT_SINE | 433 | 434.333 | 0.932 | 267 | 267.333 | 5.534 | 141 | 142 | 2.174 |
V3-ELIT_SINGER | 434 | 434 | 1.166 | 267 | 267 | 5.534 | 141 | 142.333 | 2.174 |
V3-ELIT_SINU | 433 | 433.667 | 0.932 | 267 | 267.333 | 5.534 | 141 | 141 | 2.174 |
V3-ELIT_TENT | 433 | 433 | 0.932 | 267 | 267 | 5.534 | 141 | 142.333 | 2.174 |
V3-ELIT_CIRCLE | 433 | 434.333 | 0.932 | 267 | 267.333 | 5.534 | 145 | 145 | 5.072 |
Table 8.
Metaheuristic PSA table for SCP41, SCP51, and SCP61.
Table 8.
Metaheuristic PSA table for SCP41, SCP51, and SCP61.
Experiment | scp41 | scp51 | scp61 |
---|
Best | Avg. | RPD | Best | Avg. | RPD | Best | Avg. | RPD |
---|
V3-STD | 433 | 433 | 0.932 | 267 | 267.667 | 5.534 | 144 | 144.667 | 4.348 |
V3-STD_LOG | 434 | 434 | 1.166 | 267 | 267.667 | 5.534 | 142 | 143.667 | 2.899 |
V3-STD_PIECE | 433 | 433 | 0.932 | 267 | 267.333 | 5.534 | 141 | 142.333 | 2.174 |
V3-STD_SINE | 434 | 434 | 1.166 | 267 | 267.667 | 5.534 | 141 | 143.667 | 2.174 |
V3-STD_SINGER | 434 | 434.667 | 1.166 | 268 | 268.667 | 5.929 | 142 | 142 | 2.899 |
V3-STD_SINU | 434 | 434.667 | 1.166 | 267 | 268 | 5.534 | 142 | 142 | 2.899 |
V3-STD_TENT | 433 | 433 | 0.932 | 267 | 267 | 5.534 | 141 | 142.333 | 2.174 |
V3-STD_CIRCLE | 438 | 440.667 | 2.098 | 277 | 279.333 | 9.486 | 141 | 143.667 | 2.174 |
V3-ELIT | 433 | 434.333 | 0.932 | 267 | 267 | 5.534 | 144 | 144.667 | 4.348 |
V3-ELIT_LOG | 433 | 433 | 0.932 | 267 | 267 | 5.534 | 141 | 142.667 | 2.174 |
V3-ELIT_PIECE | 433 | 434.333 | 0.932 | 267 | 267.333 | 5.534 | 141 | 142.333 | 2.174 |
V3-ELIT_SINE | 433 | 433 | 0.932 | 267 | 267.333 | 5.534 | 145 | 145 | 5.072 |
V3-ELIT_SINGER | 434 | 434 | 1.166 | 267 | 267 | 5.534 | 141 | 142.333 | 2.174 |
V3-ELIT_SINU | 433 | 433 | 0.932 | 267 | 267.333 | 5.534 | 141 | 143.667 | 2.174 |
V3-ELIT_TENT | 433 | 434.333 | 0.932 | 267 | 267 | 5.534 | 138 | 141.333 |
0
|
V3-ELIT_CIRCLE | 437 | 440 | 1.865 | 272 | 274.333 | 7.51 | 148 | 150.333 | 7.246 |
Table 9.
Metaheuristic SCA table for SCP41, SCP51, and SCP61.
Table 9.
Metaheuristic SCA table for SCP41, SCP51, and SCP61.
Experiment | scp41 | scp51 | scp61 |
---|
Best | Avg. | RPD | Best | Avg. | RPD | Best | Avg. | RPD |
---|
V3-STD | 433 | 433.667 | 0.932 | 267 | 267.667 | 5.534 | 142 | 143 | 2.899 |
V3-STD_LOG | 434 | 434 | 1.166 | 268 | 268.333 | 5.929 | 142 | 143 | 2.899 |
V3-STD_PIECE | 433 | 434.667 | 0.932 | 268 | 268 | 5.929 | 141 | 142.333 | 2.174 |
V3-STD_SINE | 434 | 434.333 | 1.166 | 268 | 268.333 | 5.929 | 141 | 143.667 | 2.174 |
V3-STD_SINGER | 434 | 435 | 1.166 | 268 | 268.667 | 5.929 | 142 | 142 | 2.899 |
V3-STD_SINU | 435 | 435.667 | 1.399 | 268 | 268.667 | 5.929 | 142 | 142 | 2.899 |
V3-STD_TENT | 434 | 434.333 | 1.166 | 267 | 267.333 | 5.534 | 141 | 142.667 | 2.174 |
V3-STD_CIRCLE | 434 | 434 | 1.166 | 267 | 267.667 | 5.534 | 141 | 144 | 2.174 |
V3-ELIT | 433 | 433 | 0.932 | 267 | 267.333 | 5.534 | 141 | 143.667 | 2.174 |
V3-ELIT_LOG | 433 | 434.333 | 0.932 | 267 | 267.667 | 5.534 | 141 | 141 | 2.174 |
V3-ELIT_PIECE | 433 | 434.333 | 0.932 | 267 | 267.333 | 5.534 | 141 | 143.667 | 2.174 |
V3-ELIT_SINE | 433 | 434.333 | 0.932 | 267 | 267 | 5.534 | 141 | 142.333 | 2.174 |
V3-ELIT_SINGER | 433 | 433.333 | 0.932 | 258 | 264.667 | 1.976 | 141 | 141 | 2.174 |
V3-ELIT_SINU | 434 | 434 | 1.166 | 267 | 267 | 5.534 | 141 | 143.333 | 2.174 |
V3-ELIT_TENT | 433 | 433 | 0.932 | 267 | 267 | 5.534 | 145 | 145 | 5.072 |
V3-ELIT_CIRCLE | 433 | 433 | 0.932 | 267 | 267.667 | 5.534 | 145 | 145 | 5.072 |
Table 10.
Metaheuristic MFO table for SCP41, SCP51, and SCP61.
Table 10.
Metaheuristic MFO table for SCP41, SCP51, and SCP61.
Experiment | scp41 | scp51 | scp61 |
---|
Best | Avg. | RPD | Best | Avg. | RPD | Best | Avg. | RPD |
---|
V3-STD | 433 | 433 | 0.932 | 267 | 267 | 5.534 | 141 | 142.333 | 2.174 |
V3-STD_LOG | 433 | 433 | 0.932 | 267 | 267 | 5.534 | 141 | 141 | 2.174 |
V3-STD_PIECE | 433 | 433 | 0.932 | 267 | 267.333 | 5.534 | 141 | 141 | 2.174 |
V3-STD_SINE | 433 | 433 | 0.932 | 267 | 267 | 5.534 | 141 | 141 | 2.174 |
V3-STD_SINGER | 434 | 434 | 1.166 | 267 | 267 | 5.534 | 141 | 141 | 2.174 |
V3-STD_SINU | 433 | 433.667 | 0.932 | 267 | 267.333 | 5.534 | 141 | 141 | 2.174 |
V3-STD_TENT | 433 | 433 | 0.932 | 267 | 267 | 5.534 | 141 | 141 | 2.174 |
V3-STD_CIRCLE | 434 | 436.333 | 1.166 | 267 | 267 | 5.534 | 141 | 141 | 2.174 |
V3-ELIT | 433 | 433 | 0.932 | 267 | 267.333 | 5.534 | 141 | 141 | 2.174 |
V3-ELIT_LOG | 433 | 433 | 0.932 | 267 | 267 | 5.534 | 141 | 141 | 2.174 |
V3-ELIT_PIECE | 433 | 433 | 0.932 | 267 | 267.333 | 5.534 | 141 | 141 | 2.174 |
V3-ELIT_SINE | 433 | 433 | 0.932 | 267 | 267.333 | 5.534 | 141 | 141 | 2.174 |
V3-ELIT_SINGER | 434 | 434 | 1.166 | 267 | 267 | 5.534 | 141 | 141 | 2.174 |
V3-ELIT_SINU | 433 | 433 | 0.932 | 267 | 267 | 5.534 | 141 | 141 | 2.174 |
V3-ELIT_TENT | 433 | 433 | 0.932 | 267 | 267.667 | 5.534 | 141 | 141 | 2.174 |
V3-ELIT_CIRCLE | 434 | 435.333 | 1.166 | 267 | 268.667 | 5.534 | 141 | 142.333 | 2.174 |
Table 11.
Metaheuristic WOA table for SCP41, SCP51, and SCP61.
Table 11.
Metaheuristic WOA table for SCP41, SCP51, and SCP61.
Experiment | scp41 | scp51 | scp61 |
---|
Best | Avg. | RPD | Best | Avg. | RPD | Best | Avg. | RPD |
---|
V3-STD | 433 | 433 | 0.932 | 267 | 267.333 | 5.534 | 141 | 143.333 | 2.174 |
V3-STD_LOG | 433 | 433.333 | 0.932 | 267 | 267.667 | 5.534 | 141 | 143.333 | 2.174 |
V3-STD_PIECE | 433 | 433 | 0.932 | 257 | 264.667 | 1.581 | 144 | 144.667 | 4.348 |
V3-STD_SINE | 433 | 434.333 | 0.932 | 267 | 267.333 | 5.534 | 141 | 143.333 | 2.174 |
V3-STD_SINGER | 434 | 434.333 | 1.166 | 267 | 267.333 | 5.534 | 141 | 142.333 | 2.174 |
V3-STD_SINU | 433 | 433.667 | 0.932 | 267 | 267.333 | 5.534 | 141 | 142 | 2.174 |
V3-STD_TENT | 433 | 435.667 | 0.932 | 267 | 267 | 5.534 | 143 | 144 | 3.623 |
V3-STD_CIRCLE | 437 | 439.333 | 1.865 | 269 | 270.333 | 6.324 | 141 | 146.667 | 2.174 |
V3-ELIT | 433 | 434.333 | 0.932 | 267 | 267.333 | 5.534 | 141 | 141.667 | 2.174 |
V3-ELIT_LOG | 433 | 434.333 | 0.932 | 267 | 269 | 5.534 | 141 | 142 | 2.174 |
V3-ELIT_PIECE | 433 | 433 | 0.932 | 267 | 267.333 | 5.534 | 141 | 142.333 | 2.174 |
V3-ELIT_SINE | 433 | 433 | 0.932 | 267 | 267 | 5.534 | 144 | 144.333 | 4.348 |
V3-ELIT_SINGER | 433 | 433 | 0.932 | 267 | 268 | 5.534 | 141 | 141 | 2.174 |
V3-ELIT_SINU | 433 | 433 | 0.932 | 267 | 267.333 | 5.534 | 141 | 141 | 2.174 |
V3-ELIT_TENT | 433 | 433 | 0.932 | 267 | 267 | 5.534 | 141 | 143.333 | 2.174 |
V3-ELIT_CIRCLE | 439 | 441.667 | 2.331 | 268 | 273 | 5.929 | 141 | 145.333 | 2.174 |
Table 12.
Average p-value of WSO compared to others experiments for SCP41.
Table 12.
Average p-value of WSO compared to others experiments for SCP41.
| STD | STD_LOG | STD_PIECE | STD_SINE | STD_SINGER | STD_SINU | STD_TENT | STD_CIRCLE | ELIT | ELIT_LOG | ELIT_PIECE | ELIT_SINE | ELIT_SINGER | ELIT_SINU | ELIT_TENT | ELIT_CIRCLE |
---|
STD | X | 0.988 | 0.991 | 0.991 | 0.988 | 0.991 | 0.991 | 0.964 | 0.988 | 0.991 | 0.991 | 0.991 | 0.991 | 0.991 | 0.991 | 0.988 |
STD_LOG | 0.036 | X | 0.909 | 0.909 | 0.079 | 0.909 | 0.909 | 0.079 | 0.036 | 0.909 | 0.909 | 0.909 | 0.094 | 0.909 | 0.909 | 0.604 |
STD_PIECE | 0.03 | 0.252 | X | 1 | 0.03 | 1 | 1 | 0.03 | 0.03 | 1 | 1 | 1 | 0.023 | 1 | 1 | 0.252 |
STD_SINE | 0.03 | 0.252 | 1 | X | 0.03 | 1 | 1 | 0.03 | 0.03 | 1 | 1 | 1 | 0.023 | 1 | 1 | 0.252 |
STD_SINGER | 0.036 | 0.97 | 0.991 | 0.991 | X | 0.991 | 0.991 | 0.5 | 0.079 | 0.991 | 0.991 | 0.991 | 0.909 | 0.991 | 0.991 | 0.97 |
STD_SINU | 0.03 | 0.252 | 1 | 1 | 0.03 | X | 1 | 0.03 | 0.03 | 1 | 1 | 1 | 0.023 | 1 | 1 | 0.252 |
STD_TENT | 0.03 | 0.252 | 1 | 1 | 0.03 | 1 | X | 0.03 | 0.03 | 1 | 1 | 1 | 0.023 | 1 | 1 | 0.252 |
STD_CIRCLE | 0.089 | 0.97 | 0.991 | 0.991 | 0.697 | 0.991 | 0.991 | X | 0.327 | 0.991 | 0.991 | 0.991 | 0.909 | 0.991 | 0.991 | 0.97 |
ELIT | 0.036 | 0.988 | 0.991 | 0.991 | 0.97 | 0.991 | 0.991 | 0.816 | X | 0.991 | 0.991 | 0.991 | 0.991 | 0.991 | 0.991 | 0.988 |
ELIT_LOG | 0.03 | 0.252 | 1 | 1 | 0.03 | 1 | 1 | 0.03 | 0.03 | X | 1 | 1 | 0.023 | 1 | 1 | 0.252 |
ELIT_PIECE | 0.03 | 0.252 | 1 | 1 | 0.03 | 1 | 1 | 0.03 | 0.03 | 1 | X | 1 | 0.023 | 1 | 1 | 0.252 |
ELIT_SINE | 0.03 | 0.252 | 1 | 1 | 0.03 | 1 | 1 | 0.03 | 0.03 | 1 | 1 | X | 0.023 | 1 | 1 | 0.252 |
ELIT_SINGER | 0.03 | 0.967 | 0.994 | 0.994 | 0.252 | 0.994 | 0.994 | 0.252 | 0.03 | 0.994 | 0.994 | 0.994 | X | 0.994 | 0.994 | 0.967 |
ELIT_SINU | 0.03 | 0.252 | 1 | 1 | 0.03 | 1 | 1 | 0.03 | 0.03 | 1 | 1 | 1 | 0.023 | X | 1 | 0.252 |
ELIT_TENT | 0.03 | 0.252 | 1 | 1 | 0.03 | 1 | 1 | 0.03 | 0.03 | 1 | 1 | 1 | 0.023 | 1 | X | 0.252 |
ELIT_CIRCLE | 0.036 | 0.604 | 0.909 | 0.909 | 0.079 | 0.909 | 0.909 | 0.079 | 0.036 | 0.909 | 0.909 | 0.909 | 0.094 | 0.909 | 0.909 | X |
Table 13.
Average p-value of WSO compared to others experiments for SCP51.
Table 13.
Average p-value of WSO compared to others experiments for SCP51.
| STD | STD_LOG | STD_PIECE | STD_SINE | STD_SINGER | STD_SINU | STD_TENT | STD_CIRCLE | ELIT | ELIT_LOG | ELIT_PIECE | ELIT_SINE | ELIT_SINGER | ELIT_SINU | ELIT_TENT | ELIT_CIRCLE |
---|
STD | X | 0.991 | 0.988 | 0.988 | 0.988 | 0.991 | 0.991 | 0.988 | 0.988 | 0.988 | 0.991 | 0.988 | 0.991 | 0.991 | 0.988 | 0.991 |
STD_LOG | 0.03 | X | 0.252 | 0.252 | 0.252 | 0.023 | 1 | 0.094 | 0.03 | 0.252 | 1 | 0.252 | 1 | 0.023 | 0.252 | 0.023 |
STD_PIECE | 0.036 | 0.909 | X | 0.604 | 0.604 | 0.094 | 0.909 | 0.31 | 0.055 | 0.604 | 0.909 | 0.604 | 0.909 | 0.094 | 0.604 | 0.03 |
STD_SINE | 0.036 | 0.909 | 0.604 | X | 0.604 | 0.094 | 0.909 | 0.31 | 0.055 | 0.604 | 0.909 | 0.604 | 0.909 | 0.094 | 0.604 | 0.03 |
STD_SINGER | 0.036 | 0.909 | 0.604 | 0.604 | X | 0.094 | 0.909 | 0.31 | 0.055 | 0.604 | 0.909 | 0.604 | 0.909 | 0.094 | 0.604 | 0.03 |
STD_SINU | 0.03 | 0.994 | 0.967 | 0.967 | 0.967 | X | 0.994 | 0.909 | 0.094 | 0.967 | 0.994 | 0.967 | 0.994 | 1 | 0.967 | 0.023 |
STD_TENT | 0.03 | 1 | 0.252 | 0.252 | 0.252 | 0.023 | X | 0.094 | 0.03 | 0.252 | 1 | 0.252 | 1 | 0.023 | 0.252 | 0.023 |
STD_CIRCLE | 0.036 | 0.967 | 0.84 | 0.84 | 0.84 | 0.252 | 0.967 | X | 0.079 | 0.84 | 0.967 | 0.84 | 0.967 | 0.252 | 0.84 | 0.03 |
ELIT | 0.036 | 0.991 | 0.98 | 0.98 | 0.98 | 0.967 | 0.991 | 0.97 | X | 0.98 | 0.991 | 0.98 | 0.991 | 0.967 | 0.98 | 0.252 |
ELIT_LOG | 0.036 | 0.909 | 0.604 | 0.604 | 0.604 | 0.094 | 0.909 | 0.31 | 0.055 | X | 0.909 | 0.604 | 0.909 | 0.094 | 0.604 | 0.03 |
ELIT_PIECE | 0.03 | 1 | 0.252 | 0.252 | 0.252 | 0.023 | 1 | 0.094 | 0.03 | 0.252 | X | 0.252 | 1 | 0.023 | 0.252 | 0.023 |
ELIT_SINE | 0.036 | 0.909 | 0.604 | 0.604 | 0.604 | 0.094 | 0.909 | 0.31 | 0.055 | 0.604 | 0.909 | X | 0.909 | 0.094 | 0.604 | 0.03 |
ELIT_SINGER | 0.03 | 1 | 0.252 | 0.252 | 0.252 | 0.023 | 1 | 0.094 | 0.03 | 0.252 | 1 | 0.252 | X | 0.023 | 0.252 | 0.023 |
ELIT_SINU | 0.03 | 0.994 | 0.967 | 0.967 | 0.967 | 1 | 0.994 | 0.909 | 0.094 | 0.967 | 0.994 | 0.967 | 0.994 | X | 0.967 | 0.023 |
ELIT_TENT | 0.036 | 0.909 | 0.604 | 0.604 | 0.604 | 0.094 | 0.909 | 0.31 | 0.055 | 0.604 | 0.909 | 0.604 | 0.909 | 0.094 | X | 0.03 |
ELIT_CIRCLE | 0.03 | 0.994 | 0.991 | 0.991 | 0.991 | 0.994 | 0.994 | 0.991 | 0.909 | 0.991 | 0.994 | 0.991 | 0.994 | 0.994 | 0.991 | X |
Table 14.
Average p-value of WSO compared to others experiments for SCP61.
Table 14.
Average p-value of WSO compared to others experiments for SCP61.
| STD | STD_LOG | STD_PIECE | STD_SINE | STD_SINGER | STD_SINU | STD_TENT | STD_CIRCLE | ELIT | ELIT_LOG | ELIT_PIECE | ELIT_SINE | ELIT_SINGER | ELIT_SINU | ELIT_TENT | ELIT_CIRCLE |
---|
STD | X | 0.965 | 0.965 | 0.965 | 0.918 | 0.965 | 0.965 | 0.038 | 0.823 | 0.965 | 0.965 | 0.965 | 0.965 | 0.965 | 0.965 | 0.823 |
STD_LOG | 0.098 | X | 1 | 1 | 0.252 | 1 | 1 | 0.03 | 0.023 | 1 | 1 | 1 | 1 | 1 | 1 | 0.252 |
STD_PIECE | 0.098 | 1 | X | 1 | 0.252 | 1 | 1 | 0.03 | 0.023 | 1 | 1 | 1 | 1 | 1 | 1 | 0.252 |
STD_SINE | 0.098 | 1 | 1 | X | 0.252 | 1 | 1 | 0.03 | 0.023 | 1 | 1 | 1 | 1 | 1 | 1 | 0.252 |
STD_SINGER | 0.177 | 0.909 | 0.909 | 0.909 | X | 0.909 | 0.909 | 0.036 | 0.094 | 0.909 | 0.909 | 0.909 | 0.909 | 0.909 | 0.909 | 0.5 |
STD_SINU | 0.098 | 1 | 1 | 1 | 0.252 | X | 1 | 0.03 | 0.023 | 1 | 1 | 1 | 1 | 1 | 1 | 0.252 |
STD_TENT | 0.098 | 1 | 1 | 1 | 0.252 | 1 | X | 0.03 | 0.023 | 1 | 1 | 1 | 1 | 1 | 1 | 0.252 |
STD_CIRCLE | 0.987 | 0.991 | 0.991 | 0.991 | 0.988 | 0.991 | 0.991 | X | 0.991 | 0.991 | 0.991 | 0.991 | 0.991 | 0.991 | 0.991 | 0.988 |
ELIT | 0.321 | 0.994 | 0.994 | 0.994 | 0.967 | 0.994 | 0.994 | 0.03 | X | 0.994 | 0.994 | 0.994 | 0.994 | 0.994 | 0.994 | 0.827 |
ELIT_LOG | 0.098 | 1 | 1 | 1 | 0.252 | 1 | 1 | 0.03 | 0.023 | X | 1 | 1 | 1 | 1 | 1 | 0.252 |
ELIT_PIECE | 0.098 | 1 | 1 | 1 | 0.252 | 1 | 1 | 0.03 | 0.023 | 1 | X | 1 | 1 | 1 | 1 | 0.252 |
ELIT_SINE | 0.098 | 1 | 1 | 1 | 0.252 | 1 | 1 | 0.03 | 0.023 | 1 | 1 | X | 1 | 1 | 1 | 0.252 |
ELIT_SINGER | 0.098 | 1 | 1 | 1 | 0.252 | 1 | 1 | 0.03 | 0.023 | 1 | 1 | 1 | X | 1 | 1 | 0.252 |
ELIT_SINU | 0.098 | 1 | 1 | 1 | 0.252 | 1 | 1 | 0.03 | 0.023 | 1 | 1 | 1 | 1 | X | 1 | 0.252 |
ELIT_TENT | 0.098 | 1 | 1 | 1 | 0.252 | 1 | 1 | 0.03 | 0.023 | 1 | 1 | 1 | 1 | 1 | X | 0.252 |
ELIT_CIRCLE | 0.321 | 0.909 | 0.909 | 0.909 | 0.697 | 0.909 | 0.909 | 0.036 | 0.319 | 0.909 | 0.909 | 0.909 | 0.909 | 0.909 | 0.909 | X |
| Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).