Next Article in Journal
Estimating the Spectral Response of Eight-Band MSFA One-Shot Cameras Using Deep Learning
Previous Article in Journal
Deep Learning Approach for Arm Fracture Detection Based on an Improved YOLOv8 Algorithm
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Innovative Enhanced JAYA Algorithm for the Optimization of Continuous and Discrete Problems

by
Jalal Jabbar Bairooz
and
Farhad Mardukhi
*
Department of Computer Engineering and Information Technology, Razi University, Kermanshah 6714414971, Iran
*
Author to whom correspondence should be addressed.
Algorithms 2024, 17(11), 472; https://doi.org/10.3390/a17110472
Submission received: 30 July 2024 / Revised: 4 October 2024 / Accepted: 15 October 2024 / Published: 22 October 2024
(This article belongs to the Section Analysis of Algorithms and Complexity Theory)

Abstract

:
Metaheuristic algorithms have gained popularity in the past decade due to their remarkable ability to address various optimization challenges. Among these, the JAYA algorithm has emerged as a recent contender that demonstrates strong performance across different optimization problems, largely attributed to its simplicity. However, real-world problems have become increasingly complex in today’s era, creating a demand for more robust and effective solutions to tackle these intricate challenges and achieve outstanding results. This article proposes an enhanced JAYA (EJAYA) method that addresses its inherent shortcomings, resulting in improved convergence and search capabilities when dealing with diverse problems. The current study evaluates the performance of the proposed optimization methods on both continuous and discontinuous problems. Initially, EJAYA is applied to solve 20 prominent test functions and is validated by comparison with other contemporary algorithms in the literature, including moth–flame optimization, particle swarm optimization, the dragonfly algorithm, and the sine–cosine algorithm. The effectiveness of the proposed approach in discrete scenarios is tested using feature selection and compared to existing optimization strategies. Evaluations across various scenarios demonstrate that the proposed enhancements significantly improve the JAYA algorithm’s performance, facilitating escape from local minima, achieving faster convergence, and expanding the search capabilities.

1. Introduction

While the concept of optimization varies across different fields, it generally involves finding the best possible solution for a problem while considering both equality and inequality constraints. In the current era, solving and optimizing complex problems in a limited time is a critical topic in various fields. Traditional approaches for handling large-scale optimization problems become increasingly complex and time-consuming, preventing these strategies from achieving accurate global solutions. As a result, metaheuristic-based optimization approaches have been developed to address these issues. These algorithms have the capability to rapidly attain a global or near-global optimized value. For instance, the effective performance of such algorithms in optimizing nonlinear problems has been demonstrated in the literature. Population-based algorithms are widely acknowledged as highly effective methods for rapidly generating relevant responses to challenges. Given these advantages, researchers have developed various algorithms. Particle swarm optimization (PSO) was inspired by the collective behavior of birds and fish in flight [1]. The genetic algorithm (GA) is the most well-known algorithm based on natural selection, which contributes to survival in nature [2,3]. Differential evolution (DE) is another algorithm that follows three steps similar to the GA, including crossover, mutation, and selection [4,5]. However, it sometimes outperforms the GA in terms of solution quality. Tree Growth Optimization (TGO) is inspired by the behavior of trees in the forest [6]. Similarly, the firefly algorithm (FA) was developed based on the behavior of fireflies [7,8]. The hunting behavior of the gray wolf optimization (GWO) algorithm inspired the design of another optimization algorithm [9]. The following algorithms are also inspired by various natural behaviors: the Adolescent Identity Search Algorithm (AISA) [10], Butterfly Optimization Algorithm (BOA) [11,12], Ant Lion Optimizer (ALO) [13], Cooperation Search Algorithm (CS) [14], Equilibrium Optimizer (EO) [15], Crow Search Algorithm (CSA) [16], Optimization Booster Algorithm (OBA) [17], Honey Badger Algorithm (HBA) [18], African Vultures Optimization Algorithm (AVOA) [19], Rain Optimization Algorithm (ROA) [20], Tunicate Swarm Algorithm (TSA) [21], Pity Beetle Algorithm (PBA) [22], and Queuing Search Algorithm (QSA) [23].
Based on the review of the aforementioned papers, it is evident that there is a compelling demand for the development of a novel optimization algorithm. This algorithm should aim to reduce the dependence on algorithm-specific parameters while effectively revealing the most optimal solution. Additionally, it is noteworthy that a substantial portion of the evaluations conducted in these studies primarily focused on specific problem domains, categorized as either continuous or discrete optimization problems. Consequently, the need arises for the creation of an algorithm capable of addressing the intricacies of both continuous and discrete scenarios.
Accordingly, the current study proposes a novel modified JAYA algorithm with enhanced search capability and convergence. Scientifically, an algorithm with balanced exploration and exploitation capabilities must be designed. This balance allows optimizers to develop a more dependable algorithm with superior performance compared to the basic version. The performance of this approach is then compared with other recent algorithms in continuous and discontinuous problem domains. Various benchmark functions are considered to assess the proposed method’s ability to handle continuous problems. Regarding discontinuous problems, one of the most popular and applicable scenarios, feature selection, is addressed. Feature selection can be defined as an optimization problem aimed at extracting an optimal subset from large-scale data.

2. Literature Survey

Population-based algorithms possess inherent strength in problem solving, and researchers frequently find ways to enhance their effectiveness by addressing their intrinsic limitations. These modifications enable the algorithms to thoroughly explore the problem space, achieve quicker convergence, and gain other benefits. Various enhancements have been introduced thus far to improve the performance of these algorithms. For instance, enhanced particle swarm optimization [24,25,26], the improved flower pollination algorithm [27], improved whale optimization [28], the improved scatter search algorithm [29], the improved differential evolution algorithm [30], the improved bat algorithm [31], improved harmony search [32,33,34,35], the improved firefly algorithm [36], the improved bacteria foraging algorithm [37], the improved artificial bee colony algorithm [38,39,40], the improved genetic algorithm [41,42,43], the improved simulated annealing algorithm [44,45], the modified squirrel search algorithm [46], the improved firework algorithm [47], improved gray wolf optimization [48], and the improved sine–cosine algorithm [49].
Although the algorithms mentioned above, with their modifications, have their benefits, they must be tuned using algorithm-specific parameters. For example, the GA necessitates setting parameters such as the crossover probability, selection operator, and mutation probability. Similarly, the other algorithms listed in the preceding paragraphs have specific parameters that must be set for optimal performance. These parameters are unique to each algorithm and should be adjusted along with the common parameters (i.e., the number of decision variables and the maximum number of populations/iterations). The general control parameters are standard among all algorithms. However, the algorithm-specific parameters are unique and must also be defined. These factors have a significant impact on the performance of metaheuristic-based algorithms. Properly configuring these settings is critical because, if the parameters are not tuned correctly, it may increase the computational burden or lead to a preference for a local optimum solution.
A parameterless approach, the JAYA algorithm, has recently been presented to address the issue of establishing algorithm-specific parameters [50]. JAYA is an advanced and robust algorithm that draws inspiration from the principle of prioritizing the best member among candidate solutions while simultaneously disregarding the worst member [50]. Despite the notable benefits offered by this algorithm, such as the absence of parameter settings and rapid convergence, it faces challenges in adequately exploring the search space and occasionally becoming trapped in local minima. In other words, because it employs a single learning technique with a low population diversity, JAYA is vulnerable to being caught in local optima when confronted with complex optimization problems. Technically, as the population undergoes updates through a single position change, the diversity within the population diminishes. This reduction in diversity limits the algorithm’s ability to explore a broader range of potential solutions, ultimately decreasing its effectiveness in discovering optimal or near-optimal solutions in the search space. Maintaining diversity is crucial for avoiding premature convergence and ensuring that the algorithm does not become trapped in local optima, thus enhancing its exploratory capabilities. For instance, in the GA, applying mutation operators can help preserve diversity and prevent premature convergence, while, in PSO, balancing exploration and exploitation through velocity control can ensure broader search space exploration. This limitation hampers achieving high-quality results when dealing with significantly large problem sizes. To address these drawbacks, various attempts have been made. One such approach involves converting the JAYA population into a subpopulation, aiming to improve the algorithm’s efficiency [51], including greater search capability and better convergence. However, because the algorithm is sliced into several groups, it adds more complexity to the algorithm, such as difficulty in implementation, increased time consumption, etc. Similarly, a dynamic weight parameter acting as a varying coefficient was incorporated in JAYA to enhance its convergence [52]. Although the algorithm’s performance experienced a significant improvement, some new parameters were added, making the algorithm sensitive to parameter settings. Additionally, integrating the shuffling process with JAYA was employed to achieve enhanced exploration within the search domain. A clustering technique was also used to replace solutions that exhibited the lowest quality [53]. Recently, a hybrid JAYA algorithm with a CSA was provided by Gholami and colleagues. This algorithm revealed excellent efficiency in solving optimization benchmarks [54].
Nevertheless, it suffers from higher time consumption and increased parameter tuning due to integration with the CSA, resulting in nearly two times the time required by the original JAYA. Luu and Nguyen introduced a modified algorithm by combining the population update of differential evolution with the original version of JAYA [55]. This integration enhanced the global search ability. Nevertheless, its complexity increased substantially due to the mixing of the two algorithms and the addition of new parameters. JAYA was also integrated with Tabu search for solving the flexible job shop scheduling problem. In this case, hybridization resulted in increased time consumption and additional parameter tuning [56]. Given the complexity and ruggedness of some optimization problems, researchers have modified the JAYA algorithm to enhance its convergence behavior. Many have combined JAYA with other powerful optimization techniques, leading to the development of various JAYA algorithm variants. These include binary JAYA [57], self-adaptive JAYA [58,59], elitism-based JAYA [60], elitism-based self-adaptive multi-population JAYA [61], chaotic JAYA [62], neural network JAYA [63], three metaphorless simple algorithms for solving optimization problems [64], the concept of search space division [65], and a review of metaheuristic algorithms on 57 benchmarks [66].
From the provided literature review, the main contributions of this proposal are briefly outlined as follows:
  • The proposed approach is based on modifying the JAYA algorithm, which results in improved search capability, faster convergence, and higher-quality solutions.
  • To address continuous problems, the proposed approach is evaluated across 20 widely recognized benchmark functions. The results obtained from these benchmark functions are compared to those achieved by other existing algorithms, including the grasshopper optimization algorithm (GOA), dragonfly algorithm (DA), moth–flame optimization (MFO), and others.
  • Feature selection is also considered as a discrete problem. The proposed method is then applied to solve this problem, and its findings are compared to other algorithms in the literature, such as the genetic algorithm, particle swarm optimization algorithm, etc.

3. Research Methodology (Proposed Algorithm)

The standard version of the JAYA algorithm was initially examined to establish a solid foundation for understanding the algorithm. Subsequent modifications are then discussed to address its limitations.

3.1. Standard Version of JAYA Algorithm

Rao proposed a novel algorithm named JAYA [50]. Although the JAYA algorithm, a parameterless technique, was introduced to address the issue of defining algorithm-specific parameters, it remains one of the more advanced and modern algorithms. JAYA draws inspiration from the notion that candidate solutions should prioritize the best member while excluding the worst. Although this technique has significant advantages, such as no parameter setup and quick convergence, it struggles to explore the search space properly and occasionally becomes trapped in local minima. In other words, because it employs a single learning technique with limited population diversity, the JAYA algorithm can encounter local optima when dealing with complex optimization problems. As the JAYA algorithm updates the population through a single position change, the population’s diversity gradually decreases. This loss of diversity restricts the algorithm’s ability to explore various regions of the search space, raising the risk of premature convergence to suboptimal solutions. The reduced exploration weakens the algorithm’s overall effectiveness, decreasing the chances of reaching the global optimum. Maintaining diversity throughout the optimization process is essential to strike a balance between exploration and exploitation. When the magnitude of the problem is significant, this impediment prevents a quality outcome. The phases in this strategy are as follows.

3.1.1. First Step: Setting the Parameters

Undoubtedly, there are certain parameters associated with each algorithm that require tuning. These parameters include lower and upper bounds (LX and UX), decision variables (DVs), the maximum cycle repetition (MI), and the size of the population (PS).

3.1.2. Second Step: Stochastically Generating Individuals

As shown in the following (1), a number of populations can be generated between the search space, UX, LX, and random numbers:
X n , m = L X m + U X m L X m · r a n d 0 ,   1 n = 1,2 , , P S   ,     m = 1,2 , , D V
The above process is repeated until it reaches the population size, as illustrated in the following matrix 2:
X p o p = X 1,1 X 1,2 X 1 , D V X 2,1 X 2,2 X 2 , D V X P S , 1 X P S , 2 X P S , D V

3.1.3. Third Step: Applying Solutions’ Position

As discussed, populations have been stochastically generated and must be added to improvement procedures. In this algorithm, this process is obtained by the following concept. Put differently, the best and worst individuals are extracted, and a new solution is generated based on therapy, as follows:
X n , m t + 1 = X n , m t + r a n d [ 0 ,   1 ] ( X m , b e s t t | X n , m t | ) r a n d [ 0 ,   1 ] ( X m , w o r s t t | X n , m t | ) n = 1,2 , , P S   ,     m = 1,2 , , D V
Here, X n , m t and X n , m t + 1 signify the position of the ith member at two different cycles, shown in t and t + 1; the population’s best and worst solutions are denoted by X m , b e s t t and X m , w o r s t t , respectively.

3.1.4. Fourth Step: Computing Fitness

Each newly generated member is ranked based on its fitness. In this phase, the fitness of the proposed member is calculated via the objective function defined by the operator.

3.1.5. Fifth Step: Assessment Solution

This step checks whether the generated solution is kept for the next iteration. To reach that, if the fitness of the ith member is less than that of the prior iteration, it is saved for the next iteration. However, if it is not accepted, the previous one remains for the next iteration, as follows:
X n , m t + 1 = X n , m t + 1 i f   f ( X n , m t + 1 ) < f ( X n , m t ) X n , m t o t h e r w i s e

3.1.6. Sixth Step: Terminating the Algorithm

Here, it is checked whether iteration is sufficient or not. If it reaches the MI, it is stopped. Otherwise, it is repeated.
Algorithm 1 shows the pseudocode of conventional JAYA.
Algorithm 1: Conventional JAYA
01:
Defining the parameters associated with the algorithm;
02:
Stochastically generate a set of solutions, then calculate their objective function;
03:
While Iter < MI
04:
find the best and worst members.
05:
for n = 1: PS do,
06:
       for m = 1: DV do,
07:
           X n , m t + 1 = X n , m t + r a n d [ 0 ,   1 ] ( X m , b e s t t | X n , m t | ) r a n d [ 0 ,   1 ] ( X m , w o r s t t | X n , m t | )
08:
       end
09:
Keep the generated solution between lower band (LB) and upper band (UB);
10:
Use the objective function to calculate the fitness of the solution;
11:
Save the new solution if it is better than the previous one;
12:
End
13:
End

3.2. Modified JAYA Algorithm

This section addresses improving the efficiency of the JAYA algorithm. These improvements are discussed step by step as follows.

3.2.1. Setting the Parameters

Like in its original version, some parameters are set according to the problem. These are LX, UX, DVs, MI, and PS.

3.2.2. Stochastically Generating Individuals

A set of members can be generated randomly using the following relation:
X n , m = L X m + U X m L X m · r a n d 0 ,   1 n = 1,2 , , P S   ,     m = 1,2 , , D V
The following matrix is obtained as a result of the aforementioned generation process:
X p o p = X 1,1 X 1,2 X 1 , D V X 2,1 X 2,2 X 2 , D V X P S , 1 X P S , 2 X P S , D V

3.2.3. Applying Solutions’ Position

The modification has been implemented here. Referring back to the standard version of JAYA in Section 3.1, two members, the best and worst members, have the highest involvement in updating the positions of all individuals. This, in turn, may have a negative effect by causing the algorithm to become stuck in local minima due to reduced population diversity. Therefore, some modifications are introduced here to tackle this problem. This is elaborated as follows:
for   n   =   1 :   PS   do ,                   for   m   =   1 :   DV   do ,                                 if   r j < A W                                       X n , m t + 1 = X n , m t + r a n d [ 0 ,   1 ] ( X m , b e s t t | X n , m t | ) r a n d [ 0 ,   1 ] ( X m , w o r s t t | X n , m t | )                                 else                                       M U   =   2   ×   Iter   ×   ( a / MI )                                       X n , m t + 1 = X m , b e s t t ( 2   ×   MU   ×   rand [ 0 ,   1 ] MU )   ×   abs ( 2   ×   X m , b e s t t X n , m t )                                 end                   end end
Here, AW is a user-defined parameter ranging between [0.1, 0.9], Iter is the iteration counter, and MI signifies the maximum number of iterations. MU demonstrates a self-adaptive mutation.
As can be seen, compared to the original JAYA algorithm, which uses a single learning strategy, the position in the proposed method is updated in two steps, which leads to better population diversity. Here, r j denotes a random number and a is a constant number that takes a value between 1 and 5.

3.2.4. Computing Fitness

According to the objective function defined by the operators, each member’s fitness is recognized in this step.

3.2.5. Assessment Solution

If the fitness of the ith member is less than its fitness in the previous iteration, it is saved for the next iteration. However, if it is not improved, the former value remains for the next iteration, as follows:
X n , m t + 1 = X n , m t + 1 i f   f ( X n , m t + 1 ) < f ( X n , m t ) X n , m t o t h e r w i s e

3.2.6. Terminating the Algorithm

Here, it is checked whether the number of iterations is adequate or not. If Iter is less than MI, the algorithm is repeated. However, it terminates once it reaches MI.
Algorithm 2 shows the proposed approach (EJAYA).
Algorithm 2: The Proposed Approach (EJAYA)
01:
Defining the parameters related to the algorithm;
02:
Stochastically generate a set of solutions, then calculate their objective function;
03:
While Iter < MI
04:
Find the best and worst members.
05:
for n = 1: PS do,
06:
        for m = 1: DV do,
07:
                  if r j < A W
08:
                      X n , m t + 1 = X n , m t + r a n d [ 0 , 1 ] ( X m , b e s t t | X n , m t | ) r a n d [ 0 , 1 ] ( X m , w o r s t t | X n , m t | )
09:
               else
10:
                     MU = 2 × Iter × (a/MI)
11:
                      X n , m t + 1 = X m , b e s t t   ( 2 * MU * rand [ 0 ,   1 ] MU )   ×   abs ( 2   ×   X m , b e s t t X n , m t )
12:
               end
13:
             end
14:
keep the generated solution between LB and UB;
15:
use the objective function to calculate the fitness of the solution;
16:
save the new solution if it is better than the previous one;
17:
end
18:
end
Generally, the heuristic method assumes a solution with an acceptable margin of error. Therefore, to model this, the generated solution is compared with the upper and lower bounds based on the X n , m t + 1 = m a x ( X n , m t + 1 , L B ) and X n , m t + 1 = m i n ( X n , m t + 1 , U B ) . This can guarantee that the generated and captured solution is between boundaries.
The binary version of the proposed method is similar to the continuous version, but the solutions are generated within the range of zero and one. The procedure for the binary version of EJAYA is outlined in Algorithm 3.
Algorithm 3: The Binary Version of Proposed Approach (BEJAYA)
01:
Defining the parameters associated with the algorithm.
02:
Stochastically generate a set of solutions with the boundary of [0 1], then round the generated solutions to have binary populations. After that, calculate their objective function.
03:
While Iter < MI
04:
Find the best and worst members.
05:
for n = 1: PS do,
06:
    for m = 1: DV do,
07:
        if r j < A W
08:
           X n , m t + 1 = X n , m t + r a n d [ 0 , 1 ] ( X m , b e s t t | X n , m t | ) r a n d [ 0 , 1 ] ( X m , w o r s t t | X n , m t | )
09:
           X n , m t + 1 = r o u n d   ( X n , m t + 1 )
10:
       else
11:
          MU = 2 × Iter × (a/MI)
12:
          X n , m t + 1 = X m , b e s t t − (2 × MU × rand[0, 1] − MU) × abs(2 × X m , b e s t t X n , m t )
13:
          X n , m t + 1 = r o u n d   ( X n , m t + 1 )
14:
       end
15:
    end
16:
keep the generated solution between LB and UB;
17:
use the objective function to calculate the fitness of the solution;
18:
save the new solution if it is better than the previous one;
19:
end
20:
end

4. Results and Discussion

This section presents the numerical evaluation of the proposed optimization approach. The effectiveness of the approach was assessed through two different types of optimization problems. First, the approach was applied to address well-known benchmark functions that are commonly used in various research studies. These benchmark functions served as a standard reference for evaluating the optimization algorithms. Additionally, EJAYA was employed to tackle feature selection, a discrete problem of great significance in various fields. The following Section 4.1 and Section 4.2 discuss these continuous and discrete problems.

4.1. Numerical Evaluation

In this section, two kinds of optimization problems are considered to evaluate the effectiveness of the proposed optimization approach. To elaborate further, they are first utilized to address some of the most well-known benchmark functions frequently used in different studies. Following that, EJAYA is used to solve feature selection, a discrete problem.
Table 1 illustrates the functions utilized to evaluate the efficiency of the proposed approach [13]. For a better overview of these functions, they are also shown in Figure 1. Some algorithms, including the standard JAYA, CSA, particle swarm optimization (PSO) [67], DA [68], GOA [69], MFO [70], and the sine–cosine algorithm (SCA) [71], are considered for comparison with our method. The mentioned algorithms are executed under the following similar conditions: MI = 1000, DV = 30, and PS = 30. To provide a measure that accounts for all of the results from the 30 runs, giving us a comprehensive view of the algorithm’s overall performance, we chose to use the average value. The average is particularly useful in our study to evaluate the general tendency and compare different algorithms under consistent conditions. All of the algorithms are then run 30 times. The tuning parameters for each algorithm are defined [35,72]. To simulate the algorithm, it was implemented using MATLAB on a personal laptop equipped with an Intel i5 processor and 8 GB of RAM.
  • SCA tuning: r1 linearly reduces from 2 to 0.
  • DA tuning: The inertia weight (w) is set to decrease linearly from 0.9 to 0.5 over the maximum number of iterations. The enemy distraction weight (e) decreases linearly from 0.1 to 0.1 divided by (MI × 0.5), where MI represents a specific factor. The separation weight (s) is determined by multiplying 2 by a random value and e. Similarly, the alignment weight (a), cohesion weight (c), and food attraction weight (f) are calculated by multiplying 2 by random values.
  • GOA tuning: The parameter c is adjusted using the formula c = cMax − l × ((cMaxcMin)/MI), where cMax is set to 1 and cMin is set to 0.00004.
  • MFO tuning: Linearly reduces from −1 to −2.
  • PSO tuning: Inertia factor (w) = 0.2, while the c1 and c2 = 2.
  • CSA tuning: The awareness probability (AP) is set to 0.1 and the flight length (fl) is set to 1.5.
  • JAYA tuning: It is updated via the best and worst members.
  • EJAYA tuning: AW = 0.3 and a = 3. It is notable that we run the EJAYA for different values of the mentioned parameters and then compare the objective functions. From the comparisons, the algorithm found the better solution (minimum objective function) under these parameters.
Table 2 displays the fitness findings for functions 1–6, including the mean, best value, and standard deviation (STD). Based on these findings, each algorithm that can provide the least fitness function is ranked higher. As can be observed, EJAYA is ranked as the best algorithm for optimizing F1, since it has the lowest mean, standard deviation, and worst and best values. Similarly, the mean value in F2, obtained by EJAYA and the SCA, accounting for 1.04480E−35 and 3.21435E−07, respectively. EJAYA was able to identify a higher-quality solution in optimizing F2, although the SCA is ranked as the second algorithm. Regarding optimizing F3, basic JAYA and EJAYA have the same efficiency in finding the optimal solution. However, it should be noted that better convergence is associated with EJAYA, as shown in Figure 2. In F4, the best fitness for the SCA, DA, GOA, MFO, PSO, CSA, JAYA, and EJAYA are 1.30543E+01, 1.16565E+01, 8.20644E+00, 3.10525E−03, 2.62136E−01, 5.23108E+00, 4.99795E+00, and 4.33560E−08, respectively. EJAYA reaches the lowest value (the best solution in minimization). In solving F5, EJAYA is again recognized as the best method, accounting for 2.33569E+01. Similarly, in F6, various algorithms can discover the best solution, but EJAYA has a faster convergence rate than the others. The comparison among F1–F6 shows that applying modifications to the JAYA algorithm can be sufficiently effective to improve its efficiency compared with its basic version and other recent algorithms.
Table 3 summarizes the findings for F7–F12. According to this table, the proposed technique is ranked as the best algorithm by determining the smallest number for the mean, standard deviation (STD), and worst and best values. Upon closer examination, EJAYA, in F7, presents the best results, accounting for 2.67151E−03. However, in F8, MFO is ranked first, followed by PSO and EJAYA. While EJAYA is not ranked as the best algorithm in F8, it outperforms basic JAYA. In the case of F9, there are various methods with outstanding performance, one of which, with quick convergence, is EJAYA. Similarly, EJAYA outperforms other algorithms in F10 and F11 due to its effective potential in discovering optimum solutions with adequate convergence. As for F12, EJAYA is placed second, while PSO obtains the best solution. Although some algorithms may have identified a superior solution in a few functions, they are not as robust as EJAYA. These algorithms only performed better in a few of the functions, while their efficiency was worse in others. As a result, a dependable algorithm is defined by its efficacy in discovering the optimal solution in most functions rather than its inefficiency with fewer functions. Consequently, EJAYA is sufficiently steady compared to the others because it ranks first in most of the functions.
Table 4 shows the algorithms that were used to discover the solutions for F13–F20. When optimizing F13, as can be seen, all of the algorithms have roughly the same performance. EJAYA, on the other hand, is regarded as the superior algorithm over other methods for optimizing F14–F16. It is noteworthy that, although EJAYA not only discovers better solutions in F17, it also has a faster convergence rate. EJAYA is also the best method for solving F18 and F19, with values of 1.14162E−05 and 1.45847E+01, respectively. Finally, the suggested strategy is less effective than the other algorithms in F20. From this paragraph, it can be stated that EJAYA overcomes the shortcomings of the standard version of JAYA by increasing the population diversity, which results in more dependable and accurate findings than the other algorithms.
The quantity of the proposed method against other algorithms has been tabulated in Table 1, Table 2, Table 3 and Table 4. However, the convergence of the algorithms, which demonstrates their process for obtaining the solution, should also be examined. In other words, if an algorithm discovers a better solution in a finite number of iterations, it is considered to have a better convergence rate. To achieve this, Figure 2 depicts the convergence of all methods for functions 1–6. As shown in F1–F6, the proposed approach outperforms the previous algorithms in terms of the search capability to escape from local minima, convergence, and other factors. The use of quick convergence is highlighted when optimizing real-world optimization problems, where reducing the computational complexity plays a vital role.
Similarly, Figure 3 compares all of the methods for functions F7–F12 regarding the convergence rate. The proposed optimization strategy designed to solve F7 is again tagged as the best algorithm. On the other hand, it is classified as the third-best method for solving F8, meaning that it comes after MFO and PSO. It should be noted that, although EJAYA is placed third in F8, its convergence demonstrates a significant improvement compared to its original form. Similarly, while optimizing F9, three algorithms, including the suggested method, basic JAYA, and PSO, are superior. When we glance at F8, we can see that the proposed method is once more classified as the most prominent among these three algorithms, since it can obtain the best solution in fewer iterations. As a result, because it does not become trapped in local minima and seeks to find a better solution throughout the iterations, the suggested technique has provided a good-quality solution in F11. PSO, on the other hand, dominates the proposed method in F12. Nonetheless, EJAYA outperforms the standard version of JAYA and different algorithms in terms of productivity while solving F12. To summarize, it is remarkable that the suggested adjustment improves the performance of the original version of JAYA and dramatically enhances its convergence capabilities.
Similarly, Figure 4 depicts the outcomes of all algorithms’ convergent capacity in solving F13–F20. To start with, in F13, there are no substantial differences among the algorithms’ convergence. Furthermore, for F14–F19, EJAYA is rated as the best algorithm for efficiency and convergence. In F20, on the other hand, its performance was not the top-performing, although its convergence may be suitable. Based on the comparison of several algorithms, it is worth noting that EJAYA is placed third rather than first in certain circumstances. However, as observed, its performance, especially in reaching an optimal solution and improving convergence, improves significantly compared to the original version.

4.2. Feature Selection

Feature selection (FS) is a technique for handling high-dimensional data. In other words, it is a preprocessing stage applied to datasets as a dimensionality reduction strategy for prediction or classification by eliminating redundant and unnecessary features. Furthermore, it helps to reduce the computational time and enhances the classification accuracy. In FS, a binary vector composed of zero and one is generated. A value of one corresponds to a selected feature, whereas a value of zero indicates that the associated feature is unselected. Consequently, the EJAYA population is generated as a vector containing values of either zero or one. Therefore, the classification accuracy and the number of selected features are compared to various population-based FS approaches, such as the GA, ALO, and a combination of whale optimization algorithms and simulated annealing (SAWOASAT-2) [73]. Table 5 summarizes a variety of standard University of California, Irvine (UCI) datasets used to test the proposed approach for feature selection.
The findings of EJAYA in terms of the classification accuracy on eighteen UCI standard datasets are presented in Table 6. The table highlights the best outcomes in bold. EJAYA demonstrates impressive classification accuracies on eleven datasets, including Breast Cancer, Congress EW, Exactly2, Heart EW, Lymphography, Peng Lung EW, Sonar EW, Tic-Tac-Toe, Waveform EW, Wine EW, and Zoo, as evidenced by the experimental results. Moreover, EJAYA’s accuracy rates are comparable to other algorithms on six datasets, including Exactly, Ionosphere EW, Kraske EW, M-of-N, Spect EW, and Vote. However, it is less accurate than others on the Breast EW dataset.
Notably, EJAYA’s average classification accuracy is more remarkable than other approaches, indicating that it is the most efficient in finding the optimal solution, achieving an accuracy rate of 0.93. As a result, EJAYA surpasses the other FS techniques in terms of accuracy across all datasets.
Table 7 displays the average number of selected features from datasets by different algorithms. The algorithm with fewer features is ranked as the best one. Firstly, WOASAT-2 was able to find appropriate solutions in two datasets, including Breast Cancer and Breast EW. In sharp contrast, EJAYA is rated as the best algorithm in Congress EW. Similarly, both WOASAT-2 and EJAYA demonstrate the best performance in the Exactly dataset. When compared to other algorithms, EJAYA is subsequently listed as the best algorithm due to its fewer features for Exactly2. However, in Heart EW, WOASAT-2 is slightly better than the other methods for finding the fewest features. The fewest features for the Ionosphere EW dataset are associated with ALO, and the proposed method is placed second. Another dataset is Krvskp EW, in which EJAYA achieves the best performance by finding the fewest features. For three datasets, Lymphography, M-of-N, and Tic-Tac-Toe, two algorithms, namely, WOASAT-2 and EJAYA, provide the same results. Interestingly, EJAYA is ranked as the best algorithm when compared to the other algorithms for six datasets, including Peng Lung EW, Sonar EW, Spect EW, Vote, Waveform EW, and Wine EW. Overall, most of the previous methods tend to become caught up in the local optimum when coping with this problem. Nevertheless, EJAYA, due to integrating a balanced search strategy, has better efficiency in escaping from the local optima and selects a set of features yielding satisfactory quality results.
To evaluate the performance of the proposed strategy, a robust statistical test is required to ensure the reliability and effectiveness of the results. For this purpose, the Friedman test, a non-parametric method commonly used to compare multiple algorithms, is applied. This test assesses the performance of optimizers by assigning average rankings based on their results, in which a lower ranking indicates superior performance. Consequently, the objective is to determine which algorithm achieves the lowest rank, signifying the highest efficiency and reliability among the algorithms [23,74,75,76,77,78].
According to the results presented in Table 8, the proposed method, EJAYA, achieves the lowest average rank, thereby outperforming the other algorithms under consideration. This demonstrates that EJAYA is the most effective optimization approach among the tested methods, validating its suitability for the intended application. The application of the Friedman test ensures that this conclusion is statistically significant and reliable, highlighting the robustness of the proposed strategy in various scenarios and applications.
Another factor for evaluating the performance of the algorithms is the computational burden. To this end, the time consumption of the algorithms for the 14th function is represented in Figure 5. As can be seen, GOA and DA have a high time complexity, while JAYA and the proposed method converge in under 1 s. From this, it can be observed that, not only did the performance of EJAYA improve, but it also led to convergence in less time.

5. Conclusions

This study proposes an innovative modified JAYA algorithm to handle both numerical optimization problems and feature selection problems, addressing a real-world optimization challenge. The developed algorithm’s efficiency was measured on 20 standard test functions, and the analysis shows that EJAYA’s productivity significantly increases in balancing exploitation and exploration. This indicates that the convergence rate has been substantially enhanced. In addition, EJAYA was used to address feature selection as a discrete problem, and its efficacy was confirmed using multiple UCI-standardized datasets. EJAYA’s results were compared to other methodologies, including WOASAT, ALO, GA, PSO, etc. The results of the proposed method revealed that EJAYA’s performance is highly competitive in terms of optimal solutions and classification accuracy. For future investigations, EJAYA can be applied to other applications, including image processing and artificial intelligence training, etc.
Future research endeavors may explore the application of EJAYA in areas such as image processing, artificial intelligence training, and beyond, to assess its adaptability and performance in diverse optimization challenges. Additionally, investigating its integration with emerging technologies and exploring novel problem domains can lead to valuable insights into the algorithm’s extended capabilities and its relevance in addressing complex real-world problems. To demonstrate the practical results of the proposed algorithm, it is also recommended to implement it on large-scale datasets. Moreover, it is also worthwhile to apply EJAYA to solve scheduling problems under uncertainties.

Author Contributions

All authors (J.J.B. and F.M.) contributed to the study’s conception, design, material preparation, data collection, and the writing and editing of the manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

This article does not involve any studies with human or animal participants.

Data Availability Statement

The datasets analyzed during this study are available in the UCI Machine Learning Repository, http://archive.ics.uci.edu/ml.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Abdelmalek, S.; Dali, A.; Bettayeb, M.; Bakdi, A. A New Effective Robust Nonlinear Controller Based on PSO for Interleaved DC–DC Boost Converters for Fuel Cell Voltage Regulation. Soft Comput. 2020, 24, 17051–17064. [Google Scholar] [CrossRef]
  2. Sun, N.; Lu, Y. A Self-Adaptive Genetic Algorithm with Improved Mutation Mode Based on Measurement of Population Diversity. Neural Comput. Appl. 2019, 31, 1435–1443. [Google Scholar] [CrossRef]
  3. Behera, R.K.; Naik, D.; Rath, S.K.; Dharavath, R. Genetic Algorithm-Based Community Detection in Large-Scale Social Networks. Neural Comput. Appl. 2020, 32, 9649–9665. [Google Scholar] [CrossRef]
  4. Cui, L.; Li, G.; Zhu, Z.; Wen, Z.; Lu, N.; Lu, J. A Novel Differential Evolution Algorithm with a Self-Adaptation Parameter Control Method by Differential Evolution. Soft Comput. 2018, 22, 6171–6190. [Google Scholar] [CrossRef]
  5. Wang, S.; Li, Y.; Yang, H. Self-Adaptive Mutation Differential Evolution Algorithm Based on Particle Swarm Optimization. Appl. Soft Comput. J. 2019, 81, 105496. [Google Scholar] [CrossRef]
  6. Emami, H.; Sharifi, A.A. A Novel Bio-Inspired Optimization Algorithm for Solving Peak-to-Average Power Ratio Problem in DC-Biased Optical Systems. Opt. Fiber Technol. 2020, 60, 102383. [Google Scholar] [CrossRef]
  7. Wu, J.; Wang, Y.G.; Burrage, K.; Tian, Y.C.; Lawson, B.; Ding, Z. An Improved Firefly Algorithm for Global Continuous Optimization Problems. Expert Syst. Appl. 2020, 149, 113340. [Google Scholar] [CrossRef]
  8. Tian, M.; Bo, Y.; Chen, Z.; Wu, P.; Yue, C. Multi-Target Tracking Method Based on Improved Firefly Algorithm Optimized Particle Filter. Neurocomputing 2019, 359, 438–448. [Google Scholar] [CrossRef]
  9. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
  10. Bogar, E.; Beyhan, S. Adolescent Identity Search Algorithm (AISA): A Novel Metaheuristic Approach for Solving Optimization Problems. Appl. Soft Comput. J. 2020, 95, 106503. [Google Scholar] [CrossRef]
  11. Arora, S.; Singh, S. Butterfly Optimization Algorithm: A Novel Approach for Global Optimization. Soft Comput. 2019, 23, 715–734. [Google Scholar] [CrossRef]
  12. Soltani, P.; Hadavandi, E. A Monarch Butterfly Optimization-Based Neural Network Simulator for Prediction of Siro-Spun Yarn Tenacity. Soft Comput. 2019, 23, 10521–10535. [Google Scholar] [CrossRef]
  13. Mirjalili, S. The Ant Lion Optimizer. Adv. Eng. Softw. 2015, 83, 80–98. [Google Scholar] [CrossRef]
  14. Feng, Z.K.; Niu, W.J.; Liu, S. Cooperation Search Algorithm: A Novel Metaheuristic Evolutionary Intelligence Algorithm for Numerical Optimization and Engineering Optimization Problems. Appl. Soft Comput. J. 2020, 98, 106734. [Google Scholar] [CrossRef]
  15. Faramarzi, A.; Heidarinejad, M.; Stephens, B.; Mirjalili, S. Equilibrium Optimizer: A Novel Optimization Algorithm. Knowledge-Based Syst. 2020, 191, 105190. [Google Scholar] [CrossRef]
  16. Askarzadeh, A. A Novel Metaheuristic Method for Solving Constrained Engineering Optimization Problems: Crow Search Algorithm. Comput. Struct. 2016, 169, 1–12. [Google Scholar] [CrossRef]
  17. Pakzad-Moghaddam, S.H.; Mina, H.; Mostafazadeh, P. A Novel Optimization Booster Algorithm. Comput. Ind. Eng. 2019, 136, 591–613. [Google Scholar] [CrossRef]
  18. Hashim, F.A.; Houssein, E.H.; Hussain, K.; Mabrouk, M.S.; Al-Atabany, W. Honey Badger Algorithm: New Metaheuristic Algorithm for Solving Optimization Problems. Math. Comput. Simul. 2022, 192, 84–110. [Google Scholar] [CrossRef]
  19. Abdollahzadeh, B.; Gharehchopogh, F.S.; Mirjalili, S. African Vultures Optimization Algorithm: A New Nature-Inspired Metaheuristic Algorithm for Global Optimization Problems. Comput. Ind. Eng. 2021, 158, 107408. [Google Scholar] [CrossRef]
  20. Moazzeni, A.R.; Khamehchi, E. Rain Optimization Algorithm (ROA): A New Metaheuristic Method for Drilling Optimization Solutions. J. Pet. Sci. Eng. 2020, 195, 107512. [Google Scholar] [CrossRef]
  21. Kaur, S.; Awasthi, L.K.; Sangal, A.L.; Dhiman, G. Tunicate Swarm Algorithm: A New Bio-Inspired Based Metaheuristic Paradigm for Global Optimization. Eng. Appl. Artif. Intell. 2020, 90, 103541. [Google Scholar] [CrossRef]
  22. Kallioras, N.A.; Lagaros, N.D.; Avtzis, D.N. Pity Beetle Algorithm—A New Metaheuristic Inspired by the Behavior of Bark Beetles. Adv. Eng. Softw. 2018, 121, 147–166. [Google Scholar] [CrossRef]
  23. Zhang, J.; Xiao, M.; Gao, L.; Pan, Q. Queuing Search Algorithm: A Novel Metaheuristic Algorithm for Solving Engineering Optimization Problems. Appl. Math. Model. 2018, 63, 464–490. [Google Scholar] [CrossRef]
  24. Song, B.; Wang, Z.; Zou, L. An Improved PSO Algorithm for Smooth Path Planning of Mobile Robots Using Continuous High-Degree Bezier Curve. Appl. Soft Comput. 2021, 100, 106960. [Google Scholar] [CrossRef]
  25. Pekel, E. Solving Technician Routing and Scheduling Problem Using Improved Particle Swarm Optimization. Soft Comput. 2020, 24, 19007–19015. [Google Scholar] [CrossRef]
  26. Deng, W.; Yao, R.; Zhao, H.; Yang, X.; Li, G. A Novel Intelligent Diagnosis Method Using Optimal LS-SVM with Improved PSO Algorithm. Soft Comput. 2019, 23, 2445–2462. [Google Scholar] [CrossRef]
  27. Fan, L.; Chen, H.; Gao, Y. An Improved Flower Pollination Algorithm to the Urban Transit Routing Problem. Soft Comput. 2020, 24, 5043–5052. [Google Scholar] [CrossRef]
  28. Yan, C.; Li, M.; Liu, W. Prediction of Bank Telephone Marketing Results Based on Improved Whale Algorithms Optimizing S_Kohonen Network. Appl. Soft Comput. J. 2020, 92, 106259. [Google Scholar] [CrossRef]
  29. Zhang, Z.; Mao, L.; Guan, C.; Zhu, L.; Wang, Y. An Improved Scatter Search Algorithm for the Corridor Allocation Problem Considering Corridor Width. Soft Comput. 2020, 24, 461–481. [Google Scholar] [CrossRef]
  30. Alaei, M.; Khorsand, R.; Ramezanpour, M. An Adaptive Fault Detector Strategy for Scientific Workflow Scheduling Based on Improved Differential Evolution Algorithm in Cloud. Appl. Soft Comput. 2020, 99, 106895. [Google Scholar] [CrossRef]
  31. Liu, L.; Luo, S.; Guo, F.; Tan, S. Multi-Point Shortest Path Planning Based on an Improved Discrete Bat Algorithm. Appl. Soft Comput. J. 2020, 95, 106498. [Google Scholar] [CrossRef]
  32. Ouaddah, A.; Boughaci, D. Harmony Search Algorithm for Image Reconstruction from Projections. Appl. Soft Comput. J. 2016, 46, 924–935. [Google Scholar] [CrossRef]
  33. Gholami, J.; Pourpanah, F.; Wang, X. Feature Selection Based on Improved Binary Global Harmony Search for Data Classification. Appl. Soft Comput. J. 2020, 93, 106402. [Google Scholar] [CrossRef]
  34. Ouyang, H.; Wu, W.; Zhang, C.; Li, S.; Zou, D.; Liu, G. Improved Harmony Search with General Iteration Models for Engineering Design Optimization Problems. Soft Comput. 2019, 23, 10225–10260. [Google Scholar] [CrossRef]
  35. Gholami, J.; Ghany, K.K.A.; Zawbaa, H.M. A Novel Global Harmony Search Algorithm for Solving Numerical Optimizations. Soft Comput. 2020, 25, 2837–2849. [Google Scholar] [CrossRef]
  36. Tian, M.; Bo, Y.; Chen, Z.; Wu, P.; Yue, C. A New Improved Firefly Clustering Algorithm for SMC-PHD Filter. Appl. Soft Comput. J. 2019, 85, 105840. [Google Scholar] [CrossRef]
  37. Sinha, A.K.; Anand, A. Optimizing Supply Chain Network for Perishable Products Using Improved Bacteria Foraging Algorithm. Appl. Soft Comput. J. 2020, 86, 105921. [Google Scholar] [CrossRef]
  38. Chang, T.; Kong, D.; Hao, N.; Xu, K.; Yang, G. Solving the Dynamic Weapon Target Assignment Problem by an Improved Artificial Bee Colony Algorithm with Heuristic Factor Initialization. Appl. Soft Comput. J. 2018, 70, 845–863. [Google Scholar] [CrossRef]
  39. Zhao, Y.; Liu, H.; Gao, K. An Evacuation Simulation Method Based on an Improved Artificial Bee Colony Algorithm and a Social Force Model. Appl. Intell. 2020, 51, 100–123. [Google Scholar] [CrossRef]
  40. Hakli, H.; Kiran, M.S. An Improved Artificial Bee Colony Algorithm for Balancing Local and Global Search Behaviors in Continuous Optimization. Int. J. Mach. Learn. Cybern. 2020, 11, 2051–2076. [Google Scholar] [CrossRef]
  41. Shao, G.; Shangguan, Y.; Tao, J.; Zheng, J.; Liu, T.; Wen, Y. An Improved Genetic Algorithm for Structural Optimization of Au–Ag Bimetallic Nanoparticles. Appl. Soft Comput. J. 2018, 73, 39–49. [Google Scholar] [CrossRef]
  42. Wang, R.L.; Okazaki, K. An Improved Genetic Algorithm with Conditional Genetic Operators and Its Application to Set-Covering Problem. Soft Comput. 2007, 11, 687–694. [Google Scholar] [CrossRef]
  43. Zhao, Z.; Liu, B.; Zhang, C.; Liu, H. An Improved Adaptive NSGA-II with Multi-Population Algorithm. Appl. Intell. 2019, 49, 569–580. [Google Scholar] [CrossRef]
  44. Morales-Castañeda, B.; Zaldívar, D.; Cuevas, E.; Maciel-Castillo, O.; Aranguren, I.; Fausto, F. An Improved Simulated Annealing Algorithm Based on Ancient Metallurgy Techniques. Appl. Soft Comput. J. 2019, 84, 105761. [Google Scholar] [CrossRef]
  45. Li, Y.; Wang, C.; Gao, L.; Song, Y.; Li, X. An Improved Simulated Annealing Algorithm Based on Residual Network for Permutation Flow Shop Scheduling. Complex Intell. Syst. 2020, 7, 1173–1183. [Google Scholar] [CrossRef]
  46. El-Ashmawi, W.H.; Elminaam, D.S.A. A Modified Squirrel Search Algorithm Based on Improved Best Fit Heuristic and Operator Strategy for Bin Packing Problem. Appl. Soft Comput. J. 2019, 82, 105565. [Google Scholar] [CrossRef]
  47. Zhang, T.; Yue, Q.; Zhao, X.; Liu, G. An Improved Firework Algorithm for Hardware/Software Partitioning. Appl. Intell. 2019, 49, 950–962. [Google Scholar] [CrossRef]
  48. Sankhwar, S.; Gupta, D.; Ramya, K.C.; Sheeba Rani, S.; Shankar, K.; Lakshmanaprabu, S.K. Improved Grey Wolf Optimization-Based Feature Subset Selection with Fuzzy Neural Classifier for Financial Crisis Prediction. Soft Comput. 2020, 24, 101–110. [Google Scholar] [CrossRef]
  49. Rizk-Allah, R.M. An Improved Sine–Cosine Algorithm Based on Orthogonal Parallel Information for Global Optimization. Soft Comput. 2019, 23, 7135–7161. [Google Scholar] [CrossRef]
  50. Venkata Rao, R. Jaya: A Simple and New Optimization Algorithm for Solving Constrained and Unconstrained Optimization Problems. Int. J. Ind. Eng. Comput. 2016, 7, 19–34. [Google Scholar] [CrossRef]
  51. Rao, R.V.; Saroj, A. An Elitism-Based Self-Adaptive Multi-Population Jaya Algorithm and Its Applications. Soft Comput. 2019, 23, 4383–4406. [Google Scholar] [CrossRef]
  52. Leghari, Z.H.; Hassan, M.Y.; Said, D.M.; Jumani, T.A.; Memon, Z.A. A Novel Grid-Oriented Dynamic Weight Parameter Based Improved Variant of Jaya Algorithm. Adv. Eng. Softw. 2020, 150, 102904. [Google Scholar] [CrossRef]
  53. Ding, Z.; Li, J.; Hao, H. Structural Damage Identification Using Improved Jaya Algorithm Based on Sparse Regularization and Bayesian Inference. Mech. Syst. Signal Process. 2019, 132, 211–231. [Google Scholar] [CrossRef]
  54. Gholami, K.; Olfat, H.; Gholami, J. An Intelligent Hybrid JAYA and Crow Search Algorithms for Optimizing Constrained and Unconstrained Problems. Soft Comput. 2021, 25, 14393–14411. [Google Scholar] [CrossRef]
  55. Luu, T.V.; Nguyen, N.S. Parameters Extraction of Solar Cells Using Modified JAYA Algorithm. Optik 2020, 203, 164034. [Google Scholar] [CrossRef]
  56. Fan, J.; Shen, W.; Gao, L.; Zhang, C.; Zhang, Z. A Hybrid Jaya Algorithm for Solving Flexible Job Shop Scheduling Problem Considering Multiple Critical Paths. J. Manuf. Syst. 2021, 60, 298–311. [Google Scholar] [CrossRef]
  57. Aslan, M.; Gunduz, M.; Kiran, M.S. JayaX: Jaya Algorithm with Xor Operator for Binary Optimization. Appl. Soft Comput. J. 2019, 82, 105576. [Google Scholar] [CrossRef]
  58. Rao, R.V.; More, K.C. Design Optimization and Analysis of Selected Thermal Devices Using Self-Adaptive Jaya Algorithm. Energy Convers. Manag. 2017, 140, 24–35. [Google Scholar] [CrossRef]
  59. Ravipudi, J.L.; Neebha, M. Synthesis of Linear Antenna Arrays Using Jaya, Self-Adaptive Jaya and Chaotic Jaya Algorithms. AEU—Int. J. Electron. Commun. 2018, 92, 54–63. [Google Scholar] [CrossRef]
  60. Raut, U.; Mishra, S. An Improved Elitist–Jaya Algorithm for Simultaneous Network Reconfiguration and DG Allocation in Power Distribution Systems. Renew. Energy Focus 2019, 30, 92–106. [Google Scholar] [CrossRef]
  61. Rao, R.V.; Saroj, A.; Bhattacharyya, S. Design Optimization of Heat Pipes Using Elitism-Based Self-Adaptive Multipopulation Jaya Algorithm. J. Thermophys. Heat Transf. 2018, 32, 702–712. [Google Scholar] [CrossRef]
  62. Jian, X.; Cao, Y. A Chaotic Second Order Oscillation JAYA Algorithm for Parameter Extraction of Photovoltaic Models. Photonics 2022, 9, 131. [Google Scholar] [CrossRef]
  63. Son, N.N.; Chinh, T.M.; Anh, H.P.H. Uncertain Nonlinear System Identification Using Jaya-Based Adaptive Neural Network. Soft Comput. 2020, 24, 17123–17132. [Google Scholar] [CrossRef]
  64. Rao, R.V. Rao Algorithms: Three Metaphor-Less Simple Algorithms for Solving Optimization Problems. Int. J. Ind. Eng. Comput. 2020, 11, 107–130. [Google Scholar] [CrossRef]
  65. Chaudhuri, A. Search Space Division Method for Wrapper Feature Selection on High-Dimensional Data Classification. Knowledge-Based Syst. 2024, 291, 111578. [Google Scholar] [CrossRef]
  66. Sharma, P.; Raju, S. Metaheuristic Optimization Algorithms: A Comprehensive Overview and Classification of Benchmark Test Functions. Soft Comput. 2024, 28, 3123–3186. [Google Scholar] [CrossRef]
  67. Wang, D.; Tan, D.; Liu, L. Particle Swarm Optimization Algorithm: An Overview. Soft Comput. 2018, 22, 387–408. [Google Scholar] [CrossRef]
  68. Mirjalili, S. Dragonfly Algorithm: A New Meta-Heuristic Optimization Technique for Solving Single-Objective, Discrete, and Multi-Objective Problems. Neural Comput. Appl. 2016, 27, 1053–1073. [Google Scholar] [CrossRef]
  69. Saremi, S.; Mirjalili, S.; Lewis, A. Grasshopper Optimisation Algorithm: Theory and Application. Adv. Eng. Softw. 2017, 105, 30–47. [Google Scholar] [CrossRef]
  70. Mirjalili, S. Moth-Flame Optimization Algorithm: A Novel Nature-Inspired Heuristic Paradigm. Knowledge-Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  71. Mirjalili, S. SCA: A Sine Cosine Algorithm for Solving Optimization Problems. Knowledge-Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  72. Gholami, J.; Mardukhi, F.; Zawbaa, H.M. An Improved Crow Search Algorithm for Solving Numerical Optimization Functions. Soft Comput. 2021, 25, 9441–9454. [Google Scholar] [CrossRef]
  73. Mafarja, M.M.; Mirjalili, S. Hybrid Whale Optimization Algorithm with Simulated Annealing for Feature Selection. Neurocomputing 2017, 260, 302–312. [Google Scholar] [CrossRef]
  74. Qais, M.H.; Hasanien, H.M.; Turky, R.A.; Alghuwainem, S.; Tostado-Véliz, M.; Jurado, F. Circle Search Algorithm: A Geometry-Based Metaheuristic Optimization Algorithm. Mathematics 2022, 10, 1626. [Google Scholar] [CrossRef]
  75. Pierezan, J.; Dos Santos Coelho, L. Coyote Optimization Algorithm: A New Metaheuristic for Global Optimization Problems. In Proceedings of the 2018 IEEE Congress on Evolutionary Computation (CEC), Rio de Janeiro, Brazil, 8–13 July 2018. [Google Scholar] [CrossRef]
  76. Carrasco, J.; García, S.; Rueda, M.M.; Das, S.; Herrera, F. Recent Trends in the Use of Statistical Tests for Comparing Swarm and Evolutionary Computing Algorithms: Practical Guidelines and a Critical Review. Swarm Evol. Comput. 2020, 54, 100665. [Google Scholar] [CrossRef]
  77. Derrac, J.; García, S.; Hui, S.; Suganthan, P.N.; Herrera, F. Analyzing Convergence Performance of Evolutionary Algorithms: A Statistical Approach. Inf. Sci. 2014, 289, 41–58. [Google Scholar] [CrossRef]
  78. da Silva, L.S.A.; Lúcio, Y.L.S.; dos Santos Coelho, L.; Mariani, V.C.; Rao, R.V. A Comprehensive Review on Jaya Optimization Algorithm. Artif. Intell. Rev. 2023, 56, 4329–4361. [Google Scholar] [CrossRef]
Figure 1. The search space of the functions is presented.
Figure 1. The search space of the functions is presented.
Algorithms 17 00472 g001aAlgorithms 17 00472 g001b
Figure 2. The convergence rate of different algorithms from F1 to F6.
Figure 2. The convergence rate of different algorithms from F1 to F6.
Algorithms 17 00472 g002
Figure 3. The convergence rate of different algorithms from F7 to F12.
Figure 3. The convergence rate of different algorithms from F7 to F12.
Algorithms 17 00472 g003aAlgorithms 17 00472 g003b
Figure 4. The convergence rate of different algorithms from F13 to F20.
Figure 4. The convergence rate of different algorithms from F13 to F20.
Algorithms 17 00472 g004
Figure 5. The time complexity of the different methods (time is per second).
Figure 5. The time complexity of the different methods (time is per second).
Algorithms 17 00472 g005
Table 1. The benchmark functions.
Table 1. The benchmark functions.
Test FunctionIterationDimensionRange
F 1 x = i = 1 n x i 2 100030[−100, 100]
F 2 x = i = 1 n x i + i = 1 n x i 100030[−10, 10]
F 3 x = ( 1.5 x 1 + x 1 × x 2 ) 2 + ( 2.25 x 1 + x 1 × x 2 2 ) 2 + ( 2.625 x 1 + x 1 × x 2 3 ) 2 100030[−4.5, 4.5]
F 4 x = max i { x i     ,       1 i n } 100030[−100, 100]
F 5 x = i = 1 n 1 [ 100 x i + 1 + x i 2 2 + ( x i 1 ) 2 ] 100030[−30, 30]
F 6 x = 0.5 + s i n 2 x 1 2 x 2 2 0.5 [ 1 + 0.001 ( x 1 2 + x 2 2 ) ] 2 100030[−100, 100]
F 7 x = i = 1 n i x i 4 + r a n d o m [ 0 ,   1 ) 100030[−1.28, 1.28]
F 8 x = s i n 2 π w 1 + i = 1 d 1 ( x 1 1 ) 2 1 + 10 s i n 2 π w i + 1 + ( w d 1 ) 2 [ 1 + 10 s i n 2 2 π w d ]
w i = 1 + x i 1 4
100030[−10, 10]
F 9 x = s i n 2 3 π x 1 + ( x 1 1 ) 2 1 + s i n 2 3 π x 2 + ( x 2 1 ) 2 1 + s i n 2 2 π x 2 100030[−10, 10]
F 10 x = 20 exp 0.2 1 n i = 1 n x i 2 e x p   ( 1 n i = 1 n cos 2 π x i + 20 + e ) 100030[−32, 32]
F 11 x = 1 4000 i = 1 n x i 2 i = 1 n c o s ( x i 2 i ) + 1 100030[−600, 600]
F 12 x = π n 10 sin π y i + i = 1 n 1 ( y i 1 ) 2 1 + 10 s i n 2 π y i + 1 + y n 1 2 + i = 1 n u ( x i ,   10,100,4 )
y i = 1 + x i + 1 4
u ( x i ,   a , k , m ) = k ( x i a ) m                   x i > a 0                           a < x i < a k ( x i a ) m       x i < a
100030[−50, 50]
F 13 x = 0.1 s i n 2 3 π x 1 + i = 1 n ( x i 1 ) 2 1 + s i n 2 ( 3 π x i + 1 ) + ( x n 1 ) 2 + ( x n 1 ) 2 1 + s i n 2 ( 3 π x n ) 100030[−50, 50]
F 14 x = i = 2 n i ( 2 x i 2 x i 1 2 ) 2 + ( x 1 1 ) 2 100030[−10, 10]
F 15 x = i = 1 n | x i sin x i + 0.1 x i | 100030[−1, 1]
F 16 x = i = 1 n ( 10 6 ) i 1 n 1 x i 2 100030[−100, 100]
F 17 x = ( x 1 1 ) 2 + i = 1 n i ( 2 x i 2 x i 1 ) 2 100030[−1, 1]
F 18 x = i = 1 n i x i 2 100030[−100, 100]
F 19 x = i = 1 n ( x i 1 ) 2 i = 2 n x i x i 1 100030 [ n 2 ,   n 2 ]
F 20 x = 1 cos 2 π i = 1 2 x i 2 + 0.1 i = 1 n x i 2 100030[−100, 100]
Table 2. The findings of F1–F6.
Table 2. The findings of F1–F6.
FunctionAlgorithmMeanSTDWorstBest
F1SCA1.03246E−022.21581E−024.99561E−024.70286E−05
DA7.11906E+023.98958E+021.17339E+032.35173E+02
GOA4.37911E 002.77415E+008.76370E+002.02460E+00
MFO2.00224E+034.47089E+031.00000E+041.55478E−02
PSO9.26402E−091.56966E−083.68461E−081.69501E−11
CSA8.15910E−023.91503E−021.34609E−015.01376E−02
JAYA1.67017E−047.75324E−052.40270E−045.68884E−05
EJAYA4.51194E−525.40138E−521.33419E−514.30324E−54
F2SCA1.66467E−052.84255E−056.62285E−053.21435E−07
DA1.25115E+014.74389E+001.84707E+015.28163E+00
GOA4.87563E+003.68567E+001.11171E+011.98699E+00
MFO8.00000E+038.36660E+032.00000E+045.23906E−03
PSO5.41294E−046.34133E−041.60348E−037.37568E−06
CSA3.78993E+001.64683E+005.37818E+001.57713E+00
JAYA4.80091E−031.41672E−036.55615E−032.68801E−03
EJAYA2.05072E−333.80692E−338.84524E−331.04480E−35
F3SCA1.56908E−042.71951E−046.40716E−047.97609E−06
DA3.16037E−055.17153E−051.20071E−041.12308E−08
GOA1.52414E−013.40808E−017.62070E−013.31072E−15
MFO4.00004E+035.47723E+031.00001E+041.40428E−02
PSO9.13802E−022.04332E−014.56901E−010.00000E+00
CSA4.46483E−296.23888E−291.52669E−285.54668E−31
JAYA0.00000E+000.00000E+000.00000E+000.00000E+00
EJAYA0.00000E+000.00000E+000.00000E+000.00000E+00
F4SCA2.07301E+015.63654E+002.88824E+011.30543E+01
DA2.48837E+011.09484E+014.00000E+011.16565E+01
GOA1.08496E+013.61725E+001.63163E+018.20644E+00
MFO2.00007E+034.47212E+031.00000E+043.10525E−03
PSO4.25763E−011.67833E−016.65839E−012.62136E−01
CSA7.75578E+002.16753E+009.96141E+005.23108E+00
JAYA6.84824E+001.48590E+008.79045E+004.99795E+00
EJAYA3.61570E−056.60817E−051.53172E−044.33560E−08
F5SCA3.18710E+024.81347E+021.15414E+032.83299E+01
DA1.98429E+051.16600E+053.96126E+051.09284E+05
GOA3.36730E+033.58249E+038.65663E+031.73102E+02
MFO6.00002E+038.94426E+032.00000E+042.06509E−02
PSO7.99993E+013.47184E+011.14977E+022.56017E+01
CSA7.54176E+013.79232E+011.23610E+023.56602E+01
JAYA7.66403E+016.16420E+011.63148E+021.36139E+01
EJAYA4.01170E+013.68858E+011.06099E+022.33569E+01
F6SCA0.00000E+000.00000E+000.00000E+000.00000E+00
DA4.96992E−108.02601E−101.84421E−090.00000E+00
GOA2.44249E−151.44755E−153.99680E−152.22045E−16
MFO4.00002E+035.47722E+031.00000E+045.82894E−03
PSO0.00000E+000.00000E+000.00000E+000.00000E+00
CSA0.00000E+000.00000E+000.00000E+000.00000E+00
JAYA0.00000E+000.00000E+000.00000E+000.00000E+00
EJAYA0.00000E+000.00000E+000.00000E+000.00000E+00
Table 3. The F7–F12 findings.
Table 3. The F7–F12 findings.
FunctionAlgorithmMeanSTDWorstBest
F7SCA4.80880E−023.63449E−021.09292E−011.27893E−02
DA3.38066E−011.98080E−015.66407E−011.41424E−01
GOA2.16150E−021.07270E−023.66619E−029.20112E−03
MFO2.00009E+034.47209E+031.00000E+047.58884E−03
PSO6.16951E−022.78374E−021.06271E−013.43940E−02
CSA4.86380E−021.51938E−026.31833E−022.79767E−02
JAYA7.59642E−023.91281E−021.39709E−013.96932E−02
EJAYA5.22059E−032.49125E−038.46121E−032.67151E−03
F8SCA2.08951E+007.27978E−022.16510E+001.98118E+00
DA8.91103E+004.03396E+001.36540E+013.94314E+00
GOA2.08458E+018.52657E+003.39048E+011.08618E+01
MFO1.99721E−022.00586E−025.14874E−021.15134E−03
PSO1.81730E−012.48844E−014.54324E−011.41313E−09
CSA2.41661E+001.14717E+003.56754E+009.53489E−01
JAYA2.30841E+001.58824E+004.63521E+009.08648E−01
EJAYA7.01740E−012.13049E−019.38772E−014.69613E−01
F9SCA9.10575E−048.84160E−042.47096E−033.83725E−04
DA1.83618E−043.84288E−048.70693E−048.51019E−07
GOA2.24691E−123.54081E−128.40632E−124.53539E−14
MFO8.00003E+034.47212E+031.00001E+045.10610E−02
PSO1.34978E−310.00000E+001.34978E−311.34978E−31
CSA3.22994E−283.18425E−288.05694E−286.08306E−29
JAYA1.34978E−310.00000E+001.34978E−311.34978E−31
EJAYA1.34978E−310.00000E+001.34978E−311.34978E−31
F10SCA1.41044E+019.07295E+002.03084E+011.88488E−02
DA9.33009E+001.40125E+001.08356E+017.67650E+00
GOA4.63716E+001.36267E+006.54250E+003.04090E+00
MFO4.00001E+035.47722E+031.00000E+046.47865E−03
PSO1.84941E−042.07915E−044.31328E−049.85355E−06
CSA4.42033E+004.32812E−014.96195E+003.89060E+00
JAYA3.99985E+008.92096E+001.99581E+017.47414E−03
EJAYA7.99361E−150.00000E+007.99361E−157.99361E−15
F11SCA2.67656E−012.61150E−015.40061E−011.06340E−03
DA1.16399E+014.54763E+001.72099E+016.25351E+00
GOA8.62586E−011.13266E−011.00433E+007.25621E−01
MFO2.00006E+034.47211E+031.00000E+041.05132E−02
PSO7.87895E−038.05111E−031.72263E−027.31268E−11
CSA3.51615E−011.13874E−014.57751E−012.22855E−01
JAYA8.08963E−021.26610E−013.03887E−015.41900E−04
EJAYA0.00000E+000.00000E+000.00000E+000.00000E+00
F12SCA2.42008E+025.36482E+021.20169E+036.33283E−01
DA5.15708E+014.82337E+011.30150E+023.91676E+00
GOA6.92309E+009.24018E−018.30938E+006.02632E+00
MFO6.00004E+038.94432E+032.00001E+042.13515E−03
PSO2.07338E−024.63622E−021.03669E−012.84415E−12
CSA5.38686E+001.52415E+006.90146E+002.97901E+00
JAYA8.02845E+003.70934E+001.43468E+014.95010E+00
EJAYA5.85022E−023.06103E−021.06179E−012.41507E−02
Table 4. The F13–F20 outcomes.
Table 4. The F13–F20 outcomes.
FunctionAlgorithmMeanSTDWorstBest
F13SCA2.18862E+001.08652E+002.98211E+009.98054E−01
DA3.17541E+001.62401E+004.95049E+009.98004E−01
GOA9.98004E−012.00148E−169.98004E−019.98004E−01
MFO2.04105E+034.45009E+031.00000E+046.82542E−03
PSO3.76477E+002.01645E+005.92885E+001.99203E+00
CSA9.98004E−010.00000E+009.98004E−019.98004E−01
JAYA9.98455E−017.86438E−049.99822E−019.98004E−01
EJAYA9.98004E−012.71948E−169.98004E−019.98004E−01
F14SCA4.00116E+014.78411E+011.19870E+021.23403E+00
DA1.85503E+031.55822E+034.35763E+032.88031E+02
GOA1.77073E+002.99038E+007.02426E+005.84282E−03
MFO6.00002E+038.94427E+032.00000E+041.22123E−02
PSO1.03821E−021.61032E−023.82335E−026.80767E−04
CSA3.61472E−011.90615E−015.70487E−011.44379E−01
JAYA6.06570E−016.68239E−011.53341E+001.31801E−03
EJAYA4.99191E−042.28724E−048.08109E−042.63417E−04
F15SCA7.14393E−011.55307E+003.49250E+004.10077E−03
DA1.56568E+018.10350E+002.48120E+016.28732E+00
GOA7.89100E+003.77821E+001.38955E+014.12501E+00
MFO6.00005E+038.94427E+032.00000E+046.34759E−03
PSO6.02316E−041.01510E−032.41332E−034.86309E−05
CSA1.55546E+001.38591E+003.93547E+006.87757E−01
JAYA7.20921E+008.28920E+002.02500E+012.75554E−01
EJAYA1.24909E−082.79305E−086.24544E−083.69422E−31
F16SCA3.07360E+004.81821E+001.10442E+016.16382E−03
DA2.10898E+071.38457E+074.32807E+071.04843E+07
GOA8.44624E+065.85351E+061.67968E+071.04565E+06
MFO1.00000E+047.07106E+032.00000E+043.94652E−02
PSO4.11067E−046.21091E−041.49353E−031.74388E−06
CSA1.89219E+065.04692E+052.41621E+061.11103E+06
JAYA2.39280E+014.74185E+011.08578E+022.48714E−01
EJAYA2.35507E−503.86420E−509.19815E−501.42800E−52
F17SCA2.01648E−011.16116E+001.34264E+00−1.04745E+00
DA7.64331E+027.72456E+022.10686E+032.13861E+02
GOA7.06963E+007.54351E+001.99448E+012.12165E−01
MFO4.00002E+035.47721E+031.00000E+041.61671E−02
PSO−1.00752E+002.60779E−01−5.41030E−01−1.12500E+00
CSA2.20753E+002.86786E+007.21599E+003.35290E−02
JAYA−1.09034E+001.92613E−02−1.05758E+00−1.10662E+00
EJAYA−1.12500E+001.48667E−06−1.12500E+00−1.12500E+00
F18SCA1.44106E−041.15066E−042.60150E−043.83455E−07
DA1.63188E+029.89062E+012.76239E+023.26153E+01
GOA1.57916E+001.93039E+004.88053E+005.46532E−02
MFO4.00002E+035.47721E+031.00000E+042.25808E−03
PSO7.19291E−079.89483E−072.03067E−067.13614E−10
CSA1.03218E+005.14330E−011.73690E+004.04716E−01
JAYA2.51463E−051.45855E−054.44227E−051.14162E−05
EJAYA1.45987E−531.60395E−533.99903E−531.27040E−54
F19SCA3.89447E+024.41530E+027.01655E+027.72382E+01
DA7.64813E+044.62739E+041.09202E+054.37607E+04
GOA2.01840E+031.15525E+032.83529E+031.20152E+03
MFO5.00000E+037.07106E+031.00000E+045.75507E−03
PSO1.60102E+022.00487E+023.01868E+021.83363E+01
CSA5.09682E+022.18121E+026.63916E+023.55447E+02
JAYA2.37153E+052.84377E+054.38238E+053.60687E+04
EJAYA1.60948E+012.13564E+001.76049E+011.45847E+01
F20SCA3.46460E−084.59083E−093.78922E−083.13998E−08
DA0.00000E+000.00000E+000.00000E+000.00000E+00
GOA3.39673E−124.72614E−126.73861E−125.48450E−14
MFO5.00004E+037.07102E+031.00000E+047.49789E−02
PSO0.00000E+000.00000E+000.00000E+000.00000E+00
CSA1.11599E−081.57825E−082.23199E−081.09912E−14
JAYA8.92734E−109.63121E−101.57376E−092.11705E−10
EJAYA1.73648E−092.39446E−093.42962E−094.33438E−11
Table 5. The UCI datasets used in this evaluation.
Table 5. The UCI datasets used in this evaluation.
DatasetNo. of FeaturesNo. of Samples
Breast cancer9699
Breast EW30569
Congress EW16453
Exactly131000
Exactly2131000
HeartEW13270
Ionosphere EW34351
Krvskp EW363196
Lymphography18148
M-of-n131000
Penglung EW32573
Sonar EW60208
Spect EW22267
Tic-tac-toe9958
Vote16300
Waveform EW405000
Wine EW13178
Zoo16101
Table 6. The assessment of different algorithms in terms of the accuracy.
Table 6. The assessment of different algorithms in terms of the accuracy.
DatasetWOASAT-2ALOGAPSOFULLEJAYA
Breast cancer0.970.960.960.950.940.98
Breast EW0.980.930.940.940.960.97
Congress EW0.980.930.940.940.920.99
Exactly1.000.660.670.680.671.00
Exactly20.750.750.760.750.740.76
Heart EW0.850.830.820.780.820.86
Ionosphere EW0.960.870.830.840.870.96
Krvskp EW0.980.960.920.940.920.98
Lymphography0.890.790.710.690.680.90
M-of-n1.000.860.930.860.851.00
Penglung EW0.940.630.700.720.661.00
Sonar EW0.970.740.730.740.620.99
Spect EW0.880.800.780.770.830.88
Tic-tac-toe0.790.730.710.730.720.80
Vote0.970.920.890.890.880.97
Waveform EW0.760.770.770.760.770.78
Wine EW0.990.910.930.950.931.00
Zoo0.970.910.880.830.791.00
Average0.920.830.830.820.810.93
The better results were shown in bold format.
Table 7. The number of selected features for different algorithms.
Table 7. The number of selected features for different algorithms.
DatasetWOASAT-2ALOGAPSOFULLEJAYA
Breast cancer4.26.285.095.725.46
Breast EW11.616.0816.3516.561414
Congress EW6.46.986.626.837.86.2
Exactly66.6210.829.7576
Exactly22.810.76.186.187.21.6
Hear tEW5.410.319.497.946.87.3
Ionosphere EW12.89.4217.3119.1814.610.2
Krvskp EW18.424.722.4320.8119.411.8
Lymphography7.211.0511.058.9810.87.2
M-of-n611.086.839.047.66
Penglung EW127.4164.13177.13178.75141.2124.1
Sonar EW26.437.9233.331.229.219
Spect EW9.416.1511.7512.511.48.7
Tic-tac-toe66.996.856.616.66
Vote5.29.526.628.86.83.1
Waveform EW20.635.7225.2822.721914.5
Wine EW6.410.78.638.366.24.3
Zoo5.613.9710.119.747.46.8
Average15.9822.6821.7621.6418.2414.60
The better results were shown in bold format.
Table 8. The results of the Friedman test.
Table 8. The results of the Friedman test.
FunctionsSCADAGOAMFOPSOCSAJAYAEJAYA
F138752641
F228753641
F376582422
F487623541
F558714623
F63.53.5783.53.53.53.5
F748326571
F867821543
F976582422
F1058632741
F1148752631
F1246821573
F1374418444
F1478452631
F1538742651
F1638642751
F1748751.5631.5
F1838652741
F1948613572
F2071.5481.5365
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Bairooz, J.J.; Mardukhi, F. An Innovative Enhanced JAYA Algorithm for the Optimization of Continuous and Discrete Problems. Algorithms 2024, 17, 472. https://doi.org/10.3390/a17110472

AMA Style

Bairooz JJ, Mardukhi F. An Innovative Enhanced JAYA Algorithm for the Optimization of Continuous and Discrete Problems. Algorithms. 2024; 17(11):472. https://doi.org/10.3390/a17110472

Chicago/Turabian Style

Bairooz, Jalal Jabbar, and Farhad Mardukhi. 2024. "An Innovative Enhanced JAYA Algorithm for the Optimization of Continuous and Discrete Problems" Algorithms 17, no. 11: 472. https://doi.org/10.3390/a17110472

APA Style

Bairooz, J. J., & Mardukhi, F. (2024). An Innovative Enhanced JAYA Algorithm for the Optimization of Continuous and Discrete Problems. Algorithms, 17(11), 472. https://doi.org/10.3390/a17110472

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop