1. Introduction
Every day, new and complex optimization problems arise in fields such as mathematics, industry, and engineering [
1]. When the problems became more complex, traditional optimization approaches were discovered with high computing costs and they were trapped in local optima while solving them [
2]. As a result, scientists have been looking for new techniques to address these problems [
3]. Metaheuristic algorithms are promising solutions proposed by drawing inspiration from herd animals’ food-finding habits or natural occurrences. Metaheuristic (MH) algorithms have many benefits, including the ability to resist local optima, use a gradient-free mechanism, and provide rational solutions regardless of problem structure [
4].
Mathematical optimization is the process of locating an item within an accessible domain for a given problem that has a maximum or minimum value [
5]. The advancement of optimization methods is critical since optimization problems arise in different fields of analysis. The majority of traditional optimization approaches are deterministic and rely on derivative knowledge [
6]. However, in real life, determining the optimal values of a problem is not always feasible [
7]. Because of their derivative-free behavior and promising optimization ability, MH techniques are becoming increasingly popular. Other benefits of these approaches include flexibility, ease of execution, and the avert ability to skip local optima [
8].
Exploration and exploitation of the MH technique are two main methods used in metaheuristics [
9]. Exploration refers to the opportunity to discover or visit new areas of the solution space. In contrast, exploitation refers to retrieving valuable knowledge about nearby regions from the found search domain. The balance between manipulation and discovery determines the consistency of the solutions found by any MH. These algorithms (i.e., MH), look for a single candidate agent or a set of agents. Single agent-oriented approaches are those that are based on a single candidate agent, whereas population-based approaches are those that are based on a group of candidate solutions. MH are made to look like natural phenomena. MH techniques can be divided into three categories depending on their source of inspiration: swarm intelligence-based algorithms, evolutionary algorithms, and physics-based algorithms.
Swarm intelligence has risen to prominence in the world of nature-inspired strategies in recent years [
10]. It is also utilized to address real-life optimization problems and is focused on the mutual actions of swarms or colonies of animals [
11]. Swarm-based optimization algorithms use the collaborative trial and error approach to find the optimal solution. The Arithmetic Optimization Algorithm (AOA) [
10], the Aquila Optimizer (AO) [
12], and the Barnacles Mating Optimizer (BMO) [
13] are well-known methods in this category. Thus, many complicated optimization problems have been solved using this class of optimization algorithm, such as scheduling problems [
14].
Recently, a new swarm-based optimization technique has been proposed which named is named the Jellyfish Search Algorithm (JSA) [
15]. This algorithm emulates the behaviour of a jellyfish swarm in nature. In accordance with the characteristics of JSA, it has been applied to solve different sets of optimization problems. For example, JSA has been used to determined the optimal solution of global benchmark functions in [
15] and its efficiency over other metaheuristic (MH) techniques has been established. Gouda et al. [
16] proposed an alternative method for estimating the parameters of the PEM fuel cell. In [
17], a multi-objective version of JSA is proposed and it has been applied to solve multiple objective engineering problems. In [
18], the JSA was implemented to solve the spectrum defragmentation problem and it showed better performance compared to other methods. Chou et al. [
19] presented a time-series forecasting method for energy consumption. This method was compared with other teaching-learning-based optimization (TLBO) and symbiotic organism search (SOS) algorithms. The results of JSA are better than those algorithms. In addition, JSA was applied to other fields such as video watermarking [
20].
In previous applications of JSA, its ability to solve different optimization problems has been observed. However, it still suffers from some drawbacks that can affect its performance. For example, it requires more improvement in its ability to balance between the exploration and exploitation phases during the searching process. This motivated us to propose an alternative version of JSA to avoid the limitations of conventional JSA and to apply it as a global optimization technique.
The proposed developed version of JSA is called DJSD—dynamic differential annealed technique. The proposed DJSD method integrated the Jellyfish Search Algorithm operators with active differential annealed optimization [
21] and the disruption operator to gain the advantages of both approaches. The proposed method is evaluated using various benchmark problems (i.e., classical benchmark). The proposed approach’s performance is analyzed and compared with other methods to solve the same problems. The results proved that the presented method is better than other comparative approaches, and it found new best solutions for several test cases. In addition, it is extended by using it as a task scheduler technique in a cloud computing environment.
In conclusion, the following contributions are included in this paper:
Enhancing the Jellyfish Search Algorithm using the concept of the dynamic annealed and disruption operator to improve its exploration and exploitation abilities during the searching process.
Applying the developed method, named DJSD, as a global optimization method through evaluating it using different classical optimization benchmark functions against other well-known MH methods.
Offering an alternative task scheduling method to address cloud task scheduling problems.
The remaining sections of this paper are organized as follows.
Section 2 presents the background of the jellyfish optimization algorithm, the Simulated Annealing algorithm, and the disruption operator.
Section 3 introduces the steps of the proposed method. The experimental results and discussions are presented in
Section 4. Finally,
Section 5 concludes the paper.
3. Developed Method
The framework of the presented DJSD method is illustrated in
Figure 1. The improved DJSD aims to enhance the performance of the traditional JSA using dynamic differential Simulated Annealing and a disruption operator. Each of these techniques is applied to enhance the exploration and exploitation of JSA.
The developed DJSD starts by producing the initial population then computing the fitness value for each agent inside this population. This is followed by determining the best agent that has the smallest fitness value. The following process updates the agents according to the time control mechanism (TCM) value, which determines whether the agents will be updated using an ocean current or jellyfish swarm. In the latter (i.e., the jellyfish swarm), the traditional operators of the JSA algorithm are applied to update the current agent. Otherwise, the competition between JSA operators, dynamic differential Simulated Annealing, and the are used to update the present agent. This is performed by updating the agents using either the traditional operators of the JSA in ocean current or the DJSD. Then, the mechanism of SA to decrease the probability of choosing a new agent as the temperature is reduced is applied. Finally, after updating the current population, the is used to improve the diversity of X.
3.1. Initial Stage
The developed DJSD starts at this point by constructing an initial population
with
N agents, and this is formulated as:
In Equation (
21),
stands for random
D values.
and
refer to the limits of the search domain.
3.2. Updating Stage
At this stage, the DJSD starts updating the agents within the current population (
X) by calculating the fitness value
for each agent
. The next step in DJSD is to allocate the best agent
, which has the best fitness value
. Then the value of TCM is improved using Equation (
17). In cases where
, the operators of the jellyfish swarm are used to update
. Otherwise, the combination of ocean current, DJSD, and
is used to enhance
. This is achieved based on the dynamic characteristics of the hammer during the search for the optimal solution. This represents a fluctuating parameter between the ocean current and the operator of SA. Hence, this process is formulated as:
In Equation (
22),
is random number and
represents the remaining mathematical function.
are random solutions chosen from the current population
X, while
denotes a random solution generated in the search space.
After that, the new value of
(i.e.,
) is accepted at temperatures elevated above low temperatures, and this is performed depending on the probability value
p as in the traditional SA. This process is formulated as follows.
The next step is to apply the
operator to the current updated population
X. However, this process takes more time, and this increases the computational time of the developed method. Accordingly,
will be applied according to the following formula:
3.3. Terminal Stage
The stop conditions are checked within this stage; if they are not met, the updating stage is repeated. Otherwise, the best solution found so far () is returned.
5. Conclusions
The artificial Jellyfish Search Algorithm (JSA) is a recent promising search method to simulate jellyfish in the ocean. It has been applied to solve various optimization problems. However, it faces some problems in the search process while solving complicated problems, particularly the local optima problem and the low diversity of candidate solutions. This paper suggests a novel dynamic search method based on using the artificial jellyfish search optimizer with two search techniques (Simulated Annealing and disruption operators), called DJSD. The enhancement of the proposed method occurs in two stages. In the first stage, the Simulated Annealing operators are incorporated into the artificial jellyfish search optimizer to enhance the ability to discover more feasible regions in a competitive manner. This modification is performed dynamically by using a fluctuating parameter representing a hammer’s characteristics to keep the solution diverse and balance the search processes. In the second stage, the disruption operator is employed in the exploitation frame to further improve the diversity of the candidate solutions throughout the optimization process and avert the local optima problem.
Two experiment series are conducted to validate the performance of the proposed DJSD method. In the first experiment series, thirty classical benchmark functions are used to validate the effectiveness of DJSD compared with other well-known search methods. The findings revealed that the suggested DJSD approach obtained encouraging results, discovered new search regions, and found new best solutions for most test cases. In the second experiment series, a set of tests is conducted to solve cloud computing applications’ task scheduling problems to further prove DJSD’s ability to function in real-world applications. The real-world application results confirmed that the proposed DJSD is competence in dealing with challenging real applications. Moreover, it obtained high performances compared to other similar methods using several standard evaluation measures, including fitness function, makespan, and energy consumption.
The proposed method can be tested further to find potential improvements in future works. Furthermore, it can be combined with other search methods to further improve its searchability in dealing with complicated problems. Different optimization problems can be tested to investigate the performance of the proposed technique, such as text clustering, photovoltaic cell parameters, engineering and industrial optimization problems, forecasting models, feature selection, image segmentation, and multi-objective problems.