1. Introduction
Many issues in the natural and applied sciences are represented by systems of nonlinear equations
that require solving, where
such that
is nonlinear for all
. It is well known that determining the precise solution
to the nonlinear system
is a difficult undertaking, especially when the equation comprises terms made up of logarithmic, exponential, trigonometric, or a mix of any transcendental terms. Thus, finding approximate solutions to this type of problem has emerged as a need. The iterative methods, including Newton’s method, are some of the most famous methods for finding approximate solutions to nonlinear equation systems (NESs) [
1]. Alternatively, optimization algorithms have been applied in attempts to extract the root solution of nonlinear systems.
In the last ten years, various optimization algorithms have been developed. Those methods can be divided into four primary categories: human-based methods, swarm-based methods, physical-based methods, and evolutionary-based methods [
2]. Human perception, attitude, or lifestyle influence human-based methods. Examples of these methods are the “Harmony Search Algorithm (HSA)” [
3] and the “Fireworks Algorithm (FA)” [
4]. Swarm-based methods mimic the behavior of swarms or animals to reproduce or survive. Examples of this algorithm are “Sperm Swarm Optimization (SSO)” [
5,
6,
7,
8], “Harris Hawks Optimization (HHO)” [
9], “The Ant Lion Optimizer (ALO)” [
10], and “Butterfly Optimization Algorithm (BOA)” [
11]. Some representative swarm intelligence optimization methods and applications have also been proposed; see for example, [
12]. Physical-based methods are inspired by both physical theories and the universe’s rules. An example of these algorithms is the “Gravitational Search Algorithm (GSA)” [
2], and “Equilibrium Optimizer (EO)” [
13]. Evolutionary-based methods are inspired by the Darwinian theory of evolution. An example of this method is the “Genetics Algorithm (GA)” [
14]. Finally, some advanced optimization methods with applications from the real-life have been proposed, for example [
15,
16].
The primary objectives of these methods are to yield the optimal solution and a higher convergence rate. Meta-heuristic optimization should be based on exploration and exploitation concepts to achieve global optimum solutions. The exploitation concept indicates the ability of a method to converge to the optimal potential solution. In contrast, exploration refers to the power of algorithms to search the entire space of a problem domain. Therefore, the main goal of meta-heuristic methods is to balance the two concepts.
However, different meta-heuristic methods have been developed to find solutions to various real-life tasks. The use of optimization algorithms for solving NESs is significant and critical. Various optimization algorithms are used in the solution of nonlinear systems. The following may be summarized:
By improving the performance of optimization algorithms, researchers have been able to target more accurate solutions. For example, Zhou and Li [
17] provided a unified solution to nonlinear equations using a modified CSA version. FA was modified by Ariyaratne et al. [
18], who made it possible to make the root approximation simultaneously with continuity, differentiation, and initial assumptions. Ren et al. [
19] proposed another variation by combining GA with harmonic and symmetric individuals. Chang [
20] also revised the GA to estimate better parameters for NESs.
Furthermore, complex systems were handled by Grosan and Abraham [
21] by putting them in the form of multi-objective optimization problems. Jaberipour et al. [
22] addressed NESs using a modified PSO method; the modification aims to overcome the core PSO’s drawbacks, such as delayed convergence and trapping at local minimums. Further, NESs have been addressed by Mo and Liu [
23], who added the “Conjugate Direction Method (CDM)” into the PSO algorithm. The algorithm’s efficiency for solving high-dimensional problems and overcoming local minima was increased by using CDM [
24].
Several research methods involved combining two population-based algorithms (PBAs) to achieve more precise results in nonlinear modeling systems. These combinations produce hybrid algorithms that inherit the benefits of both techniques while reducing their downsides [
25]. Hybrid ABC [
26], hybrid ABC and PSO [
27], hybrid FA [
28], hybrid GA [
29], hybrid KHA [
30], hybrid PSO [
31], and many others [
32,
33,
34,
35,
36] are some examples of hybridizing PBAs.
NESs have often been solved using optimization techniques, either using a “Single Optimization Algorithm (SOA)” or a hybrid algorithm that combines two optimization procedures. Only a few researchers have attempted to combine the iterative method and an optimization approach. Karr et al. [
37] presented a hybrid method combining Newton’s method and GA for obtaining solutions for nonlinear testbed problems. After using GA to identify the most efficient starting solution, Newton’s approach was utilized. To solve systems of nonlinear models, a hybrid algorithm described by Luo et al. [
38] can be utilized; the combination includes GA, Powell algorithm, and Newton’s method. Luo et al. [
39] have provided a method for solving NESs by integrating chaos and quasi-Newton techniques. Most of the previous research has concentrated on a specific topic or issue rather than attempting to examine NESs. In a relatively recent study, Sihwail et al. [
40] developed a hybrid algorithm known as NHHO to solve arbitrary NESs of equations that combine Harris Hawks’ optimization method and Newton’s method. Very recently, Sihwail et al. [
41] proposed a new algorithm for solving NESs of equations in which Jarratt’s iterative approach and the Butterfly optimization algorithm were combined to create the new scheme known as JBOA.
A hybrid algorithm can leverage the benefits of one method while overcoming the drawbacks of the other. However, most hybrid methods face problems with premature convergence due to the technique used in the original algorithms [
42]. As a result, choosing a dependable combination of algorithms to produce an efficient hybrid algorithm is a crucial step.
One of the more recent swarm-based methods is Sperm Swarm Optimization (SSO), which is based on the mobility of flocks of sperm to fertilize an ovum. There are various benefits of SSO, which can be listed as follows [
2,
5,
6]:
The capability of exploitation of SSO is very robust.
Several kinds of research have validated its simplicity, efficiency, and ability to converge to the optimal solution.
Its theory can be applied to a wide range of problems in the areas of engineering and science.
Its mathematical formulation is easy to implement, understand, and utilize.
However, most NESs simulate different data science and engineering problems that have more than one solution. Hence, it is difficult to give accurate solutions to these problems. Like other optimization algorithms, SSO may fall into a local minimum (solution) instead of the optimal solution. As a result, we developed a hybrid approach that incorporates Newton’s iterative scheme with the SSO algorithm to mitigate the drawback. It is worth mentioning that Newton’s method is the first known iterative scheme for solving nonlinear equations using the successive approximation technique. According to Newton’s method, the correct digits nearly double each time a step is performed, referred to as the second order of convergence.
Newton’s method is highly dependent on choosing the correct initial point. To achieve good convergence toward the root, the starting point, like other iterative approaches, must be close enough to the root. The scheme may converge slowly or diverge if the initial point is incorrect. Consequently, Newton’s method can only perform limited local searches in some cases.
For the reasons outlined above, a hybrid SSO algorithm (MSSO) has been proposed to solve NESs, where Newton’s method is applied to improve the search technique and SSO is used to enhance the selection of initial solutions and make global search more efficient.
It is not the concern of this study to demonstrate that hybridizing the SSO and Newton’s methods performs better than other optimization algorithms such as PSO or genetic algorithms. However, this work aims to highlight the benefits of hybridizing an optimization algorithm with an iterative method. This is to enhance the iterative method’s accuracy in solving nonlinear systems and reduce its complexity. Further, it is also able to overcome several drawbacks of Newton’s method, such as initial point selection, trapping in local optima, and divergence problems. Moreover, hybridization in MSSO is beneficial in finding better roots for the selected NSEs. Optimization algorithms alone are unlikely to provide precise solutions compared to iterative methods such as Newton’s method and Jarratt’s method.
The proposed modification improves the initial solution distribution in the search space domain. Moreover, compared to the random distribution used by the original technique, Newton’s approach improves the computational accuracy of SSO and accelerates its convergence rate. Hence, this research paper aims to improve the accuracy of NES solutions. The following are the main contributions of this paper:
We present a Modified Newton–Sperm Swarm Optimization Algorithm (MSSO) that combines Newton’s method and SSO to enhance its search mechanism and speed up its convergence rate.
The proposed MSSO method is intended to solve nonlinear systems of different orders.
Different optimization techniques were compared with MSSO, including the original SSO, PSO, ALO, BOA, HHO, and EO. The comparison was made based on multiple metrics, such as accuracy, fitness value, stability, and convergence speed.
The rest of the paper is organized as follows:
Section 2 discusses SSO algorithms and Newton’s iterative method.
Section 3 describes the proposed MSSO.
Section 4 describes the experiments on the benchmark systems and their results. Further discussion of the findings is provided in
Section 5. Finally,
Section 6 presents the study’s conclusion.
3. Modified Sperm Swarm Optimization (MSSO)
SSO is a powerful optimization technique that can address various issues. No algorithm, however, is suitable for tackling all problems, according to the “No Free Lunch (NFL)” theorem [
48]. By using Newton’s method, the proposed MSSO outperforms the original SSO in terms of solving nonlinear equation systems. In MSSO, Newton’s methods are used as a local search to enhance the search process, as shown in
Figure 3.
When Newton’s method is applied to the sperm position, at each iteration, the fitness value of the potential solution is compared to the fitness of the location calculated by Newton’s scheme. The newly computed location by Newton’s method is shown in
Figure 3 as
.
In each iteration, MSSO employs both the SSO algorithm and Newton’s method. The SSO first determines the most optimal sperm location among the twenty initial locations as an optimal candidate location. The optimal candidate location is then fed into Newton’s method. In other words, the output from SSO is considered a potential solution or a temporary solution. The obtained solution is then treated as an input for Newton’s method. Newton’s method as an iterative method calculates the next candidate solution based on Equation (6). Newton’s method’s ability to find a better candidate is very high since it is a second-order convergence method. However, in order to avoid a local optimal solution, the candidate solution obtained from Newton’s method (
Xn+1) is compared to the solution calculated by SSO (
Xsperm). Thus, the location with the lowest fitness value determines the potential solution to the problem. The next iteration is then performed based on the current most promising solution. Algorithm 1 shows the pseudocode for the suggested MSSO algorithm.
Algorithm 1. Modified Sperm Swarm Optimization (MSSO). |
Begin | |
Step 1: | Initialize potential solutions. |
Step 2: | for i = 1: size of flock do |
Step 3: | apply the fitness for potential solution. |
| if obtained fitness > best solution of the potential solution then |
| give the current value as the best solution of the potential solution. |
| end if |
| end for |
Step 4: | depends on the winner, give the value of winner. |
Step 5: | for i =1: size of flock do |
| Perform Equation (5) |
| Perform Equation (1). |
| end for |
Step 6: | Calculate Newton’s location using Equation (6) |
| Calculate the fitness of and using Equation (7) |
| if fitness () < fitness |
| |
| end if |
Step 7: | while final iterations is not reached go to Step 2. |
End. | |
The initialization, exploitation, and exploration phases of the SSO method are shown in the algorithm. The alterations specified in the red box are implemented at the end of each iteration. We compare Newton’s location with the sperm’s optimal location based on their fitness values and select the one that has the best fitness value.
Computational Complexity
The complexity of the new MSSO’s can be obtained by adding up the SSO’s complexity and Newton’s method’s complexity. At first glance, Newton’s technique is overly complicated compared to optimization methods. At each iteration, one has to solve a system of linear models, which is time-consuming because every Jacobian calculation requires scalar function evaluations. As a result, combining Newton’s approach with any optimization process is likely to make it more complicated.
On the other hand, combining SSO with Newton’s technique did not significantly increase processing time. However, the MSSO can overcome Newton’s method limitations, including selecting the starting points and divergence difficulties. As a result, the MSSO is superior at solving nonlinear equation systems.
The MSSO’s time complexity is influenced by the initial phase, the process of updating the position of the sperm, and the use of Newton’s scheme. The complexity of the initialization process is O(S), where S is the total number of sperm. The updating process, which includes determining the optimal solution and updating sperm positions, has a complexity equal to O(I × S) + O(I × S × M), where I and M represent the maximum number of iterations and the complexity of the tested benchmark equation respectively. Furthermore, Newton’s scheme complexity is calculated as O(I × T), where T is the computation time. Consequently, the proposed MSSO has an overall computational complexity of O(S × (I + IM + 1) + IT).
Every improvement certainly has a cost. The principal objective of the proposed hybrid algorithm is to enhance the fitness value and the convergence speed of the existing algorithms. However, as a result of adding one algorithm to another, the complexity and the time cost of the hybrid algorithm are increased compared to the original algorithm. Eventually, a tradeoff between the merits and disadvantages should be considered while using any algorithm.
4. Numerical Tests
Eight nonlinear systems of several orders were selected as indicators to clarify the efficiency and capability of the new hybrid MSSO scheme. Comparisons between MSSO and the other six well-known optimization algorithms have been performed. Those optimization algorithms are the original SSO [
2], HHO [
9], PSO [
49], ALO [
10], BOA [
11], and EO [
13]. For consistency, all selected systems used in the comparisons are arbitrary problems that are common in the literature, for instance, [
19,
21,
40,
44,
50,
51,
52,
53].
The comparison between the optimization algorithms is based on the fitness value of each algorithm in each benchmark. A solution with less fitness value is more accurate than a solution with a higher fitness value. Hence, the most effective optimization algorithm is the one that solves with the least fitness value. The fitness function used in the comparison is the Euclidean norm, also called the square norm or norm-2. Using the Euclidean norm, we can determine the distance from the origin, which is expressed as follows:
Similar settings have been used in all benchmarks to guarantee a fair comparison of all selected algorithms. The parameter values of all optimization algorithms have been fine-tuned to improve the performance of the algorithms. The best solution was chosen by every optimization method 30 times. Search agents (population size) have been set to 20 and the maximum iteration to 50. Furthermore, the best solution with the least fitness value is chosen if there is more than one solution for a particular benchmark. In the end, for lack of space, answers are shortened to 11 decimal places.
Calculations were conducted using MATLAB software version R2020a with the default variable precision of 16 digits. This was on an Intel Core i5 processor running at 2.2 GHz and 8 GB of RAM under the Microsoft Windows 8 operating system.
Problem 1: Let us consider the first problem to be the following nonlinear system of two equations:
For this system, the precise solution is given by
,
. After running the algorithms 30 times, MSSO significantly surpassed all other optimization algorithms in the comparison.
Table 1 shows that the proposed hybrid MSSO algorithm has attained the best solution with the least fitness value equaling zero. This means that the solution obtained by MSSO is an exact solution for the given system.
Problem 2: The second benchmark is the system of two nonlinear equations given by:
Here, the exact zero for the system in this problem is given by
. As shown in
Table 2, it is evident that MSSO achieved the exact solution of this system with a fitness value of zero. It also outperformed all other algorithms with a substantial difference, especially in comparison with SSO, BOA, and HHO.
Problem 3: The third system of nonlinear equations is given by:
This NES of three equations has the exact solution
,
,
. According to
Table 3, the proposed MSSO achieved a zero fitness value. The superiority of MSSO is evident in this example, with a significant difference between MSSO and all other compared optimization algorithms.
Problem 4: Consider the following system of three nonlinear equations:
The precise solution of the nonlinear system in this problem is equal to
. The best solution achieved by the compared schemes for the given system is illustrated in
Table 4. The proposed MSSO found a precise answer, with zero as a fitness value. ALO recorded the second-best solution with a fitness value of 2.27 × 10
−6, while the rest of the compared algorithms were far from the exact answer. Again, the proposed MSSO has proved it has an efficient local search mechanism. Hence, it can achieve more accurate solutions for nonlinear systems.
Problem 5: The next benchmark is the following system of two nonlinear equations:
This nonlinear system has the trivial solution
.
Table 5 illustrates the comparison between the different optimization algorithms for the given system. Compared with the other algorithms, the original SSO and HHO achieved excellent results, with fitness values of 5.36 × 10
−15 and 6.92 × 10
−14, respectively. However, MSSO outperformed both of them and delivered the exact solution for the given system.
Problem 6: The sixth system considered for the comparison is an interval arithmetic benchmark [
53] given by the following system of ten equations:
In this benchmark, MSSO has proven its efficiency.
Table 6 clearly shows the significant differences between MSSO and the other compared algorithms. MSSO achieved the best solution with a fitness value of 5.21 × 10
−17, while all different algorithms achieved solutions far from the exact answer. When we compare the fitness values of the hybrid MSSO and the original SSO, we can see how substantial modifications were made to the local search mechanism of the original SSO to produce the hybrid MSSO.
Problem 7: Consider the model A combustion chemistry problem for a temperature of 3000 °C [
21], which can be described by the following nonlinear system of equations:
In
Table 7, the comparison for this system shows that MSSO has the least fitness value of 7.09 × 10
−21, while PSO and EO have fitness values of 2.85 × 10
−9 and 3.45 × 10
−8, respectively.
Problem 8: The last benchmark is an application from neurophysiology [
52], described by the nonlinear system of six equations:
There is more than one exact solution to this system.
Table 8 shows that the proposed MSSO algorithm achieved the most accurate solution with a fitness value of 1.18 × 10
−24, and the PSO algorithm achieved second place with a fitness value of 5.26 × 10
−7. In contrast, the rest of the algorithms recorded answers that differ significantly from the exact solution. Further, NESs in problems 6–8 prove the flexibility of the proposed hybrid MSSO as it remains efficient even in a wide interval [−10, 10].
The comparison results in all benchmarks confirm the hypothesis that we have mentioned in the first section; that is, that the hybridization of two algorithms inherits the efficient merits of both algorithms (SSO and Newton’s methods). This can be seen by looking at the comparison results between the MSSO and the original SSO, where the MSSO has outperformed the original SSO in all selected benchmarks. The reason for this remarkable performance is the use of Newton’s method as a local search, which strengthens the hybrid’s capability to avoid the local optimum in Problems 1–5 (where MSSO has obtained the exact solution), and significantly improves the obtained fitness values in Problems 6-8. The comparisons indicate that the proposed hybrid algorithm MSSO has avoided being trapped in the local optima in all problems, compared with the majority of the other algorithms.
6. Conclusions
In this work, a hybrid method known as MSSO was introduced for solving systems of nonlinear equations using Newton’s iterative method as a local search for the Sperm Swarm Optimization algorithm SSO. The main goal of the MSSO is to solve the problem of Newton’s method’s initial guess, the achievement of which results in a better selection of initial points, enabling it to be applied to a wider variety of real-world applications. Moreover, Newton’s scheme was used in MSSO as a local search, which improved the accuracy of the tested solutions. In addition, the MSSO’s convergence speed is substantially improved.
Eight nonlinear systems of varying orders were utilized to illustrate the effectiveness of the proposed MSSO. The novel MSSO was also compared to six well-known optimization methods, including the original SSO, BOA, ALO, EO, HHO, and PSO. The Euclidean norm has been utilized as a fitness function in all benchmarks. According to the results, MSSO outperforms all other compared algorithms in four metrics: fitness value, solution accuracy, stability, and speed of convergence. In addition, the consistency of the MSSO is confirmed by running the methods thirty times. Additionally, the standard deviation showed that MSSO was the most stable optimization algorithm.
Additionally, we compared the performance of MSSO and Newton’s method on four problems from the benchmarks. Across all four datasets, the MSSO outperformed Newton’s method. The MSSO method also overcomes some of Newton’s scheme’s limitations, such as divergence and selection of initial guesses.
Future work can address some related issues, such as how the suggested method performs against common optimization benchmarks. Future research will also focus on solving nonlinear equations arising from real-world applications, such as Burgers’ Equation. In addition, future work needs to address the efficiency of the proposed algorithm when solving big systems. Finally, the use of a derivative-free iterative method instead of Newton’s method reduces the computational complexity resulting from the need to evaluate Newton’s method in each iteration and is an interesting topic that needs to be focused on in the future.