1. Introduction
Computer-Aided Geometric Design (CAGD for short) [
1] takes the representation, drawing, display, analysis, and processing of product geometric shape information as the core research content, and it occupies an important position in manufacturing, medical diagnosis, artificial intelligence, computer vision, and other fields. Product geometry design is the focus of CAGD research, and free-form curves and surfaces are an important tool for describing product geometry. From the Ferguson method [
2] to Bézier [
3,
4], B-spline [
5,
6], NURBS [
7,
8,
9], and other methods, the representation of free-form curves and surfaces has gone through different stages of development driven by industrial software technology. The Ball method is also one of the current commonly used representations. It was proposed in 1974 as the mathematical basis for the former British Airways CONSURF fuselage surface modeling system [
10,
11,
12]. Subsequent research by several scholars has led to the emergence of a variety of generalized forms such as the Said–Ball curve [
13,
14,
15], the Wang–Ball curve [
16], the generalized Ball curves of the Wang–Said type, and the generalized Ball curves of the Said–Bézier type [
17]. The Wang–Ball curve not only has good properties such as stability, symmetry, endpoint interpolation, and geometric invariance but also significantly outperforms the Said–Ball and Bézier curves in terms of degree elevation, degree reduction, and recursive evaluation [
15].
With the rapid development of the geometric modeling industry, the requirements for accuracy standards, surface quality, and overall smoothness of free curves and surfaces are becoming increasingly stringent. The inaccuracies and limited accuracy caused by the floating-point environment in the modeling of curves and surfaces are major causes for the lack of robustness in solid modeling. Interval analysis [
18], which is a tool to enhance the stability of algorithms by dealing with errors, has been gradually introduced into the fields of geometric modeling, computer graphics, and Computer-Aided Design/Computer-Aided Manufacturing (CAD/CAM) since the 1980s [
19,
20,
21]. Subsequently, the concepts of interval Bézier curves, interval Ball curves, and interval B-spline curves have been proposed. Unlike conventional curves, interval curves are constructed with rectangles instead of control vertices represented by real numbers. However, the interval method also has the disadvantage of an expanding error domain under rotational transformation. To this end, Lin et al. [
22] put forward disk Bézier curves in combination with the disk algorithm. A disk curve uses disks to represent the control vertices. Compared with interval curves, disk curves have the following advantages: (1) the shape of disk curves remains unchanged under affine transformation; (2) in terms of data storage volume, the interval method requires eight data records in contrast to only two data records for the disk, which reduces the amount of data. Since then, research related to disk curves has been carried out rapidly. Chen [
23] studied the order reduction problem of disk Bézier curves using both linear programming and optimization methods. In 2018, Ao et al. [
24] proposed a high-precision intersection algorithm for disk B-spline curves and made this curve flexible for stroke representation. Seah et al. [
25] applied disk B-spline curves to artistic brush strokes and 2D animation, and Hu et al. [
26] constructed the disk Wang–Ball (DWB) curve and investigated its degree reduction problem.
The rapid development of the graphics industry and the manufacturing sector is accompanied by the constant updating of geometric modeling systems, which has led to an increase in the exchange, integration, and sharing of data between different systems for geometric descriptions. Approximate merging [
27] is an approximate transformation technique proposed by Hoschek in 1987, which involves approximating a curve consisting of multiple lower-order curve segments with a single higher-order curve. Approximate merging reduces the amount of data transfer during product design and development, thus enabling efficient data transfer and exchange between different systems. In 2001, Hu et al. [
28] presented a methodology to merge two adjacent Bézier curves using controlled vertex perturbations and least squares and showed that the merger error could be reduced if the original curves were first raised to higher degrees before approximating the merger. This method is simple, intuitive, and operational, so Tai et al. [
29] used a similar method to solve the exact merging of B-spline curves and suggested a node adjustment technique for tuning the end nodes of the
kth-order curve without varying the shape of the curve. Subsequently, Cheng et al. [
30] gave a uniform matrix representation for exact merging by minimizing the curve distance. Zhu and Wang [
31] used the
L2 parametrization to measure the approximation error of the curves before and after the merger and achieved the optimal merger problem for Bézier curves with G
2 continuity. Lu et al. [
32] also minimized the
L2 distance and obtained the merger curve for two Bézier curves with G
3 continuity in an explicit manner.
It is clear from the above research that most of the approximate merging problems are confined to conventional curves and do not involve interval and disk curves. Therefore, the questions of how to extend the approximate merging method to interval as well as disk curves, how to estimate the merging error of interval and circular curves, and whether it is possible to directly extend the approximate merging method of traditional curves to interval curves all need to be addressed. The main objective of this paper is to investigate an approximation method to merge disk Wang–Ball curves with improved robustness and accuracy. The merging error is one of the criteria for judging the effectiveness of curve merging, so we take into account the objective of minimizing the merging error to establish an approximate merging model for circular domain curves. In the problem of how to obtain the optimal coordinates of the merged curves, we invoke meta-heuristic optimization algorithms.
Meta-heuristics algorithms have the advantages of fast convergence, high search power, and high solution accuracy, and they play an increasingly important role in optimization problems. Particle swarm optimization (PSO) [
33] has appeared in various improved versions since its introduction in 1995, such as multi-objective particle swarm optimization [
34] and adaptive particle swarm optimization algorithms [
35], and it has been successfully applied in many fields such as image classification [
36], path planning [
37] and biomedicine [
38]. The marine predators algorithm (MPA) [
39] is inspired by ocean predator and prey movements, and its combination with problems such as the shape optimization [
40], image segmentation [
41], wind power prediction [
42], and the 0–1 knapsack problem [
43] have achieved high-quality optimization results. There are also a variety of algorithms and enhanced variants developed based on different inspirations, such as the chimp optimization algorithm [
44], the nutcracker optimizer [
45], enhanced black widow optimization (QIWBWO) [
46], the snow ablation optimizer (SAO) [
47], the multi-strategy enhanced chameleon swarm algorithm (MCSA) [
48], and the multi-objective artificial hummingbird algorithm (MOAHA) [
49].
The snake optimizer (SO) [
50] is inferred from the unique lifestyle of snakes and has been successfully applied to Hammerstein adaptive filters [
51] and power-aware task schedulers for wearable biomedical systems [
52]. As SO is prone to falling into local optimality and inadequate optimization capabilities when faced with different optimization problems, enhanced versions of it have been proposed successively to achieve better results. A multi-strategy fused snake optimizer (MFISO) was developed by Fu et al. for deep-learning prediction models of gas prominence in underground mines [
53]. Rawa investigated a hybrid form of SO and sine cosine algorithm (SCA) called SO-SCA using both parallel and tandem mechanism runs and used it to solve transmission expansion planning models [
54]. Khurma et al. not only generated a binary version called BSO based on the S-shaped transformation function but also integrated the new evolutionary greedy crossover operator with SO to propose a BSO-CV algorithm for medical classification problems [
55]. The multi-strategy enhanced snake optimizer (BEESO) [
56] is an improved algorithm proposed by Hu et al. for the optimization problem introducing bi-directional search, modified evolutionary population dynamics, and elite opposition-based learning strategy in SO. The experimental results in the literature demonstrate that it possesses a highly competitive search capability and convergence speed compared to a variety of sophisticated algorithms. Therefore, BEESO will be used to solve the approximate merging problem for DWB curves, and the main contributions are summarized as follows:
- (1)
We discuss the approximate merging problem of DWB curves and establish an approximate merging optimization model with the merging error as the objective;
- (2)
We propose an approximate merging method of DWB curves based on BEESO and demonstrate the optimization capability of BEESO with numerical examples.
The remainder of this text is structured as follows: The definition of DWB curves is presented in
Section 2, along with a discussion of the approximate merging of adjacent DWB curves and a specific optimization model;
Section 3 introduces the BEESO and proposes an approximate merging method of DWB curves based on BEESO and gives three numerical examples;
Section 4 concludes the paper.
3. DWB Curves Merging Based on BEESO
3.1. BEESO
BEESO [
56] is an improved algorithm proposed to address the shortcomings of SO, which integrates three strategies into SO to improve its optimization capabilities. The update phase of BEESO includes exploration, exploitation, and mutation operations. The initial population is calculated using Equation (13), in which
represents the
ith individual in a population.
where
r is randomly selected at [0, 1],
LB stands for the lower bound, and
UB stands for the upper bound.
The population is composed of 50% females and 50% males, and the population is randomly split into two sub-groups before the start of the iteration, as follows:
where
Nm and
Nf represent male and female subgroups.
The food mass
P and temperature
Temp that control the algorithmic process are calculated with the following formula:
where
k and
K, respectively, represent the current and maximum number of iterations, and
s1 = 0.5.
3.1.1. Exploration Phase
If the food quality
P <
Threshold means that it is of low quality, the population needs to search for better-quality food in the feasible range, which indicates that BEESO enters the exploration phase. The exploration phase consists of a random search and a bi-directional search; BEESO will use these two search methods to update the position of individuals and select the better ones for the next stage by comparing their fitness values. For the random search, the random update position of each individual is formulated as follows:
where
and
represent male and female individuals, respectively.
AM and
AF are the snake’s ability to find food,
r is a random number, and
s2 = 0.5.
where
represents the fitness value.
Bi-directional search makes use of the best and worst individuals to guide BEESO toward the optimal value while maximizing the search area, which effectively improves the disadvantages of the random search such as lower randomness, higher uncertainty, and a narrow search range. The mathematical approach is formulated as
where
and
are the best and worst individuals, respectively, and
r1 and
r2 are evenly generated random numbers.
3.1.2. Exploitation Phase
A higher quality of food represents BEESO entering the exploitation phase. When the temperature
Temp > 0.6 represents a higher temperature in the environment, the snake will move toward the food, as described by the mathematical equation
Mating behavior occurs when the temperature is right, and there is a fighting mode and a mating mode. Fighting mode means that each female engages in mating with the best male, and the males will get the best females by fighting, as in the following equation:
where
FM and
FF are the combat abilities and are calculated using the following formula:
Mating patterns in which mating between each pair of individuals occurs is mathematically modeled as
where
MM and
MF stand for the mating ability:
There is a potential for egg production after mating. If the snake eggs hatch, modified evolutionary population dynamics (MEPD) is implemented on the current parent to improve population quality by eliminating the poorer individuals and mutating the better ones. New offspring are first generated for the bottom 50% of individuals using Equations (32) and (33):
where
i = 1, 2, ...,
N/2.
The mutation operation is applied to the top 50% of individuals, as follows:
where
p1,
p2,
p3 are random integers between [1,
N], and
p1 ≠
p2 ≠
p3 ≠
i.
F is the scaling factor:
where
freq is the frequency of vibration of the sine function.
3.1.3. Elite Opposition-Based Learning Strategy
The tendency to fall into local optima is a common problem with optimization algorithms. The elite opposition-based learning strategy is meant to optimize the better-performing individuals in the population, so that the algorithm approximates the global optimum with a higher probability.
stands for elite individuals, who rank in the top
EN of the population. The elite opposite solution of the current individual is calculated below:
where
EN = 0.1 ×
N, and
and
are the minimum and maximum values of elite individuals.
3.2. Steps for Solving the Approximate Merger Models by BEESO
In
Section 2, two optimization models, endpoint-preserving merging and non-endpoint-preserving merging, are established with the merging error of the curves before and after merging as the objective function. In this section, the BEESO will be used to solve the approximate merged optimized model for DWB curves, and the implementation steps are as follows:
Step one: Setting the parameters of the algorithm;
Step two: Enter the coordinates and radius of the DWB curves to be merged, and calculate the subdivision parameters;
Step three: Initialization. Calculate the initial population according to Equation (13), and divide it into female and male populations. Use the approximate merging model Equation (9) (or Equation (10)) as the objective function;
Step four: Judging food quality. If P < 0.25, execute Equations (17)–(20) and the bi-directional search Equations (21) and (22) of the exploration phase to generate new individuals. Calculate the fitness value of the individuals, and select the better individuals to proceed to the next phase; otherwise, proceed to the exploitation phase;
Step five: The exploitation phase starts by calculating the temperature Temp. If Temp > 0.6, the individual moves closer to the food as in Equation (23). Otherwise, it enters the fight mode or the mating mode. When r > 0.6, the individual is in fight mode and performs Equations (24)–(27); otherwise, it operates in mating mode;
Step six: The mating pattern performs Equations (28)–(31) on individuals, followed by consideration of whether the eggs hatch after mating. If the eggs hatch, MEPD (Equations (32)–(36)) is used to produce individuals that move on to the next stage;
Step seven: Find the elite individuals, and update the population again by Equations (37)–(39);
Step eight: Calculate the fitness value, and determine the optimal individual;
Step nine: Determine the termination condition of the algorithm. Let k = k + 1. If , return to step four; otherwise, output the minimum merging error and the coordinates of the control disks of the merging curve.
A detailed flowchart for the approximate merged model is shown in
Figure 1.
3.3. Optimization Examples
To verify the feasibility of the algorithm, three specific numerical examples are given in this section. Each numerical example is optimized in terms of both endpoint-preserving interpolation and endpoint-unperceiving interpolation. The population size and iterations in all experiments are 50 and 500. In addition to BEESO, several advanced algorithms are selected for comparison purposes, such as CSA [
48], SCA [
57], grey wolf optimizer (GWO) [
58], white shark optimizer (WSO) [
59], the Coot optimization algorithm (COOT) [
60], the sooty tern optimization algorithm (STOA) [
61], and Harris hawks optimization (HHO) [
62].
Example 1. It is given that two adjacent DWB curves and their control vertex coordinates as well as the control radius are shown in Table 1, and this constructs the shape of the DWB curve shown in Figure 2. In this example, these two curves of 3 and 4 degrees will be combined into a single 6-degree DWB curve according to the model built in Section 2.2.2.
Table 2 and
Table 3 show the optimum results obtained by the eight intelligent algorithms including BEESO and SCA for the two cases of no endpoint preservation and endpoint preservation, respectively, which include the control disk coordinates of the merged curves and the merging error. The comparative results of the before and after merging curves are shown in
Figure 3 and
Figure 4, where the merged DWB curve is shown in green. The change in the shape of the curves before and after the merger can be observed in the figures. For example, the shape of the left segment of the curve obtained by CSA shows a large error. In addition, the convergence in the optimization process is shown in
Figure 3i and
Figure 4i.
The overall result shows that the optimization of the “C” curve fits the original curve better. For the non-endpoint-preserving case, SCA and STOA obtain curves that differ significantly from the width of the original curve, and their errors are accordingly two of the larger of all the algorithms. In the endpoint-preserving case, STOA also has a larger difference in results and a slower convergence rate in the optimization process. Although the remaining algorithms have similar results, BEESO has better results when looking at the fit and width of the two curves before and after merging. BEESO achieves the best results among the eight algorithms, both in terms of error results and convergence speed.
Example 2. The purpose of this example is to approximate the merging of two adjacent 4-degree DWB curves into one 5-degree curve. The control disks of the two curves before merging are shown in Table 4, and the shape of the curve obtained by visualizing it is shown in Figure 5. Table 5 and
Table 6, respectively, give the optimization results of the eight algorithms for the two cases of no endpoint preservation and endpoint preservation, which include the optimal control vertices and the merging error.
Figure 6 and
Figure 7 show the optimization results for two adjacent 4-degree DWB curves merged into one 5-degree DWB curve under different constraints, respectively.
In the case where endpoints are not preserved, there is a large difference in the optimization results between the eight algorithms. For example, the visual difference between the merged curves obtained by HHO, GWO, and CSA and the original curve is significant, which is also evident from the error for each algorithm in
Table 5. Whereas for the endpoint-preserving case, BEESO, CSA, SO, WSO, and COOT obtain similar effect plots, the optimized curve from GWO has a large difference in width from the pre-merged curve. However, by combining the error data in the table, BEESO obtains the minimum error under each of the two constraints.
Example 3. A 3-degree curve and 4-degree curve are given, respectively, as shown in Table 7, and the two curves are constructed as shown in Figure 8. This example will discuss the problem of merging 3- and 4-degree curves based on intelligent algorithms in both the non-endpoint-preserving and endpoint-preserving cases. The results obtained for this example using the eight algorithms including BEESO and CSA are shown in
Table 8 and
Table 9. In addition, the graphs of the merging effect obtained by visualizing the control disks are shown in
Figure 9 and
Figure 10.
For the case of non-endpoint preservation, the approximate merging results of the three algorithms BEESO, COOT, and SO in
Figure 9 are relatively good, with BEESO having the smallest merging error of all the algorithms. The other algorithms, on the other hand, all have much room for improvement in their overall results, especially GWO, HHO, and STOA, which have large errors in the position of the curve and the width of the curve before merging. Due to the low dimensionality of the optimizing variables in the endpoint-preserving case, the results for all seven algorithms apart from STOA are approximate and have a small gap in terms of the combined errors given. For BEESO, SO, and WSO, there is no significant gap when the errors are kept to three decimal places. The experimental outcome by keeping the valid data to more than one decimal place is 0.14669880, 0.14669887, and 0.14669885 for BEESO, SO, and WSO respectively; the error for BEESO is the smallest of the three algorithms.