2.1. Local Optimization and Fusion of the Texture Overlapping Area of Remote Sensing Images
In order to optimize and fuse the local features of the texture overlapping area of remote sensing images, and prevent distortion in the stitching process, visual communication technology is used to perform finite element analysis on remote sensing images [
14]. The finite element meshes are transformed into similar textures. The texture overlapping area of the transformed remote sensing images is locally distorted. Taking a triangular remote sensing image to be stitched as an example, we set three transformation points as
,
,
, and the relationship between the vertex
and the other two points is as follows:
where the coordinates in the stitching texture coordinate system of the remote sensing image from the vertex
to
and
are
and
. If the local remote sensing image texture image
to be optimized and the similarly changed image
are not matched and transformed, the overlapping position of
and
,
cannot be calculated, then the similar texture change formula is:
where
is the remote sensing image transformation weight. By calculating the local distortion coefficient
of similar texture transformation in the finite element grid [
15], the local distortion energy formula of the texture overlapping area of transformed remote sensing image is obtained.
When using visual communication technology to optimize and fuse the parts of the texture overlapping area of remote sensing images [
16], only the texture of the overlapping area of a single grid that needs to be optimized is locally similarly transformed, such as the remote sensing image
that needs to be locally optimized and fused. In the process of local similarity transformation, it is necessary to refer to the texture attributes of the overlapping area of the image
, and then optimize the transformation of each coordinate point in the horizontal coordinate system and the vertical coordinate system and perform local distortion constraints on the optimized transformed grid links to make its fusion degree close to zero. At this time, it means that the texture overlapping area of the two remote sensing images has the same local optimization. The least squares method is used to calculate the local fusion optimal solution [
17], and the horizontal constraint term of the remote sensing image texture is obtained:
where
is the sum of the remote sensing image texture grid block
to be stitched and the adjacent grid block that needs to be fused. Knowing the vertices
,
and
in the remote sensing image, the following formula is used to obtain the vertical constraint term of the remote sensing image texture:
where
is the sum of the remote sensing image texture grid block
to be stitched and the lower grid block that needs to be stitched. Knowing the vertices (8, 6 and 10) in the remote sensing image, the following formula is used to obtain the diagonal constraint term of the remote sensing image texture:
where
is the product of the remote sensing image texture grid block
that needs to be fused and the grid block to be stitched with connection relationship, (11, 6 and 15), are the vertices in the known remote sensing image.
The energy formula
and grid energy formula
of the remote sensing image texture constraint term are mixed in operation [
18] to realize the local optimization of the remote sensing image texture:
where
and
are the weighted thresholds. After fusing the locally optimized remote sensing images, we can finally obtain the stitched image of the texture overlapping area of remote sensing images.
By matching the feature points of remote sensing images, block link constraints and shape distortion constraints are introduced to obtain the corrected stitched image. The remote sensing image fusion method is used to obtain a smooth stitched image of the texture seams of remote sensing images, which optimizes the texture of the local overlapping area of remote sensing images.
2.2. Extraction of the Optimal Stitching Line of the Texture Seams of Remote Sensing Images
The main direction line of the texture overlapping area of the remote sensing images is divided into a vertical direction line and a horizontal direction line. If
represents the initial remote sensing image and
represents the target image, then the main direction line of the overlapping area is
. The principle of gradient structure information is used [
19] to calculate the gradient structure information in the horizontal and vertical directions, namely:
According to the above formula, the main direction of the texture seams of the remote sensing images can be determined, as shown in
Figure 1.
By calculating a set of initial points and then deriving them as a starting point, the optimal stitching line is selected from the smooth stitching lines of the last multiple texture seams of remote sensing images, thereby saving the calculation time [
20]. Taking
Figure 1 as an example, we select five candidate initial points on NS. The coordinates of point N, point U, and point S are
,
and
, respectively. However, because the ordinate values of the three candidate initial points are the same, the coordinate corresponding to the point on NS is
, the line from the point
to the intersection point of UR and UT is denoted as
, and the gradient error can be calculated by:
where
,
represents the smoothness of the gradient, and
represents the similarity.
Using the error calculation formula [
21] between the gradient smoothness and the similarity, the gradient error value of the texture seams of remote sensing images is calculated. The following formula is used to divide the five candidate initial points on NS into five groups, namely:
is used to mark the smallest point in each group, the number of smooth stitching lines of the texture seams of remote sensing image is
, and five initial points are calculated.
According to the calculated five smooth stitching lines of the texture seams of remote sensing images, the minimum cost is the optimal stitching line of the texture seams of remote sensing images [
22]. Assuming that the corresponding coordinate of the point
is
, one of the three candidate initial points is selected as the next growth point.
According to the calculated cumulative gradient error, the iteration point is determined as
. Let the cumulative gradient errors of the three candidate initial points be
,
and
, respectively, then the error calculation formula for each candidate initial point is:
where the number of neighboring points of the candidate initial point is
,
represents the cumulative point composed of the point
and six adjacent points. If
is the coordinate of the point
, then the point with the smallest cumulative value can be selected according to the above formula. It can be obtained from the above formula that when the accumulated values of two points are equal, the priority is sorted as
,
and
.
Therefore, the conditions for the selection of candidate points of the point
are deduced and expressed as:
According to the obtained five smooth stitching lines of the texture seams of remote sensing images, it is supposed that the
mth smooth stitching line of the texture seams of remote sensing images contains
points, and the coordinate of the
jth point is
, then the mean value of the minimum cumulative error of the smooth stitching lines can be calculated by:
The mean value
of minimum cumulative error calculated in the above formula is the optimal stitching line of the texture seams of remote sensing images, which can smoothly pass through the remote sensing image and optimize the removal effect of the smooth stitching line of the texture seams of remote sensing images [
23].
By calculating the gradient structure information of the texture seams of remote sensing images in the horizontal and vertical directions, the main direction of the texture seams is determined, the initial point is selected, and the mean value of the minimum cumulative error of the smooth stitching line is used, and finally, the optimal stitching line is extracted.
2.3. Design of a Smooth Stitching Algorithm for the Texture Seams of Remote Sensing Images
It is supposed that the texture feature point cluster after the local optimization of the remote sensing image texture overlapping area is
,
is on the remote sensing image texture
,
is on the remote sensing image texture
. Then, we can calculate the distance from any feature point
to the feature point
of remote sensing image texture, and the texture local feature vector of
and
, as follows:
We then calculate the relationship function between two local feature vectors of remote sensing image textures:
It is supposed that the matching threshold for remote sensing image texture is
, and the texture feature points larger than this threshold have better similarity and transformation, so this point is the best texture stitching point. In the process of stitching remote sensing image textures, texture feature points are the core elements of stitching [
24]. Through the spatial summation method, the distribution of each feature point in the remote sensing image texture is described. The stitching of the remote sensing image textures is completed through the fusion of the texture feature points. This method does not consider the sequence relationship, so the result is not accurate enough [
25]. For the problem that the remote sensing image cannot realize the smooth stitching of texture seams, a new texture stitching method is proposed. This method uses
as a feature transformation reference point and rebuilds a new texture feature point matrix according to the order relationship of the remote sensing image textures [
26].
The new texture stitching method has the same principle as the initial processing method, and the coordinates of the reference point
for feature transformation are obtained:
According to the texture natural relationship in the remote sensing image texture feature points (Equation (14)), we change
to matrix
:
The above formula is a new remote sensing image texture description. Compared with the initial remote sensing image texture description, the new method has a more correct sequence relationship, and can accurately see all the information points of the remote sensing image texture without performing multiple segmentation calculations on a 2D image, which improves the stitching efficiency [
27].
The new remote sensing image texture description is processed uniformly to ensure that the original feature vector remains unchanged, and the feature points of the remote sensing image texture are arranged according to the description
to form a trapezoidal structure information matrix
:
Unified processing is performed on each row of the trapezoidal structure matrix:
where the structure information point in the trapezoidal structure matrix
is
. Through the processing in Equation (17), the new remote sensing image texture stitching method has better fusion effect.
In order to improve the stability of the proposed method, the texture seams are smoothly stitched:
where the sequence
can be split into several mutually restricted trapezoidal structure sequences
,
and
, where
is the smooth stitching coefficient of texture seams. Weighted analysis is performed on each sequence, respectively, to find the average value:
where
,
is the sequence number of the
cth ladder structure in the sorted ladder structure sequence. We have
.
After feature extraction and processing of remote sensing image textures, the remote sensing image textures to be stitched have corresponding texture transformation points, and the minimum variance calculation is used to obtain the best stitching point of remote sensing images [
28].
Taking the central feature point of the remote sensing image texture as the reference point of similar transformation, the coordinate transformation formulas of A and B can be obtained as:
where
and
are the coordinates of the reference point of similar transformation in A and B, respectively. Equation (21) can be solved by a linear equation set in three unknowns, and the equation only needs to be solved for the transformed reference point; finally, the stitching situation of the remote sensing images is verified.
The texture feature point fusion of remote sensing images is the final work in the entire image stitching process. The smooth fusion of the texture seams is carried out by using the gradient structure information transformation method. Generally, images are divided into original remote sensing images and improved remote sensing images. The improved remote sensing image is rotated to obtain the rotation angle, and the improved remote sensing image after the rotation is overlapped with the original remote sensing image, and the effect of the smooth stitching is examined. The specific operation is shown in
Figure 2.
The coordinate system XOY and the coordinate system are the coordinate systems of the remote sensing image one and the remote sensing image two, respectively. The start and end points of the remote sensing image two are obtained by the method of line segment cutting, so that the relationship between the overlapping area and the point can be calculated after the coordinate conversion. Then the difference method is used to obtain the converted pixel color of the remote sensing image, and finally the linear transition is used to fuse the stitched remote sensing images. In summary, the boundary correlation constraint method is used to match the feature points of the texture seams of remote sensing images, the best matching pair is selected, and a smooth stitching algorithm is designed for the texture seams of remote sensing images to achieve the smooth stitching of the texture seams.