Next Article in Journal
A DSL-Based Approach for Detecting Activities of Daily Living by Means of the AGGIR Variables
Next Article in Special Issue
A GRASP-Based Approach for Planning UAV-Assisted Search and Rescue Missions
Previous Article in Journal
Stochastic Identification of Guided Wave Propagation under Ambient Temperature via Non-Stationary Time Series Models
Previous Article in Special Issue
Graph SLAM Built over Point Clouds Matching for Robot Localization in Tunnels
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Monocular Visual Position and Attitude Estimation Method of a Drogue Based on Coaxial Constraints

Navigation Research Center, College of Automation Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106, China
*
Author to whom correspondence should be addressed.
Sensors 2021, 21(16), 5673; https://doi.org/10.3390/s21165673
Submission received: 12 July 2021 / Revised: 15 August 2021 / Accepted: 18 August 2021 / Published: 23 August 2021

Abstract

:
In aerial refueling, there exists deformation of the circular feature on the drogue’s stabilizing umbrella to a certain extent, which causes the problem of duality of position estimation by a single circular feature. In this paper, a monocular visual position and attitude estimation method of a drogue is proposed based on the coaxial constraints. Firstly, a procedure for scene recovery from one single circle is introduced. The coaxial constraints of the drogue are proposed and proved to be useful for the duality’s elimination by analyzing the matrix of the spatial structure. Furthermore, we came up with our method, which is composed of fitting the parameters of the spatial circles by restoring the 3D points on it, using the two-level coaxial constraints to eliminate the duality, and optimizing the normal vector of the plane where the inner circle is located. Finally, the effectiveness and robustness of the method proposed in this paper are verified, and the influence of the coaxial circle’s spatial structure on the method is explored through simulations of and experiments on a drogue model. Under the interference of a large amount of noise, the duality elimination success rate of our method can also be maintained at a level that is more than 10% higher than others. In addition, the accuracy of the normal vector obtained by the fusion algorithm is improved, and the mean angle error is reduced by more than 26.7%.

1. Introduction

Aerial refueling technology can significantly extend the endurance of aircraft, which is of great importance in strategic or tactical aviation operations [1]. It is well known at present that there are two kinds of aerial refueling system: the probe-and-drogue refueling system pioneered by Flight Refueling Ltd. [2] and the Flying Boom refueling system developed by Boeing [3]. Due to the economy and flexibility of the probe-and-drogue refueling method and its adoption by many countries, its use is advisable for autonomous aerial refueling [4]. In a docking situation, the principal issue is how to accurately and quickly measure the relative position and attitude between the probe (from the receiving aircraft) and the drogue (from the tanker) during the end game [4,5,6]. Monocular vision is the most popular and fastest-growing method and has been extensively studied by researchers owing to its feasibility, ease of calibration, low cost, passive nature, and large effective field of view [7,8]. Therefore, it is a common navigation method for the AAR docking stage [1,4,5].
Typical geometric features used for pose estimation include points, lines, and circles in vision-based relative pose measurements. Whether in theoretical research [9,10,11,12] or application [13,14,15,16], scholars at home and abroad have done a lot of research. Compared with other features, circular features are common and widespread in the aviation field. Circular features have a high anti-interference ability with respect to occlusion owing to their geometric characteristics. It is an accepted and advantageous method in aeronautics and astronautics to obtain the navigation information by using the circular features of the target to accomplish the task of the aircraft and spacecraft [17,18].
The shape of a drogue is a typical body of revolution (BOR) that contains many coaxial circular features. However, with a change in aircraft speed, the stabilizing umbrella of the drogue is in different opening and closing states, which leads to the deformation of the circular features to some extent during the refueling process. It is difficult to obtain in real time the effective radius of circular features for monocular vision pose estimation. Only the shape of the circle near the oil joint is fixed, which is called the ‘inner circle’ of the drogue in this paper. The 5 degrees of freedom (DOF) pose of the drogue can be estimated by the inner circle.
Nonetheless, there exists the problem of a dual solution for the pose estimation by a single circular feature, that is, there are two sets of the circle center’s positions and the normal vectors of the circle plane [19]. If and only if the optical center and the target circle form a cone is there a unique solution.
To eliminate the duality, in References [18,20] the authors proposed to construct a reference point in a plane or space and used the invariance in the Euclidean distance to the circle center as a constraint to select the correct pose. However, this method relies on a known point feature; in Reference [21], the constraint angle from a motion reconstruction was utilized to eliminate the duality, that is, the space angle does not change with the motion of the rigid body, but it requires dense point features in the scene for reconstruction. For the case of multiple circular features, the authors suggested that the pose estimation of the target be constrained by multiple parallel circular or cylindrical features [19,22]. The implicit condition is that the radius of all coaxial circles is known, which is not satisfied in the drogue case. The features of a double planar circle were applied to estimate the pose of the target in Reference [23] and calibrate the camera parameters in References [24,25]. In addition, stereo vision was introduced to the pose estimation by providing additional information to the scene reconstruction in References [26,27].
All of the methods mentioned above are limited by specific requirements and special conditions, such as the prerequisite of all information about the coaxial circles, the introduction of auxiliary equipment other than a monocular camera, the existence or installation of special structures that can be recognized, or the need for many features for reconstruction. Notwithstanding, it is not possible or at least difficult to install an artificial mark on the drogue and introduce other auxiliary equipment into the airborne monocular vision system for visual navigation in AAR or an Actively Stabilized Refueling Drogue System (ASRDS).
Aiming at solving the problem that only a single circular feature known in advance can be directly used for pose estimation, while the others are deformed to a certain extent during air refueling, a monocular visual position and attitude estimation method of a drogue based on coaxial constraints is proposed in this paper. It makes full use of multiple circular features of the drogue itself, and the proposed coaxial constraints can effectively eliminate the duality of solutions. Moreover, the accuracy of the target’s normal vector, optimized by fusing multiple circular features, is greatly improved.
The paper is organized as follows. Section 2 presents the method for scene recovery from a circular feature. In Section 3, we propose the coaxial constraints and prove them. The position and attitude estimation algorithm based on the coaxial constraints is presented in Section 4. Section 5 presents the simulations and experiments conducted to evaluate the proposed methodology. Section 6 presents the conclusions.

2. Scene Recovery from a Circle

2.1. Projection of a Circle

Define the world coordinate system, camera coordinate system, and image coordinate system of the object as O w X w Y w Z w , O c X c Y c Z c , and O I U V , respectively. Let x ˜ ( w ) = ( x w , y w , z w , 1 ) T denote the 3D homogeneous coordinates of a 3D point in the world coordinate system, and let x ˜ ( I ) = ( u , v , 1 ) T be the homogeneous coordinates of its projection in the image coordinate system. The projection from the world coordinate system to the image can be described as:
z c x ˜ ( I ) = K x ( c ) = K [ R | t ] x ˜ ( w ) w i t h K = f u 0 u 0 0 f v v 0 0 0 1
where z c is a scale factor (with a projection depth of x ); K is the camera intrinsic matrix, with the focal length f u , f v and the principle point u 0 , v 0 ; R and t are the rotation matrix and the translation vector from the world coordinate system to the camera coordinate system, respectively; and x ( c ) = ( x c , y c , z c ) T 3 denote the non-homogeneous coordinates of the corresponding 3D point in the camera coordinate system.
Without loss of generality, all the discussions in this paper assume that the camera intrinsic matrix K has been calibrated. We establish a world coordinate system where the xy plane lies on the plane of the circle and the z-axis of the coordinate system is perpendicular to the plane and faces away from the camera. A point x ˜ p ( w ) = ( x p , y p , z p , 1 ) T w i t h z p = 0 on the circle P of which the center homogeneous coordinates are ( x o , y o , 0 , 1 ) T with a radius of r 0 can be expressed by the following formula.
( x ˜ p ( x y ) ) T P x ˜ p ( x y ) = 0 w i t h x ˜ p ( x y ) = ( x p , y p , 1 ) T a n d z p = 0 P = 1 0 x o 0 1 y o x o y o x o 2 + y o 2 r 0 2
Combining Equation (1) with Equation (2), the projection of the circle in the image coordinate system is derived as:
( x ˜ ( I ) ) T Q x ˜ ( I ) = 0 Q = μ K r 1 r 2 t T P K r 1 r 2 t 1
Denote H = K r 1 r 2 t . Q can be rewritten as Q = μ H T P H 1 , where μ is a scale factor and equal to z c 2 . r 1 and r 2 are the first and second columns, respectively, of the rotation matrix R , and t is the translation vector.
Since Q is a symmetric matrix, it forms an elliptical cone by the optical center of the camera and the spatial circle under the conditions of the principles and the technique of imaging. Without considering the degradation that occurs when the projection of the circle becomes a line, the projection of the circle is an ellipse corresponding to the intersection of the image plane and the elliptical cone. The projection of the spatial circle on the image plane satisfies the ellipse constraint, which is consistent with the conclusion [25].

2.2. Position and Attitude Estimation from a Circle

The solution of a pose estimation problem from one circle is the position of the circle center O ( c ) = ( x , y , z ) T and the normal vector n ( c ) = ( n 1 , n 2 , n 3 ) T of the plane where the circle is located in the camera coordinate system due to the rotational symmetry of the circle feature. The normal vector is equivalent to the direction of the z-axis of the world coordinate system in the camera coordinate system.
When the camera intrinsic matrix K has been calibrated, it is easy to transfer the image coordinates x ˜ ( I ) with K 1 . For the convenience of derivation, let K = I 3 = d i a g 1 , 1 , 1 . Then, the matrix H can be rewritten and the inverse of the matrix can be derived as:
H = K r 1 r 2 t = R e 1 , e 2 , R T t H - 1 = e 1 , e 2 , s R T
where e 1 = ( 1 , 0 , 0 ) T , e 2 = ( 0 , 1 , 0 ) T , and s = ( s 1 , s 2 , s 3 ) T . R , t , and s follow the rule R T t = s 3 1 s 1 s 2 1 with s 3 0 . Combined with Equation (4), it can be derived from Equation (3) that:
Q = μ R M R T with M = 1 0 s 1 0 1 s 2 s 1 s 2 s 1 2 + s 2 2 ( s 3 r 0 ) 2
Under perspective geometry, the parameter matrix of a projected ellipse and the parameter matrix of the spatial circle are similar. According to Reference [28], the analytical solutions of position and attitude can be settled based on the theorem that if two matrices are similar, they have the same eigenvalues. Four sets of circle centers and normal vectors can be obtained by the Eigen-decomposition of the matrix Q , as:
o ( c ) = ω 1 r λ 3 λ 1 λ 2 λ 1 λ 1 λ 3 v 1 ω 2 r λ 1 λ 3 λ 2 λ 3 λ 3 λ 1 v 3 n ( c ) = ω 1 λ 1 λ 2 λ 1 λ 3 v 1 ω 2 λ 3 λ 2 λ 3 λ 1 v 3
where ω 1 = ± 1 , ω 2 = ± 1 . λ i are the eigenvalues of the matrix Q and v i are the corresponding eigenvectors. According to the constraint that the z-axis of the camera coordinate system points to the target, two sets of solutions can be eliminated. The remaining two sets of solutions are the duality problems for the position and attitude estimation by a single circular feature under normal circumstances.

3. The Coaxial Constraints on a Drogue

3.1. Analysis of a Drogue

During the docking stage of autonomous aerial refueling, the control system that controls the motion state of the tanker and receiver, such as position, attitude, and speed, is extremely dependent on the relative pose between the drogue and the receiver obtained by the visual navigation system. In addition, the actively stabilized refueling drogue system (ASRDS) for stabilizing the drogue also relies on visual navigation to monitor the position and attitude between the drogue and the probe in real time.
In these scenes, the target with which visual navigation needs to deal is the drogue. The shape of the drogue shown in Figure 1 is approximately a BOR, and its outer contour is the surface of revolution (SOR), which contains many coaxial circular features. As the motion state of the tanker changes, the stabilizing umbrella of the drogue transitions between different opening and closing states. As a result, the actual size of most circular features is not fixed and its reference value cannot be obtained in advance. In this case, only the radius of the circle marked with the solid red line in Figure 1 is fixed, which is defined as the inner circle of the drogue in this paper and can be used directly to yield the analytical solution of position and attitude.
Nevertheless, the planes on which the circular features are located remain parallel, and the distance between the planes changes little. The vectors connecting the centers of circles are also collinear with the normal vector of each circle on the whole. Therefore, the coaxial constraints are proposed as follows:
  • There exists a structure of spatial circles on several planes parallel to the plane of the inner circle.
  • The centers of multiple spatial circles are collinear and the vectors composed of the circles’ center are collinear with the normal vector of the inner circle.

3.2. Proof

In this section, we prove that the constraints proposed in Section 3.1 can be adopted to eliminate the duality of the estimation of position and attitude. According to the two sets of solutions calculated by the inner circle and the distance between the planes of circles, two corresponding spatial structures can be restored in combination with the image. Denote Ω j 1 and Ω j 2 the j-th plane parallel to the inner circle corresponding to the two sets of solutions, and denote O j 1 and O j 2 the intersection curves between the space structures and the j-th plane parallel to the inner circle. If it can be proved that the space structure recovered from the false solution does not meet the coaxial constraints, that is, O j 1 and O j 2 are not circular at the same time, and the vectors composed of the circles’ center are not collinear with the normal vector of the inner circle, so the coaxial constraints can be used to eliminate the duality.
Without special instructions, the coordinates below are all in the camera coordinate system. Suppose that two sets of 5 DOF poses of the inner circle are calculated as ( O 1 , n 1 ) and ( O 2 , n 2 ) by Section 2.2. Among them, O 1 and O 2 are the candidate coordinates of the inner circle center, while n 1 and n 2 are the candidate normal vectors of the inner circle plane.
Two rotation matrices, denoted R ¯ 1 and R ¯ 2 , were constructed by n 1 and n 2 , respectively, which make the z-axis of the camera coordinate system parallel to the normal vector n 1 or n 2 , and the z-axis coordinate of the target is positive. The coordinate system transforms as follows:
X 1 = R ¯ 1 X X 2 = R ¯ 2 X
where X are the coordinates in the original camera coordinate system; X 1 and X 2 are in the transformed camera coordinate system. The relationships between them are as follows:
X 1 = R ¯ 1 R ¯ 2 T X 2 = R ¯ X 2
In accordance with the cause of duality, the camera’s optical center and the inner circle, denoted O , constitute an oblique cone space in the two transformed coordinate systems. The matrix expression of the oblique cone surface is given as follows:
X i T Φ i X i = X i T 1 0 α 0 1 β α β γ X i = 0 , i = 1 , 2
That is, the two transformed coordinate systems are constrained to the following formula:
X 1 T Φ 1 X 1 = 0 X 2 T Φ 2 X 2 = 0
Substituting Equation (8) into Equation (10), we can obtain:
Φ 2 = R ¯ T Φ 1 R ¯
If the hypothesis that the dual solutions of the circle O make the two space structures of coaxial circle P satisfy the coaxial constraint (1) is not null, the camera’s optical center and circle P also form an oblique cone in the two transformed coordinate systems, Ψ 1 and Ψ 2 , transformed by R ¯ 1 and R ¯ 2 , that satisfies:
Ψ 2 = R ¯ T Ψ 1 R ¯
From the coaxial constraint (2), Ψ 1 , Φ 1 and Ψ 2 , Φ 2 satisfy the following formula:
Φ 1 = Ψ 1 0 0 k 1 α q 1 0 0 k 1 β q 1 k 1 α q 1 k 1 β q 1 δ γ = Ψ 1 Δ 1 Φ 2 = Ψ 2 0 0 k 1 α q 2 0 0 k 1 β q 2 k 1 α q 2 k 1 β q 2 δ γ = Ψ 2 Δ 2
where k 1 is the ratio of the z-axis coordinates of circles O and P in the two transformed coordinate systems.
k = z q 1 z p 1 = z q 2 z p 2 = z q z q + z j ( w ) w i t h z q 1 = O 1 T n 1 , z q 2 = O 2 T n 2
z j ( w ) is the z-axis coordinate of P in the world coordinate system. From Equation (6), we can obtain: z q 1 = z q 2 = z q . Denote
R ¯ = r 11 r 12 r 13 r 21 r 22 r 23 r 31 r 32 r 33 Δ 1 = 0 0 α 1 0 0 β 1 α 1 β 1 γ Δ 2 = 0 0 α 2 0 0 β 2 α 2 β 2 γ
Combining Equation (11) with Equation (14), it can be inferred that:
Δ 1 R ¯ = R ¯ Δ 2
Substituting Equation (15) into Equation (16), Equation (16) can be written as:
α 1 r 31 α 1 r 32 α 1 r 33 β 1 r 31 β 1 r 32 β 1 r 33 α 1 r 11 + β 1 r 21 + γ r 31 α 1 r 12 + β 1 r 22 + γ r 32 α 1 r 13 + β 1 r 23 + γ r 33 = α 2 r 13 β 2 r 13 α 2 r 11 + β 2 r 12 + γ r 13 α 2 r 23 β 2 r 23 α 2 r 21 + β 2 r 22 + γ r 23 α 2 r 33 β 2 r 33 α 2 r 31 + β 2 r 32 + γ r 33
Analyzing the formula above, there exist the following situations:
  • r 13 = r 31 = 0 , r 23 = r 32 = 0
    • In this case, r 33 must be 1. There is a rotation transformation around the z-axis between the two coordinate systems with R ¯ as the rotation matrix, which means that the position and the normal vector of the circle are not changed. In other words, ( O 1 , n 1 ) and ( O 2 , n 2 ) are equal.
  • Other Situations
Comparing the left and right sides of Equation (17), we can obtain α 1 = α 2 0 , β 1 = β 2 0 , and γ 0 , where means possibly equal. Combining Equation (13) and Equation (17), we can obtain α q 1 = α q 2 , β q 1 = β q 2 . If the intersecting curve is of the same radius, the following equation must exist.
α q 1 2 + β q 1 2 γ q 1 z q 1 2 = r z 2 = α q 2 2 + β q 2 2 γ q 2 z q 2 2
From Equation (18), we can obtain γ q 1 = γ q 2 . It can also be inferred that ( O 1 , n 1 ) are equal to ( O 2 , n 2 ) .
In summary, the space structure recovered from the false analytical solution, which is caused by the duality of the pose estimation based on one circular feature, does not meet the coaxial constraints. Thereby, the coaxial constraints proposed by Section 3.1 can be utilized to eliminate the duality.

4. The Position and Attitude Estimation Method Based on the Coaxial Constraints

The Pseudo-algorithm of the position and attitude estimation method based on the coaxial constraints is shown in Algorithm 1.
Algorithm 1: Position and Attitude Estimation Algorithm Based on the Coaxial Constraints
Input:
  Image: The parameters of ellipses have been fitted and matched
  r: Radius of inner circle
   z j ( w ) : The z-axis coordinates of coaxial circles‘ center
Output:
  O: The position of drogue in the camera coordinate system
  n: The normal vector of drogue in the camera coordinate system
1: function COAXIALCIRCLEPOSE(Image, r, z j ( w ) )
2:   Transform the parameters of ellipses into matrix expression
3:   Calculate two sets of position and attitude of the inner circle
     ( O 1 , n 1 ) , ( O 2 , n 2 )
4:   for j = 1 to J do
5:    Restore the spatial structures of ( O j 1 , O j 2 ) by
     x * ( c ) = d n T K - 1 U ˜ * K - 1 U ˜ *
6:   Fit O j 1 , r j 1 , Δ r j 1 and O j 2 , r j 2 , Δ r j 2 from ( O j 1 , O j 2 )
7:   Calculate ( O j 1 , n j 1 ) and ( O j 2 , n j 2 ) from r j 1 , i f Δ r j 1 < Δ r j 2 r j 2 , i f Δ r j 1 > Δ r j 2
8:   end for
9:   Calculate ϕ 1 and ϕ 2 by
      ϕ i = j = 1 J ( Δ r j i r j 1 + r j 2 ) i = 1 , 2
10:   Eliminate duality by
      O , n = O 1 , n 1 , i f ϕ 1 < ϕ 2 O 2 , n 2 , i f ϕ 1 > ϕ 2
11:   Eliminate duality by
      O , n = O 1 , n 1 , i f n j 1 or n j 2 T n 1 1 O 2 , n 2 , i f n j 1 or n j 2 T n 2 1
12:   Optimize normal vector by
      n ^ = arg min ζ j = 1 J n j ( c ) ζ 2 + n ( c ) ζ 2 with ζ T ζ = 1
13:   Obtain translation vector by
      t ^ = O
14:   result n ^ , t ^
15:   return result
16: end function
The algorithm uses the parameters of the ellipse, such as center coordinates, the major axis, the minor axis, the inclination angle, the radius of the inner circle, and the z-axis coordinates of the inner coaxial circles in the world coordinate system as input.
First of all, the matrix expression of ellipses from the input parameters of the ellipses and the position and attitude with the known condition of the inner circle are calculated by the algorithm in Section 2.2. Then, two corresponding spatial structures are restored using the method described in Section 4.1. The method described in Section 4.2 is used to eliminate the duality. Finally, the normal vector of the target is optimized by fusing multiple circular features in the spatial structure.

4.1. Spatial Structure Recovery

The spatial relationship of a 3D point and circle corresponding to the duality is shown in Figure 2. Take the coaxial circle corresponding to ( O 1 , n 1 ) as an example, which is on the plane parallel to the inner circle, and the z-axis coordinate is z j ( w ) in the world coordinate system. The restored 3D point x * ( c ) = x * , y * , z * T on the coaxial circle is the intersection of O c U * (the vector connecting the camera’s optical center O c and the imaging point U * ) with the space plane, where the coaxial circle is located. x * ( c ) satisfies the following:
z * U ˜ * = K x * ( c ) n 1 T x * ( c ) d = 0
where d = n 1 T O 1 n 1 + z j ( w ) , according to the relationship between the two planes. It can be inferred that:
x * ( c ) = d n 1 T K - 1 U ˜ * K - 1 U ˜ *
Select N points on the curve uniformly and restore the coordinates of the 3D points on the plane Ω 1 in terms of Equation (20). Denote the N 3D points, the center of the coaxial circle, and the midpoint of any two different points as x * ( c ) i = x i * , y i * , z i * T , O * = x o * , y o * , z o * T , and x * ( c ) i j = x i * + x j * 2 , y i * + y j * 2 , z i * + z j * 2 T , respectively, where i , j = 1 , 2 , N . Since the center of a circle must be on the intersection line of the mid vertical plane of any two points, the vector O 1 x * ( c ) i j should be perpendicular to x * ( c ) i x * ( c ) j . This means that:
x j * x i * , y j * y i * , z j * z i * x i * + x j * 2 x o * , y i * + y j * 2 y o * , z i * + z j * 2 z o * T = 0
After sorting out the equation, we can obtain
Δ x i j * x o * + Δ y i j * y o * + Δ z i j * z o * = c i j
where Δ x i j * = x i * x j * , Δ y i j * = y i * y j * , Δ z i j * = z i * z j * , and c i j = x i * 2 + y i * 2 + z i * 2 x j * 2 y j * 2 z j * 2 2 . N-1 linearly independent vectors, denoted Δ x i j * , Δ y i j * , Δ z i j * T , can be obtained from the N 3D points, and the following formula can be obtained:
Δ x 12 * Δ y 12 * Δ z 12 * Δ x 23 * Δ y 23 * Δ z 23 * Δ x N 1 N * Δ y N 1 N * Δ z N 1 N * x o * y o * z o * T = c 1 c 2 c N 1
Rewrite Equation (23) as:
A O * = C
Since the circle’s center must be on Ω 1 , which can be represented by n 1 (the normal vector) and d (the intercept), we can define the error equation as:
f O * = A O * C 2 + λ B O * 1 w i t h B = n 1 / d
The coordinates of the circle’s center are the least square solution of the above equation.
O * λ = A T A B T B 0 A T C 1
Furthermore, the radius and the roundness of the circle are obtained by Equation (27). We define the roundness of the circle, denoted Δ r * , as the average roundness of the points.
l i = x i * x o * 2 + y i * y o * 2 + z i * z o * 2 r = 1 N i = 1 N l i Δ r = i = 1 N l i r
The radius and the roundness of other coaxial circles can also be calculated.

4.2. Elimination of Duality

As the name suggests, the two-level coaxial constraints algorithm consists of two parts. The first level is the coaxial constraint (1) in Section 3.1, that is, the elimination of the duality by using the roundness of the spatial structure. After restoring the spatial structures by Section 4.1, we can obtain the radius and roundness of the coaxial circles, denoted r j 1 , r j 2 and Δ r j 1 , Δ r j 2 , respectively, corresponding to the two pose solutions of the inner circle. If there are J coaxial circles, the definition is as follows:
ϕ i = j = 1 J Δ r j i r j 1 + r j 2 w i t h i = 1 , 2 a n d j = 1 , 2 J
Then, the first-level constraint can be expressed as the following equation.
O , n = O 1 , n 1 i f ϕ 1 < ϕ 2 O 2 , n 2 i f ϕ 1 > ϕ 2
The second-level constraint is the coaxial constraint (2) in Section 3.1, that is, the normal vectors of multiple planes where the circles are located are parallel. Using the radius calculated by Equation (27), two sets of positions and attitudes of multiple coaxial circles, denoted O j 1 , n j 1 and O j 2 , n j 2 , respectively, can be solved for. The second-level coaxial constraint can be written as:
O , n = O 1 , n 1 i f n j 1 or n j 2 T n 1 1 O 2 , n 2 i f n j 1 or n j 2 T n 2 1
After the coaxial constraints are applied, the normal vectors of the planes where multiple circles are located can also be obtained.

4.3. Fusion of Multiple Circular Features

After the two-level coaxial constraints described in the previous subsection are applied, the center position of the inner circle and the normal vectors of multiple planes can be obtained. In the pose fusion problem of multiple coaxial circles with a known radius, there are the following relationships among the circles.
O j ( c ) = z j ( w ) n ( c ) + t ( c )
where z j ( w ) is the z-axis coordinate of O j in the world coordinate system. Let O j k ( c ) = O j ( c ) O k ( c ) w i t h j , k 1 , 2 J , where O j ( c ) and O k ( c ) are the centers of the coaxial circles obtained by Section 4.2, then
O j k ( c ) = ( z j ( w ) z k ( w ) ) n ( c )
In Reference [29], it was shown that the pose fusion algorithm for multiple coaxial circles with a known radius is equivalent to solving the following equation.
n ^ = arg min ζ j = 1 J n j ( c ) ζ 2 + j , k 1 , 2 J O j k ( c ) z j ( w ) z k ( w ) ζ 2 w i t h ζ T ζ = 1
The analysis of Equation (6) shows that the radius estimation error has little influence on the error of n j ( c ) but some influence on the error of O j ( c ) . When the value of z j ( w ) z k ( w ) is small, the vector O j k ( c ) z j ( w ) z k ( w ) has a little error. Accordingly, we use the following formula to solve n ^ .
n ^ = arg min ζ j = 1 J n j ( c ) ζ 2 + n ( c ) ζ 2 with ζ T ζ = 1
where n j ( c ) and n ( c ) are the normal vectors of the coaxial circle and the inner circle solved by Section 4.2. Considering that the error of O j is a little larger, the solution for the position of the inner circle is selected as the corresponding t ^ .
So far, the position and the normal vector of the target with coaxial constraints have been solved.

5. Results and Analysis

5.1. Evaluation Indices

The simulations of and experiments on the drogue model were designed to verify the effectiveness of the proposed algorithm. The performance of the algorithm was evaluated in terms of two aspects: the duality elimination success rate and the accuracy of the normal vector of the target.
Assuming that the position and the normal vector of the target calculated by our algorithm were n ^ , t ^ and the benchmarks we set were R g , t g , the error was defined to be composed of the angle error Δ θ (the angle between the solved normal vector and the ground truth r 3 , where r 3 is the third column of the matrix R g ) and the translation error Δ t .
Δ θ = arccos n ^ T r 3 Δ t = t ^ t g
The evaluation criterion of the successful elimination of the duality was that the angle error and the translation error of n ^ , t ^ were less than what was eliminated.
In this paper, the success rate of duality elimination was defined as the percentage of frames with successful elimination of the duality with respect to the total number of image frames.

5.2. Simulations

The simulations were carried out on a simulation system based on MATLAB, in which the resolution was 1600 × 1200 pixels, the pixel size was 0.01 × 0.01 mm, the focal distance was 8 mm, and the principal point of the photograph was (800.00, 600.00).
To verify the anti-interference ability and the robustness to relative distances of the algorithm, we carried out comparative simulations under different conditions, including noise in the image feature (an error in fitting the feature’s parameters), noise in the spatial structure (deformation of the drogue’s frame), and different relative distances between the camera and the target.
The world coordinate system was established with the inner circle’s center as the origin and the z-axis perpendicular to the plane of the inner circle and facing away from the camera.
The parameters of the target were set according to the measurements of the actual drogue. The radius of the inner circle was r = 125.0 mm, and the coordinates of the inner circle’s center were 0 , 0 , 0 ; the radius of the circle O 1 was r 1 = 385.0 mm, and the coordinates of the circle’s center were 0 , 0 , 210 ; the radius of the circle O 2 was r 2 = 320.0 mm, and the coordinates of the circle’s center were 0 , 0 , 310 . Only O 2 was used in the duality elimination simulations.
1.
Elimination of Duality
  • Noise in the Image Feature
The z-axis coordinate of the target in the camera coordinate system was 5 m, and the x-axis coordinate, y-axis coordinate, heading, pitch, and roll were set as a random value within the range of ± 1.5 m, ± 1.5 m, ± 30 ° , ± 30 ° , and ± 15 ° , respectively. We added white gaussian noise with a mean value of 0 and a variance of 10 to the center coordinates of O 2 , and noise with a mean value of 0 and a variance of 20 to r 2 . The noise in the image features was set as white gaussian noise with a mean value of 0 and a variance of interval 1 from 1 to 6. According to the above simulation conditions, images of the target were generated, and the features in the image with the noise disturbance were extracted to solve for the position and the normal vector. Each test generated 2000 simulated images to run the algorithm, and the results of each simulation condition are the average of 10 tests. The results are shown in Figure 3.
  • Noise in the Spatial Structure
The setting of the target’s pose in the camera coordinate system was the same as above. We added white gaussian noise with a mean of 0 and a variance of 2 to the image features. At this time, the noise in the coordinates of O 2 was set as white gaussian noise with a mean value of 0 and a variance of interval 5 from 0 to 40. The noise in r 2 was twice the noise in O 2 . Under the above conditions, each test generated 2000 simulated images to run the algorithm, and the results of each simulation condition are an average of 10 tests. The results are shown in Figure 3b.
  • Different z-axis Coordinates of Targets
We added white gaussian noise with a mean of 0 and a variance of 2 to the image features, white gaussian noise with a mean value of 0 and a variance of 20 to O 2 , and white gaussian noise with a mean value of 0 and a variance of 40 to r 2 . The settings of the x-axis coordinate, y-axis coordinate, and attitude were the same as above, but the z-axis coordinates were 4–10 m with an interval of 1. Under the above conditions, each test generated 2000 simulated images to run the algorithm, and the results of each simulation condition are an average of 10 tests. The results are shown in Figure 3c.
To verify the effectiveness of our algorithm and consider its applicability to drogues for air refueling, we selected the algorithms in References [18,20] for comparison, which are the curves marked as PC1 and PC2, respectively, in Figure 3.
The results in Figure 3a show that the proposed algorithm for duality elimination has a good anti-interference ability against the noise in the image features. It also has excellent performance when there is a large disturbance. That is, when the variance is 5, the success rate can be higher than 90%. The algorithm in Reference [20] also has a high degree of robustness to the noise in the image features, but it is about 3% lower than ours.
The results in Figure 3b demonstrate that the proposed algorithm is surprisingly robust to the noise in the spatial structure. Under noise with a variance of 40, the algorithm in Reference [18] almost fails, but our success rate is higher than 98.5%, which also has obvious advantages compared with the 78.20% of the algorithm in Reference [20].
This shows that the performance of our algorithm remains pretty consistent with changes in the radius of the coaxial circle and the distance between the planes caused by the opening and closing of the drogue’s stabilizing umbrella during aerial refueling.
It is worth mentioning that there is about a 30% difference between the results of the different simulations using the algorithms in References [18,20]. This means that the two algorithms are sensitive to the pose of the target.
The results in Figure 3c show that our algorithm is effective when different z-axis coordinates of the target within 10 m are used. Moreover, it can obtain a higher success rate (about 6% higher) than the algorithm in [20] under the same conditions.
2.
Accuracy of the Normal Vector
We also evaluated the accuracy of the normal vector solved by the algorithm for the fusion of multiple circular features under the same simulation conditions as above. We compared the accuracy of the normal vectors solved by the different algorithms in simulations. The results are shown in Figure 4 and Figure 5.
Comparing the curves marked OURS-1, OURS-2, OURS-3, and OC in Figure 4a–c, it is noticeable that fusing multiple coaxial circles can effectively improve the accuracy of the normal vector. The mean value of the curves for each simulation is shown in Table 1. From the curves OC and OURS-3, the data show that our fusion algorithm with both O1 and O2 can effectively improve the accuracy of the target’s normal vector. The advantages are quite obvious since the angle errors of the normal vector solved by the fusion algorithm with both O1 and O2 achieve good results. In addition, the angle errors corresponding to the simulations of different levels of image feature noise and spatial structure noise and different z-axis coordinates are reduced by 18.4%, 25.6%, and 15.4%, respectively, compared with the result that only eliminates duality by O1 and O2. Once again, these results show that our algorithm is robust against the noise in the image features and spatial structure and is effective when different z-axis coordinates of the target within at least 10 m are used. The normal vector solved by the fusion algorithm with O1 and O2 (marked as OURS-1 and OURS-2, respectively) can also improve the accuracy. However, using two coaxial circles is much better than using one. Moreover, the results for O1 and O2 are different in the case of a single coaxial circle. This is discussed in the next section.
Notice that the curve of OC is under the curves of OURS-1 in Figure 4c when the z-axis coordinates of the target are greater than 8 m. The results here are due to the success rate of duality elimination with two coaxial circles being higher than that with one coaxial circle and the pose estimation algorithm in Section 2.1 providing a highly accurate solution with less image feature noise.
3.
Influence of spatial structure on the algorithm’s performance
The striking result that emerges from the curves of OURS-1 and OURS-2 in Figure 4 is that there are great differences between them. To verify the influence of different spatial structures of the coaxial circles on the algorithm, evaluation simulations were carried out.
White gaussian noise with a mean of 0 and a variance of 2 was added to the image features. Noise with a mean value of 0 and a variance of 20 was added to the position of O 1 , and noise with a mean value of 0 and a variance of 30 was added to the position of r 1 . The settings of the x-axis coordinate, y-axis coordinate, and attitude were the same as above.
Under the above simulation conditions, comparative simulations of different z-axis coordinates of the target (5 m, 7 m, 9 m), different radii of the coaxial circles (150 mm, 320 mm, 490 mm), and different center coordinates (50 mm intervals within ± 500 ) were carried out. Each test generated 2000 simulated images to run the algorithm, and the results of each simulation condition are an average of 10 tests. The results are shown in Figure 6, Figure 7, Figure 8.
The relationship between the success rate and the target’s spatial structure is presented in Figure 6a, Figure 7a, Figure 8a, while the relationship between the accuracy of the normal vector and the target’s spatial structure is presented in Figure 6b, Figure 7b, Figure 8b. All the curves show that the absolute values of the coaxial circles’ z-axis coordinates in the world coordinate system are smaller, and the performance of the algorithm is worse. Moreover, all the curves are not completely symmetrical when the z-axis coordinates of the target in the camera coordinate system are negative and positive. When the absolute values of the z-axis coordinates of the coaxial circle with the same radius in the world coordinate system are equal, the negative one is better than the positive one, e.g., the points with abscissa of −100 and 100 on the curve marked r-490 in Figure 6, Figure 7, Figure 8.
More interestingly, Figure 6, Figure 7, Figure 8 reveal that the relationship between the performance of the proposed algorithm and the radii of the coaxial circles is the opposite when the z-axis coordinates of the coaxial circle in the world coordinate system are positive and negative. When the z-axis coordinates of the coaxial circle are negative, the coaxial circle with a radius of 490 mm (r-490) has a better success rate and the accuracy of the normal vector is higher. In contrast, the coaxial circle with a radius of 150 mm (r-150) has better performance when the z-axis coordinates of the coaxial circle are positive.
All in all, the results reveal a relationship between the performance of the proposed algorithm and the spatial structure of the target, and the algorithm’s performance is not consistent when the z-axis coordinates of the drogue in the camera coordinate system are different. Coaxial circles with a reasonable structure could greatly improve the performance of the algorithm. The following conclusions can be drawn by comparing the curves in the figure:
  • The farther away the plane of the coaxial circle is from the plane of the inner circle, the better the performance of the algorithm is. It would be a good choice to select a circular feature in the plane that is more than twice the radius of the inner circle away from the plane of the inner circle. (When the two planes are very close, the false solutions of the two circles are also very similar, which leads to a reduction in the success rate of duality elimination.)
  • When the distance between the plane of the coaxial circle and the plane of the inner circle is the same, the closer the coaxial circle plane is to the camera, the better the performance of the algorithm is.
  • When the plane of the coaxial circle is closer to the camera than the inner circle, the larger the radius of the coaxial circle is, the better the performance of the algorithm is.
  • When the plane of the coaxial circle is farther away from the camera than the inner circle, the smaller the radius of the coaxial circle is, the better the performance of the algorithm is.

5.3. Experiments on the Drogue Model

The proposed algorithm was tested on a sequence of 49 images captured by the Point Grey BFLY-U3-23S6C-C camera shown in Figure 9a with the drogue model shown in Figure 9b. The image resolution was set to 960 × 600 pixels. The focal length of the camera was calibrated as follows: fu = 1343.44, fv = 1347.53, principle point u0 = 493.53, v0 = 289.09.
The drogue model was made according to a real drogue, in which the diameter of the inner circle marked with a red solid line is 250 mm. The origin of the world coordinate system of the target was located at the center of the inner circle, and the z-axis was perpendicular to the plane and faced away from the camera.
The coordinates of the other coaxial circles were O p = ( 0 , 0 , 210 ) and O q = ( 0 , 0 , 310 ) . To simulate the deformation of the drogue during aerial refueling, an image of the drogue model was captured with some stretching and compression.
The ground truth corresponding to each frame was obtained by fusing and optimizing the pose of the target with a chessboard in multiple images according to the method described in [13].
Experiments were carried out with circle O p , circle O q , and both, which are marked O1, O2, and O3, respectively. The success rate of duality elimination is presented in Table 2, which shows that our algorithm for duality elimination has high practicability since it succeeded in all the images captured in the experiment.
The angle error of the normal vector is shown in Figure 10, where the curve marked OC describes the normal vector’s angle error with the duality eliminated by both O p and O q without fusion. The mean value of the curves O1, O2, O3, and OC is 0.11°, 0.10°, 0.08°, and 0.15°, respectively.
Furthermore, to test the computational efficiency of the algorithm, points of interval 10 from 20 to 60 were selected to recover the spatial structures by Section 4.1. The case of one coaxial circle was compared with the case of two coaxial circles.
The relationship between the computation time and the number of points for the recovery of spatial structures is presented in Table 3. As the number of points increases, the computation time of the algorithm increases. In the experiments corresponding to Table 2 and Figure 10, 30 points were selected for the spatial structure recovery of each circle, and the corresponding computation time is shown (red font) in Table 3. The case of one coaxial circle is 1.2 ms while that of two coaxial circles is 2.1 ms. This result shows that our algorithm owns a desirable real-time performance.
Overall, these results suggest that our proposed algorithm is capable of solving the problem of estimating the pose of the drogue, which means that it can effectively eliminate duality and simultaneously improve the accuracy of the target’s normal vector in real time.

6. Conclusions

In summary, we have proposed a monocular visual position and attitude estimation method of a drogue based on coaxial constraints, which can effectively eliminate the duality in pose estimation by a single circular feature and greatly improve the accuracy of the optimized normal vector. The effectiveness and robustness of the method were verified by simulations and experiments. Furthermore, we established a basis for engineering through a number of simulations and experiments. To our knowledge, no one has so far explored the relationship between the algorithm’s performance and the target’s spatial structures.

Author Contributions

K.Z. contributed to the development of the methodology, performed the experiments, and wrote the paper; Y.S. contributed to the project’s administration, provided supervision, and reviewed this paper; and Y.Z. and H.L. collected and preprocessed the data for the experiments. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China, grant number 61533008.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zhang, J.; Zhen, L. Tracking and position of drogue for autonomous aerial refueling. In Proceedings of the 2018 IEEE 3rd Optoelectronics Global Conference (OGC), Shenzhen, China, 4–7 September 2018; pp. 171–175. [Google Scholar] [CrossRef]
  2. Latimer-Needham, C.H. Apparatus for Aircraft-Refueling in Flight and Aircraft-Towing. U.S. Patent Application No. 2,716,527, 30 August 1955. [Google Scholar]
  3. Clifford, J.L. Aircraft Interconnecting Mechanism. U.S. Patent Application No. 2,663,523, 12 December 1953. [Google Scholar]
  4. Maa, Y.; Zhao, R.; Liu, E.; Zhang, Z.; Yan, K. A novel autonomous aerial refueling drogue detection and pose estimation method based on monocular vision. Measurement 2019, 136, 132–142. [Google Scholar] [CrossRef]
  5. Valasek, J.; Gunnam, K.; Kimmett, J.; Tandale, M.D.; Junkins, J.L.; Hughes, D. Vision-based sensor and navigation system for autonomous air refueling. J. Guid. Control Dyn. 2005, 28, 979–989. [Google Scholar] [CrossRef]
  6. Martinez, C.; Richardson, T.; Thomas, P.; du Bois, J.L.; Campoy, P. A vision-based strategy for autonomous aerial refueling tasks. Robot. Auton. Syst. 2013, 61, 876–895. [Google Scholar] [CrossRef]
  7. Lin, S.; Garratt, M.A.; Lambert, A.J. Monocular vision-based real-time target recognition and tracking for autonomously landing an UAV in a cluttered shipboard environment. Auton. Robot. 2017, 41, 881–901. [Google Scholar] [CrossRef]
  8. Yang, Y.; Shen, Q.; Li, J.; Deng, Z.; Wang, H.; Gao, X. Position and Attitude Estimation Method Integrating Visual Odometer and GPS. Sensors 2020, 20, 2121. [Google Scholar] [CrossRef] [PubMed]
  9. Lepetit, V.; Moreno-Noguer, F.; Fua, P. EPnP: An Accurate O(n) Solution to the PnP Problem. Int. J. Comput. Vis. 2009, 81, 155–166. [Google Scholar] [CrossRef] [Green Version]
  10. Hesch, J.A.; Roumeliotis, S.I. A Direct Least-Squares (DLS) Method for PnP. In Proceedings of the IEEE International Conference on Computer Vision, Barcelona, Spain, 6–13 November 2011; pp. 383–390. [Google Scholar]
  11. Kneip, L.; Li, H.; Seo, Y. UPnP: An Optimal O(n) Solution to the Absolute Pose Problem with Universal Applicability. In Computer Vision—ECCV 2014; Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T., Eds.; Springer: Cham, Switzerland, 2014; Volume 8689, pp. 127–142. [Google Scholar]
  12. Ferraz, L.; Binefa, X.; Moreno-Noguer, F. Very Fast Solution to the PnP Problem with Algebraic Outlier Rejection. In Proceedings of the 2014 IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 23–28 June 2014; pp. 501–508. [Google Scholar] [CrossRef] [Green Version]
  13. Sun, Y.; Huang, B.; Yang, B.; Yao, S. Vision Calibration Method of Spatial Point Position Under Multi-View Scene Determination. J. Nanjing Univ. Aeronaut. Astronaut. 2015, 47, 343–347. [Google Scholar]
  14. Barone, F.; Marrazzo, M.; Oton, C.J. Camera Calibration with Weighted Direct Linear Transformation and Anisotropic Uncertainties of Image Control Points. Sensors 2020, 20, 1175. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  15. Qin, T.; Li, P.; Shen, S. VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator. IEEE Trans. Robot. 2018, 34, 1004–1020. [Google Scholar] [CrossRef] [Green Version]
  16. Zhao, J.; Hu, Y.; Tian, M. Pose Estimation of Excavator Manipulator Based on Monocular Vision Marker System. Sensors 2021, 21, 4478. [Google Scholar] [CrossRef] [PubMed]
  17. Huang, B.; Sun, Y.R.; Zeng, Q.H. Real-time drogue detection and template tracking strategy for autonomous aerial refueling. J. Real-Time Image Process. 2020, 17, 437–446. [Google Scholar] [CrossRef]
  18. Miao, X.; Zhu, F.; Ding, Q.; Hao, Y.; Wu, Q.; Xia, R. Monocular vision pose measurement based on docking ring component. Guangxue Xuebao/Acta Opt. Sin. 2013, 33, 123–131. [Google Scholar] [CrossRef]
  19. Lu, J.; Xia, M.; Li, W.; Guo, W.-P.; Yang, K.-C. 3-D location estimation of underwater circular features by monocular vision. Optik 2013, 124, 6444–6449. [Google Scholar] [CrossRef]
  20. Zhang, L.M.; Zhu, F.; Hao, Y.M.; Wang, M.Y. Pose measurement based on a circle and a non-coplanar feature point. Guangzi Xuebao/Acta Photonica Sin. 2015, 44, 112002. [Google Scholar] [CrossRef]
  21. Zhang, L.; Huang, X.; Feng, W.; Hu, T.; Liang, S. Solution of duality in pose estimation of a single circle utilizing constraint angles motion reconstruction. Guangxue Xuebao/Acta Opt. Sin. 2016, 36, 115002. [Google Scholar] [CrossRef]
  22. Gurdjos, P.; Sturm, P.; Wu, Y. Euclidean structure from N ≥ 2 parallel circles: Theory and algorithms. In Computer Vision—ECCV 2006; Leonardis, A., Bischof, H., Pinz, A., Eds.; Springer: Berlin/Heidelberg, Germany, 2006; Volume 3951, pp. 238–252. [Google Scholar]
  23. Huang, B.; Sun, Y.; Zeng, Q. General fusion frame of circles and points in vision pose estimation. Optik 2018, 154, 47–57. [Google Scholar] [CrossRef]
  24. Huang, H.; Zhang, H.; Cheung, Y.-m.; IEEE. The Common Self-polar Triangle of Concentric Circles and Its Application to Camera Calibration. In Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 4065–4072. [Google Scholar] [CrossRef]
  25. Zhao, Z.; Liu, Y. Applications of projected circle centers in camera calibration. Mach. Vis. Appl. 2010, 21, 301–307. [Google Scholar] [CrossRef]
  26. Ma, W.; Sun, S.; Song, J.; Li, W. Circle Pose Estimation Based on Stereo Vision. Appl. Mech. Mater. 2012, 263–266, 2408–2413. [Google Scholar] [CrossRef]
  27. Liu, Z.Y.; Liu, X.; Duan, G.F.; Tan, J.R. Precise pose and radius estimation of circular target based on binocular vision. Meas. Sci. Technol. 2019, 30, 14. [Google Scholar] [CrossRef]
  28. Huang, B.; Sun, Y.; Zhu, Y.; Xiong, Z.; Liu, J. Vision pose estimation from planar dual circles in a single image. Optik 2016, 127, 4275–4280. [Google Scholar] [CrossRef]
  29. Huang, B. Research on Close-range High-accuracy Vision Navigation Technology for UAV Aerial Refueling; Nanjing University of Aeronautics and Astronautics: Nanjing, China, 2019. [Google Scholar]
Figure 1. This is a figure of a drogue for aerial refueling. (a) The drogue during aerial refueling seen from the perspective of the receiver; (b) a side view of the drogue model.
Figure 1. This is a figure of a drogue for aerial refueling. (a) The drogue during aerial refueling seen from the perspective of the receiver; (b) a side view of the drogue model.
Sensors 21 05673 g001
Figure 2. This is a figure of the spatial relationship of a point and circle corresponding to the duality.
Figure 2. This is a figure of the spatial relationship of a point and circle corresponding to the duality.
Sensors 21 05673 g002
Figure 3. This is a figure showing the success rate of the algorithm under different simulation conditions. The curves corresponding to the algorithm in References [18,20] are marked as PC1 and PC2. The coordinates of the two points were set to (300, 300, 0) and (150, 150, −500), and gaussian noise with a mean of 0 and a variance of 5 was added to the position. (a) The success rate of the algorithms under different levels of noise in the image; (b) The success rate of the algorithms under different levels of noise in the spatial structure; (c) The success rate of the algorithms under different z-axis coordinates of the targets.
Figure 3. This is a figure showing the success rate of the algorithm under different simulation conditions. The curves corresponding to the algorithm in References [18,20] are marked as PC1 and PC2. The coordinates of the two points were set to (300, 300, 0) and (150, 150, −500), and gaussian noise with a mean of 0 and a variance of 5 was added to the position. (a) The success rate of the algorithms under different levels of noise in the image; (b) The success rate of the algorithms under different levels of noise in the spatial structure; (c) The success rate of the algorithms under different z-axis coordinates of the targets.
Sensors 21 05673 g003
Figure 4. This is a figure about the angle error of the normal vector under different simulation conditions. The normal vector that only eliminates duality by O 1 and O 2 is marked as OC, and the normal vectors solved by the fusion algorithm with O 1 , O 2 and both O 1 and O 2 are marked as OURS-1, OURS-2, and OURS-3, respectively. (a) The angle error of the normal vector under different levels of noise in image features; (b) The angle error of the normal vector under different levels of noise in the spatial structure; (c) The angle error of the normal vector under different z-axis coordinates of the targets.
Figure 4. This is a figure about the angle error of the normal vector under different simulation conditions. The normal vector that only eliminates duality by O 1 and O 2 is marked as OC, and the normal vectors solved by the fusion algorithm with O 1 , O 2 and both O 1 and O 2 are marked as OURS-1, OURS-2, and OURS-3, respectively. (a) The angle error of the normal vector under different levels of noise in image features; (b) The angle error of the normal vector under different levels of noise in the spatial structure; (c) The angle error of the normal vector under different z-axis coordinates of the targets.
Sensors 21 05673 g004
Figure 5. This is a figure showing some examples of simulations for the angle error of the normal vector. (a) The angle error of the normal vector when the variance in the image feature noise is 2; (b) The angle error of the normal vector when the variance in the spatial structure noise is 20; (c) The angle error of the normal vector at z-axis coordinates of 6 m.
Figure 5. This is a figure showing some examples of simulations for the angle error of the normal vector. (a) The angle error of the normal vector when the variance in the image feature noise is 2; (b) The angle error of the normal vector when the variance in the spatial structure noise is 20; (c) The angle error of the normal vector at z-axis coordinates of 6 m.
Sensors 21 05673 g005
Figure 6. This is a figure about the relationship between the performance of the proposed algorithm and the target’s spatial structure when the z-axis coordinate of the target is 5 m. The curves of the different radii are marked r-150, r-320, and r-490, respectively. (a) The curves of the success rate. (b) The curves of the normal vector’s angle error.
Figure 6. This is a figure about the relationship between the performance of the proposed algorithm and the target’s spatial structure when the z-axis coordinate of the target is 5 m. The curves of the different radii are marked r-150, r-320, and r-490, respectively. (a) The curves of the success rate. (b) The curves of the normal vector’s angle error.
Sensors 21 05673 g006
Figure 7. This is a figure about the relationship between the performance of the proposed algorithm and the target’s spatial structure when the z-axis coordinate of the target is 7 m. (a) The curves of the success rate. (b) The curves of the normal vector’s angle error.
Figure 7. This is a figure about the relationship between the performance of the proposed algorithm and the target’s spatial structure when the z-axis coordinate of the target is 7 m. (a) The curves of the success rate. (b) The curves of the normal vector’s angle error.
Sensors 21 05673 g007
Figure 8. This is a figure about the relationship between the performance of the proposed algorithm and the target’s spatial structure when the z-axis coordinate of the drogue is 9 m. (a) The curves of the success rate. (b) The curves of the normal vector’s angle error.
Figure 8. This is a figure about the relationship between the performance of the proposed algorithm and the target’s spatial structure when the z-axis coordinate of the drogue is 9 m. (a) The curves of the success rate. (b) The curves of the normal vector’s angle error.
Sensors 21 05673 g008
Figure 9. This is a figure about the experimental equipment. (a) A photo of the camera and the lens used for image capture. (b) A photo of the drogue model.
Figure 9. This is a figure about the experimental equipment. (a) A photo of the camera and the lens used for image capture. (b) A photo of the drogue model.
Sensors 21 05673 g009
Figure 10. This is a figure of the angle error of the normal vector corresponding to the captured images.
Figure 10. This is a figure of the angle error of the normal vector corresponding to the captured images.
Sensors 21 05673 g010
Table 1. This is a table about the mean error of the normal vector in the curves under different simulation conditions.
Table 1. This is a table about the mean error of the normal vector in the curves under different simulation conditions.
CurvesCorresponding to Simulation (a)Corresponding to Simulation (b)Corresponding to Simulation (c)
Ours-10.130.130.15
Ours-20.110.120.12
Ours-30.070.080.08
OC0.170.190.20
Table 2. The success rate of the algorithm with different coaxial circles.
Table 2. The success rate of the algorithm with different coaxial circles.
CirclesSuccess Rate (%)
O1100
O2100
O3100
Table 3. This is a table of the relationship between the computation time and the number of points for spatial structure recovery.
Table 3. This is a table of the relationship between the computation time and the number of points for spatial structure recovery.
Number of Points for Each CircleComputation Time of 1
Coaxial Circle (ms)
Computation Time of 2
Coaxial Circles (ms)
200.91.3
301.22.1
401.42.6
501.63.1
601.93.8
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zhao, K.; Sun, Y.; Zhang, Y.; Li, H. Monocular Visual Position and Attitude Estimation Method of a Drogue Based on Coaxial Constraints. Sensors 2021, 21, 5673. https://doi.org/10.3390/s21165673

AMA Style

Zhao K, Sun Y, Zhang Y, Li H. Monocular Visual Position and Attitude Estimation Method of a Drogue Based on Coaxial Constraints. Sensors. 2021; 21(16):5673. https://doi.org/10.3390/s21165673

Chicago/Turabian Style

Zhao, Kedong, Yongrong Sun, Yi Zhang, and Hua Li. 2021. "Monocular Visual Position and Attitude Estimation Method of a Drogue Based on Coaxial Constraints" Sensors 21, no. 16: 5673. https://doi.org/10.3390/s21165673

APA Style

Zhao, K., Sun, Y., Zhang, Y., & Li, H. (2021). Monocular Visual Position and Attitude Estimation Method of a Drogue Based on Coaxial Constraints. Sensors, 21(16), 5673. https://doi.org/10.3390/s21165673

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop