Next Article in Journal
Improvement in Accuracy of a Multi-Joint Robotic Ultrasonic Inspection System for the Integrity of Composite Structures
Previous Article in Journal
Xylitol Production from Exhausted Olive Pomace by Candida boidinii
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Improved Uncalibrated Visual Servo Strategy for Hyper-Redundant Manipulators in On-Orbit Automatic Assembly

1
CAS Key Laboratory of On-orbit Manufacturing and Integration for Space Optics System, Changchun Institute of Optics, Fine Mechanics and Physics, Chinese Academy of Sciences, Changchun 130000, China
2
University of Chinese Academy of Sciences, Beijing 100049, China
3
Center of Materials Science and Optoelectronics Engineering, University of Chinese Academy of Sciences, Beijing 100049, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2020, 10(19), 6968; https://doi.org/10.3390/app10196968
Submission received: 27 August 2020 / Revised: 30 September 2020 / Accepted: 30 September 2020 / Published: 5 October 2020
(This article belongs to the Section Robotics and Automation)

Abstract

:

Featured Application

An uncalibrated visual servo strategy for the on-orbit automatic assembly of an ultra-high-caliber optical system. It is mainly employed to realize the static positioning and dynamic tracking of the desired image features in a harsh space environment when the desired image features required for assembly are known.

Abstract

Aiming at on-orbit automatic assembly, an improved uncalibrated visual servo strategy for hyper-redundant manipulators based on projective homography is proposed. This strategy uses an improved homography-based task function with lower dimensions while maintaining its robustness to image defects and noise. This not only improves the real-time performance but also makes the joint space of hyper-redundant manipulator redundant to the homography-based task function. Considering that the kinematic parameters of the manipulator are easily changed in a space environment, the total Jacobian between the task function and manipulator joints is estimated online and used to construct a controller to directly control the manipulator joints without a kinematics model. When designing the controller, the above-mentioned redundancy is exploited to solve the problem of the over-limiting of the joint angles of the manipulator. The KF-SVSF method, which combines the optimality of the Kalman filter (KF) and the robustness of the smooth variable structure filter (SVSF), is introduced to the field of uncalibrated visual servos for the first time to perform the online estimation of the total Jacobian. In addition, the singular value filtering (SVF) method is adopted for the first time to avoid the interference caused by the unstable condition number of the estimated total Jacobian. Finally, simulations and experiments verify the superiority of this strategy in terms of its real-time performance and precision.

1. Introduction

Operating an ultra-high-caliber optical system in high space orbit is one of the key goals of the Chinese space industry and will help scientists to make more detailed observations of the universe [1]. However, it is impossible to transport it to space as a whole or install such a large space system manually in space. Large space structure construction through on-orbit services (OOS) may be the best solution [2,3,4,5,6,7,8,9,10,11,12,13]. OOS missions also include visual inspection, fuel refueling and satellite repair and are of increasing interest to the space industry because of their high economical potential and strategic benefits. There are many interesting challenges in the development and operation of space robots for OOS missions, such as motion estimation and the prediction of target objects, multi-physics modeling and the simulation of space manipulators, practical dynamic parameter identification, dexterous and efficient manipulation, time delay and sensor errors in feedback controls, compliance and intelligent controls, multi-arm coordinated controls and operations, among others [5,6,7]. A visual servo is also an important technology for manipulators for OOS missions [8,9,10,11,12,13]. It uses the image information captured by a camera to position or track space manipulators. This paper mainly considers the uncalibrated visual servo strategy for a hyper-redundant manipulator with a hand–eye system to realize the alignment of both static and dynamic assembly objects when the desired image features are known. Joint flexibility is a problem in space manipulators that may cause vibration. Joint flexibility is ignored in the research presented in this paper; the details of the joint flexibility problem can be found in [14].
The traditional visual servo relies on the accurate calibration of the internal camera parameters and the hand–eye relationship and determines the pose of the manipulator based on the current image information and calibrated system parameters [13,15,16,17,18]. Various parameters of a robotic vision system are likely to change in the complex and harsh space environment, which will have a serious impact on the accuracy of the visual servo. Moreover, frequent system recalibration will greatly increase the cost and workload, and its feasibility is quite low [19,20,21,22]. For these reasons, research into uncalibrated visual servos is very significant.
The term “uncalibrated” mainly means that the internal and external parameters of a camera are unknown. Compared with the position-based visual servo and hybrid visual servo, which both require 3-D reconstruction and are highly sensitive to calibration error, an image-based visual servo is more applicable to this uncalibrated situation [21,22]. An image-based uncalibrated visual servo (IBUVS) uses image information such as the pixel coordinates of target feature points to construct task functions directly and design a controller based on the image Jacobian, which is estimated online in each control period [22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38].
In actual application, some image features may be missing owing to motion blur, inconspicuous background contrast and the occlusion of the visual field, causing the visual servo task to fail. In addition, image transmission noise and the resulting image processing errors also impact the accuracy of visual servos. To solve these problems, researchers have proposed an uncalibrated visual servo based on projective homography (PHUVS) [23]. This method solves the homography matrix by matching as many feature points as possible and uses the elements of the homography matrix to construct the task function. The homography matrix can be solved as long as there are no less than four pairs of matched feature points. Therefore, the visual servo will not be interrupted even if some feature points are lost. In addition, the PHUVS has a certain compensatory effect on image noise. As the number of matched feature points increases, this compensation becomes more obvious and the system shows better robustness to image noise [39]. It is worth mentioning that the dimensions of the homography matrix and task function remain constant regardless of the number of matching feature points, so the real-time performance of system is not affected. However, the original homography-based task function fails to take full advantage of the eight degrees of freedom of the homography matrix. In addition to further improving the real-time performance, this property is of great significance to the application of PHUVS in hyper-redundant manipulators.
The online estimation of the image Jacobian is another pivotal point of an uncalibrated visual servo. Classic image Jacobian estimation methods include the dynamic Broyden–Gauss–Newton method [37], Broyden–RLS method [29], group-based Broyden method [38], Kalman filter (KF) method [23,26,27,36], and particle filter [33,38]. Hao [22] showed that the KF method performs better than other methods when tracking targets with changing directions of motion. Music [29] also compared these methods and showed that the particle filter is slightly more accurate when the noise level is high, but the Kalman filter has better real-time and dynamic performance for mechanical systems. A limitation of Kalman filtering is that its accuracy may decrease or even diverge when there is modeling uncertainty and the statistical characteristics of noise are unknown [31,32]. To solve this problem, researchers have proposed several methods such as the fuzzy adaptive Kalman filter [31,32], Sage–Husa adaptive Kalman filter and Kalman neural network [34], but the performances of these methods are not always stable. A recently proposed variational Bayesian-based adaptive Kalman filter can improve accuracy to a certain extent, but its process is so cumbersome that it takes more than 20 times as much calculation time as the Kalman filter [40].
Considering the complexity of on-orbit maintenance tasks and the need for flexibility, a redundant manipulator with more degrees of freedom is a reasonable choice [41,42]. Redundancy is often used in space manipulators; for example, to minimize the reactions transferred from the manipulator to the spacecraft [43] or to minimize the kinetic energy and joint torques [44]. Therefore, a hyper-redundant manipulator with nine degrees of freedom has been designed to implement on-orbit maintenance tasks for large space facilities. However, its kinematic parameters are affected by factors such as thermal expansion and contraction, corrosion, etc., and often change. It is almost impossible to recalibrate a hyper-redundant manipulator with a complex structure when its parameters change due to working in the harsh space environment for a long time. Meanwhile, a controller that is constructed with the estimated total Jacobian and directly controlled manipulator joints is more accurate and robust to the changes of manipulator parameters. In addition, the estimated total Jacobian of the homography-based task function tends to present an unstable conditional number. This results in a sudden increase of the manipulator velocity in some directions, which causes the manipulator to shake and results in serious errors [45].
On the basis of the above description, an improved uncalibrated visual servo strategy for hyper-redundant manipulators in on-orbit automatic assembly is proposed. In this strategy, an improved task function with robustness to image noise and better real-time performance is constructed; another advantage of this is that the joint space of hyper-redundant manipulators has redundancy. To improve the system accuracy and make the system robust to the variable parameters of manipulator in space, the estimated total Jacobian, which maps the task function and manipulator joints, is used to construct a controller to control the manipulator joints directly without a kinematics model. When designing the controller, the above-mentioned redundancy is exploited to prevent damage to the mechanical structure caused by the over-limiting of the joint angles of the manipulator. Meanwhile, considering that the visual servo system uses an approximate linear model and the measurement noise of the task function is difficult to determine, the KF-smooth variable structure filter (SVSF) method for state space estimation is introduced into the visual servo to perform the online estimation of the total Jacobian. This method combines the KF method and SVSF method and shows better robustness to noise and modeling uncertainty [46,47,48]. In addition, the singular value filtering (SVF) method [45] is adopted to avoid the disturbance caused by the unstable conditional number of the estimated total Jacobian. Finally, this strategy is verified by several simulations and experiments.

2. Projective Homography

The current frame F and the desired frame F of an eye-in-hand camera are defined for problems with an eye-in-hand system in which the camera moves with the manipulator. Figure 1 shows the geometric relationship between these two camera frames. Suppose M i = [ X i , Y i , Z i ] is the coordinate of the feature point P i in F and M i = [ X i , Y i , Z i ] is the coordinate described by the desired camera, while p i = [ u i v i 1 ] and p i = [ u i v i 1 ] are the homogeneous pixel coordinates of feature point P i captured by the current and desired cameras, respectively. The geometric relationship between M i and M i can be expressed as
M i = R M i + t
where R R 3 × 3 and t R 3 × 1 are the orientation matrix and displacement vector between F and F expressed in F , respectively.
Suppose each target point P i is on the plane π , d is the distance between the desired projection center and plane π and n is the normal vector of π expressed in the desired camera frame F with length n = 1 / d . Based on Equation (1), we can obtain
M i = ( R + t n T ) M i
G = R + t n T
where G is the Euclidean homography matrix. In normalized coordinates, (2) can be expressed as
m i = Z i Z i ( R + t n T ) m i
where m i = [ X i / Z i , Y i / Z i , 1 ] and m i = [ X i / Z i , Y i / Z i , 1 ] . The depth scale is denoted as
α i = Z i Z i .
The following relationship can be obtained from Equations (2), (3) and (5):
p i = α i ( K G K 1 ) p i
where K is the internal parameter matrix of the eye-in-hand camera [36]. The homography matrix can be obtained as
H = K G K 1
which refers to the mapping between different images projected from the same object to two different image planes. It has been applied in many domains such as motion estimation, 3-D reconstruction [49] and of course the visual servo. Even in an uncalibrated situation, this matrix can be estimated up to a scale β with more than four corresponding image features in different images via
H ^ = β H
where β is independent of the depth ratio α i in (5).
Because the images are always disturbed by noise during transmission and processing, the exact solution of H cannot be obtained via the noisy measurement image point p i . However, it is possible to obtain the optimal estimated homography matrix H ^ in the sense of least squares. The estimated image point of P i in the current image can be obtained via p ^ i = H ^ p i . The root-mean-squared (RMS) error between p ^ i and the true points can be obtained via ε ^ ε ( 4 / n ) 1 / 2 [38], where n is the number of image features and ε is the RMS error between the measurement point p i and the true point. It is worth noting that the estimated homography matrix H ^ can be equivalently considered as being estimated by the matching points p ^ i p i . Obviously, if n > 4 , building the task function with the elements of the homography matrix will have a certain compensation effect for image noise.

3. Improved Strategy for the Uncalibrated Visual Servo of a Hyper-Redundant Space Manipulator

In this section, our improved strategy as shown in Figure 2c will be described in detail. The first part details the new task function and provides related proofs. Then, the more robust Kalman filter–smooth variable structure filter (KF-SVSF) method will be introduced to estimate the total Jacobian. Finally, the estimated total Jacobian, processed via SVF to reduce the disturbance caused by an unstable conditional number, is used to build a controller to control the manipulator joints directly for both static positioning and dynamic tracking while limiting the joint angle.

3.1. Improved Homography-Based Task Function

In the original PHUVS strategy proposed by Zeyu Gong [23], the homography-based task function is defined as
h ^ = [ H ^ 1 H ^ 2 H ^ 3 ] T = [ h ^ 1 h ^ 2 h ^ 9 ] 9 × 1 T
where H ^ i is the i th row of the estimated homography matrix. The error vector of the system is denoted by
e = h 0 h ^
where h 0 is constructed by stacking the rows of the identity matrix I 3 × 3 . The purpose of our strategy can be considered to be making the current camera frame F coincide with F . To guarantee this, the constraint det ( H ^ ) = 1 is defined to obtain the unique projective homography matrix H ^ between the current and desired feature points in each iteration. It can be proved that e = 0 if and only if rotation matrix R = I 3 × 3 and position vector t = 0 [23].
In our strategy, the constraint h ^ 9 = 1 is defined, where h 9 is the last element of the homography matrix and a direct linear transformation (DLT) is introduced to estimate the homography matrix. First, it is necessary to normalize the two images, respectively, as follows:
  • Translate the feature points as p ˜ i = T 1 p i and p ˜ i = T 1 p i to ensure that the centroid of these feature points is at the origin;
  • Scale feature points as p ¯ i = T 2 p ˜ i and p ¯ i = T 2 p ˜ i to make their average distance from the origin equal to 2 .
On the basis of the properties of homogeneous coordinates, homogeneous simultaneous equations can be constructed with transformed pixel coordinates as
( p ¯ i T 0 T u ¯ i p ¯ i 0 T p ¯ i T v ¯ i p ¯ i ) ( H ¯ 1 T H ¯ 2 T H ¯ 3 T ) A 2 n × 9 X = 0 = 0
where A is the coefficient matrix and H ¯ i is the ith row of H ¯ . The purpose of the above normalization operation is to prevent the coefficient matrix A from becoming ill-conditioned owing to image noise. This solves the problem of numerical instability.
Then, the least squares solution of equation (11) can be calculated via singular value decomposition. Then, we conduct an inverse transform on H ¯ as
H ˜ = ( T 2 T 1 ) 1 h ¯ ( T 2 T 1 )
where H ˜ is the estimated homography between the current and desired feature points. The homography H ˜ can be scaled at any scale.
Thus, H ˜ is scaled as H ^ = H ˜ / h ˜ 9 and the new task function is constructed as
h ^ = [ h ^ 1 h ^ 2 h ^ 8 ] T 8 × 1
where h ^ reflects the eight degrees of freedom of the homography matrix and further reduces the dimensions of the system. The error vector of system is defined as
e = [ 1 0 0 0 1 0 0 0 ] T [ h ^ 1 h ^ 2 h ^ 8 ] T
where e = 0 if and only if the rotation matrix R = 0 and the position vector t = 0 . Its sufficiency and necessity can be proved as follows.
  • Necessity proof: If R = I and t = 0 , then G = I . On the basis of the similarity of H and G , we obtain H = I . From Equation (8), we obtain H ^ = H / β = I / β . The constraint h 9 = 1 gives 1 / β = h 9 = 1 ; i.e., β = 1 , H ^ = I , and e = 0 .
  • Sufficiency: It is obvious that H ^ = I and H = β I if e = 0 . From Equation (7), we have
    G = K 1 H K = β I .
Taking two feature points, the coordinates of these points in F and F can be described as
M i = β M i , M j = β M j .
Obviously, regardless of in which frame this is expressed, the distance between P i and P j is constant. That is,
M i M j = M i M j = β M i M j .
This means that β = 1 , which implies H = I and G = I . Thus, R = I and t = 0 .
This new task function takes full advantage of the property that the homography matrix has eight degrees of freedom. This not only reduces the dimensions of the task function but also simplifies the state space of the visual servo system that needs to be estimated online, so the real-time performance of system is further improved. Meanwhile, this causes the total Jacobian J t R 8 × 9 , which maps the task function and joint space of the hyper-redundant manipulator, to become redundant. Therefore, this redundancy can be used to finish some secondary tasks such as limiting the joint angle without a manipulator kinematic model during the servo process. Obviously, this improves the robustness of system to changes in the kinematic parameters of manipulators.
Remark 1.
A flaw in the inhomogeneous solution of the homography matrix is that if the true value of h 9 is 0 or very close to 0, the estimated homography matrix becomes very unstable. In fact, this situation rarely occurs in the visual servo. In static positioning tasks, the task function generally changes near to the desired gradient direction. As long as h ^ 9 is not equal to 0 at the initial position, its value will never become 0 during the servo process. In tracking tasks, the task function is always close to the desired value.

3.2. Online Estimation of Total Jacobian

The kinematic parameters of the space manipulator change easily in the harsh space environment. To avoid the repeated calibration of the complex hyper-redundant manipulator, the total Jacobian between the task function and the joints of the manipulator is used to construct the controller. The expression of the total Jacobian is
J t = [ h ^ 1 / q 1 h ^ 1 / q 9 h ^ 8 / q 1 h ^ 8 / q 9 ] 8 × 9
Therefore, the accuracy of the estimation of the total Jacobian is directly related to the system performance.
The uncalibrated visual servo system of the nine degree-of-freedom (9 DOF) manipulator is modeled as
X K + 1 = X K
Z K + 1 = C K + 1 X K + 1
where X K = [ ( J t 1 ) K ( J t 2 ) K ( J t 8 ) K ] T and J t i is the i th row of the estimated total Jacobian J t . C K = d i a g ( q ˙ K q ˙ K ) 8 × 72 is constructed by the joint velocity of the manipulator, and Z K can be considered as the change rate of task function h ^ .
It is worth noting that this model is a linear approximation of the visual servo system, and it causes modeling uncertainty. Meanwhile, because the homography-based task function is used to compensate for the image noise, the statistical characteristics of the measurement noise of the task function become difficult to obtain as well. Therefore, the more robust KF-SVSF method [46,47,48] is introduced into the visual servo to improve the system accuracy.
This method combines the optimality of the KF method with the robustness of the SVSF method. As shown in Figure 3, it introduces the concept of an optimal smoothing variable boundary layer, which is an estimation of the impact of modeling uncertainty and noise. The KF-SVSF method seeks a balance between accuracy and robustness by introducing a saturation limit into the calculated optimal boundary layer. By switching between KF gain and SVSF gain, the optimality of KF is maintained within the limit, while the robustness and stability of SVSF remain outside the limited range.
First, the smooth variable boundary layer can be obtained as follows:
  • Calculate the a priori state space:
    X ^ K + 1 | K = X ^ K | K
  • Calculate the a priori state error covariance matrix:
    P K + 1 | K = P K | K + Q K
  • Calculate the residuals vector:
    Z ^ K | K = C X ^ K | K
    Z ^ K + 1 | K = C X ^ K + 1 | K
    e Z , K + 1 | K = Z K + 1 Z ^ K + 1 | K
    e Z , K | K = Z K Z ^ K | K
  • Calculate the smoothing boundary layer ψ K + 1 :
    S K + 1 | K = C P K + 1 | K C T + R K + 1
    E K + 1 = | e Z , K + 1 | K | a b s + γ | e Z , K | K | a b s
    ψ K + 1 = ( E ¯ K + 1 1 C P K + 1 | K C T S K + 1 1 ) 1
    where Q K and R K + 1 are the process noise covariance matrix and measurement noise covariance matrix, respectively; E ¯ K + 1 is the diagonal matrix constructed with E K + 1 ; and the calculated optimal boundary layer ψ K + 1 of the uncalibrated visual servo system is the diagonal matrix ψ K + 1 = d i a g ( ( ψ 1 ) K + 1 ( ψ 2 ) K + 1 ( ψ 8 ) K + 1 ) .
  • Compare the calculated boundary layer ψ K + 1 with the threshold ψ S , which is based on the specific conditions of the system, and switch the appropriate filter gain:
    If ψ K + 1 ψ S , the KF gain is switched to provide an optimal estimate:
    K K + 1 = C + E ¯ K + 1 ψ K + 1 1
    If ψ K + 1 > ψ S , the SVSF gain is switched to provide a robust estimate:
    K K + 1 = C + d i a g [ E K + 1 S a t ( e Z , K + 1 | K ψ ) ] × [ d i a g ( e Z , K + 1 | K ) ] 1
    where ψ R 8 × 1 is the standard boundary layer in SVSF, and this can be adjusted according to the diagonal elements of the calculated smoothing boundary layer matrix ψ K + 1 .
Then, we complete the subsequent total Jacobian estimation process:
6. Calculate the a posteriori state error covariance matrix:
P K + 1 | K + 1 = ( I K K + 1 C ) P K + 1 | K ( I K K + 1 C ) T + K K + 1 R K + 1 K K + 1 T
7. Obtain the estimated state vector X ^ K + 1 | K + 1 :
X ^ K + 1 | K + 1 = X ^ K + 1 | K + K K + 1 e Z , K + 1 | K
8. Split the state vector X ^ K + 1 | K + 1 into the estimated Jacobian J t as:
( J t ) K = [ ( X ^ K + 1 | K + 1 ) 1 ( X ^ K + 1 | K + 1 ) 9 ( X ^ K + 1 | K + 1 ) 64 ( X ^ K + 1 | K + 1 ) 72 ] 8 × 9 .
Remark 2.
When carrying out dynamic tracking tasks, the expanded total Jacobian matrix J e = [ J t , h / t ] 8 × 10 is estimated online, where h / t represents the Jacobian between the gradient of the task function and the velocity of the target and is important for the construction of a dynamic tracking controller.

3.3. Controller Design

Several advanced controllers based on the estimated Jacobian can be used to realize a well-behaved uncalibrated visual servo; for example, a nonlinear controller with a disturbance observer and a disturbance-rejection controller with an extended-state observer [25]. However, the proportional controller is still the most widely used and developed method in visual servos [22,23,26,29,31,32,33,34,35,36,37,38]. In addition, its simplicity is beneficial for us to verify this robust strategy.
Before constructing the proportional controller, two problems need to be solved. The first is the disturbance caused by the unstable conditional number of the estimated total Jacobian. Through a large number of simulations, it is found that the estimated Jacobian of the homography-based task function tends to present an unstable conditional number and causes drastic changes of the manipulator velocity in certain directions, resulting in severe vibrations and serious errors. Therefore, the SVF method is introduced to deal with this problem [45]. Singular value decomposition is performed on the estimated total Jacobian J t of the hyper-redundant manipulator according to
( J t ) m × n = U m × m m × n V n × n T = i = 1 m σ i u i v i T
where σ i is the singular value of the total Jacobian J t and u i and v i are the i th columns of U and V , respectively. The filtering function h σ 0 , υ ( σ ) of the singular value σ i proposed in the SVF method can maintain the stability of the smallest singular value of the estimated total Jacobian. It is defined as
h σ 0 , υ ( σ i ) = σ i 3 + υ σ i 2 + 2 σ i + 2 σ 0 σ i 2 + υ σ i + 2
where σ 0 is the minimum acceptable singular value and υ is a scaling factor that regulates the filtering shape of function h σ 0 , υ ( σ ) . If σ 0 < υ and υ σ 0 < 2 , h σ 0 , υ ( σ ) can stay monotonic. The pseudo-inverse of the total Jacobian J t can be developed via the SVF method as
J S + = { J t T ( J t T J t ) 1 , min | σ i | > σ 0 i = 1 m 1 h σ 0 , υ ( σ i ) v i u i T , e l s e .
The restriction on the minimum singular value guarantees the stability of the conditional number of the estimated total Jacobian. Thus, the disturbance caused by the unstable conditional number can be reduced without generating much additional error in the algorithms.
Another problem is that the manipulator joint angles may overrun. If the joint angle is not limited, it may cause a failure of the visual servo or even cause damage to the space manipulator. At this time, the redundancy of the joint space of the hyper-redundant manipulator resulting from the improved homography-based task function can be used to prevent this from happening. The gradient projection based on the redundant Jacobian can be adopted to avoid the joint limit [41,42]. When the joints move away from the limit of joint, allowing the joints to move freely can reduce unnecessary energy consumption. The gradient function is defined as
H ( q ) = [ H ( q ) / q 1 H ( q ) / q n ] T
{ H ( q ) q i = 0 , i f Δ | H ( q ) q i | 0 H ( q ) q i = ( q i , max q i , min ) 2 ( 2 q i q i , max q i , min ) 4 ( q i , max q i ) 2 ( q i q i , min ) 2 , e l s e
On the basis of these optimizations, the controllers for static positioning and dynamic tracking are designed, respectively, as follows:
  • Static positioning controller:
    q ˙ = λ ( J S ) K + e K + η ( I ( J S ) K + ( J S ) K ) H ( q )
    where ( J S ) K is the processed total Jacobian in the k th period and λ is the proportional gain.
  • Dynamic tracking controller:
    q ˙ = λ ( J S ) K + ( e K ( e t ) K ) + η ( I ( J S ) K + ( J S ) K ) H ( q )
    where ( e t ) K = ( h / t ) Δ t is the estimation of the task function variation in the k th period caused by target movement.

4. Simulations

This section describes several simulations of our improved strategy. The performances of this strategy in aligning with both static and dynamic assembly objects are compared with several original strategies. Their abilities of avoiding the joint limit and handling an unstable conditional number of estimated total Jacobian are verified as well.

4.1. System Description

The following simulation adopted the model of a 9 DOF hyper-redundant manipulator with an eye-in-hand camera. This 9 DOF manipulator was independently designed and manufactured by our laboratory. Figure 4 shows the mechanical structure and schematic of this manipulator, and Table 1 gives the Denavit–Hartenberg parameters of the manipulator.
A five-megapixel industrial camera was mounted at the end-effector, and its intrinsic parameter was
K = ( 1132.1 0 1280 0 1132.1 1024 0 0 1 ) .
To simplify the system model, the camera frame F was considered to coincide with the end-effector frame F e .

4.2. Simulations and Discussions of Aligning with Static Assembly Objects

To align with static objects, the various errors after the system reaches the steady state are the most reasonable index for comparing the performances of different uncalibrated visual servo strategies. Several tests were performed with different noise levels to check the robustness of the IBUVS, original PHUVS, improved PHUVS with KF method and improved PHUVS with KF-SVSF method. A square target was chosen with a side length of 200 mm and 49 feature points. Different types of initial pose errors, including a pure translation error [ Δ χ , Δ θ ] = [ 350   mm , 350   mm , 100   mm , 0 , 0 , 0 ] , a pure rotation error [ Δ χ , Δ θ ] = [ 0   mm , 0   mm , 0   mm , 10 , 10 , 45 ] and a general error [ Δ χ , Δ θ ] = [ 350   mm , 350   mm , 100   mm , 10 ,   10 , 45 ] , were set according to the expected camera motion. The threshold of the boundary layer in the KF-SVSF model is defined as
ψ S = d i a g [ 64 36 420 36 64 420 10 10 ]
Table 2 shows the errors of all the above tests. The improved PHUVS with the KF-SVSF method performs better than the IBUVS, original PHUVS and improved PHUVS with the KF. Sometimes, the IBUVS does not converge. Figure 5 shows the detailed test results of the improved PHUVS with the KF-SVSF method for general motion with a noise mean of | V | = 0.5 .

4.3. Simulations and Discussions of Aligning with Dynamic Assembly Objects

The mean square error (MSE) is chosen as an index to compare the performances of different strategies for aligning with dynamic assembly objects. It represents the average image error between the current and desired image features in the servo process, and is defined as
M S E = 1 N i = 1 N Δ I i
where Δ I i is the image error at the i th iteration and N is the number of iterations.
The same target is adopted to move in planar and 3D trajectories. Both 2D motion in a double leaf rose path and 3D motion in a triple leaf rose path with different noise levels are simulated. These different paths of the target center are denoted as follows:
P r o s e 2 = [ 450 + 300 cos ( 3 ω N Δ T ) sin ( ω N Δ T ) 800 + 300 cos ( ω N Δ T ) sin ( 3 ω N Δ T ) 820 ]
P r o s e 3 = [ 450 + 250 sin ( 3 ω N Δ T ) sin ( ω N Δ T ) 800 + 300 cos ( ω N Δ T ) sin ( 3 ω N Δ T ) 820 + v N Δ T ]
where ω = π / 30   rad · s 1 is the angular velocity in the X Y plane, v = 2.5   mm · s 1 is the translation velocity along the Z axis, Δ T = 0.02   s is the sampling period and N is the number of iterations. The threshold of the KF-SVSF boundary layer was defined as
ψ S = d i a g [ 128 42 1360 42 128 1360 16 16 ]
Table 3 gives the MSEs of all the above tests. All of these MSEs are the average of 100 experiments. Figure 6 and Figure 7 show the image errors of the different strategies for the double leaf rose and triple leaf rose paths with a noise mean of | V | = 0.5 , and Figure 8 and Figure 9 show the camera trajectories of the different strategies. The IBUVS is the least robust method to image noise, and its tracking precision is unacceptable. The improved PHUVS, which directly controls manipulator joints, has better performance than the original PHUVS, as shown in Figure 2a. Meanwhile, the improved PHUVS with the KF-SVSF method shows better tracking precision than the improved PHUVS with the KF.
The abilities of the strategies to avoid the joint limit and handle an unstable conditional number are examined as well. The joint limit was set as ± 90 , and the minimum acceptable singular value of the total Jacobian was set as σ 0 = 10 4 . Figure 10 shows the unlimited and limited joint angles for these two trajectories. Figure 11 shows the logarithm of the minimum singular value σ min of the estimated total Jacobian ( J t ) K and processed Jacobian ( J S ) K in each period. In each tracking trajectory, all joint angles of the super-redundant manipulator are limited to the acceptable range. When unstable condition numbers are caused by the appearance of excessively small minimum singular values, the minimum singular value σ min of the processed Jacobian remains above the threshold σ 0 = 10 4 .

5. Experiments

5.1. Evaluation of Real-Time Performance

Several experiments of systems with different homography-based task functions are carried out. The time of consumed (TOC) is the most appropriate indicator to measure real-time performance. The TOC-H which is the time consumed on the homography matrix estimation, and the TOC-C, which includes the time consumed for Jacobian estimation and control output calculation, are selected as indicators to compare the real-time performance of the system. A PC with an Intel(R) Xeon(R) E5-1620 CPU at 3.5 GHz, 64-b operating system was chosen to run all experiments. The average values of TOC-H and TOC-C in each experiment with more than 10,000 control periods are shown in Table 2. The experiment in the article is only presented to compare the real-time performance of the two strategies. In practical space applications, this part of the function may be implemented using FPGA. Corresponding designs and improvements will be made to meet computational efficiency requirements.
As shown in Table 4 and Figure 12, the TOC-H and TOC-C of the improved homography-based task function are always lower than the original homography-based task function with different numbers of feature points. According to the comparison of the results, the improved homography-based task functions with smaller dimensions show more convincing real-time performance, as expected.

5.2. Evaluation of System Performance

As shown in Figure 13, a 9 DOF hyper-redundant manipulator, 6 DOF universal robot, smart camera and NI real-time system were used to build the experimental platforms. The parameters of the manipulator and the camera are consistent with the parameters set in the simulations. Experiments for aligning with both static and dynamic assembly objects were conducted to verify the improvement of our strategy. A square target was chosen with a side length of 200 mm and 49 feature points. All experiments were performed under an illumination similar to that of actual working conditions to examine the reliability of our strategy.

5.2.1. Static Positioning Experiment

Considering that, in the actual work scene, manipulators often perform general movements to align with assembly targets, the initial pose error of the positioning experiment was set as
[ Δ χ , Δ θ ] = [ 226   mm , 224   mm , 386   mm , 30 , 10 , 10 ]
The threshold of the boundary layer in the KF-SVSF model was defined as
ψ S = d i a g [ 64 36 420 36 64 420 10 10 ]
Table 5 gives the various steady-state errors of different strategies in positioning experiments, and these results are the average of five repeated experiments. Although the noise of image transmission and processing has a certain impact on the accuracy of target recognition, the improved PHUVS with the KF-SVSF method clearly performs better than other strategies in each comparison. This verifies the superiority of our improved strategy. Figure 14 is the change curve of the proposed improved PHUVS in terms of the image error and task function error. Figure 15 shows the angular velocity of each joint of the manipulator.

5.2.2. Dynamic Tracking Experiment

In the dynamic tracking experiment, the target was fixed to the end-effector of a universal robot and taken to move along different paths. To ensure reliability, both complex planar and 3-D paths were tested. The planar and 3D paths of the target were set as
P r o s e 2 = [ 840 + 200 cos ( 3 ω N Δ T ) sin ( ω N Δ T ) 1690 200 + 250 cos ( ω N Δ T ) sin ( 3 ω N Δ T ) ]
P r o s e 3 = [ 840 + 250 sin ( 3 ω K Δ T ) sin ( ω K Δ T ) 1690 + v K Δ T 200 + 250 cos ( ω K Δ T ) sin ( 3 ω K Δ T ) ]
where ω = π / 6   rad · s 1 is the angular velocity in the X Y plane, v = 1.25   mm · s 1 is the translation velocity in the Z direction, Δ T = 0.04   s is the sampling period and K is the number of iterations. The threshold of the KF-SVSF boundary layer was defined as
ψ S = d i a g [ 128 42 1360 42 128 1360 16 16 ]
Table 6 gives the mean square error (MSE) of different strategies tracking two different paths, and these results are the average of five repeated experiments. Figure 16, Figure 17, Figure 18 and Figure 19 show the image errors of the different strategies, and the joint angle and joint angular velocity of the proposed improved PUHVS with the KF-SVSF method, in the double leaf rose path and the triple leaf rose paths respectively. Obviously, the improved PHUVS with the KF-SVSF method shows better accuracy and stability than other strategies. During the tracking process, the tracking error is always kept in an acceptable range. It is worth mentioning that, due to the adoption of the SVF method, the system smoothly completes the servo process without severe disturbance. It can also be seen that, because the redundancy resulting from the improved homography-based task function is used for the design of the controller to limit the joint angle, there was no case in which the joint angle exceeded the limit, and the entire experiment was successfully completed.
It is worth noting that, in our experiment, we calculated the corresponding joint increment command Δ q = q ˙ Δ T through the sampling period Δ T instead of continuous joint velocities commands q ˙ to send to the robot. During the execution of each sampling period, joint velocities were automatically planned when moving the increment Δ q . As shown in Figure 17a and Figure 19a, the joint angle was still relatively smooth. However, in Figure 5f, Figure 15, Figure 17b and Figure 19b, high-frequency oscillations were present in joint velocities caused by various noises. If the manipulator were controlled in a speed mode in actual tasks, this might cause vibrations and it might be necessary to add a filter to cut high frequencies. This would add a delay and possibly make this method unsuitable for real-time operations in space. Researchers have noticed this problem and found that PHUVS experiences a much smaller variation amplitude in joint velocities than IBUVS [23]. However, this level of improvement is not sufficient to completely solve the problem, and this is still a shortcoming that needs to be overcome in future applications.

6. Conclusions

This paper proposed an improved uncalibrated visual servo strategy for hyper-redundant manipulators in an on-orbit automatic assembly. An improved homography-based task function with robustness to image noise and better real-time performance was constructed, which made the joint space of the hyper-redundant manipulator redundant to the task function. To improve the system accuracy and avoid the interference caused by the change of manipulator parameters in space, a proportional controller based on the estimated total Jacobian was designed to control the manipulator joints directly without a manipulator Jacobian. The KF-SVSF method, which combines the optimality of the KF method with the robustness of the SVSF method, was introduced to perform the online estimation of the total Jacobian and further reduced the impact of modeling uncertainty and image noise. The SVF method was adopted to reduce the disturbance caused by an unstable conditional number of the estimated total Jacobian without generating much additional error in the algorithms. In addition, the redundancy of the joint space relative to the new task function was exploited to limit the joint angle. Several tasks for aligning with static or dynamic objects were simulated to test the performance of this strategy. The simulation results demonstrated the improved accuracy and robustness of this strategy. The abilities of the system to limit the joint angle and deal with an unstable conditional number were verified as well. The reliability of this strategy was further validated by experiments based on actual working conditions.

Author Contributions

Conceptualization, J.G.; Data curation, J.G.; Formal analysis, J.G., A.L. and W.W.; Funding acquisition, M.Z.; Investigation, J.G.; Methodology, J.G.; Project administration, M.Z., L.C. and Z.X.; Resources, M.Z., L.C. and Z.X.; Software, J.G. and A.L.; Validation, J.G.; Visualization, J.G. and A.L.; Writing—original draft, J.G.; Writing—review & editing, M.Z. and W.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China (grant numbers 11672290, 11972343, and 91848202); the Science and Technology Development Plan of Jilin province (grant number 2018020102GX); and the Jilin Province and the Chinese Academy of Sciences Cooperation in Science and Technology High-Tech Industrialization Special Funds Project (grant number 2018SYHZ0004).

Acknowledgments

Mark Kurban from Liwen Bianji, Edanz Editing China (www.liwenbianji.cn/ac), have been invited to edited a draft of this manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hippler, S. Adaptive Optics for Extremely Large Telescopes. J. Astron. Instrum. 2019, 8, 1–21. [Google Scholar] [CrossRef] [Green Version]
  2. Stolfi, A.; Angeletti, F.; Gasbarri, P.; Panella, M. A Deep Learning Strategy for On-Orbit Servicing Via Space Robotic Manipulator. Aerotec. Missili Spaz. 2019, 98, 273–282. [Google Scholar] [CrossRef]
  3. Chen, G.; Yuan, B.; Jia, Q.; Sun, H.; Guo, W. Failure Tolerance Strategy of Space Manipulator for Large Load Carrying Tasks. Acta Astronaut. 2018, 148, 186–204. [Google Scholar] [CrossRef]
  4. Li, W.-J.; Cheng, D.-Y.; Liu, X.-G.; Wang, Y.-B.; Shi, W.-H.; Tang, Z.-X.; Gao, F.; Zeng, F.-M.; Chai, H.-Y.; Luo, W.-B.; et al. On-Orbit Service (OOS) of Spacecraft: A Review of Engineering Developments. Prog. Aerosp. Sci. 2019, 108, 32–120. [Google Scholar] [CrossRef]
  5. Flores-Abad, A.; Ma, O.; Pham, K.; Ulrich, S. A Review of Space Robotics Technologies for on-Orbit Servicing. Prog. Aerosp. Sci. 2014, 68, 1–26. [Google Scholar] [CrossRef] [Green Version]
  6. Xu, W.; Liang, B.; Xu, Y. Survey of Modeling, Planning, and Ground Verification of Space Robotic Systems. Acta Astronaut. 2011, 68, 1629–1649. [Google Scholar] [CrossRef]
  7. Menon, C.; Busolo, S.; Cocuzza, S.; Aboudan, A.; Bulgarelli, A.; Bettanini, C.; Marchesi, M.; Angrilli, F. Issues and Solutions for Testing Free-Flying Robots. Acta Astronaut. 2007, 60, 957–965. [Google Scholar] [CrossRef]
  8. Stolfi, A.; Gasbarri, P.; Sabatini, M. A Parametric Analysis of a Controlled Deployable Space Manipulator for Capturing a Non-Cooperative Flexible Satellite. Acta Astronaut. 2018, 148, 317–326. [Google Scholar] [CrossRef]
  9. Chen, H.; Huang, P.; Liu, Z.; Ma, Z. Time Delay Prediction for Space Telerobot System with a Modified Sparse Multivariate Linear Regression Method. Acta Astronaut. 2020, 166, 330–341. [Google Scholar] [CrossRef]
  10. Larouche, B.P.; Zhu, Z.H. Autonomous Robotic Capture of Non-Cooperative Target Using Visual Servoing and Motion Predictive Control. Auton. Robot. 2014, 37, 157–167. [Google Scholar] [CrossRef]
  11. Dong, G.; Zhu, Z.H. Predictive Visual Servo Kinematic Control for Autonomous Robotic Capture of Non-Cooperative Space Target. Acta Astronaut. 2018, 151, 173–181. [Google Scholar] [CrossRef]
  12. Rivolta, A.; Lunghi, P.; Lavagna, M. GNC & Robotics for on Orbit Servicing With Simulated Vision in the Loop. Acta Astronaut. 2019, 162, 327–335. [Google Scholar] [CrossRef]
  13. Dong, G.; Zhu, Z.H. Position-Based Visual Servo Control of Autonomous Robotic Manipulators. Acta Astronaut. 2015, 115, 291–302. [Google Scholar] [CrossRef]
  14. Bottin, M.; Cocuzza, S.; Comand, N.; Doria, A. Modeling and Identification of an Industrial Robot with a Selective Modal Approach. Appl. Sci. 2020, 10, 4619. [Google Scholar] [CrossRef]
  15. Wang, N.; He, H. Adaptive Homography-Based Visual Servo for Micro Unmanned Surface Vehicles. Int. J. Adv. Manuf. Technol. 2019, 105, 4875–4882. [Google Scholar] [CrossRef]
  16. Assa, A.; Janabi-Sharifi, F. Virtual Visual Servoing for Multicamera Pose Estimation. IEEE/ASME Trans. Mechatron. 2015, 20, 789–798. [Google Scholar] [CrossRef]
  17. Colombo, F.T.; Fontes, J.V.D.C.; Da Silva, M.M. A Visual Servoing Strategy Under Limited Frame Rates for Planar Parallel Kinematic Machines. J. Intell. Robot. Syst. 2019, 96, 95–107. [Google Scholar] [CrossRef]
  18. Li, S.; Li, D.; Zhang, C.; Wan, J.; Xie, M. RGB-D Image Processing Algorithm for Target Recognition and Pose Estimation of Visual Servo System. Sensors 2020, 20, 430. [Google Scholar] [CrossRef] [Green Version]
  19. Cai, C.; Somani, N.; Nair, S.; Mendoza, D.; Knoll, A. Uncalibrated Stereo Visual Servoing for Manipulators Using Virtual Impedance Control. In Proceedings of the 13th International Conference on Control Automation Robotics & Vision (ICARCV), Singapore, 10–12 December 2014; pp. 1888–1893. [Google Scholar]
  20. Liang, X.; Wang, H.; Liu, Y.-H.; Chen, W.; Zhao, J. A Unified Design Method for Adaptive Visual Tracking Control of Robots with Eye-in-Hand/Fixed Camera Configuration. Automatica 2015, 59, 97–105. [Google Scholar] [CrossRef]
  21. Shademan, A.; Jägersand, M. Three-View Uncalibrated Visual Servoing. In Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, Taiwan, 18–22 October 2010; pp. 6234–6239. [Google Scholar]
  22. Hao, M.; Sun, Z. A Universal State-Space Approach to Uncalibrated Model-Free Visual Servoing. IEEE/ASME Trans. Mechatron. 2011, 17, 833–846. [Google Scholar] [CrossRef]
  23. Gong, Z.; Tao, B.; Yang, H.; Yin, Z.; Ding, H. An Uncalibrated Visual Servo Method Based on Projective Homography. IEEE Trans. Autom. Sci. Eng. 2018, 15, 806–817. [Google Scholar] [CrossRef]
  24. Liu, M.; Pradalier, C.; Esiegwart, R. Visual Homing from Scale with an Uncalibrated Omnidirectional Camera. IEEE Trans. Robot. 2013, 29, 1353–1365. [Google Scholar] [CrossRef]
  25. Ma, Z.; Su, J. Robust Uncalibrated Visual Servoing Control Based on Disturbance Observer. ISA Trans. 2015, 59, 193–204. [Google Scholar] [CrossRef]
  26. Gong, Z.; Tao, B.; Qiu, C.; Yin, Z.; Ding, H. Trajectory Planning with Shortest Path for Modified Uncalibrated Visual Servoing Based on Projective Homography. IEEE Trans. Autom. Sci. Eng. 2020, 17, 1076–1083. [Google Scholar] [CrossRef]
  27. Wang, F.; Liu, Z.; Chen, C.L.P.; Zhang, Y. Robust Adaptive Visual Tracking Control for Uncertain Robotic Systems With Unknown Dead-Zone Inputs. J. Frankl. Inst. 2019, 356, 6255–6279. [Google Scholar] [CrossRef]
  28. Zhang, J.; Liu, D. Calibration-Free and Model-Independent Method for High-DOF Image-Based Visual Servoing. J. Control. Theory Appl. 2013, 11, 132–140. [Google Scholar] [CrossRef]
  29. Musić, J.; Bonković, M.; Cecić, M. Comparison of Uncalibrated Model-Free Visual Servoing Methods for Small-Amplitude Movements: A Simulation Study. Int. J. Adv. Robot. Syst. 2014, 11, 108. [Google Scholar] [CrossRef]
  30. Shi, H.; Sun, G.; Wang, Y.; Hwang, K.-S. Adaptive Image-Based Visual Servoing with Temporary Loss of the Visual Signal. IEEE Trans. Ind. Inform. 2019, 15, 1956–1965. [Google Scholar] [CrossRef]
  31. Xiaolin, R.; Hongwen, L.; Yuanchun, L. Online Image Jacobian Identification Using Optimal Adaptive Robust Kalman Filter for Uncalibrated Visual Servoing. In Proceedings of the 2017 2nd Asia-Pacific Conference on Intelligent Robot Systems (ACIRS), Wuhan, China, 16–18 June 2017; pp. 53–57. [Google Scholar]
  32. Lv, X.; Huang, X. Fuzzy Adaptive Kalman Filtering based Estimation of Image Jacobian for Uncalibrated Visual Servoing. In Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, 9–15 October 2006; pp. 2167–2172. [Google Scholar]
  33. Wang, F.; Sun, F.; Zhang, J.; Lin, B.; Li, X. Unscented Particle Filter for Online Total Image Jacobian Matrix Estimation in Robot Visual Servoing. IEEE Access 2019, 7, 92020–92029. [Google Scholar] [CrossRef]
  34. Zhong, X.; Zhong, X.; Peng, X. Robots Visual Servo Control with Features Constraint Employing Kalman-Neural-Network Filtering Scheme. Neurocomputing 2015, 151, 268–277. [Google Scholar] [CrossRef]
  35. Zhou, Z.; Zhang, R.; Zhu, Z. RETRACTED: Uncalibrated Dynamic Visual Servoing via Multivariate Adaptive Regression Splines and Improved Incremental Extreme Learning Machine. ISA Trans. 2019, 92, 298–314. [Google Scholar] [CrossRef] [PubMed]
  36. Gu, J.; Wang, W.; Zhu, M.; Lv, Y.; Huo, Q.; Xu, Z. Research on A Technology of Automatic Assembly Based on Uncalibrated Visual Servo System. In Proceedings of the 2018 IEEE International Conference on Mechatronics and Automation (ICMA), Changchun, China, 5–8 August 2018; pp. 872–877. [Google Scholar]
  37. Piepmeier, J.A.; McMurray, G.V.; Lipkin, H. A Dynamic Quasi-Newton Method for Uncalibrated Visual Servoing. Proc. Int. Conf. Robot. Autom. 2003, 2, 1595–1600. [Google Scholar] [CrossRef]
  38. Zhao, Q.; Zhang, L.; Chen, Y. Online Estimation Technique for Jacobian Matrix in Robot Visual Servo Systems. In Proceedings of the 2008 3rd IEEE Conference on Industrial Electronics and Applications, Singapore, 3–5 June 2008; pp. 1270–1275. [Google Scholar]
  39. Hartley, R.; Zisserman, A. Multiple View Geometry in Computer Vision; Cambridge University Press (CUP): New York, NY, USA, 2004. [Google Scholar]
  40. Huang, Y.; Zhang, Y.; Wu, Z.; Li, N.; Chambers, J. A Novel Adaptive Kalman Filter With Inaccurate Process and Measurement Noise Covariance Matrices. IEEE Trans. Autom. Control 2018, 63, 594–601. [Google Scholar] [CrossRef] [Green Version]
  41. Mu, Z.; Liu, T.; Xu, W.; Lou, Y.; Liang, B. Dynamic Feedforward Control of Spatial Cable-Driven Hyper-Redundant Manipulators for on-Orbit Servicing. Robots 2018, 37, 18–38. [Google Scholar] [CrossRef]
  42. Chan, T.F.; Dubey, R. A Weighted Least-Norm Solution Based Scheme for Avoiding Joint Limits for Redundant Joint Manipulators. IEEE Trans. Robot. Autom. 1995, 11, 286–292. [Google Scholar] [CrossRef]
  43. Cocuzza, S.; Pretto, I.; Debei, S. Least-Squares-Based Reaction Control of Space Manipulators. J. Guid. Control. Dyn. 2012, 35, 976–986. [Google Scholar] [CrossRef]
  44. Tringali, A.; Cocuzza, S. Globally Optimal Inverse Kinematics Method for a Redundant Robot Manipulator with Linear and Nonlinear Constraints. Robots 2020, 9, 61. [Google Scholar] [CrossRef]
  45. Colome, A.; Torras, C. Closed-Loop Inverse Kinematics for Redundant Robots: Comparative Assessment and Two Enhancements. IEEE/ASME Trans. Mechatron. 2014, 20, 944–955. [Google Scholar] [CrossRef] [Green Version]
  46. Gadsden, S.; Habibi, S.R. A New Robust Filtering Strategy for Linear Systems. J. Dyn. Syst. Meas. Control. 2012, 135, 014503. [Google Scholar] [CrossRef]
  47. Gadsden, S.A.; Habibi, S.; Kirubarajan, T. Kalman and Smooth Variable Structure Filters for Robust Estimation. IEEE Trans. Aerosp. Electron. Syst. 2014, 50, 1038–1050. [Google Scholar] [CrossRef]
  48. Goodman, J.M.; Wilkerson, S.A.; Eggleton, C.; Gadsden, S.A. A Multiple Model Adaptive SVSF-KF Estimation Strategy. Signal Process. Sens. Inf. Fusion Target Recognit. XXVIII 2019, 11018, 110181K. [Google Scholar] [CrossRef]
  49. Nie, Y.; Zhang, Z.; Sun, H.-Q.; Su, T.; Li, G. Homography Propagation and Optimization for Wide-Baseline Street Image Interpolation. IEEE Trans. Vis. Comput. Graph. 2016, 23, 2328–2341. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Projective homography model.
Figure 1. Projective homography model.
Applsci 10 06968 g001
Figure 2. Visual servo system: (a) Original PHUVS block diagram. (b) Improved PHUVS block diagram for static positioning tasks. (c) Improved PHUVS block diagram for dynamic tracking tasks.
Figure 2. Visual servo system: (a) Original PHUVS block diagram. (b) Improved PHUVS block diagram for static positioning tasks. (c) Improved PHUVS block diagram for dynamic tracking tasks.
Applsci 10 06968 g002
Figure 3. Concept of boundary layers of the Kalman filter–smooth variable structure filter (KF-SVSF) method.
Figure 3. Concept of boundary layers of the Kalman filter–smooth variable structure filter (KF-SVSF) method.
Applsci 10 06968 g003
Figure 4. Manipulator structure and link coordinate systems of the 9 DOF manipulator.
Figure 4. Manipulator structure and link coordinate systems of the 9 DOF manipulator.
Applsci 10 06968 g004
Figure 5. Detailed result of the improved PHUVS with the KF-SVSF method. (a) Trajectories of partial feature points in the image plane. (b) Norm of image error. (c) Norm of position error. (d) Norm of attitude error. (e) Norm of error vector. (f) Joint velocities of the manipulator.
Figure 5. Detailed result of the improved PHUVS with the KF-SVSF method. (a) Trajectories of partial feature points in the image plane. (b) Norm of image error. (c) Norm of position error. (d) Norm of attitude error. (e) Norm of error vector. (f) Joint velocities of the manipulator.
Applsci 10 06968 g005
Figure 6. Image error of different strategies in a double leaf rose path. (a) Image error of IBUVS. (b) Image error of the original PHUVS. (c) Image error of the improved PHUVS with the KF method. (d) Image error of the improved PHUVS with the KF-SVSF method.
Figure 6. Image error of different strategies in a double leaf rose path. (a) Image error of IBUVS. (b) Image error of the original PHUVS. (c) Image error of the improved PHUVS with the KF method. (d) Image error of the improved PHUVS with the KF-SVSF method.
Applsci 10 06968 g006
Figure 7. Image error of different strategies in a triple leaf rose path. (a) Image error of IBUVS. (b) Image error of the original PHUVS. (c) Image error of the improved PHUVS with the KF method. (d) Image error of the improved PHUVS with the KF-SVSF method.
Figure 7. Image error of different strategies in a triple leaf rose path. (a) Image error of IBUVS. (b) Image error of the original PHUVS. (c) Image error of the improved PHUVS with the KF method. (d) Image error of the improved PHUVS with the KF-SVSF method.
Applsci 10 06968 g007
Figure 8. Camera trajectories of different strategies in a double leaf rose path. (a) Camera trajectory of IBUVS. (b) Camera trajectory of the original PHUVS. (c) Camera trajectory of the improved PHUVS with the KF method. (d) Camera trajectory of the improved PHUVS with the KF-SVSF method.
Figure 8. Camera trajectories of different strategies in a double leaf rose path. (a) Camera trajectory of IBUVS. (b) Camera trajectory of the original PHUVS. (c) Camera trajectory of the improved PHUVS with the KF method. (d) Camera trajectory of the improved PHUVS with the KF-SVSF method.
Applsci 10 06968 g008
Figure 9. Camera trajectories of different strategies in a triple leaf rose path. (a) Camera trajectory of IBUVS. (b) Camera trajectory of the original PHUVS. (c) Camera trajectory of the improved PHUVS with the KF method. (d) Camera trajectory of the improved PHUVS with the KF-SVSF method.
Figure 9. Camera trajectories of different strategies in a triple leaf rose path. (a) Camera trajectory of IBUVS. (b) Camera trajectory of the original PHUVS. (c) Camera trajectory of the improved PHUVS with the KF method. (d) Camera trajectory of the improved PHUVS with the KF-SVSF method.
Applsci 10 06968 g009
Figure 10. Joint angle of the manipulator in different paths. (a) Unlimited joint angles of the manipulator in a double leaf rose path. (b) Limited joint angles of the manipulator in a double leaf rose path. (c) Unlimited joint angles of the manipulator in atriple leaf rose path. (d) Limited joint angles of the manipulator in a triple leaf rose path.
Figure 10. Joint angle of the manipulator in different paths. (a) Unlimited joint angles of the manipulator in a double leaf rose path. (b) Limited joint angles of the manipulator in a double leaf rose path. (c) Unlimited joint angles of the manipulator in atriple leaf rose path. (d) Limited joint angles of the manipulator in a triple leaf rose path.
Applsci 10 06968 g010
Figure 11. (a) Inaccurate trajectory caused by an unstable conditional number in a double leaf rose path. (b) Logarithm of the minimum singular value of the total Jacobian in a double leaf rose path. (c) Inaccurate trajectory caused by an unstable conditional number in a triple leaf rose path. (d) Logarithm of the minimum singular value of the total Jacobian in a triple leaf rose path.
Figure 11. (a) Inaccurate trajectory caused by an unstable conditional number in a double leaf rose path. (b) Logarithm of the minimum singular value of the total Jacobian in a double leaf rose path. (c) Inaccurate trajectory caused by an unstable conditional number in a triple leaf rose path. (d) Logarithm of the minimum singular value of the total Jacobian in a triple leaf rose path.
Applsci 10 06968 g011
Figure 12. TOC of original PHUVS and improved PHUVS with different numbers of feature points.
Figure 12. TOC of original PHUVS and improved PHUVS with different numbers of feature points.
Applsci 10 06968 g012
Figure 13. Experiment platform and target.
Figure 13. Experiment platform and target.
Applsci 10 06968 g013
Figure 14. Detailed result of the improved PHUVS with the KF-SVSF method in the static positioning experiment. (a) Image error norm. (b) Norm of error vector.
Figure 14. Detailed result of the improved PHUVS with the KF-SVSF method in the static positioning experiment. (a) Image error norm. (b) Norm of error vector.
Applsci 10 06968 g014
Figure 15. Joint velocities of the improved PHUVS with the KF-SVSF method in the static positioning experiment.
Figure 15. Joint velocities of the improved PHUVS with the KF-SVSF method in the static positioning experiment.
Applsci 10 06968 g015
Figure 16. Image error of different strategies in the experiment of the double leaf rose path. (a) Image error of IBUVS. (b) Image error of the original PHUVS. (c) Image error of the improved PHUVS with the KF method. (d) Image error of the improved PHUVS with the KF-SVSF method.
Figure 16. Image error of different strategies in the experiment of the double leaf rose path. (a) Image error of IBUVS. (b) Image error of the original PHUVS. (c) Image error of the improved PHUVS with the KF method. (d) Image error of the improved PHUVS with the KF-SVSF method.
Applsci 10 06968 g016
Figure 17. Joint angles and joint velocity of the improved PHUVS with the KF-SVSF method in the tracking experiment of a double leaf rose path. (a) Joint angles. (b) Joint velocities.
Figure 17. Joint angles and joint velocity of the improved PHUVS with the KF-SVSF method in the tracking experiment of a double leaf rose path. (a) Joint angles. (b) Joint velocities.
Applsci 10 06968 g017
Figure 18. Image error of different strategies in experiment of triple leaf rose path. (a) Image error of IBUVS. (b) Image error of the original PHUVS. (c) Image error of the improved PHUVS with the KF method. (d) Image error of the improved PHUVS with the KF-SVSF method.
Figure 18. Image error of different strategies in experiment of triple leaf rose path. (a) Image error of IBUVS. (b) Image error of the original PHUVS. (c) Image error of the improved PHUVS with the KF method. (d) Image error of the improved PHUVS with the KF-SVSF method.
Applsci 10 06968 g018
Figure 19. Joint angles and joint velocity of the improved PHUVS with the KF-SVSF method in the tracking experiment of a triple leaf rose path. (a) Joint angles. (b) Joint velocities.
Figure 19. Joint angles and joint velocity of the improved PHUVS with the KF-SVSF method in the tracking experiment of a triple leaf rose path. (a) Joint angles. (b) Joint velocities.
Applsci 10 06968 g019
Table 1. Denavit–Hartenberg parameters of the nine degree-of-freedom (9 DOF) manipulator.
Table 1. Denavit–Hartenberg parameters of the nine degree-of-freedom (9 DOF) manipulator.
Joint q n α n 1 a n 1 d n Joint Limit
1q1000±90°
2q290°165.50±90°
3q3−90°165.50±90°
4q490°165.50±90°
5q5−90°165.50±90°
6q690°165.50±90°
7q7−90°165.50±90°
8q890°165.50±90°
9q9−90°165.50±90°
Table 2. Steady-state error of different strategies in the static positioning test. Δ χ is in millimeters, Δ θ is in degrees and Δ I is in pixels. IBUVS: image-based uncalibrated visual servo. OPHUVS: original PHUVS.
Table 2. Steady-state error of different strategies in the static positioning test. Δ χ is in millimeters, Δ θ is in degrees and Δ I is in pixels. IBUVS: image-based uncalibrated visual servo. OPHUVS: original PHUVS.
TaskMethodNoise Mean
| V | = 0.2 | V | = 0.5 | V | = 1
Δ χ Δ θ Δ I Δ χ Δ θ Δ I Δ χ Δ θ Δ I
Pure TranslationIBUVS0.4250.0141.1210.8230.0324.5411.2250.0676.346
OPHUVS0.1830.0040.9910.3020.0172.4530.3950.0313.637
KF0.1810.0040.9870.2890.0152.2780.3790.0293.308
KF-SVSF0.1740.0040.8860.2730.0152.0520.3640.0293.081
Pure RotationIBUVS0.3410.0231.4270.5370.0455.2530.8730.0697.641
OPHUVS0.1260.0040.9320.2680.0162.4660.3850.0343.826
KF0.1240.0040.9290.2510.0152.3490.3740.0313.527
KF-SVSF0.1170.0030.9060.2420.0142.2350.3040.0293.174
General MotionIBUVS0.3740.0291.4350.9110.0496.2131.3520.0728.214
OPHUVS0.1870.0060.8130.2610.0182.7160.4060.0333.826
KF0.1850.0060.8110.2430.0172.5290.3940.0323.672
KF-SVSF0.1690.0060.7980.2280.0152.4820.3630.0313.396
Table 3. Mean square error (MSE) of different strategies in the dynamic tracking test.
Table 3. Mean square error (MSE) of different strategies in the dynamic tracking test.
PathMethodNoise Mean
| V | = 0.2 | V | = 0.5 | V | = 1
MSEMSEMSE
Double leaf roseIBUVS4.978.5411.47
OPHUVS3.945.217.56
IPHUVS with KF2.593.845.54
IPHUVS with KF-SVSF2.373.565.31
Triple leaf roseIBUVS5.128.9611.67
OPHUVS3.915.137.48
IPHUVS with KF2.663.755.62
IPHUVS with KF-SVSF2.433.475.39
MSE is in pixels.
Table 4. Detailed value of the TOC of the original PHUVS and improved PHUVS.
Table 4. Detailed value of the TOC of the original PHUVS and improved PHUVS.
Number of Feature PointsIPHUVSOPHUVS
TOC-HTOC-CTotalTOC-HTOC-CTotal
40.54860.81091.35950.56241.08911.6515
90.75040.81091.56130.76081.08921.85
160.97120.81091.78210.98041.08922.0696
251.29420.81082.1051.30221.08912.3913
361.64720.81072.45791.65911.08922.7483
492.20260.81083.01342.21761.08933.3069
Table 5. Steady-state error of different strategies in the static positioning experiment. Δ χ is in millimeters, Δ θ is in degrees and Δ I is in pixels.
Table 5. Steady-state error of different strategies in the static positioning experiment. Δ χ is in millimeters, Δ θ is in degrees and Δ I is in pixels.
Method Δ χ Δ θ Δ I
IBUVS1.080.0234.14
OPHUVS0.960.0122.81
IPHUVS with KF0.910.0122.76
IPHUVS with KF-SVSF0.720.0112.59
Table 6. MSE of different strategies in the dynamic tracking experiment.
Table 6. MSE of different strategies in the dynamic tracking experiment.
PathDouble Leaf RoseTriple Leaf Rose
Method
IBUVS9.4310.19
OPHUVS7.327.74
IPHUVS with KF6.286.61
IPHUVS with KF-SVSF5.615.89
MSE is in pixels.

Share and Cite

MDPI and ACS Style

Gu, J.; Zhu, M.; Cao, L.; Li, A.; Wang, W.; Xu, Z. Improved Uncalibrated Visual Servo Strategy for Hyper-Redundant Manipulators in On-Orbit Automatic Assembly. Appl. Sci. 2020, 10, 6968. https://doi.org/10.3390/app10196968

AMA Style

Gu J, Zhu M, Cao L, Li A, Wang W, Xu Z. Improved Uncalibrated Visual Servo Strategy for Hyper-Redundant Manipulators in On-Orbit Automatic Assembly. Applied Sciences. 2020; 10(19):6968. https://doi.org/10.3390/app10196968

Chicago/Turabian Style

Gu, Jinlin, Mingchao Zhu, Lihua Cao, Ang Li, Wenrui Wang, and Zhenbang Xu. 2020. "Improved Uncalibrated Visual Servo Strategy for Hyper-Redundant Manipulators in On-Orbit Automatic Assembly" Applied Sciences 10, no. 19: 6968. https://doi.org/10.3390/app10196968

APA Style

Gu, J., Zhu, M., Cao, L., Li, A., Wang, W., & Xu, Z. (2020). Improved Uncalibrated Visual Servo Strategy for Hyper-Redundant Manipulators in On-Orbit Automatic Assembly. Applied Sciences, 10(19), 6968. https://doi.org/10.3390/app10196968

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop