Next Article in Journal
Terahertz Sensing of L-Valine and L-Phenylalanine Solutions
Previous Article in Journal
Research on Unsupervised Low-Light Railway Fastener Image Enhancement Method Based on Contrastive Learning GAN
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Quaternion-Based Attitude Estimation of an Aircraft Model Using Computer Vision

by
Pavithra Kasula
,
James F. Whidborne
* and
Zeeshan A. Rana
Centre for Aeronautics, Cranfield University, Cranfield MK43 0AL, UK
*
Author to whom correspondence should be addressed.
Sensors 2024, 24(12), 3795; https://doi.org/10.3390/s24123795
Submission received: 5 April 2024 / Revised: 25 May 2024 / Accepted: 6 June 2024 / Published: 12 June 2024
(This article belongs to the Special Issue Sensors in Aircraft (Volume II))

Abstract

:
Investigating aircraft flight dynamics often requires dynamic wind tunnel testing. This paper proposes a non-contact, off-board instrumentation method using vision-based techniques. The method utilises a sequential process of Harris corner detection, Kanade–Lucas–Tomasi tracking, and quaternions to identify the Euler angles from a pair of cameras, one with a side view and the other with a top view. The method validation involves simulating a 3D CAD model for rotational motion with a single degree-of-freedom. The numerical analysis quantifies the results, while the proposed approach is analysed analytically. This approach results in a 45.41% enhancement in accuracy over an earlier direction cosine matrix method. Specifically, the quaternion-based method achieves root mean square errors of 0.0101 rad/s, 0.0361 rad/s, and 0.0036 rad/s for the dynamic measurements of roll rate, pitch rate, and yaw rate, respectively. Notably, the method exhibits a 98.08% accuracy for the pitch rate. These results highlight the performance of quaternion-based attitude estimation in dynamic wind tunnel testing. Furthermore, an extended Kalman filter is applied to integrate the generated on-board instrumentation data (inertial measurement unit, potentiometer gimbal) and the results of the proposed vision-based method. The extended Kalman filter state estimation achieves root mean square errors of 0.0090 rad/s, 0.0262 rad/s, and 0.0034 rad/s for the dynamic measurements of roll rate, pitch rate, and yaw rate, respectively. This method exhibits an improved accuracy of 98.61% for the estimation of pitch rate, indicating its higher efficiency over the standalone implementation of the direction cosine method for dynamic wind tunnel testing.

1. Introduction

Flight dynamics models are essential for designing new aircraft, as they are used for assessing handling qualities, designing flight control systems, and conducting flight simulations. These tasks are crucially performed during the early stages of aircraft design. Additionally, the construction of sub-scale flight demonstrators offers an invaluable means to evaluate designs early in the design cycle, as well as to validate models and facilitate model development [1]. Furthermore, they can be used for model parameter identification. Approaches to flight dynamics identification encompass theoretical and numerical analyses, wind tunnel testing, and flight testing. In the initial design phase, theoretical parameter estimation is common but may have limited accuracy, necessitating wind tunnel testing. This method is comparatively lower in cost and requires less time to manufacture than a proof-of-concept sub-scale flying vehicle. Sub-scale models can also be tested in wind tunnels to extract aerodynamic and flight dynamics data.
While static wind tunnel testing suffices for identifying the main parameters such as lift and drag, non-linear flight regimes involving high angles of attack demand investigation into non-steady aerodynamics. Dynamic wind tunnel (DWT) testing serves this purpose [2]. Although dynamic wind tunnel testing is not yet widespread, advances in model construction methods such as 3D printing and instrumentation are making dynamic wind tunnel experiments more accessible and affordable, thus enriching the design process.
Several DWT testing techniques have been developed to determine the forces acting on the model, including captive tests (such as forced oscillation and rotary balance tests), single-degree-of-freedom (DOF) tests (like free-to-pitch or free-to-yaw, and free-to-roll tests), and free-flying tests. Captive tests aim to measure damping and rotary derivatives but involve a support mechanism that restricts free motion, differing from real flight conditions. One-DOF tests, on the other hand, are utilised to measure dynamic stability derivatives and free-motion modes, as well as evaluate unsteady aerodynamic effects on the model motion. These tests involve the model being supported with an initial condition allowing for free motion. Free-flying tests are further categorised into low-speed tests with a power system and high-speed tests without a power system [3,4].
The preferred choice for conducting free-flying testing is typically a 3-DOF rotational motion rig, enabling force measurement either via a balance positioned at the rig mount or the attachment point of the model. At Cranfield University, the available DWT testing rig features an open-section tunnel with a 1.5 × 1.1 m working section and a maximum speed capability of 40 m/s, facilitating investigation into four degrees of freedom (roll, pitch, yaw, and heave) of motion. This setup allows for a roll and pitch motion range of ±30°, a yaw motion range of 360°, and a maximum translation motion range of 0.75 m in heave [5,6]. Figure 1 depicts the Cranfield University DWT rig, featuring a 1/12 scale model of the BAE Hawk, enabling 3-DOF motion with a bottom rig mount. Another configuration is the DWT testing rig at the University of Bristol, which has a maximum 5-DOF capability by incorporating a 2-DOF gimbal at the pendulum attachment and a 3-DOF gimbal mounted at the model’s centre of gravity [7,8,9]. Another 4-DOF dynamic wind tunnel pitch, roll, yaw, and heave has been developed [10].
On-board instrumentation plays a crucial role in aircraft model testing, with inertial measurement units (IMUs) being a popular choice for measuring rotation rates and displacement accelerations. IMUs, consisting of three gyroscopes and accelerometers mounted in orthogonal axes, provide measurements in all three dimensions. Despite their effectiveness, IMUs exhibit varying bias performances, with micro-electro-mechanical systems (MEMSs) and ring laser gyro (RLG)-based IMUs from Honeywell ranging from 0.25 deg/h to 7 deg/h and 0.25 deg/h to 0.0006 deg/h, respectively [11]. However, high-accuracy IMUs are often expensive and bulky, posing limitations for DWT testing models. Alternatively, low-cost MEMS sensors can be employed for scaled model testing, although with reduced accuracy due to inherent errors such as bias error, repeatability, stability, drift, sensor noise, misalignment, and latency. To mitigate these errors, sensor fusion techniques including, for example, multiple IMUs and model-based filters such as extended Kalman and Sage-Husa adaptive Kalman filters [12] are widely used. On the contrary, off-board sensors like high-speed cameras offer real-time data acquisition capabilities, presenting an advantage in dynamic wind tunnel testing for identifying flight dynamic models. However, their performance can be hindered by factors like lighting and background settings. The continuous advances in computer vision techniques present an opportunity to enhance the accuracy of data obtained in dynamic wind tunnel experiments.
Computer vision techniques for attitude estimation are often reliant on visual markers placed on the model. However, the absence of markers allows for the implementation of the algorithm across diverse aircraft wing configurations, including high, low, and mid-wing designs. This versatility highlights the potential applicability of the computer vision algorithm without markers, making it adaptable for various aircraft models without being limited to specific configurations.
Moreover, the absence of markers broadens the research application to various fields where computer vision is essential, such as air traffic control tower surveillance [13] and marine operations [14]. While traditional applications focus more on target positions rather than attitudes, this research lays a foundation for incorporating attitude estimation in such scenarios. Additionally, as the use of drones expands across sectors like rescue operations, surveillance, and agriculture, the need for drone identification and classification becomes crucial [15]. The findings of this research can potentially be effectively applied in state estimations for drones, as well as for advanced techniques aimed at accurately identifying, tracking, and estimating missile dynamics for military purposes.
In this paper, we propose a method for accurately acquiring dynamic wind tunnel data for a scaled rigid-body aircraft model without visual markers on the model. The method employs a sequential process involving Harris corner detection and Kanade–Lucas–Tomasi tracking. From this, an aircraft skeleton position is obtained. Then, the Euler angles and Euler rates may be estimated to capture the attitude of the aircraft model. Validation of this approach is conducted by simulating a 3D CAD model of a 1/12 sub-scale BAE Hawk fighter aircraft model using MATLAB Simscape. Note that an early version of this work was previously presented in [16] and this paper improves and extends the previous study in several key areas. Quaternions for the attitude estimation are used instead of the direction cosine matrix and this improves the accuracy of the results. Notably, a detailed sensitivity analysis on the use of quaternions is conducted. The analysis identifies first-order changes in the estimated angles resulting from small changes in the identified position of the aircraft axes. Furthermore, this paper adds sensor fusion between on-board and off-board instrumentation by implementing an extended Kalman filter, as suggested in [16].
The contributions of this paper include the following: (i) A sequence of established computer vision techniques is proposed for solving a new problem. These include feature detection, specifically corner detection and feature tracking to obtain an aircraft skeleton. (ii) A customised transformation method from 2D to 3D coordinates tailored for a stereo orthogonal setup is presented. Furthermore, it adapts well-established Euler angles, commonly used in aerospace for attitude estimation, to estimate Euler angles from the aircraft skeleton points by employing quaternions and transforming them to Euler angles. (iii) A sensitivity analysis is performed; this determines first-order changes in the Euler angles resulting from alterations in the positioning of the aircraft skeleton points. This analysis serves to pinpoint the aircraft skeleton points that exert the most significant impact on the Euler angles, enhancing the accuracy and precision of the estimation process. (iv) Finally, by combining these advances, this interdisciplinary research proposes a complete framework for accurate attitude estimation in dynamic wind tunnel applications, with a focus on identifying and mitigating potential sources of error.
The rest of this paper is structured as follows. Section 2 discusses the related work pertinent to the innovation presented in this paper, namely the application of computer vision techniques for estimating the attitude of an aircraft model during dynamic wind tunnel testing. Section 3 discusses attitude estimation using our proposed method. Section 4 details the 3D CAD model simulation. Section 5 discusses the conducted sensitivity analysis. Section 6 elaborates on the extended Kalman filter utilised for sensor fusion, integrating data from both the off-board and on-board sensors. Section 7 discusses the results. Our conclusions are presented in Section 8.

2. Related Work

Various object detection techniques utilising laser and self-luminous markers [17], as well as multiple line marks on the sides of train models [18] and markers [19], have been employed for pose measurements in wind tunnels. Additionally, object-tracking techniques, including particle filtering [20], saliency combined particle filtering [21], and background subtraction [22], have been utilised for aerial UAV tracking.
Extensive research has been conducted on using photogrammetry for various applications, including obtaining aerodynamic parameters from wind tunnel experiments. At NASA’s Langley Research Center, a digital photogrammetry technique has been employed to measure model deformation, and sting bending using a single camera [23]. This method allows for the extraction of 2D or 3D spatial measurements, which are then transformed into engineering variables such as attitude and deformation. By leveraging the capabilities of computer vision to perform autonomous algorithms, the accuracy and speed of these measurements are significantly improved. Specifically, computer vision was utilised to monitor model deformation. The primary goal of this work was to enhance the deformation measurement capability, which also holds potential for measuring model attitude.
A binocular vision method using a photogrammetry model based on multi-layer refraction and laser strips has been proposed for measuring the position and attitude of high-speed rolling targets in wind tunnel experiments [24]. Laser-aided vision technology for high-precision pose measurement in wind tunnels has also been suggested for rolling targets, though improvements could include speed measurement and increased precision [17]. A monocular vision method using coloured images achieves accurate positioning through image differencing, accumulation, and marker position estimation, with precision being less than 0.19 mm and 0.18° for the position and angle of pitch and yaw [25].
For supersonic wind tunnels, a pose measurement method employing ultrathin retro-reflection markers and spatial coding achieves high accuracy, with precision values of less than 0.16 mm, 0.132°, and 0.712° for position, pitch, and yaw, respectively [26]. Additionally, an attitude and position measurement system for slender models using two cameras, one positioned to the side and the other looking up the model, achieves a maximal error of less than 0.05° for attitude measurement [27].
A photogrammetry system using two video cameras has been proposed for measuring the attitude of models in hypersonic wind tunnels, achieving an accuracy of 0.1° for pitch, although these results have not been independently validated [28]. Another approach has been suggested using a single camera, offering a flexible position and attitude measurement method [29]. In this method, the focal length and camera orientation can be adjusted for various measurement conditions without needing to recalibrate the camera. This technique can achieve an accuracy of 0.2866 mm for displacement measurements of planar targets and 0.2615° for attitude measurements of planar objects.
A remote sensing system using a motion capture camera for the position measurement of a T-shaped object with visual markers has been proposed [30]. This method has been compared with the commonly used laser displacement sensor (LDS) method, and the results indicate that the motion capture method is equally accurate. Additionally, a pose measurement technique for an aircraft model in a wire-driven parallel suspension system (WDPSS) using monocular vision and visual markers, along with HALCON for data processing, has been suggested [31]. While this method is feasible, further work is needed to improve accuracy and achieve dynamic real-time measurement.
A method using two high-speed industrial cameras and high-contrast targets on an aircraft model surface has been proposed for the angle of attack measurement, with accuracy within 0.01° [32]. This method shows promise for effective measurement in high-speed wind tunnels. Another approach, employing monocular vision and visual markers on the model along with HALCON for data processing, has been suggested, providing accurate results with a relative error of less than 2% compared to three-axis turntable measurements [33]. However, there is room for enhancement in this method.
For spin tests in vertical wind tunnels, a method utilising two CCD cameras and visual markers achieves an accuracy of 0.1° for attitude angle measurement [34]. Another method based on stereo vision, involving the projection of collimated laser beams onto screens, demonstrates promising results with maximum deviations of less than 0.05° for pitch, roll, and yaw angles, whilst validating in a controlled laboratory environment [35].
The only existing literature on estimating aircraft dynamics using computer vision without markers conducted by [36], explores computer vision-based estimation of aircraft dynamics, employing optical flow data captured by a downward-looking camera and processed through an extended Kalman Filter, complemented by Monte Carlo simulations for analysis. In contrast, our research adopts a novel approach utilising 3D CAD simulations to replicate real-world conditions and environments, facilitating the reproduction of real-world experiments, aiding in error identification and setting multiple degrees of freedom for the model. MATLAB Simscape serves as a valuable tool for modelling, simulating, and visualising the CAD model. While [36] utilise Monte Carlo simulations primarily for statistical analysis and uncertainty assessment, our Simscape simulations concentrate on the dynamic behaviour and performance analysis of physical systems, enabling the examination of time-domain response, transient behaviour, steady-state characteristics, and control of physical systems. Simscape proves particularly beneficial for designing and testing control systems and assessing system performance under diverse conditions. Compared to the approach by [36], our method offers a cost-effective alternative before conducting dynamic wind tunnel testing, producing results comparable to real-world experimental outcomes.

3. Attitude Estimation

This section elaborates on the methodology employed for attitude estimation of an aircraft model using computer vision, without the need for visual markers on the model. The approach follows a systematic process that involves several key steps:
  • Feature identification: Initially, key features representing the aircraft’s skeleton are identified within image frames extracted from video footage. The video is captured using an orthogonal stereo camera setup, which provides both top and side views of the aircraft. This stereo vision is critical for accurate spatial analysis. Section 3.1 details feature identification and tracking in initial and consecutive frames.
  • Two-dimensional (2D) to 3D transformation: Once the key features are identified, their coordinates in the 2D-pixel coordinates are transformed into 3D coordinates. This transformation leverages the orthogonal views provided by the stereo camera setup to accurately reconstruct the spatial configuration of the identified points. Section 3.2 discusses the theory of the pinhole camera model to transform the 2D pixel coordinates to 3D coordinates. In addition, Section 3.3 details the tailored method for the transformation of 2D pixel coordinates to complete (X, Y, Z) 3D coordinates.
  • Attitude estimation: With the 3D coordinates of the key features obtained, the next step involves estimating the aircraft’s attitude. This is achieved by employing quaternions, a mathematical representation that facilitates smooth and continuous rotation calculations. The Euler angles, which describe the aircraft’s orientation, are derived from these quaternions. Section 3.4 and Section 3.4.1 describe the widely applied Euler angles for aerospace applications and our proposed method for estimating Euler angles from aircraft skeleton points by employing quaternions.
This methodology demonstrates a robust and efficient approach to aircraft attitude estimation using computer vision techniques, eliminating the need for physical markers and leveraging computer vision techniques for accurate spatial analysis.

3.1. Features Detection and Tracking

After acquiring the video and extracting initial frames from both top and side views, the region of interest (ROI) is selected, encompassing the wingspan and fuselage length. Subsequently, the focus shifts to detecting corner points within the image. Harris corner and edge detection, as outlined by Harris [37], is a widely recognised approach for corner detection. This method is crucially employed in the feature detection stage of the process as shown in Figure 2 to identify corner points on the aircraft model. The corner points are obtained in these ROIs using the MATLAB Computer Vision Toolbox detectHarrisFeatures function. The details and methodology of Harris corner detection can be found in [37].
The aircraft skeleton is defined by key corner points, including the nose tip, midpoint of the horizontal stabiliser, wingtips, and vertical stabiliser tip. The nose tip and midpoint of the horizontal stabiliser represent the fuselage length along the X-axis of the aircraft. The wingtips define the wingspan along the Y-axis, while the vertical stabiliser tip and an intersecting point from this tip to the fuselage length in the side view define the Z-axis.
These corner points, which include fuselage length and wingspan, are visually identifiable in the top view, while the fuselage length and vertical stabiliser length are easily seen in the side view. The remaining corner points are identified using aircraft geometry and the knowledge of the setup of the 1-DOF test as described in Section 4.
The corner points in the initial frame are used to track the feature point tracking stage shown in Figure 2. The methodology adopted in this research involves the application of the Kanade–Lucas–Tomasi [38,39] tracking technique to strategically monitor selected points within the initial frame, which serve as pivotal landmarks on the model aircraft skeleton. These key points include wing tip points, the nose tip in the top view, and both the tail tip and nose tip in the side view. Subsequent frames are then analysed to identify the remaining corner points of the aircraft skeleton by implementing the Harris corner detection through a synthesis of aircraft geometry and insights gained from the experimental test procedures conducted as described in Section 4. The operation of the Kanade–Lucas–Tomasi tracking algorithm, which estimates the displacement between an initialised point and the next frame to track the point across frames, is detailed in [38,39]. This approach ensures precise and comprehensive tracking of the model aircraft movement, facilitating a thorough analysis of its behaviour and performance characteristics.

3.2. Camera Parameters

The subsequent section delves into the camera parameters based on a widely known pinhole camera model. The objective is to detect the location of corner points within a 3D space based on their corresponding image points, which result from projecting the 3D points onto a plane. This is illustrated in Figure 3. The transformation is described in detail here and serves as the penultimate stage within the process outlined in Figure 2.

3.2.1. Extrinsic Parameters of the Simscape Model

The rotation matrix, R T , of body coordinates ( X b , Y b , Z b ) to camera coordinates ( X c , Y c , Z c ) in Simscape for a top view is given by the following:
R T = 1 0 0 0 1 0 0 0 1 .
Similarly, the rotation matrix, R S , of body coordinates to camera coordinates in Simscape for a side view is as follows:
R S = 1 0 0 0 0 1 0 1 0 .
The translation matrices for both the views are as follows:
t T = 0 0 50 cm
t S = 0 50 cm 0 .

3.2.2. Transformation from 3D to 2D

The extrinsic parameters of rotation and translation are multiplied with a 3D point to obtain the point in the camera coordinate system by the following:
X c Y c Z c = R t X Y Z 1 .
where R is a rotation matrix of body coordinates to camera coordinates, and t is a translation vector. Multiplying this with the camera’s intrinsic parameters of focal length, f, and principal points, c x , c y , we obtain the 3D point in the pixel coordinates given by the following:
u v w = f 0 c x 0 f c y 0 0 1 X c Y c Z c .
The 2D-pixel coordinates are then obtained by the following:
u = u w , v = v w .
The camera projection parameters are chosen for this simulation as follows. The principal point is positioned at the centre of gravity of the model in the top and side views. By aligning the principal point with the centre of gravity of the aircraft model, a common reference point is established for both the top and side views. This alignment simplifies the geometric transformations required for 3D reconstruction, reducing parallax errors and enhancing the accuracy of depth estimation. Moreover, this configuration streamlines mathematical modelling and camera calibration processes. It also improves the interpretability and visualisation of the aircraft model structure and dimensions, facilitating more precise and consistent measurements. In addition, the focal length is set to 1, as it does not affect the camera projection matrix in this simulation context. This approach effectively removes any additional complexity that could arise from varying focal lengths or scaling factors, thus streamlining the validation process.

3.2.3. Transformation from 2D to 3D

Transforming a corner point on a 2D plane (identified in Section 3.1) to a 3D point is performed by the following method. The pixel coordinates ( u , v ) obtained from an image are transformed into image coordinates by the following:
X I = d p x ( u c x ) ,
Y I = d p y ( v c y ) ,
where c x , c y are the principal points and ( d p x , d p y ) are the metres/pixel scaling pair chosen to be to (1,1) for simplicity. Since all the points lie on the image plane, the camera coordinates are obtained by the following:
X c Y c Z c = X I Y I f .
The camera coordinates are transformed into 3D coordinates by the following:
X Y Z 1 = R T t T X c Y c Z c .
where R is a rotation matrix of body coordinates to camera coordinates, t is a translation vector, and ( · ) T denotes the transpose. These transformations are applied to both the top and side views.

3.2.4. Camera Pose Estimation and Triangulation

The relative pose of the side camera to the top camera can be estimated by using the knowledge of the extrinsic parameters of both cameras by the following:
R R = R T R S
t R = t S t T
Using the above-known parameters and triangulating 3D points by the function triangulate based on direct linear transformation [40] induces errors in identifying the corresponding 3D points due to the sensitivity limitation to noise.

3.3. 3D Aircraft Model Skeleton

The transformation of identified 2D pixel coordinates from the top and side views as described in Section 3.1 into 3D coordinates is achieved using the respective camera projection matrices as described in Section 3.2. This process yields accurate (X, Y) 3D coordinates from the top view and (X, Z) 3D coordinates from the side view. The complete 3D coordinates of the aircraft skeleton are then obtained by averaging the shared X-coordinate from both views, using the Y-coordinate from the top view and the Z-coordinate from the side view. Integrating corner points from both camera perspectives by utilising this tailored method for orthogonal stereo setup yields the complete aircraft skeleton, as depicted in Figure 4.

3.4. Euler Angles

The final stage of the process shown in Figure 2 is to extract the Euler or Tait–Bryan angles (roll, pitch, yaw). The aircraft model is given the standard right-hand set body axes used for aircraft dynamics [16] with the X-axis pointing forward, the Y-axis pointing starboard, and the Z-axis pointing downward. The rotation matrices for roll, pitch, and yaw are given next. The rotation matrix for a roll rotation around the X-axis through the angle ϕ is given by the following:
R X = 1 0 0 0 cos ϕ sin ϕ 0 sin ϕ cos ϕ
The rotation matrix around the new Y-axis is a pitch rotation through the angle θ and is given by the following:
R Y = cos θ 0 sin θ 0 1 0 sin θ 0 cos θ
The final rotation is around the Earth frame Z-axis through a yawn angle ψ with a rotation matrix
R Z = cos ψ sin ψ 0 sin ψ cos ψ 0 0 0 1
The rotation matrix (known as the direction cosine matrix) around the X- Y - Z axes through the angles ϕ , θ , ψ is given by the following:
R = cos θ cos ψ cos θ sin ψ sin θ sin ϕ sin θ cos ψ cos ϕ sin ψ sin ϕ sin θ sin ψ + cos ϕ cos ψ sin ϕ cos θ cos ϕ sin θ cos ψ + sin ϕ sin ψ cos ϕ sin θ sin ψ sin ϕ cos ψ cos ϕ cos θ
The Euler angles can be extracted from the DCM, (R), and are given by the following:
θ = arcsin R 13 ,
ϕ = arctan R 23 R 33 ,
ψ = arctan R 12 R 11 .
The Euler rates are computed to derive dynamic measurements for attitude estimation. These rates provide crucial insights into the aircraft dynamics, aiding in the accurate determination of its orientation over time. The Euler rates ( ϕ ˙ , θ ˙ , ψ ˙ ) from the body angular rates ( p , q , r ) are given by the following:
ϕ ˙ θ ˙ ψ ˙ = 1 sin ϕ tan θ cos ϕ tan θ 0 cos ϕ sin ϕ 0 sin ϕ sec θ cos ϕ sec θ p q r
Since the body angular rate p is unknown, the Euler rate ( ϕ ˙ ) is obtained by backward difference numerical differentiation as follows:
ϕ ˙ = ϕ ( t ) ϕ ( t 1 ) Δ t .
Although this method causes a time shift, it is a feasible solution for implementation in a real-time system. The angular rates θ ˙ and ψ ˙ are similarly obtained.

3.4.1. Euler Angles and Rates from Aircraft Skeleton

Let P 0 represent a matrix containing the initial skeleton points on the aircraft model before an Euler rotation in 3D coordinates and P 3 represent the matrix containing corresponding points on the aircraft model after an Euler rotation in 3D coordinates. These are points obtained as described in Section 3.3. The matrix P 3 obtained can have the expanded equation as follows:
P 3 = X 3 ( 1 , 1 ) X 3 ( 1 , 2 ) X 3 ( 1 , 6 ) Y 3 ( 2 , 1 ) Y 3 ( 2 , 2 ) Y 3 ( 2 , 6 ) Z 3 ( 3 , 1 ) Z 3 ( 3 , 2 ) Z 3 ( 3 , 6 )
The X-axis vector of the aircraft model is at P 3 ( i , j ) , where i = 1 , 2 , 3 and j = 1 , 2 are the rows and columns of the matrix, respectively. The resultant vector and the normalised vector are obtained by the following:
X ( 3 ) = X 3 ( 1 , 2 ) X 3 ( 1 , 1 ) Y 3 ( 2 , 2 ) Y 3 ( 2 , 1 ) Z 3 ( 3 , 2 ) Z 3 ( 3 , 1 )
To obtain the X-axis vector with unity magnitude and without changing its direction, the unit vectors ( i , j , and k ) of the vector can be obtained by normalising the vector as follows:
X ^ ( 3 ) = X ( 3 ) X ( 3 )
where X ( 3 ) denotes the Euclidean norm of X ( 3 ) .
Similarly, the Y-axis vector of the aircraft model is at P 3 ( i , j ) , where i = 1 , 2 , 3 and j = 3 , 4 are the rows and columns of the matrix, respectively. The resultant vector and the normalised vector are obtained by the following:
Y ( 3 ) = X 3 ( 1 , 4 ) X 3 ( 1 , 3 ) Y 3 ( 2 , 4 ) Y 3 ( 2 , 3 ) Z 3 ( 3 , 4 ) Z 3 ( 3 , 3 ) ,
Y ^ ( 3 ) = Y ( 3 ) Y ( 3 ) .
Finally, the Z-axis vector of the aircraft model is at P 3 ( i , j ) , where i = 1 , 2 , 3 and j = 5 , 6 are the rows and columns of the matrix, respectively. The resultant vector and the normalised vector are obtained by the following:
Z ( 3 ) = X 3 ( 1 , 6 ) X 3 ( 1 , 5 ) Y 3 ( 2 , 6 ) Y 3 ( 2 , 5 ) Z 3 ( 3 , 6 ) Z 3 ( 3 , 5 ) ,
Z ^ ( 3 ) = Z ( 3 ) Z ( 3 ) .
The rotation matrix for the orientation is given by the following:
Q T ( 3 ) = X ^ ( 3 ) , Y ^ ( 3 ) , Z ^ ( 3 )
The matrix is used to extract the quaternion components ( q ( 3 ) w , q ( 3 ) x , q ( 3 ) y , q ( 3 ) z ) of Q T ( 3 ) as follows:
q ( 3 ) w = 1 2 1 + X 3 ( 1 , 2 ) X 3 ( 1 , 1 ) X ( 3 ) + Y 3 ( 2 , 4 ) Y 3 ( 2 , 3 ) Y ( 3 ) + Z 3 ( 3 , 6 ) Z 3 ( 3 , 5 ) Z ( 3 )
q ( 3 ) x = 1 4 q ( 3 ) w Y 3 ( 2 , 6 ) Y 3 ( 2 , 5 ) Z ( 3 ) Z 3 ( 3 , 4 ) Z 3 ( 3 , 3 ) Y ( 3 )
q ( 3 ) y = 1 4 q ( 3 ) w Z 3 ( 3 , 2 ) Z 3 ( 3 , 1 ) X ( 3 ) X 3 ( 1 , 6 ) X 3 ( 1 , 5 ) Z ( 3 )
q ( 3 ) z = 1 4 q ( 3 ) w X 3 ( 1 , 4 ) X 3 ( 1 , 3 ) Y ( 3 ) Y 3 ( 2 , 2 ) Y 3 ( 2 , 1 ) X ( 3 )
Since the aircraft model is trimmed in the initial frame and has no rotation, the initial quaternion matrix obtained is an identity matrix. Therefore, the quaternion matrix and the quaternion components are as follows:
Q T ( 0 ) = 1 0 0 0 1 0 0 0 1
q ( 0 ) w = 1 , q ( 0 ) x = 0 , q ( 0 ) y = 0 , and q ( 0 ) z = 0 .
The relative orientation between the two frames is obtained from the quaternions by the following:
q ( r ) = q ( 3 ) q ( 0 ) 1
where q ( . ) is the quaternion, including the scalar and the unit vector components. The inverse of a quaternion (q) is equal to the complex conjugate q * of the quaternion when the quaternion is normalised; that is, | q | 2 = q w 2 + q x 2 + q y 2 + q z 2 = 1 . Therefore, q ( 0 ) 1 = q ( 0 ) * , and the equation is modified as follows:
q ( r ) = q ( 3 ) q ( 0 ) *
where the complex conjugate q ( 0 ) * = q ( 0 ) w i q ( 0 ) x j q ( 0 ) y k q ( 0 ) z . By using the quaternion product rule, the relative orientation can be rewritten as follows:
q ( r ) = q ( 3 ) w + q ( 3 ) = q ( 3 )
where q ( 3 ) w is the scalar component and q ( 3 ) is the unit vector component of the quaternion. The quaternion components ( q ( 3 ) w , q ( 3 ) x , q ( 3 ) y , and q ( 3 ) z ) are used to obtain the Euler angles ( ϕ , θ , and ψ ) using the following equations:
ϕ = arctan Q T ( 3 ) 2 , 3 Q T ( 3 ) 3 , 3 = arctan 2 q ( 3 ) y q ( 3 ) z + 2 q ( 3 ) w q ( 3 ) x q ( 3 ) w 2 q ( 3 ) x 2 q ( 3 ) y 2 + q ( 3 ) z 2
θ = arcsin Q T ( 3 ) 1 , 3 = arcsin 2 q ( 3 ) x q ( 3 ) z 2 q ( 3 ) w q ( 3 ) y q ( 3 ) w 2 + q ( 3 ) x 2 + q ( 3 ) y 2 + q ( 3 ) z 2
ψ = arctan Q T ( 3 ) 1 , 2 Q T ( 3 ) 1 , 1 = arctan 2 q ( 3 ) x q ( 3 ) y + 2 q ( 3 ) w q ( 3 ) z q ( 3 ) w 2 + q ( 3 ) x 2 q ( 3 ) y 2 q ( 3 ) z 2

4. 3D CAD Simulation

A 3D CAD rendition of a 1/12 scaled BAE Hawk is crafted using SolidWorks (Student version 2021-22).This model is then imported into MATLAB Simscape (version R2023a) for modelling and simulating the aircraft motion. Within the Simscape environment, a model is constructed featuring gimbal joints to facilitate the Euler angle transformation of the CAD model, alongside a transform sensor to extract the rotation matrix. Subsequently, the angles ϕ , θ , and ψ are applied to the Simscape model.
The presented experiment tests just a single degree-of-freedom, whereby the pitch angle undergoes sinusoidal variation and the roll and yaw are fixed at zero. This experiment demonstrates the method using a longitudinal motion rig; additional work is ongoing to develop a novel 3-DOF rig capable of testing both the short-period and phugoid modes [41,42].
Figure 5 and Figure 6 depict the applied pitch angle and rate. It is worth noting the presence of numerical rounding noise in the roll and yaw angles.
Realistic data from an on-board sensor comprising an IMU and potentiometer gimbal is generated to facilitate sensor fusion. This process is briefly outlined here. After acquiring the Euler rates as described in Equation (22), we can determine ( p , q , r ) by taking the inverse of Equation (21). Gyroscope measurements of body angular rates are typically subject to noise. However, in this simulation, ideal (noise-free) measurements of body angular rates ( p , q , r ) are utilised as ground truth data to validate the state estimates obtained from the EKF. Similarly, the ideal measurements of an accelerometer can be derived using the rotation matrix R as mentioned in Equation (17) from the Simscape transform sensor, as outlined below:
a c c x a c c y a c c z = R T 0 0 9.81
To introduce noise and uncertainty into the gyroscope, accelerometer, and potentiometer measurements, zero-mean Gaussian noise is added as unbiased noise, along with a linear drift rate to the gyroscope measurements. The unbiased noise is generated using the normrnd function. The linear drift rate represents the accumulated bias drift over the test duration.
The generated noisy data are filtered through a low-pass filter, effectively reducing noise by attenuating high-frequency components while preserving low-frequency signals. This process also serves to smooth out fluctuations or jitter arising from mechanical imperfections, resulting in a stable and consistent output signal.

5. Sensitivity Analysis

A sensitivity analysis is conducted to approximate the change in functions due to small perturbations in its input variables. This is done by utilising Taylor’s first-order series. The analysis is performed by calculating the Jacobian matrices ϕ / P 3 , θ / P 3 , ψ / P 3 with respect to each of the 6 final frame world coordinates ( X , Y , Z ) of the aircraft skeleton as obtained in Section 3.3. Euler angle Equations (38)–(40) can be rewritten for the sensitivity analysis as follows:
ϕ = arctan 2 num y num z + 8 num x q ( 3 ) w 2 16 q ( 3 ) w 4 num x 2 num y 2 + num z 2
θ = arcsin 2 num x num z num y 8 q ( 3 ) w 2 16 q ( 3 ) w 4 + num x 2 + num y 2 + num z 2
ψ = arctan 2 num x num y + 8 num z q ( 3 ) w 2 16 q ( 3 ) w 4 + num x 2 num y 2 num z 2
where num x , num y , and num z are the numerators of the quaternion components q ( 3 ) x , q ( 3 ) y , and q ( 3 ) z given in Equations (32)–(34). The Taylor series functions can be given as follows:
ϕ ( P 3 + δ P 3 ) ϕ ( P 3 ) + δ P 3 ϕ P 3
θ ( P 3 + δ P 3 ) θ ( P 3 ) + δ P 3 θ P 3
ψ ( P 3 + δ P 3 ) ψ ( P 3 ) + δ P 3 ψ P 3

6. Extended Kalman Filter

An extended Kalman filter is employed for sensor fusion of accelerometer, gyroscope, potentiometer gimbal, and vision data. The EKF enables a comprehensive understanding of the system’s behaviour. Through a process of prediction and correction, the filter refines its estimates over time, providing a fused representation of the underlying system state. This approach not only enhances accuracy but also improves robustness in scenarios where individual sensors may be prone to noise, biases, or data loss. A standard approach is used as follows:
x ^ ( k | k 1 ) = f x ^ ( k 1 | k 1 ) , u ( k 1 ) + w k
P ( k | k 1 ) = F k P ( k 1 | k 1 ) F k T + Q ( k 1 )
z k = h x ^ ( k | k 1 ) + v k
y ˜ k = z k h x ^ ( k | k 1 )
S k = H k P ( k | k 1 ) H k T + R k
K k = P ( k | k 1 ) H k T S k 1
x ^ ( k | k ) = x ^ ( k | k 1 ) + K k y ˜ k
P ( k | k ) = I K k H k P ( k | k 1 )
where
w k N ( 0 , Q ( k ) )
v k N ( 0 , R ( k ) )
the state transition matrix is given by the following:
F k = f x | x ^ ( k 1 | k 1 ) , u ( k 1 )
and the observation transition matrix is given by the following:
H k = h x | x ^ ( k | k 1 )
Equations (47), (48) and (57) represent the discrete-time predicted state estimate, covariance estimate, and the state transition matrix. Equations (49)–(54) and (58) represent the measurement model, innovation or measurement residual, innovation covariance, near-optimal Kalman gain, updated state estimate, updated covariance estimate, and observation transition matrix, respectively. A loosely coupled EKF is implemented as this reduces the computational complexity. The state vector ( x ) is the Euler angles and the biases of the gyroscope and is described by the vector, as follows:
x = ϕ θ ψ bias x bias y bias z T
The state control vector ( u ) is the angular velocity from the gyroscope and is given as follows:
u = p q r T
Given the state control vector, the Euler rates can be obtained by modifying (21) as follows:
ϕ ˙ θ ˙ ψ ˙ = 1 sin ϕ tan θ cos ϕ tan θ 0 cos ϕ sin ϕ 0 sin ϕ sec θ cos ϕ sec θ p bias x q bias y r bias z
The predicted state estimate (47) is obtained as follows:
x ^ ( k | k 1 ) = x ^ ( k 1 | k 1 ) + ϕ ˙ θ ˙ ψ ˙ 0 0 0 Δ t
The state transition matrix given in (57) is obtained as follows:
F k ( : , 1 ) = 1 Δ t ( cos ϕ tan θ ( bias y q ) sin ϕ tan θ ( bias z r ) ) Δ t ( cos ϕ ( bias z r ) + sin ϕ ( bias y q ) ) Δ t ( ( sin ϕ ( bias z r ) ) / cos θ ( cos ϕ ( bias y q ) ) / cos θ ) 0 0 0
F k ( : , 2 ) = Δ t ( cos ϕ ( bias z r ) ( tan θ 2 + 1 ) + sin ϕ ( bias y q ) ( tan θ 2 + 1 ) ) 1 Δ t ( ( cos ϕ sin θ ( bias z r ) ) / cos θ 2 + ( sin ϕ sin θ ( bias y q ) ) / cos θ 2 ) 0 0 0
F k ( : , ( 3 : 6 ) ) = 0 Δ t Δ t sin ϕ tan θ Δ t cos ϕ tan θ 0 0 Δ t cos ϕ Δ t sin ϕ 1 0 ( Δ t sin ϕ ) / cos θ ( Δ t cos ϕ ) / cos θ 0 1 0 0 0 0 1 0 0 0 0 1
The measurement model for sensor fusion is obtained as follows:
z k = acc x acc y acc z θ p o t ϕ c a m θ c a m ψ c a m T
The observation model that maps the state vector to the measurement vector is obtained as follows:
h x ^ ( k | k 1 ) = sin θ cos θ sin ϕ cos ϕ cos θ θ ϕ θ ψ T
The observation transition matrix given in (58) is obtained as follows:
H k = 0 cos θ 0 0 0 0 cos ϕ cos θ sin ϕ sin θ 0 0 0 0 cos θ sin ϕ cos ϕ sin θ 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0
The measurement noise covariance matrix is obtained as follows:
R k = r acc x 0 0 0 0 0 0 0 r acc y 0 0 0 0 0 0 0 r acc z 0 0 0 0 0 0 0 r θ p o t 0 0 0 0 0 0 0 r ϕ c a m 0 0 0 0 0 0 0 r θ c a m 0 0 0 0 0 0 0 r ψ c a m
The updated state estimate given in (53) is obtained as follows:
x ^ ( k | k ) = ϕ updated θ updated ψ updated bias x updated bias y updated bias z updated T

7. Results

First, we discuss the intermediate outcomes of identifying the aircraft skeleton from both side and top images, as the accuracy of attitude estimation heavily relies on the precise positioning of the corner points within the aircraft model skeleton.
The utilisation of Harris corner detection to identify corner points introduces varying errors in the pixel coordinates ( u , v ) , leading to inaccuracies in the Euler angle estimation. The errors derived from Harris detection, exemplified by a single frame, are showcased in Table 1. Additionally, Figure 7 illustrates the corner points and aircraft model skeleton in the pixel coordinates, with green and red colours representing the initial frame and a consecutive frame, respectively. A notable limitation of applying corner detection algorithms to the current CAD model design and lighting setup is that the right-wing tip falls into a blind spot and remains invisible to the camera unless the aircraft model exhibits roll by more than 5 degrees. Additionally, the left wing tip is generally obscured and only becomes visible when the model exhibits extensive motions across all three degrees of freedom.
The transformation of 2D pixel coordinates to 3D coordinates of the aircraft model skeleton as described in Section 3.3 is illustrated in Figure 4. This figure demonstrates the accuracy of the customised transformation from 2D to 3D coordinates, specifically tailored for this orthogonal stereo setup, in successfully converting the aircraft skeleton points from the top and side camera views into a unified 3D world coordinate system. The rotation matrix obtained using (30) has a determinant of 1. This indicates that the matrix represents a pure rotation without any scaling effects. Figure 8, Figure 9, Figure 10, Figure 11, Figure 12 and Figure 13 illustrate the ϕ , θ , ψ , ϕ ˙ , θ ˙ , and ψ ˙ results from our proposed vision method and compare them to the Simscape data. Table 2 provides a comprehensive numerical representation of accuracy defined as the degree to which the estimated angles from our proposed method match the ground truth values represented by Simscape results, expressed as an error range. The following interpretations could be made from the results obtained from the quaternions:
  • The results of pitch angle ( θ ) indicate that there is no consistent bias towards underestimation or overestimation in these angles. However, the results of roll angle ( ϕ ) indicate a consistent bias towards underestimation and the results of yaw angle ( ψ ) indicate a consistent bias towards overestimation. Irrespective of the bias, the results follow a pattern of the input of the Simscape model, which is a sin wave.
  • The magnitude of the errors suggests that the quaternion method is accurate in estimating angles, with ϕ and ψ having relatively small errors and θ having relatively larger errors.
  • The bias of ϕ ˙ , θ ˙ , and ψ ˙ do not have a consistent bias towards underestimation or overestimation.
  • The magnitudes of the error range of θ ˙ have large errors while the ϕ ˙ and ψ ˙ have small errors identical to the Euler angles results as described in [16].
Even though quaternions offer accuracy and stability, small errors in their components can still lead to significant deviations in Euler angles, especially near zero rotation angles. This is due to the sensitivity of trigonometric functions involved in converting quaternions to Euler angles. The inaccuracies in identifying Euler angles propagate to Euler rates during backward differentiation. A significant error of −0.2806 radians per second in the θ ˙ occurs at the first time step, which is at 0 + Δ t , with the subsequent minimum error changing to −0.0818 radians per second, suggesting that the primary error occurs initially.
By computing the root mean square error (RMSE) for each estimate of phi, theta, and psi individually, and then aggregating these RMSE values, the percentage improvement of the quaternion method when compared to the direction cosine matrix employed in [16], is determined by using the following equation,
percentage = R M S E DCM R M S E Quat R M S E DCM 100 %
The results indicate a 45.41% improvement in the accuracy of the quaternion method for attitude estimation of the CAD model compared to using the direction cosine matrix method.
A sensitivity analysis is performed to evaluate the system’s robustness and its ability to maintain performance under varying errors in the identification of aircraft skeleton points. This analysis aims to understand how errors in accurately identifying the aircraft skeleton corner points, as visualised in Table 1, propagate to the Euler angle errors. Additionally, to further analyse the first-order changes in the Euler angles, random numbers between −1 and 1 are chosen as Δ P 3 . Table 3 depicts the obtained first-order change of functions. The results indicate the following:
  • The world coordinate points ( Y 3 ( 2 , 5 ) , Y 3 ( 2 , 6 ) , Z 3 ( 3 , 3 ) , and Z 3 ( 3 , 4 ) ) have a high impact on the change in ϕ angles. These points are the Y-coordinates of the tail length and Z-coordinates of wing tips. This is visualised in Figure 14, where dx31, ⋯, dy31, ⋯, dz31 represent X 3 ( 1 , 1 ) , ⋯, Y 3 ( 1 , 1 ) , ⋯, Z 3 ( 1 , 1 ) , which are the aircraft skeleton points in consecutive frames as defined in (23).
  • Figure 15 indicates that the world coordinate points ( X 3 ( 1 , 5 ) , X 3 ( 1 , 6 ) ) have a significant impact on the change of θ angles. These points are the X-coordinates of the tail length and the points Z 3 ( 3 , 1 ) , and Z 3 ( 3 , 2 ) have slightly minimal impact on the change in θ angles. These points are the Z-coordinates of the fuselage length.
  • Figure 16 depicts that the points ( X 3 ( 1 , 3 ) , X 3 ( 1 , 4 ) , Y 3 ( 2 , 1 ) , and Y 3 ( 2 , 2 ) ) have a high impact on the ψ angles. These are the X-coordinates of the wing tips and the Y-coordinates of the fuselage length.
  • Any inaccuracies or disturbances in the identification of P 3 points tend to amplify the roll and yaw angle and affect the accuracy or stability in ϕ and ψ estimation.
  • The results obtained from quaternion measurements were corroborated by the sensitivity analysis, affirming their reliability in obtaining ϕ , θ , ψ angles. Additionally, the sensitivity analysis reveals that the pitch angle exhibits the highest susceptibility to noise or inaccuracies in aircraft skeleton points, followed by the roll angle, while the yaw angle demonstrates relatively lower sensitivity.
Figure 17, Figure 18 and Figure 19 depict the results of EKF state ( ϕ , θ , ψ ) estimation, respectively. The dynamic measurements ( ϕ ˙ , θ ˙ , ψ ˙ ) obtained through backward differentiation of EKF’s state estimates are illustrated in Figure 20, Figure 21 and Figure 22, respectively. Table 4 provides a comprehensive numerical representation of the error ranges. Comparing the EKF results to the quaternion results in Table 2 indicates that the EKF estimation of Euler angles is consistent with the expected error range from quaternions. Comparing the EKF results with the Simscape results indicates the following:
  • A comprehensive and reliable estimate of the system state is derived by the EKF fusion process that effectively integrates information from different sensors. This indicates a reliable fusion by the EKF.
  • The EKF is robust to variations, noise, and uncertainties present in the gyroscope, accelerometer, and potentiometer data. This demonstrates the ability of EKF to effectively handle sensor measurements and provide state estimation even in challenging conditions.
  • The results from EKF perform as validation for the sensor fusion algorithm implemented in the EKF. It indicates that the fusion process is functioning correctly.

8. Conclusions

A computer vision technique is proposed for the attitude estimation of a scaled aircraft model in dynamic wind tunnel testing without the placement of any markers on the model. This implementation is novel and has not been previously executed. The method implements a sequential process of Harris corner detection, Kanade–Lucas–Tomasi tracking, and quaternions to identify the Euler angles. The Harris detection method introduces inaccuracies in identifying the corner points of the aircraft model skeleton, thereby affecting the estimation of Euler angles. A sensitivity analysis of the proposed approach was conducted using a first-order Taylor series. The implementation of quaternions over the direction cosine matrix for estimating Euler angles has shown a 45.41% improvement in accuracy. The quaternion method achieves an RMSE of 0.0101 rad/s, 0.0361 rad/s, and 0.0036 rad/s for the dynamic measurements of ϕ ˙ , θ ˙ , and ψ ˙ , respectively. Additionally, this method exhibits an accuracy of 98.08% for pitch rate ( θ ˙ ). Furthermore, by incorporating data from on-board sensors such as an IMU and a potentiometer gimbal, and integrating it with vision data using the extended Kalman filter, this method has achieved an RMSE of 0.0090 rad/s, 0.0262 rad/s, and 0.0034 rad/s for the dynamic measurements of ϕ ˙ , θ ˙ , and ψ ˙ , respectively. This method exhibits an improved accuracy of 98.61% for the estimation of pitch rate ( θ ˙ ).
Additional verification of the method through application to lateral and longitudinal angle changes will be performed, along with full integration with other measurement techniques. While the approach developed here works satisfactorily without markers, making this a challenging exercise, the inclusion of markers should improve accuracy and reliability. The system will also be integrated into a longitudinal 3-degree-of-freedom rig that is currently being developed [41,42].

Author Contributions

Conceptualisation, P.K. and J.F.W.; methodology, P.K. and J.F.W.; software, P.K.; validation, P.K.; formal analysis, P.K.; investigation, P.K.; resources, J.F.W. and Z.A.R.; writing—original draft preparation, P.K.; writing—review and editing, J.F.W.; supervision, J.F.W. and Z.A.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The MATLAB software utilised for processing both on-board and off-board data, as well as for attitude estimation, is available in the Cranfield Online Research Data repository (https://doi.org/10.17862/cranfield.rd.25700061).

Acknowledgments

I want to express my sincere gratitude to my fellow researcher, Ademayowa Ishola for the Hawk CAD model and other support throughout this research.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
DWTdynamic wind tunnel
DOFdegree-of-freedom
MEMSmicro-electro-mechanical system
RLGring laser gyro
UAVunmanned aerial vehicle
IMUinertial measurement unit
DCMdirection cosine matrix
CADcomputer-aided design
DLTdirect linear transformation
PnPperspective-n-point
ROIregion of interest
EKFextended Kalman filter

References

  1. Pontillo, A.; Yusuf, S.; Lopez, G.; Rennie, D.; Lone, M. Investigating pitching moment stall through dynamic wind tunnel test. J. Aerosp. Eng. 2020, 234, 267–279. [Google Scholar] [CrossRef]
  2. Rajamurthy, M.S. Generation of comprehensive longitudinal aerodynamic data using dynamic wind-tunnel simulation. J. Aircr. 1997, 34, 29–33. [Google Scholar] [CrossRef]
  3. Tortora, P.; Huang, M.; Wang, Z. A review of wind tunnel based virtual flight testing techniques for evaluation of flight control systems. Int. J. Aerosp. Eng. 2015, 2015, 672423. [Google Scholar] [CrossRef]
  4. Owens, B.; Brandon, J.; Croom, M.; Fremaux, M.; Heim, G.; Vicroy, D. Overview of dynamic test techniques for flight dynamics research at NASA LaRC. In Proceedings of the 25th AIAA Aerodynamic Measurement Technology and Ground Testing Conference, San Fransisco, CA, USA, 5–8 June 2006. Number AIAA 2006-3146. [Google Scholar] [CrossRef]
  5. Carnduff, S.D.; Cooke, A.K. Formulation and System Identification of the Equations of Motion for a Dynamic Wind Tunnel Facility. In College of Aeronautics Report COA 0801; Cranfield University: Cranfield, UK, 2008. [Google Scholar]
  6. Carnduff, S.D.; Erbsloeh, S.D.; Cooke, A.K.; Cook, M.V. Characterizing stability and control of subscale aircraft from wind-tunnel dynamic motion. J. Aircr. 2009, 46, 137–147. [Google Scholar] [CrossRef]
  7. Pattinson, J.; Lowenberg, M.H.; Goman, M.G. Multi-degree-of-freedom wind-tunnel maneuver rig for dynamic simulation and aerodynamic model identification. J. Aircr. 2013, 50, 551–566. [Google Scholar] [CrossRef]
  8. Araujo-Estrada, S.A.; Gong, Z.; Lowenberg, M.H.; Neild, S.; Goman, M. Wind tunnel manoeuvre rig: A multi-DOF test platform for model aircraft. In Proceedings of the 54th AIAA Aerospace Sciences Meeting, San Diego, CA, USA, 4–8 January 2016. Number AIAA 2016-2119. [Google Scholar] [CrossRef]
  9. Navaratna, P.D.B.; Lowenberg, M.H.; Neild, S.A. Minimally constrained flight simulation in wind tunnel. J. Aircr. 2019, 56, 1353–1366. [Google Scholar] [CrossRef]
  10. Tai, S.; Wang, L.; Wang, Y.; Bu, C.; Yue, T. Flight dynamics modeling and aerodynamic parameter identification of four-degree-of-freedom virtual flight test. AIAA J. 2023, 61, 2652–2665. [Google Scholar] [CrossRef]
  11. Honeywell International Inc. Honeywell Aerospace. Available online: https://aerospace.honeywell.com/us/en/products-and-services/product/hardware-and-systems/sensors/inertial-measurement-units (accessed on 21 July 2023).
  12. Narasimhappa, M.; Mahindrakar, A.D.; Guizilini, V.C.; Terra, M.H.; Sabat, S.L. MEMS-based IMU drift minimization: Sage-Husa adaptive robust Kalman filtering. IEEE Sens. J. 2020, 20, 250–260. [Google Scholar] [CrossRef]
  13. Ruffner, J.W.; Deaver, D.M.; Henry, D.J. Requirements analysis for an air traffic control tower surface surveillance enhanced vision system. In Enhanced and Synthetic Vision 2003; International Society for Optics and Photonics; Verly, J.G., Ed.; SPIE: Bellingham, DC, USA, 2003; Volume 5081, pp. 124–135. [Google Scholar] [CrossRef]
  14. Bachoo, A.K.; le Roux, F.; Nicolls, F. An optical tracker for the maritime environment. In Signal Processing, Sensor Fusion, and Target Recognition XX; International Society for Optics and Photonics; Kadar, I., Ed.; SPIE: Bellingham, DC, USA, 2011; Volume 8050, pp. 571–580. [Google Scholar] [CrossRef]
  15. Valaboju, R.; Vaishnavi; Harshitha, C.; Kallam, A.R.; Babu, B.S. Drone detection and classification using computer vision. In Proceedings of the 2023 7th International Conference on Trends in Electronics and Informatics (ICOEI), Tirunelveli, India, 11–13 April 2023; pp. 1320–1328. [Google Scholar] [CrossRef]
  16. Kasula, P.; Whidborne, J.F.; Rana, Z.; Ishola, A. Motion estimation of an aircraft model using computer vision. In Proceedings of the AIAA SCITECH 2024 Forum, Orlando, FL, USA, 8–12 January 2024. Number AIAA 2024-2501. [Google Scholar] [CrossRef]
  17. Liu, W.; Ma, X.; Li, X.; Chen, L.; Zhang, Y.; Li, X.; Shang, Z.; Jia, Z. High-precision pose measurement method in wind tunnels based on laser-aided vision technology. Chin. J. Aeronaut. 2015, 28, 1121–1130. [Google Scholar] [CrossRef]
  18. Li, Z.; Yang, M.; Huang, S.; Zhou, D. A new moving model test method for the measurement of aerodynamic drag coefficient of high-speed trains based on machine vision. Proc. Inst. Mech. Eng. Part J. Rail Rapid Transit 2018, 232, 1425–1436. [Google Scholar] [CrossRef]
  19. Wu, L.J.; Casciati, F.; Casciati, S. Dynamic testing of a laboratory model via vision-based sensing. Eng. Struct. 2014, 60, 113–125. [Google Scholar] [CrossRef]
  20. Ha, J.; Johnson, E.N.; Tannenbaum, A. Real-time visual tracking using geometric active contours and particle filters. J. Aerosp. Comput. Inf. Commun. 2008, 5, 361–379. [Google Scholar] [CrossRef]
  21. Yoo, S.; Kim, W.; Kim, C. Saliency combined particle filtering for aircraft tracking. J. Signal Process. Syst. 2014, 76, 19–31. [Google Scholar] [CrossRef]
  22. Zheng, M.; Wu, Z.; Bakhdavlatov, S.; Qu, J.; Li, H.; Yuan, J. Real-time aerial targets detection algorithm based background subtraction. In Proceedings of the 2013 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, Kaohsiung, Taiwan, 29 October–1 November 2013; pp. 1–6. [Google Scholar] [CrossRef]
  23. Graves, S.S.; Bunrer, A.W. Development of an intelligent videogrammetric wind tunnel measurement system. In Optical Diagnostics for Fluids, Solids, and Combustion; International Society for Optics and Photonics; Mercer, C.R., Cha, S.S., Shen, G., Eds.; SPIE: Bellingham, DC, USA, 2004. [Google Scholar] [CrossRef]
  24. Ma, X.; Liu, W.; Chen, L.; Lu, J.; Liu, W.; Zhang, J.; Jia, Z.; Changyu, L.; Zaihua, Y.; Wangmin, Y. A photogrammetry pose measurement method for moving targets in a wind tunnel. In Proceedings of the IEEE International Instrumentation and Measurement Technology Conference, Taipei, Taiwan, 23–26 May 2016; pp. 1–5. [Google Scholar] [CrossRef]
  25. Liu, W.; Chen, L.; Ma, X.; Li, X.; Jia, Z. Monocular position and pose measurement method for high-speed targets based on colored images. Chin. J. Sci. Instrum. 2016, 37, 675–682. [Google Scholar]
  26. Jia, Z.; Ma, X.; Liu, W.; Lu, W.; Li, X.; Chen, L.; Wang, Z.; Cui, X. Pose measurement method and experiments for high-speed rolling targets in a wind tunnel. Sensors 2014, 14, 23933–23953. [Google Scholar] [CrossRef] [PubMed]
  27. Cheng, L.; Yang, Y.; Xue, B.; Zhou, F.; Bai, X. A position and attitude vision measurement system for wind tunnel slender model. In Proceedings of SPIE—The International Society for Optical Engineering; SPIE: Bellingham, DC, USA, 2014; Volume 9301. [Google Scholar] [CrossRef]
  28. Jones, T.W.; Lunsford, C.B. A photogrammetric system for model attitude measurement in hypersonce wind tunnels. In Proceedings of the 45th AIAA Aerospace Sciences Meeting and Exhibit, Reno, NV, USA, 8–11 January 2007. Number AIAA 2007-1164. [Google Scholar] [CrossRef]
  29. Liu, W.; Ma, X.; Chen, L.; Jia, Z.; Liu, W.; Li, X.; Zhang, J.; Lu, J. Remote-controlled flexible pose measurement system and method for a moving target in wind tunnel. Chin. J. Aeronaut. 2018, 31, 89–98. [Google Scholar] [CrossRef]
  30. Park, H.S.; Kim, J.Y.; Kim, J.G.; Choi, S.W.; Kim, Y. A new position measurement system using a motion-capture camera for wind tunnel tests. Sensors 2013, 13, 12329–12344. [Google Scholar] [CrossRef] [PubMed]
  31. Chen, Y.f.; Wu, L.N.; Yue, S.l.; Lin, Q. Pose measurement base on machine vision for the aircraft model in wire-driven parallel suspension system. In The International Society for Optical Engineering; SPIE: Bellingham, DC, USA, 2013; Volume 8783. [Google Scholar] [CrossRef]
  32. Sun, Y.; Zhang, Z.; Huang, S.; Yu, B.; Wang, S. Vision measurement technology research for model angle of attack in wind tunnel tests. Acta Aeronaut. Astronaut. Sin. 2013, 34, 1–7. [Google Scholar]
  33. Chunping, Y.; Yixin, C.; Yifeng, C.; Liao-ni, W.; Qi, L. Posture measurement for the aircraft model based on the monocular-vision system. In Proceedings of the International Conference on Automatic Control and Artificial Intelligence (ACAI 2012), Xiamen, China, 3–5 March 2012; pp. 1580–1583. [Google Scholar] [CrossRef]
  34. Song, J.; Ma, J.; Jiang, M.; Jiang, K. Measurement based on machine vision for spin test in vertical wind tunnel. In Proceedings of the 2012 IEEE Symposium on Electrical and Electronics Engineering (EEESYM), Kuala Lumpur, Malaysia, 24–27 June 2012; pp. 251–254. [Google Scholar] [CrossRef]
  35. Chen, J.; Zhao, L. A stereo vision-based attitude measurement system for aircraft model in wind tunnel. In Proceedings of the 2010 International Conference on Computer Application and System Modeling (ICCASM 2010), Taiyuan, China, 22–24 October 2010; Volume 9, pp. 456–460. [Google Scholar] [CrossRef]
  36. Gurfil, P.; Rotstein, H. Computer vision-based estimation of aircraft dynamics. In Proceedings of the 40th IEEE Conference on Decision and Control, Orlando, FL, USA, 4–7 December 2001; pp. 3746–3751. [Google Scholar] [CrossRef]
  37. Harris, C.; Stephens, M. A combined corner and edge detector. Alvey Vis. Conf. 1988, 13, 10-5244. [Google Scholar] [CrossRef]
  38. Tomasi, C.; Kanade, T. Detection and Tracking of Point Features. Int. J. Comput. Vis. 1991, 9, 137–154. [Google Scholar] [CrossRef]
  39. Shi, J.; Tomasi, C. Good features to track. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 21–23 June 1994; pp. 593–600. [Google Scholar] [CrossRef]
  40. Hartley, R.; Zisserman, A. Multiple View Geometry in Computer Vision; Cambridge University Press: Cambridge, UK, 2003. [Google Scholar]
  41. Ishola, A.; Whidborne, J.; Kasula, P. Control for novel 3-dof flight testing in a wind tunnel. In Proceedings of the 14th UKACC International Conference on Control (CONTROL 2024), Winchester, UK, 10–12 April 2024. [Google Scholar] [CrossRef]
  42. Ishola, A.A. Development of an Aircraft-Manipulator System for Dynamic Wind Tunnel Tests. Ph.D. Thesis, Cranfield University, Bedford, UK, 2024. [Google Scholar]
Figure 1. 3-DOF aircraft model in DWT.
Figure 1. 3-DOF aircraft model in DWT.
Sensors 24 03795 g001
Figure 2. Flow chart showing the process for attitude estimation.
Figure 2. Flow chart showing the process for attitude estimation.
Sensors 24 03795 g002
Figure 3. Pinhole camera [16] (reprinted with permission from the American Institute of Aeronautics and Astronautics, Inc.).
Figure 3. Pinhole camera [16] (reprinted with permission from the American Institute of Aeronautics and Astronautics, Inc.).
Sensors 24 03795 g003
Figure 4. Aircraft skeleton in 3D. Initial frame shown in green, next frame shown in red.
Figure 4. Aircraft skeleton in 3D. Initial frame shown in green, next frame shown in red.
Sensors 24 03795 g004
Figure 5. Simscape Euler angles [16].
Figure 5. Simscape Euler angles [16].
Sensors 24 03795 g005
Figure 6. Simscape Euler rates [16].
Figure 6. Simscape Euler rates [16].
Sensors 24 03795 g006
Figure 7. Initial frame and consecutive frame [16] (reprinted with permission from the American Institute of Aeronautics and Astronautics, Inc.).
Figure 7. Initial frame and consecutive frame [16] (reprinted with permission from the American Institute of Aeronautics and Astronautics, Inc.).
Sensors 24 03795 g007
Figure 8. Euler angle, ϕ .
Figure 8. Euler angle, ϕ .
Sensors 24 03795 g008
Figure 9. Euler rate, ϕ ˙ .
Figure 9. Euler rate, ϕ ˙ .
Sensors 24 03795 g009
Figure 10. Euler angle, θ .
Figure 10. Euler angle, θ .
Sensors 24 03795 g010
Figure 11. Euler rate, θ ˙ .
Figure 11. Euler rate, θ ˙ .
Sensors 24 03795 g011
Figure 12. Euler angle, ψ .
Figure 12. Euler angle, ψ .
Sensors 24 03795 g012
Figure 13. Euler rate, ψ ˙ .
Figure 13. Euler rate, ψ ˙ .
Sensors 24 03795 g013
Figure 14. Sensitivity analysis roll.
Figure 14. Sensitivity analysis roll.
Sensors 24 03795 g014
Figure 15. Sensitivity analysis theta.
Figure 15. Sensitivity analysis theta.
Sensors 24 03795 g015
Figure 16. Sensitivity analysis yaw.
Figure 16. Sensitivity analysis yaw.
Sensors 24 03795 g016
Figure 17. EKF Euler angle, ϕ .
Figure 17. EKF Euler angle, ϕ .
Sensors 24 03795 g017
Figure 18. EKF Euler angle, θ .
Figure 18. EKF Euler angle, θ .
Sensors 24 03795 g018
Figure 19. EKF Euler angle, ψ .
Figure 19. EKF Euler angle, ψ .
Sensors 24 03795 g019
Figure 20. EKF Euler rate, ϕ ˙ .
Figure 20. EKF Euler rate, ϕ ˙ .
Sensors 24 03795 g020
Figure 21. EKF Euler rate, θ ˙ .
Figure 21. EKF Euler rate, θ ˙ .
Sensors 24 03795 g021
Figure 22. EKF Euler rate, ψ ˙ .
Figure 22. EKF Euler rate, ψ ˙ .
Sensors 24 03795 g022
Table 1. Harris detection method errors.
Table 1. Harris detection method errors.
Corner PointsX Pixel ErrorY Pixel Error
Side View Tail P10.8801.312
Top View Wing P12.0520.052
Top View Wing P21.0600.583
Top View HS2.052±1.052
Side View HS P11.229−1.171
Side View HS P21.257−1.047
Table 2. Error range of Euler angles and rates from quaternions.
Table 2. Error range of Euler angles and rates from quaternions.
AnglesError Range (°)RatesRates Error (rad/s)
ϕ 0.1548 , 0 ϕ ˙ 0.0578 , 0.0307
θ 0.9924 , 0 θ ˙ 0.2806 , 0.1041
ψ 0 , 0.0394 ψ ˙ 0.0118 , 0.0134
Table 3. First-order change of the functions.
Table 3. First-order change of the functions.
Euler AnglesChange in Angles (°)
ϕ 0.0194
θ 0.0569
ψ 0.0111
Table 4. Error range of EKF attitude estimation.
Table 4. Error range of EKF attitude estimation.
AnglesError Range (°)RatesRates Error (rad/s)
ϕ 0.0578 , 0.0232 ϕ ˙ 0.0321 , 0.0309
θ 0.9928 , 0 θ ˙ 0.2806 , 0.0701
ψ 0.0113 , 0.0173 ψ ˙ 0.0118 , 0.0128
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kasula, P.; Whidborne, J.F.; Rana, Z.A. Quaternion-Based Attitude Estimation of an Aircraft Model Using Computer Vision. Sensors 2024, 24, 3795. https://doi.org/10.3390/s24123795

AMA Style

Kasula P, Whidborne JF, Rana ZA. Quaternion-Based Attitude Estimation of an Aircraft Model Using Computer Vision. Sensors. 2024; 24(12):3795. https://doi.org/10.3390/s24123795

Chicago/Turabian Style

Kasula, Pavithra, James F. Whidborne, and Zeeshan A. Rana. 2024. "Quaternion-Based Attitude Estimation of an Aircraft Model Using Computer Vision" Sensors 24, no. 12: 3795. https://doi.org/10.3390/s24123795

APA Style

Kasula, P., Whidborne, J. F., & Rana, Z. A. (2024). Quaternion-Based Attitude Estimation of an Aircraft Model Using Computer Vision. Sensors, 24(12), 3795. https://doi.org/10.3390/s24123795

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop