Next Article in Journal
Evaluation of Erythema Severity in Dermatoscopic Images of Canine Skin: Erythema Index Assessment and Image Sampling Reliability
Next Article in Special Issue
Optimal H Control for Lateral Dynamics of Autonomous Vehicles
Previous Article in Journal
Electromechanical Response of Smart Ultra-High Performance Concrete under External Loads Corresponding to Different Electrical Measurements
Previous Article in Special Issue
3D Fast Object Detection Based on Discriminant Images and Dynamic Distance Threshold Clustering
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Estimation of Vehicle Attitude, Acceleration, and Angular Velocity Using Convolutional Neural Network and Dual Extended Kalman Filter

1
Department of Automotive Electronics & Control Engineering, Hanyang University, Seoul 04763, Korea
2
Autonomous Driving Center, Hyundai Motor Company R&D Division, Seoul 06182, Korea
*
Author to whom correspondence should be addressed.
Sensors 2021, 21(4), 1282; https://doi.org/10.3390/s21041282
Submission received: 14 January 2021 / Revised: 5 February 2021 / Accepted: 8 February 2021 / Published: 11 February 2021
(This article belongs to the Special Issue Sensors Fusion for Vehicle Detection and Control)

Abstract

:
The acceleration of a vehicle is important information in vehicle states. The vehicle acceleration is measured by an inertial measurement unit (IMU). However, gravity affects the IMU when there is a transition in vehicle attitude; thus, the IMU produces an incorrect signal output. Therefore, vehicle attitude information is essential for obtaining correct acceleration information. This paper proposes a convolutional neural network (CNN) for attitude estimation. Using sequential data of a vehicle’s chassis sensor signal, the roll and pitch angles of a vehicle can be estimated without using a high-cost sensor such as a global positioning system or a six-dimensional IMU. This paper also proposes a dual-extended Kalman filter (DEKF), which can accurately estimate acceleration/angular velocity based on the estimated roll/pitch information. The proposed method is validated by real-car experiment data and CarSim, a vehicle simulator. It accurately estimates the attitude estimation with limited sensors, and the exact acceleration/angular velocity is estimated considering the roll and pitch angle with de-noising effect. In addition, the DEKF can improve the modeling accuracy and can estimate the roll and pitch rates.

1. Introduction

In recent decades, the vehicle controller has been developed significantly for stability and user convenience. Typically, the electronic stability controller (ESC) or active roll stabilization (ARS) are used for ensuring vehicle stability in chassis controller, adaptive cruise control (ACC), or lane keeping system (LKS) as well as for ensuring convenience in advanced driver assistant system (ADAS). To improve the performance of these controllers, vehicle states must be estimated with a high accuracy. The commonly required and important state information of aforementioned controllers are acceleration and angular velocity. In a vehicle, the inertial measure unit (IMU) measures the acceleration and angular velocity using an inertial force. However, if a transition in attitude (roll and pitch) occurs, the gravitational force is reflected in the sensor value. The IMU cannot distinguish between gravitational force and inertial forces; therefore, a change in the attitude of the vehicle causes a fatal error in the IMU.
Several studies have been conducted to overcome the errors in these accelerometers and to estimate the exact state of the vehicle. In [1,2], the adaptive Kalman filter was designed to minimize the effect of the accelerometer offset errors. In addition, the Kalman filter was used to estimate the accelerometer offset and vehicle velocity [3,4,5]. However, the accuracy of estimating vehicle states was limited because of a lack of information of the vehicle attitude.
Several studies have been conducted to estimate the accurate vehicle attitude in various ways. Using IMU or GPS or vehicle dynamics, researchers tried to estimate the vehicle roll and pitch angle. In addition, due to the recent developments in artificial intelligence and neural networks, data-driven estimators have been used for state estimation [6]. These estimators have the advantage of having high accuracy by training the data directly in situations where accurate mathematical modeling is difficult. The literature review is discussed separately in Section 2.
This paper proposes a novel vehicle attitude estimator using a convolutional neural network (CNN), based on the advantages of a data-driven estimator. Figure 1 shows the architecture of the proposed algorithm. First, we select the features based on vehicle roll and pitch dynamics, importance judgment and sensor usability. By using the selected features sequentially [7], the three-dimensional IMU and vehicle sensor build a CNN-based regression model that can estimate the vehicle roll and pitch angles without GPS. Moreover, a dual extended Kalman filter (DEKF) is designed to estimate the exact acceleration/angular velocity based on the estimated roll and pitch angles with removing the effect of gravitational force. The proposed model is based on the sprung mass six-degree-of-freedom (6-DOF) model and increases the modeling accuracy while estimating the tire cornering stiffness simultaneously. The performance of the proposed algorithm was verified using real car experiment data and MATLAB/Simulink with CarSim, a vehicle simulator. A CNN conducted training and verification using data collected from various driving conditions as an experimental vehicle with RT-3002 which is a high-accuracy GPS-inertial navigation system (INS) from Oxford Technical Solutions Ltd. The performance of the DEKF was verified with a scenario in which postural changes occur using CarSim and MATLAB/Simulink.
The remainder of this paper is organized as follows. Section 2 describes the related literature review. Section 3 explains the methodology including neural network and DEKF. Section 4 describes the results of performance verification. Section 5 summarizes the study and contributions.

2. Literature Review

Several studies have been conducted to estimate the accurate vehicle attitude in various ways. In [8], authors proposed an observer which can estimate the land vehicle’s roll and pitch by using an IMU and the kinematics model. Also, the adaptive Kalman filter was proposed based on IMU aided by vehicle dynamics [9]. However, these methods have a low accuracy in dynamic situations. This problem was attempted to be solved in [10,11,12,13], based on the Kalman filter by compensating for the external acceleration that interfered with the estimation of the attitude, but the accuracy was limit.
Therefore, many researchers have proposed sensor fusion method using not only IMU but fusion with global positioning system (GPS). Representatively, the method in which IMU and GPS fusion with Kalman filter [14,15,16] or sliding mode observer [17] can be used. These methods can increase the accuracy, but they require high-cost sensors such as GPS and are highly affected by GPS performance.
Accordingly, vehicle attitude estimator studies were conducted without the use of GPS, based on the characteristics of the vehicle dynamics. Using vehicle roll dynamics, a dynamic observer design based on a reliable rollover index [18] was proposed. Also, the Kalman filter [19] and the robust observer [20] based on vehicle roll dynamics which estimate roll angle were proposed. Alternatively, using lateral dynamics, the research which estimate road bank angle [21,22] was proposed. However, these methods exhibited limited accuracy in transient situations. A method for estimating both roll and pitch attitudes using a six-dimensional IMU and bicycle model [23,24] was also studied. These methods exhibited a high accuracy, but they have a short validation range and required hard-to-get data such as six-dimensional IMU.
Several recent studies have also been conducted to estimate the vehicle roll and pitch angle using neural network. In [25,26,27], the vehicle roll angle estimation method using sensor fusion with a neural network and Kalman filter was proposed. Furthermore, a vehicle roll and road bank angle estimation method based on a deep neural network was introduced [28]. However, the pitch attitudes were not estimated, and the six-dimensional IMU data such as roll rate or vertical acceleration were required, which complicated the estimation. Literature reviews are summarized in Table 1.

3. Methodology

3.1. Data-Driven-Based Estimator (Neural Network) Design

3.1.1. Feature Selection

Feature selection is based on vehicle dynamics, sensor usability, and attention mechanism. The purpose of neural network is to design a regression model for calculating the roll and pitch angles, so select the primary feature candidates for the vehicle’s roll and pitch dynamics. Vehicle roll dynamics [29] can be written as:
Σ M x = h R m s g s i n ϕ 1 2 k s l s 2 s i n ϕ 1 2 b s l s 2 ϕ ˙ s i n ϕ + Σ F y h C G h R
Σ F y = 2 C α f δ v y + l f ψ ˙ v x + 2 C α r v y l r ψ ˙ v x + m s g s i n ϕ .  
where M x is the moment respect to the roll axis, h R is the height of the from center of gravity to roll center, m s is the vehicle sprung mass, g is the gravity acceleration, ϕ is the roll of vehicle, k s is the stiffness of the suspension, l s is the length of the wheel track, b s is the damping coefficient of the suspension, and h C G is the height of the center of gravity from the ground.
The lateral force applied to the vehicle can be expressed by (2) based on the bicycle model. C α f and C α r are the cornering stiffness values of each front and rear tire, respectively, δ is the steering angle, v x and v y are the velocities of each x and y axes, respectively, l f and l r are the length of each front and rear wheel axis from the center of gravity, and ψ ˙ is the yaw rate.
The vehicle pitch dynamics can be written as [29]:
Σ M y = k s l f 2 s i n θ k s l r 2 s i n θ b s l f 2 θ ˙ c o s θ b s l r 2 θ ˙ c o s θ Σ F d r i v e Σ F b r a k e h C G
Σ F d r i v e = N t f η t f r e f f T e I e + I t N t f 2 + I d N f 2 + I w a x r e f f 2
Σ F b r a k e = 2 P b r a k e A c a l i p e r μ p a d R b r e f f
where M y is the moment with respect to the pitch axis and θ is the pitch of the vehicle.
The traction force applied to the vehicle can be expressed by (4) based on the vehicle drive-line dynamics. N t f is the gear ratio of the transmission and differential gears, η t f is the efficiency of the power transfer from the engine to the wheel axis, r e f f is the effective radius of the tire, T e is the engine torque, I e ,   I t ,   I d ,   I w are the inertia of each engine, transmission, differential gear, and wheel, respectively, and a x is the longitudinal acceleration of the vehicle.
Brake force applied to the vehicle is given by (5). P b r a k e is the pressure of the master cylinder, A c a l i p e r is the area of the brake pad caliper, μ p a d is the friction coefficient of the brake pad, and R b is the distance from the wheel center to the brake pad.
Based on (1)–(5), we can choose only variables, except for static parameters such as the vehicle mass. As a result, a total of 13 feature candidates were selected, including the acceleration and yaw rate. Subsequently, seven features were selected considering the availability of sensors in the experimental vehicle. Table 2 shows the selected features. Based on these, we conducted an analysis of importance based on the attention mechanism [30]. The results of the importance analysis are presented in Appendix A. According to Appendix A, the final features were selected same as Table 2.

3.1.2. Network Design

Before configuring a 2-d input to the CNN, we calculate ϕ s t a t i c , p s e u d o and θ s t a t i c , p s e u d o reflecting the static roll and pitch angles using feature sensor values. The pseudo values are written as:
ϕ s t a t i c , p s e u d o = e a · V x · sin 1 a y g
θ s t a t i c , p s e u d o = e b · V x e c · θ t h r o t t l e e d P b r a k e · sin 1 a x g
where a,b,c,d are the constant design parameters. (5) and (6) are similar to the road bank and the slope angle, respectively when the vehicle’s wheel speed is zero. If the vehicle’s wheel speed increases, (5) and (6) become zero.
The neural network architecture is shown in Figure 2. The network is composed of CNN part and fully connected layer (FCL) part in parallel configuration. The input array is composed of a mux of each sequential feature sensor data including (5), (6). The sequential information is 2 s for 0.01 s, therefore the input array size is (200   ×   9). CNN part uses all inputs, namely all-time series data in the past 2 s, but FCL part uses only the last row of the input array, meaning only current step data. This means that the CNN part is designed with the intention of estimating dynamic vehicle body changes while driving and FCL part is designed to estimate the roll and pitch angles in static scenarios. The CNN part comprises four convolution layers and two fully connected layers. The first layer converts the input matrix into a square matrix and shuffles the sensor placement order. Then, it passes through the three convolution layers, and then pass the one convolution layer with large size filter and wide strides for compressing the data. Next, unfold to fully connected layer and make the last layer’s size (256   ×   1). In FCL part, there are four layers and final layer’s size is also (256   ×   1). The two final layers are concatenated, and then the regression model is constructed, which calculates two outputs using one fully connected layer. Hyperparameters of the neural network such as the number of filters or activation function are described in Appendix B.

3.2. Dual Extended Falman Filter Design

3.2.1. State Space Model

The state space model is based on the vehicle 6-DOF sprung mass model [31,32] for expressing the six-dimensional acceleration and angular velocity. The dynamics are composed based on the Euler rigid body equation and the force or moment of each axis which is given by each dynamic [29]. Figure 3 shows the six-dimensional motion of vehicle sprung mass. The forces or moment of each axis are given by:
Σ F x = m s v x ˙ + v z θ ˙ v y ψ ˙ h R θ ¨ + h R ϕ ˙ ψ ˙
Σ F y = m s v y ˙ + v x ψ ˙ v z ϕ ˙ h R ϕ ¨ + h R θ ˙ ψ ˙
Σ F z = m s v z ˙ + v y ϕ ˙ v x θ ˙ h R ϕ ˙ 2 h R θ ˙ 2
Σ M x = I x ϕ ¨ + I z I y θ ˙ ψ ˙ m s h R v y ˙ + v x ψ ˙ v z ϕ ˙ h R ϕ ¨ + h R θ ˙ ψ ˙
Σ M y = I y θ ¨ + I x I z ϕ ˙ ψ ˙
Σ M z = I z ψ ¨ + I y I x ϕ ˙ θ ˙
where Σ F x ,   Σ F y ,   Σ F z are the sum of forces of the x, y, and z axes, respectively. Σ M x ,   Σ M y ,   Σ M z are the sum of moments of the x, y, and z axes, respectively. I x ,   I y ,   I z are the moments of inertia along the x, y, and z axes, respectively. Σ F x ,   Σ F y ,   Σ F z ,   Σ M x ,   Σ M y , and Σ M z can be derived from a vehicle model such as a bicycle model. Therefore, the nonlinear state space equation can be written as:
x ˙ 1 = N t f η t f r e f f u 1 Σ F b r a k e 1 2 ρ C d A x 1 2 m s g s i n x 8 m s + I e + I t N t f 2 + I d N f 2 + I w r e f f 2 x 3 x 5 + x 2 x 6 + h R x 5 ˙ h R x 4 x 6
x ˙ 2 = 1 m s Σ F y + m s g s i n x 7 x 1 x 6 + x 3 x 4 + h R x 4 ˙ h R x 5 x 6
x ˙ 3 = 1 m s k s sin x 8 l f l r + b s cos x 8 x 5 l f l r x 2 x 4 + x 1 x 5 + h R x 4 2 + h R x 5 2
x ˙ 4 = 1 I x + m s h R 2 h R m s g s i n x 7 1 2 k s l s 2 sin x 7 1 2 b s l s 2 x 4 cos x 7 + Σ F y h C G h R I z I y x 5 x 6 + m s h R x ˙ 2 + x 1 x 6 x 3 x 4 + h R x 5 x 6
x ˙ 5 = 1 I y l f 2 + l r 2 k s sin x 8 l f 2 + l r 2 b s x 5 cos x 8 Σ F d r i v e Σ F b r a k e h C G I x I z x 5 x 6
x ˙ 6 = 1 I z 2 l f C α f 2 l r C α r x 1 x 2 2 l f 2 C α f + 2 l r 2 C α r x 1 x 6 + 2 l f C α f u 3 I y I x x 4 x 5
x ˙ 7 = x 4
x ˙ 8 = x 5
Σ F d r i v e , Σ F b r a k e , and Σ F y have been described previously. The state vector x = v x   v y   v z   ϕ ˙   θ ˙   ψ ˙   ϕ   θ , state input u =   T e   P b r a k e   δ . ρ is the density of the air, C d is the drag coefficient and A is the frontal area of the vehicle.
State outputs include longitudinal acceleration, lateral acceleration, and yaw rate from IMU. The roll and pitch angles from the neural network are also included. Thus, the state output y = a x   a y   ψ ˙   ϕ   θ and it can be written as:
a x = N t f η t f r e f f u 1 Σ F b r a k e 1 2 ρ C d A x 1 2 m s g s i n x 8 m s + I e + I t N t f 2 + I d N f 2 + I w r e f f 2
a y = 1 m s Σ F y + m s g s i n x 7
ψ ˙ = x 6
ϕ = x 7
θ = x 8

3.2.2. Observability Check

Before designing the estimator, the observability must be checked. The observability of the nonlinear state space model can be checked by the Lie derivative [33]. When the state space equation is expressed as x ˙ = f x , u ,   y = h x , the Lie derivative and observability matrix can be written as:
L f 0 = h x L f k + 1 = L f k x f = L f k · f
O = L f 0 L f 1 L f n 1 x = x 0
where O is the observability matrix. Using the rank of the observability matrix, the system’s observability can be seen locally. As a result of checking the rank of the observability matrix, the observability matrix has full rank in range of v x 0 ; therefore, this system is locally observable for the range except v x = 0 .

3.2.3. Dual Extended Kalman Filter Module

Among the vehicle dynamics parameters, the cornering stiffness varies under conditions such as the vehicle load. To reduce the errors while modeling, the cornering stiffness should be estimated.
This study adopts the DEKF as an estimator for reducing the error and increasing the modeling accuracy. Figure 4 shows the DEKF scheme. The state vector, state input, state output, and state space equation have been discussed in Section 3.2.1. The DEKF module works according to the following recursive algorithm [34]:
Parameter prediction:
x ^ p t   = x ^ p t 1
P p t   = P p t 1 + Q p
State prediction:
x ^ s t   = f x ^ s t 1 , u t , x ^ p t
P s t   = F s t P s t 1 F s T t + Q s
State update:
K s t   = P s t H s T t H s t P s t H s T t + R s 1
x ^ s t   = x ^ s t + K s t y t h x ^ s t
P s t   = P s t K s t H s t P s t
Parameter update:
K p t   = P p t H p T t H p t P p t H p T t + R p 1
x ^ p t   = x ^ p t + K p t y t h x ^ s t
P p t   = P p t K p t H p t P p t
where the parameter vector x ^ p = C α f   C α r T , state vector x ^ s = v x   v y   v z   ϕ ˙   θ ˙   ψ ˙   ϕ   θ T ; P p and P s are the error covariance matrices for parameters and states, respectively; Q p and Q s are the process noise covariance matrices for parameter and state estimators, respectively; R p and R s are the output noise covariance matrices for parameter and state estimators, respectively. R p and R s are the same because both the parameter and state estimator have same output y. K p and K s are the Kalman gain matrices for parameter and state estimators, respectively. F s and H s are the Jacobian matrices for the state and output equations, respectively, and are expressed as follows:
F s = f 1 x 1 f 1 x 8 f 8 x 1 f 8 x 8
H s = h 1 x 1 h 1 x 8 h 5 x 1 h 5 x 8
In Equations (36) and (38), H p is the Jacobian matrix of state output for the parameter estimator and can be expressed as:
H p = y x ^ p = a x C α f a x C α r θ C α f θ C α r
R p and R s can be determined by the engineer’s tuning based on sensor noise. Q s and Q p can also be determined by engineer’s tuning but Q p ’s square of each elements is tuned to approximately 1% of the initial values of the each actual parameter.

4. Results and Analysis

This section presents and discusses the experimental verification results of the algorithms mentioned in Section 3. Neural network and DEKF are discussed separately in Section 3.1 and Section 3.2, respectively. The performance is compared with that of commercial sensors.

4.1. Roll and Pitch Estimator (Neural Network)

4.1.1. Dataset

Sensor data from real-car experiment data were used for training and validating the neural network. The neural network input data set contained information of a car’s chassis sensors, and the label data set of roll and pitch angles was obtained from the high-accuracy GPS-inertial navigation system (INS) RT-3002 (Oxford Technical Solutions Ltd., Bicester, UK). For training the neural network, a total of 176,259 data sets were used, which logged about 30 min at 10 ms intervals in various situations and offline validation was performed in scenarios as shown in Table 3 with the same vehicle. The software used Python and the framework used TensorFlow 1.6.

4.1.2. Validation Result and Analysis

The estimation performance was validated using offline sensor data logged in various cases, as shown in Table 3. The root mean square error (RMSE) was calculated by comparison with RT-3002, which was treated as a reference, and with the datasheet of SST810, which is a commercial inclinometer sensor (Vigor Technology Co., Ltd., Xiamen, China). Figure 5, Figure 6, Figure 7 and Figure 8 show the scenario and roll/pitch estimation results for Case 1–4, respectively.
Table 4 shows the accuracy of the SST810 datasheet and RMSE calculation results of the estimation results for Case 1–4. Figure 5, Figure 6, Figure 7 and Figure 8 show that the value of pitch and roll angles between 0 and 2 s is fixed at zero. This is due to the structure described in Section 3.1.2, requiring 2 s of sequential data for the input; thus, the values in the initial 2 s cannot be calculated. The RMSE was therefore calculated in the time zone excluding the initial 2 s.
Case 1 shows that the roll RMSE is approximately 0.1° and the pitch RMSE is approximately 0.4°. There was some offset error, although the vehicle’s speed was zero, i.e., a stationary scenario. For commercial sensors, the error rate is 0.05° at the static scenario, but the estimation results show a larger error compared with the commercial sensors. The neural network estimator in this study uses data of only the chassis sensors of the vehicle; thus, the performance of sensors has a significant impact on the estimation performance. In particular, it can be expected that the IMU’s characteristic bias error affected the offset errors in the estimation results.
Cases 2–4 include scenarios in which rapid changes in vehicle attitude occur in areas where acceleration or deceleration occurs. The validation results show high estimation accuracy in these scenarios. In addition, the noise reduction effect compared to RT-3002 is noticeable in the 15–60 s duration for Case 3; thus, the neural network estimator can estimate stable output. The performance of the estimator is thus superior compared to the commercial sensors.

4.2. Acceleration and Angular Velocity Estimator (DEKF)

4.2.1. Validation Environment

The acceleration and angle velocity estimator was validated by simulation using CarSim, which is a commercial vehicle simulator software. To create an environment similar to the actual vehicle, Gaussian white noise was added to the sensor value, as shown in Table 5. To ensure that the errors of the acceleration sensor due to roll and pitch have been corrected, simulations were conducted in the scenarios shown in Table 6.

4.2.2. Validation Results and Analysis

The correctness performance of a x , a y , and ψ ˙ was validated by comparing IMU sensor values, DEKF estimates, and reference values, while the estimates of roll rates and pitch rates without sensors were validated by comparing with only the reference values. The RMSE of the estimated results is calculated based on the reference and compared with the datasheet of SMI860, which is a commercial six-dimensional IMU from BOSCH Co., Ltd. In addition, to validate the effects of DEKF’s parameter estimator, the estimated results of C α f and C α r for each case were noted, and the estimated results with and without cornering stiffness estimation for Case 3 were compared. Figure 9, Figure 10, Figure 11 and Figure 12 show the simulation scenarios and results of DEKF.
Table 7 shows the accuracy of the commercial sensor, obtained from its datasheet, and RMSE calculation results of the estimation values for case 1–3. In case 1, there was a fatal error of sensor value of a y due to roll angle, and case 2 exhibited considerably varying roll and pitch angles; therefore, the value of a x and a y from sensors may not be accurate. Figure 9b and Figure 10b confirm that these errors in sensor values are successfully corrected to obtain estimates close to the reference. In addition, in normal driving scenarios, as in the 60–65 s duration in Case 3, an a x sensor error by pitch is noted, which has also been successfully corrected. Furthermore, in all cases, filter effects that reduce noise from existing sensors can be checked through a x , a y , and ψ ˙ . Compared with a commercial sensor, a x and a y show similar accuracy, whereas ψ ˙ shows a significantly higher accuracy. However, in the case of commercial sensors, there may be errors caused by roll and pitch angles; therefore, DEKF can have commercial sensor level or higher accuracy even though correcting those errors.
The roll and pitch rates could be estimated because they were included in the state vector, although the sensor values were not included. The roll rate accuracy was occasionally lower than that of commercial sensors depending on the case, and in the case of pitch rate, it is not comparable because there is no output of commercial sensors, but it can be confirmed from the RMSE values that the estimated values have a fairly high accuracy.
Figure 9d, Figure 10d and Figure 11d show the estimated cornering stiffness. Figure 12 and Table 8 show the results with and without cornering stiffness estimation in Case 3, which can improve accuracy by approximately 3–5%, particularly with a greater effect on a y .

5. Conclusions and Future Work

In this paper, we proposed a CNN-based neural network to estimate the roll and pitch angles of a vehicle. A DEKF was used to correct the gravitational effect caused by the roll and pitch for estimating the exact acceleration and angular velocity.
By using the vehicle’s chassis sensor data as a time series, the neural network could estimate the roll and pitch angles of the vehicle without a GPS or six-dimensional IMU. Based on the estimated roll and pitch angels, we designed an extended Kalman filter (EKF) using the 6-DOF vehicle model. Another EKF was designed, and the two EKFs were used to estimate the cornering stiffness. We then constructed the DEKF.
Using experimental data obtained using a real car, the proposed roll and pitch estimator was validated, and the DEKF was validated in the CarSim simulation environment. The roll and pitch estimator showed an improved performance compared to the commercial sensors in dynamic scenarios and also reduced the noise. However, the performance in static scenarios was weaker. The acceleration and angular velocity estimator could effectively correct the acceleration sensor error due to roll and pitch with a de-noising effect. In addition, the roll and pitch rates that could not be obtained from sensors could be estimated with significant accuracy. By comparing the results before and after including the cornering stiffness, we found that the accuracy is improved if the cornering stiffness is considered.
On the other hand, our work has limitations and challenges that should be further discussed. We plan to consider fusion with other algorithms to improve the attitude estimation performance in static scenarios. In addition, the proposed method has not been checked in case of a change in the vehicle weight. It will also be necessary to verify the performance of the algorithm due to changes in vehicle weight. Furthermore, our proposed algorithm is hard to apply as an embedded system in vehicle because the neural network has large capacity. The future work should be conducted to enable algorithm to operate as real-time in vehicle through simplification and optimization.

Author Contributions

Conceptualization, M.O. and S.O.; methodology, M.O.; software, M.O.; validation, M.O. and S.O.; formal analysis, M.O. and S.O.; investigation, M.O.; resources, J.H.P.; data curation, S.O.; writing—original draft preparation, M.O.; writing—review and editing, S.O. and J.H.P.; visualization, M.O.; supervision, J.H.P.; project administration, S.O. and J.H.P.; funding acquisition, J.H.P. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Hyundai Motor Group Academy Industry Research Collaboration.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Restrictions apply to the availability of these data. Data was obtained from Hyundai Motor Company and are available with the permission of Autonomous Driving Center of Hyundai Motor Company R&D Division.

Acknowledgments

This work was supported by Autonomous Driving Center of Hyundai Motor Company R&D Division.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Feature Importance Analysis

Figure A1. Result of feature importance weight analysis.
Figure A1. Result of feature importance weight analysis.
Sensors 21 01282 g0a1
Table A1. Feature name of each number.
Table A1. Feature name of each number.
Feature NumberFeature Name
1Ax
2Ay
3Yaw Rate
4Brake Pressure
5Steering Angle
6Engine Throttle
7 Σ Wheel Speed (Vx)
All features have an importance weight higher than 0.8; therefore, feature reduction is not performed.

Appendix B. Hyperparameter of Neural Network

Table A2. Details of neural network.
Table A2. Details of neural network.
PartLayerNameFilter SizeActivationOutput Size
CNN
part
1-layerFully connected Leaky ReLU(200   ×   200   ×   1)
2-layerConvolution layer(3   ×   3   ×   1) 8EA 1Leaky ReLU(200   ×   200   ×   8)
3-layerConvolution layer(3   ×   3   ×   8) 16EA 1Leaky ReLU(200   ×   200   ×   16)
4-layerConvolution layer(3   ×   3   ×   16) 32EA 1Leaky ReLU(200   ×   200   ×   32)
5-layerConvolution layer(5   ×   5   ×   32) 32EA 2Leaky ReLU(40   ×   40   ×   32)
6-layerFully connected Leaky ReLU(256   ×   1)
FCL
part
1-layerFully connected Leaky ReLU(256   ×   1)
2-layerFully connected Leaky ReLU(1024   ×   1)
3-layerFully connected Leaky ReLU(1024   ×   1)
4-layerFully connected Leaky ReLU(256   ×   1)
Final LayerFinal layerFully connected (2   ×   1)
1 1 × 1 stride was used. 2 5 × 5 strides were used.

References

  1. Lee, H. Reliability indexed sensor fusion and its application to vehicle velocity estimation. J. Dyn. Sys. Meas. Control 2006, 128, 236–243. [Google Scholar] [CrossRef]
  2. Chu, L.; Shi, Y.; Zhang, Y.; Liu, H.; Xu, M. Vehicle lateral and longitudinal velocity estimation based on Adaptive Kalman Filter. In Proceedings of the 3rd International Conference on Advanced Computer Theory and Engineering (ICACTE), Chengdu, China, 20–22 August 2010. [Google Scholar] [CrossRef]
  3. Wu, L.-J. Experimental study on vehicle speed estimation using accelerometer and wheel speed measurements. In Proceedings of the 2011 Second International Conference on Mechanic Automation and Control Engineering, Inner Mongolia, China, 15–17 July 2011. [Google Scholar] [CrossRef]
  4. Klomp, M.; Gao, Y.; Bruzelius, F. Longitudinal velocity and road slope estimation in hybrid electric vehicles employing early detection of excessive wheel slip. Veh. Syst. Dyn. 2014, 52 (Suppl. S1), 172–188. [Google Scholar] [CrossRef]
  5. Song, C.K.; Uchanski, M.; Hedrick, J.K. Vehicle speed estimation using accelerometer and wheel speed measurements. In Proceedings of the SAE International Body Engineering Conference and Automotive & Transportation Technology Conference, Paris, France, 9–11 July 2002. [Google Scholar] [CrossRef]
  6. Jin, X.; Yin, G.; Chen, N. Advanced estimation techniques for vehicle system dynamic state: A survey. Sensors 2019, 19, 4289. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. Zhang, X.; Zhao, J.; LeCun, Y. Character-level convolutional networks for text classification. arXiv 2015, arXiv:1509.01626. [Google Scholar]
  8. Eric Tseng, H.; Xu, L.; Hrovat, D. Estimation of land vehicle roll and pitch angles. Veh. Syst. Dyn. 2007, 45, 433–443. [Google Scholar] [CrossRef]
  9. Xiong, L.; Xia, X.; Lu, Y.; Liu, W.; Gao, L.; Song, S.; Han, Y.; Yu, Z. IMU-Based Automated Vehicle Slip Angle and Attitude Estimation Aided by Vehicle Dynamics. Sensors 2019, 19, 1930. [Google Scholar] [CrossRef] [Green Version]
  10. Lee, J.K.; Park, E.J.; Robinovitch, S.N. Estimation of attitude and external acceleration using inertial sensor measurement during various dynamic conditions. IEEE Trans. Instrum. Meas. 2012, 61, 2262–2273. [Google Scholar] [CrossRef] [Green Version]
  11. Suh, Y.-S.; Park, S.-K.; Kang, H.-J.; Ro, Y.-S. Attitude estimation adaptively compensating external acceleration. JSME Int. J. Ser. C Mech. Syst. Mach. Elem. Manuf. 2006, 49, 172–179. [Google Scholar] [CrossRef] [Green Version]
  12. Ahmed, H.; Tahir, M. Accurate attitude estimation of a moving land vehicle using low-cost MEMS IMU sensors. IEEE Trans. Intell. Transp. Syst. 2016, 18, 1723–1739. [Google Scholar] [CrossRef]
  13. Scholte, W.J.; Marco, V.R.; Nijmeijer, H. Experimental Validation of Vehicle Velocity, Attitude and IMU Bias Estimation. IFAC-PapersOnLine 2019, 52, 118–123. [Google Scholar] [CrossRef]
  14. Caron, F.; Duflos, E.; Pomorski, D.; Vanheeghe, P. GPS/IMU data fusion using multisensor Kalman filtering: Introduction of contextual aspects. Inf. Fusion 2006, 7, 221–230. [Google Scholar] [CrossRef]
  15. Bevly, D.M.; Ryu, J.; Gerdes, J.C. Integrating INS sensors with GPS measurements for continuous estimation of vehicle sideslip, roll, and tire cornering stiffness. IEEE Trans. Intell. Transp. Syst. 2006, 7, 483–493. [Google Scholar] [CrossRef]
  16. Ryu, J.; Rossetter, E.J.; Gerdes, J.C. Vehicle sideslip and roll parameter estimation using GPS. In Proceedings of the AVEC International Symposium on Advanced Vehicle Control, Hiroshima, Japan, 9–13 September 2002. [Google Scholar]
  17. Ahmad, I.; Benallegue, A.; El Hadri, A. Sliding mode based attitude estimation for accelerated aerial vehicles using GPS/IMU measurements. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013. [Google Scholar] [CrossRef]
  18. Rajamani, R.; Piyabongkarn, D.; Tsourapas, V.; Lew, J.Y. Parameter and state estimation in vehicle roll dynamics. IEEE Trans. Intell. Transp. Syst. 2011, 12, 1558–1567. [Google Scholar] [CrossRef]
  19. Garcia Guzman, J.; Prieto Gonzalez, L.; Pajares Redondo, J.; Sanz Sanchez, S.; Boada, B.L. Design of Low-Cost Vehicle Roll Angle Estimator Based on Kalman Filters and an IoT Architecture. Sensors 2018, 18, 1800. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  20. Park, M.; Yim, S. Design of Robust Observers for Active Roll Control. IEEE Access 2019, 7, 173034–173043. [Google Scholar] [CrossRef]
  21. Chung, T.; Yi, S.; Yi, K. Estimation of vehicle state and road bank angle for driver assistance systems. Int. J. Automot. Technol. 2007, 8, 111–117. [Google Scholar]
  22. Tseng, H.E. Dynamic estimation of road bank angle. Veh. Syst. Dyn. 2001, 36, 307–328. [Google Scholar] [CrossRef]
  23. Oh, J.; Choi, S.B. Vehicle roll and pitch angle estimation using a cost-effective six-dimensional inertial measurement unit. Proc. Inst. Mech. Eng. Part D J. Automob. Eng. 2013, 227, 577–590. [Google Scholar] [CrossRef]
  24. Kamal Mazhar, M.; Khan, M.J.; Bhatti, A.I.; Naseer, N. A Novel Roll and Pitch Estimation Approach for a Ground Vehicle Stability Improvement Using a Low Cost IMU. Sensors 2020, 20, 340. [Google Scholar] [CrossRef] [Green Version]
  25. Vargas-Meléndez, L.; Boada, B.L.; Boada, M.J.L.; Gauchía, A.; Díaz, V. A sensor fusion method based on an integrated neural network and Kalman filter for vehicle roll angle estimation. Sensors 2016, 16, 1400. [Google Scholar] [CrossRef] [Green Version]
  26. Vargas-Melendez, L.; Boada, B.L.; Boada, M.J.L.; Gauchia, A.; Diaz, V. Sensor Fusion based on an integrated neural network and probability density function (PDF) dual Kalman filter for on-line estimation of vehicle parameters and states. Sensors 2017, 17, 987. [Google Scholar] [CrossRef] [PubMed]
  27. González, L.P.; Sánchez, S.S.; Garcia-Guzman, J.; Boada, M.J.L.; Boada, B.L. Simultaneous Estimation of Vehicle Roll and Sideslip Angles through a Deep Learning Approach. Sensors 2020, 20, 3679. [Google Scholar] [CrossRef]
  28. Taehui, L.; Sang Won, Y. Estimation of Vehicle Roll and Road Bank Angle based on Deep Neural Network. In Proceedings of the KSAE 2018 Annual Autumn Conference & Exhibition, Jeongseon, Korea, 14–17 November 2018. [Google Scholar]
  29. Rajamani, R. Vehicle Dynamics and Control, 2nd ed.; Springer Science & Business Media: New York, NY, USA, 2011. [Google Scholar]
  30. Challita, N.; Khalil, M.; Beauseroy, P. New feature selection method based on neural network and machine learning. In Proceedings of the 2016 IEEE International Multidisciplinary Conference on Engineering Technology (IMCET), Beirut, Lebanon, 2–4 November 2016. [Google Scholar] [CrossRef]
  31. Yang, S.; Kim, J. Validation of the 6-dof vehicle dynamics model and its related VBA program under the constant radius turn manoeuvre. Int. J. Automot. Technol. 2012, 13, 593–605. [Google Scholar] [CrossRef]
  32. Ray, L.R. Nonlinear tire force estimation and road friction identification: Simulation and experiments. Automatica 1997, 33, 1819–1833. [Google Scholar] [CrossRef]
  33. Lee, D.-H.; Kim, I.-K.; Huh, K.-S. Tire Lateral Force Estimation System Using Nonlinear Kalman Filter. Trans. Korean Soc. Automot. Eng. 2012, 20, 126–131. [Google Scholar] [CrossRef] [Green Version]
  34. Wenzel, T.A.; Burnham, K.; Blundell, M.; Williams, R. Dual extended Kalman filter for vehicle state and parameter estimation. Veh. Syst. Dyn. 2006, 44, 153–171. [Google Scholar] [CrossRef]
Figure 1. Architecture overview.
Figure 1. Architecture overview.
Sensors 21 01282 g001
Figure 2. Neural network architecture.
Figure 2. Neural network architecture.
Sensors 21 01282 g002
Figure 3. Vehicle sprung mass 6 degree of freedom motion.
Figure 3. Vehicle sprung mass 6 degree of freedom motion.
Sensors 21 01282 g003
Figure 4. Scheme of dual-extended Kalman filter (DEKF).
Figure 4. Scheme of dual-extended Kalman filter (DEKF).
Sensors 21 01282 g004
Figure 5. Validation results of Case 1. (a) Speed and steering profile. (b) Roll/pitch estimation results.
Figure 5. Validation results of Case 1. (a) Speed and steering profile. (b) Roll/pitch estimation results.
Sensors 21 01282 g005
Figure 6. Validation results of Case 2. (a) Speed and steering profile. (b) Roll/pitch estimation results.
Figure 6. Validation results of Case 2. (a) Speed and steering profile. (b) Roll/pitch estimation results.
Sensors 21 01282 g006
Figure 7. Validation results of Case 3. (a) Speed and steering profile. (b) Roll/pitch estimation results.
Figure 7. Validation results of Case 3. (a) Speed and steering profile. (b) Roll/pitch estimation results.
Sensors 21 01282 g007
Figure 8. Validation results of Case 4. (a) Speed and steering profile. (b) Roll/pitch estimation results.
Figure 8. Validation results of Case 4. (a) Speed and steering profile. (b) Roll/pitch estimation results.
Sensors 21 01282 g008
Figure 9. Simulation results of Case 1. (a) Roll/pitch profile. (b) a x , a y and ψ ˙ estimation results. (c) ϕ ˙ and θ ˙ estimation results. (d) C α f and C α r estimation results.
Figure 9. Simulation results of Case 1. (a) Roll/pitch profile. (b) a x , a y and ψ ˙ estimation results. (c) ϕ ˙ and θ ˙ estimation results. (d) C α f and C α r estimation results.
Sensors 21 01282 g009
Figure 10. Simulation results of Case 2. (a) Roll/pitch profile. (b) a x , a y , and ψ ˙ estimation results. (c) ϕ ˙ and θ ˙ estimation results. (d) C α f and C α r estimation results.
Figure 10. Simulation results of Case 2. (a) Roll/pitch profile. (b) a x , a y , and ψ ˙ estimation results. (c) ϕ ˙ and θ ˙ estimation results. (d) C α f and C α r estimation results.
Sensors 21 01282 g010
Figure 11. Simulation results of Case 3. (a) Roll/pitch profile. (b) a x , a y , and ψ ˙ estimation results. (c) ϕ ˙ and θ ˙ estimation results. (d) C α f and C α r estimation results.
Figure 11. Simulation results of Case 3. (a) Roll/pitch profile. (b) a x , a y , and ψ ˙ estimation results. (c) ϕ ˙ and θ ˙ estimation results. (d) C α f and C α r estimation results.
Sensors 21 01282 g011
Figure 12. Effect of cornering stiffness estimation in Case 3.
Figure 12. Effect of cornering stiffness estimation in Case 3.
Sensors 21 01282 g012
Table 1. Literature review summarizing.
Table 1. Literature review summarizing.
ReferencesMethodologyModel
[8]Linear observerIMU kinematic model
[9]Kalman filterIMU + Vehicle dynamics (bicycle model and wheel model)
[10,11,12,13]Kalman filterIMU external acceleration model
[14,15,16]Kalman filterIMU + GPS model
[17]Sliding mode observerIMU + GPS model
[18]Dynamic observerVehicle roll dynamics
[19]Kalman filterVehicle roll dynamics
[20]Dual Kalman filterVehicle roll dynamics
[21,22]Linear observerVehicle lateral dynamics (bicycle model)
[23,24]Linear observer, Sliding mode observerVehicle lateral dynamics (bicycle model) + IMU
[25,26,27]Kalman filter + neural networkVehicle roll dynamics + fully connected layer
[28]Neural networkFully connected layer
Table 2. Selected features considering sensor usability.
Table 2. Selected features considering sensor usability.
Feature NameDescription
AxVehicle longitudinal acceleration from IMU
AyVehicle lateral acceleration from IMU
Yaw RateVehicle yaw rate from IMU
Brake PresBrake pressure from the master cylinder
Str AngleSteering wheel angle
ThrottleEngine throttle valve opening degree (0~1)
Σ WheelSpd (Vx)Sum of the wheel rotation speed
Table 3. Roll/pitch estimator validation scenario.
Table 3. Roll/pitch estimator validation scenario.
ConditionDescription
Case 1-A stationary situation on the uphill
Case 2Acceleration ± 0.5 g or higher
Steering ± 5 deg or higher
Rapid acceleration with steering on downhill slope-quick deceleration
Case 3Steering ± 50 deg or higher
Yaw rate ± 30 deg/s or higher
Accelerate-turn-deceleration with steering on flat road
Case 4-Common driving
Table 4. Root mean square error (RMSE) calculation results of roll/pitch estimator and accuracy of SST810 datasheet.
Table 4. Root mean square error (RMSE) calculation results of roll/pitch estimator and accuracy of SST810 datasheet.
Case 1Case 2Case 3Case 4
RollPitchRollPitchRollPitchRollPitch
RMSE (deg)0.11330.41880.05730.54220.13590.09580.51400.5283
Commercial sensor accuracy (deg) 1≤±0.05
(static situation)
≤±0.5
(dynamic situation)
1 From SST810 inclinometer datasheet of Vigor Technology Co., Ltd.
Table 5. Sensor configuration for simulation.
Table 5. Sensor configuration for simulation.
SensorNoiseUnit
IMU ( a x ,   a y )0.1 (RMS) + 10 (%)m/s2
IMU ( ψ ˙ )0.01 (RMS) + 10 (%)rad/s
Steering angle0.05 (RMS) + 10 (%)rad
Engine torque7 (RMS)N · m
Brake pressure0.05 (RMS)MPa
Table 6. Acceleration/angular velocity estimator validation scenario.
Table 6. Acceleration/angular velocity estimator validation scenario.
ConditionDescription
Case 1Roll ± 30 deg or lower
Pitch ± 3 deg or lower
U-turn with 30 degrees of bank angle
Case 2Roll ± 30 deg or lower
Pitch ± 10 deg or lower
Sharp turn at 30 degrees of bank angle after 10 degrees of uphill and downhill
Case 3-Common driving
Table 7. RMSE calculation results of DEKF and accuracy obtained from SMI860 datasheet.
Table 7. RMSE calculation results of DEKF and accuracy obtained from SMI860 datasheet.
Accuracy
DEKF (RMSE)Commercial Sensor 1
a x (m/s2)Case 10.4325≤±0.5
Case 20.8075
Case 30.3087
a y (m/s2)Case 10.8232≤±0.5
Case 20.4204
Case 30.5085
ψ ˙ (deg/s)Case 10.7391≤±3
Case 20.3953
Case 30.5844
ϕ ˙ (deg/s)Case 15.8499≤±2
Case 25.1394
Case 31.0542
θ ˙ (deg/s)Case 11.9251- 2
Case 21.7475
Case 30.6704
1 From SMI860 IMU datasheet of BOSCH Co., Ltd. 2 SMI860 cannot sense the pitch rate.
Table 8. RMSE of with and without cornering stiffness estimation in Case 3.
Table 8. RMSE of with and without cornering stiffness estimation in Case 3.
RMSE
Without Cornering Stiffness EstimationWith Cornering Stiffness Estimation
a y (m/s2)0.55410.5085
ψ ˙ (deg/s)0.61310.5844
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Ok, M.; Ok, S.; Park, J.H. Estimation of Vehicle Attitude, Acceleration, and Angular Velocity Using Convolutional Neural Network and Dual Extended Kalman Filter. Sensors 2021, 21, 1282. https://doi.org/10.3390/s21041282

AMA Style

Ok M, Ok S, Park JH. Estimation of Vehicle Attitude, Acceleration, and Angular Velocity Using Convolutional Neural Network and Dual Extended Kalman Filter. Sensors. 2021; 21(4):1282. https://doi.org/10.3390/s21041282

Chicago/Turabian Style

Ok, Minseok, Sungsuk Ok, and Jahng Hyon Park. 2021. "Estimation of Vehicle Attitude, Acceleration, and Angular Velocity Using Convolutional Neural Network and Dual Extended Kalman Filter" Sensors 21, no. 4: 1282. https://doi.org/10.3390/s21041282

APA Style

Ok, M., Ok, S., & Park, J. H. (2021). Estimation of Vehicle Attitude, Acceleration, and Angular Velocity Using Convolutional Neural Network and Dual Extended Kalman Filter. Sensors, 21(4), 1282. https://doi.org/10.3390/s21041282

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop