Motion Constraints and Vanishing Point Aided Land Vehicle Navigation
Abstract
:1. Introduction
2. Relative Attitude from Vanishing Point Coordinates
2.1. Coordinate Systems Definition
- Navigation frame (n-frame): For the near-Earth navigation, it is defined as the local-level frame where x, y, z axes are in the directions of east, north and up.
- Vehicle motion frame (m-frame) [38]: x-axis is parallel with the non-steered axle, pointing to the right, y-axis points to the forward, and z-axis points up. Both y and z-axes are in the vertical plane of symmetry. The origin, which is the measurement origin of the vehicle, is the position at road height, mid-way between the wheels of non-steered axle.
- IMU body frame (b-frame): x-axis points right, the y-axis points forwards and the z-axis points up. The origin is the measurement origin of the IMU.
- Camera frame (c-frame): The camera is looking forward, so the z-axis points forward, x-axis points right, and y-axis points down.
- Camera intermediate frame (-frame): rotate 90 about x-axis of camera frame to get camera intermediate frame. It is introduced to derive the relationship between the vanishing point coordinates and relative attitude of the camera.
- Road markings frame (r-frame): fixed to a road and rotated with the road terrain and road direction. Suppose the vehicle is moving on the road with parallel lane markings. In r-frame, x and y axes are in the road plane, perpendicular to and along the lane markings, respectively. z-axis is perpendicular to and pointing out from the road surface.
2.2. Camera Relative Attitude Derived from Vanishing Point Coordinates
2.3. Straight Lane Determination and Vanishing Point Detection
2.4. Measurement Uncertainty
3. Sensor Fusion Process
- (1)
- The vehicle is moving on a certain road segment, where the road boundaries or lane markings are parallel;
- (2)
- The camera has been calibrated, i.e., the Interior Orientation Parameters (IOPs) are known. The relative rotations between IMU frame and camera frame are also known.
3.1. Filter State
3.2. Motion Constraint Aiding
3.3. Vanishing Point Aiding
3.4. State Augmentation Treatment
- Clone the value of to ;
- Set the state covariance as
4. Results
4.1. Simulation Results
4.2. Experimental Results
- (1)
- Free INS: Only INS mechanization is performed to calculate the navigation states.
- (2)
- INS/NHC: NHC is triggered every 1 s with the measurement noise being 0.1 m/s (1 ).
- (3)
- INS/NHC/VP: Vanishing point aiding module is added into the system. It is triggered by the straight lane markings detection and valid vanishing point measurement.
4.2.1. Results of experiment #1: Path-0028
4.2.2. Results of Experiment #2: Path-0101
5. Discussion
5.1. Robustness of VP-Aiding Approach
- (1)
- No VP: only motion constraints are used to aid the INS;
- (2)
- VP: Proposed VP-aiding method is used without soft faults detection;
- (3)
- VP & AIME: Proposed VP-aiding method is used and soft faults detection (AIME) is utilized.
5.2. A Complement to Point-Based VO-Aiding
6. Conclusions
Author Contributions
Acknowledgments
Conflicts of Interest
References
- Eckhoff, D.; Sofra, N.; German, R. A performance study of cooperative awareness in ETSI ITS G5 and IEEE WAVE. In Proceedings of the 2013 10th Annual Conference on Wireless On-Demand Network Systems and Services, WONS 2013, Banff, AB, Canada, 18–20 March 2013; pp. 196–200. [Google Scholar]
- European Telecommunications Standards Institute. Intelligent Transport Systems (ITS)—Vehicular Communications—Basic Set of Applications—Part 2: Specification of Cooperative Awareness Basic Service ETSI EN 302 637-2 V1.3.2; Technical Report; European Telecommunications Standards Institute: Sophia Antipolis, France, 2014. [Google Scholar]
- Petit, J.; Shladover, S.E. Potential Cyberattacks on Automated Vehicles. IEEE Trans. Intell. Transp. Syst. 2015, 16, 546–556. [Google Scholar] [CrossRef]
- Liu, Y.; Fu, Q.; Liu, Z.; Li, S. GNSS spoofing detection ability of a loosely coupled INS/GNSS integrated navigation system for two integrity monitoring methods. In Proceedings of the 2017 International Technical Meeting of the Institute of Navigation, ITM 2017, Monterey, CA, USA, 30 January–2 February 2017. [Google Scholar]
- Dissanayake, G.; Sukkarieh, S.; Nebot, E.; Durrant-Whyte, H. The aiding of a low-cost strapdown inertial measurement unit using vehicle model constraints for land vehicle applications. IEEE Trans. Robot. Autom. 2001, 17, 731–747. [Google Scholar] [CrossRef]
- Niu, X.; Nassar, S.; El-Sheimy, N. An Accurate Land-Vehicle MEMS IMU/GPS Navigation System Using 3D Auxiliary Velocity Updates. J. Inst. Navig. 2007, 54, 177–188. [Google Scholar] [CrossRef]
- Atia, M.M.; Liu, S.; Nematallah, H.; Karamat, T.B.; Noureldin, A. Integrated indoor navigation system for ground vehicles with automatic 3-D alignment and position initialization. IEEE Trans. Veh. Technol. 2015, 64, 1279–1292. [Google Scholar] [CrossRef]
- Yu, C.; El-Sheimy, N.; Lan, H.; Liu, Z. Map-Based Indoor Pedestrian Navigation Using an Auxiliary Particle Filter. Micromachines 2017, 8, 225. [Google Scholar] [CrossRef]
- Attia, M.; Moussa, A.; El-Sheimy, N. Bridging integrated GPS/INS systems with geospatial models for car navigation applications. In Proceedings of the 23rd International Technical Meeting of the Satellite Division of the Institute of Navigation (ION GNSS 2010), Portland, OR, USA, 21–24 September 2010; pp. 1697–1703. [Google Scholar]
- Tardif, J.P.; George, M.; Laverne, M.; Kelly, A.; Stentz, A. A new approach to vision-aided inertial navigation. In Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Taipei, Taiwan, 18–22 October 2010; pp. 4161–4168. [Google Scholar]
- Schmid, K.; Ruess, F.; Suppa, M.; Burschka, D. State estimation for highly dynamic flying systems using key frame odometry with varying time delays. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Vilamoura, Portugal, 7–12 October 2012; pp. 2997–3004. [Google Scholar]
- Veth, M.J. Fusion of Imaging and Inertial Sensors for Navigation. Ph.D. Thesis, Air Force Institute of Technology, Dayton, OH, USA, 2008. [Google Scholar]
- Mourikis, A.I.; Roumeliotis, S.I. A multi-state constraint Kalman filter for vision-aided inertial navigation. In Proceedings of the IEEE International Conference on Robotics and Automation, Roma, Italy, 10–14 April 2007; IEEE: Roma, Italy, 2007; pp. 3565–3572. [Google Scholar]
- Leutenegger, S.; Lynen, S.; Bosse, M.; Siegwart, R.; Furgale, P. Keyframe-based visual-inertial odometry using nonlinear optimization. Int. J. Robot. Res. 2015, 34, 314–334. [Google Scholar] [CrossRef]
- Caprile, B.; Torre, V. Using vanishing points for camera calibration. Int. J. Comput. Vis. 1990, 4, 127–139. [Google Scholar] [CrossRef]
- Bazin, J.C.; Demonceaux, C.; Vasseur, P.; Kweon, I.S. Motion estimation by decoupling rotation and translation in catadioptric vision. Comput. Vis. Image Underst. 2010, 114, 254–273. [Google Scholar] [CrossRef]
- Keßler, C.; Ascher, C.; Flad, M.; Trommer, G.F. Multi-sensor indoor pedestrian navigation system with vision aiding. Gyroscopy Navig. 2012, 3, 79–90. [Google Scholar] [CrossRef]
- Ruotsalainen, L.; Kuusniemi, H.; Bhuiyan, M.Z.H.; Chen, L.; Chen, R. A two-dimensional pedestrian navigation solution aided with a visual gyroscope and a visual odometer. GPS Solut. 2013, 17, 575–586. [Google Scholar] [CrossRef]
- Camposeco, F.; Pollefeys, M. Using vanishing points to improve visual-inertial odometry. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 5219–5225. [Google Scholar]
- Williams, B.; Hudson, N.; Tweddle, B.; Brockers, R.; Matthies, L. Feature and pose constrained visual Aided Inertial Navigation for computationally constrained aerial vehicles. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 431–438. [Google Scholar]
- Hwangbo, M.; Kanade, T. Visual-inertial UAV attitude estimation using urban scene regularities. In Proceedings of the IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 2451–2458. [Google Scholar]
- Kim, S.B.; Bazin, J.C.; Lee, H.K.; Choi, K.H.; Park, S.Y. Ground vehicle navigation in harsh urban conditions by integrating inertial navigation system, global positioning system, odometer and vision data. IET Radar Sonar Navig. 2011, 5, 814. [Google Scholar] [CrossRef]
- Bazin, J.C.; Demonceaux, C.; Vasseur, P.; Kweon, I. Rotation estimation and vanishing point extraction by omnidirectional vision in urban environment. Int. J. Robot. Res. 2012, 31, 63–81. [Google Scholar] [CrossRef]
- Schwarze, T.; Lauer, M. Robust ground plane tracking in cluttered environments from egocentric stereo vision. In Proceedings of the IEEE International Conference on Robotics and Automation, Seattle, WA, USA, 26–30 May 2015; pp. 2442–2447. [Google Scholar]
- Lee, B.; Zhou, J.; Ye, M.; Guo, Y.; Sensing, R.; Calibration, C.; Phone, S. A Novel Approach to Camera Calibration Method for Smart Phones. In Proceedings of the the International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Prague, Czech Republic, 12–19 July 2016; Volume XLI-B5, pp. 49–54. [Google Scholar]
- Seo, Y.W.; Rajkumar, R.R. Use of a Monocular Camera to Analyze a Ground Vehicle’s Lateral Movements for Reliable Autonomous City Driving. In Proceedings of the IEEE IROS Workshop on Planning, Perception and Navigation for Intelligent Vehicles, Seattle, WA, USA, 26–30 May 2013; pp. 197–203. [Google Scholar]
- Tao, Z.; Bonnifait, P.; Fremont, V.; Ibanez-Guzman, J. Lane marking aided vehicle localization. In Proceedings of the IEEE Conference on Intelligent Transportation Systems, ITSC, The Hague, The Netherlands, 6–9 October 2013; pp. 1509–1515. [Google Scholar]
- Cui, D.; Xue, J.; Zheng, N. Real-Time Global Localization of Robotic Cars in Lane Level via Lane Marking Detection and Shape Registration. IEEE Trans. Intell. Transp. Syst. 2016, 17, 1039–1050. [Google Scholar] [CrossRef]
- Liu, Z.; El-sheimy, N.; Yu, C.; Qin, Y. Vanishing Point/Vehicle Motion Constraints Aided Ground Vehicle Navigation. In Proceedings of the 2017 International Technical Meeting of the Institute of Navigation, Monterey, CA, USA, 30 January–2 February 2017; pp. 292–300. [Google Scholar]
- Chatzi, E.N.; Smyth, A.W. The unscented Kalman filter and particle filter methods for nonlinear structural system identification with non-collocated heterogeneous sensing. Struct. Control Health Monit. 2009, 16, 99–123. [Google Scholar] [CrossRef]
- Wendel, J.; Metzger, J.; Moenikes, R.; Maier, A.; Trommer, G.F. A Performance comparison of tightly coupled GPS/INS navigation systems based on extended and sigma point Kalman filters. Navig. J. Inst. Navig. 2006, 53, 21–31. [Google Scholar] [CrossRef]
- Shin, E.H.; El-Sheimy, N. Unscented Kalman Filter and Attitude Errors of Low-Cost Inertial Navigation Systems. Navigation 2007, 54, 1–9. [Google Scholar] [CrossRef]
- Eftekhar Azam, S.; Chatzi, E.; Papadimitriou, C. A dual Kalman filter approach for state estimation via output-only acceleration measurements. Mech. Syst. Signal Process. 2015, 60, 866–886. [Google Scholar] [CrossRef]
- Azam, S.E.; Chatzi, E.; Papadimitriou, C.; Smyth, A. Experimental validation of the Kalman-type filters for online and real-time state and input estimation. J. Vib. Control 2017, 23, 2494–2519. [Google Scholar] [CrossRef]
- Roumeliotis, S.I.; Burdick, J.W. Stochastic cloning: A generalized framework for processing relative state measurements. In Proceedings of the 2002 IEEE International Conference on Robotics and Automation, Washington, DC, USA, 11–15 May 2002; Volume 2, pp. 1788–1795. [Google Scholar]
- Mourikis, A.I.; Roumeliotis, S.I.; Burdick, J.W. SC-KF Mobile Robot Localization : A Stochastic-Cloning Kalman Filter for Processing Relative-State Measurements. IEEE Trans. Robot. 2007, 23, 717–730. [Google Scholar] [CrossRef]
- Popp, M.; Crocoll, P.; Ruppelt, J.; Trommer, G.F. A Novel Multi Image Based Navigation System to Aid Outdoor—Indoor Transition Flights of Micro Aerial Vehicles. In Proceedings of the 27th International Technical Meeting of the Satellite Division of the Institute of Navigation ION GNSS+, Tampa, FL, USA, 8–12 September 2014; pp. 1595–1608. [Google Scholar]
- Liu, Z.; El-Sheimy, N.; Qin, Y. Low-cost INS/Odometer Integration and Sensor-to-sensor Calibration for Land Vehicle Applications. In Proceedings of the IAG/CPGPS International Conference on GNSS+ (ICG+ 2016), Shanghai, China, 27–30 July 2016. [Google Scholar]
- Hartley, R.; Zisserman, A. Multiple View Geometry in Computer Vision; Cambridge University Press: Cambridge, UK, 2003. [Google Scholar]
- Bar Hillel, A.; Lerner, R.; Levi, D.; Raz, G. Recent progress in road and lane detection: A survey. Mach. Vis. Appl. 2014, 25, 727–745. [Google Scholar] [CrossRef]
- El-Sheimy, N. Inertial Surveying and INS/GPS Integration, Lecture Notes for ENGO 623 Course; Department of Geomatics Engineering, The University of Calgary: Calgary, AB, Canada, 2016. [Google Scholar]
- Qin, Y. Inertial Navigation, 2nd ed.; Press of Science: Beijing, China, 2014. [Google Scholar]
- Liu, Z.; El-Sheimy, N.; Qin, Y.; Yu, C.; Zhang, J. Partial State Feedback Correction for Smoothing Navigational Parameters. In China Satellite Navigation Conference (CSNC) 2016 Proceedings: Volume II, Changsha, China, 18–20 May 2016; Sun, J., Liu, J., Fan, S., Wang, F., Eds.; Springer Singapore: Singapore, 2016; pp. 461–472. [Google Scholar]
- Liu, Z.; Qin, Y.; Li, S.; Cui, X. A new IMU-based method for relative pose determination. In Proceedings of the 22nd Saint Petersburg International Conference on Integrated Navigation Systems, Saint Petersburg, Russia, 25–27 May 2015; pp. 425–428. [Google Scholar]
- Geiger, A.; Lenz, P.; Stiller, C.; Urtasun, R. Vision meets robotics: The KITTI dataset. Int. J. Robot. Res. 2013, 32, 1231–1237. [Google Scholar] [CrossRef]
- OXTS. RT3000 Brochure. Available online: https://www.oxts.com/app/uploads/2017/07/RT3000-brochure-170606.pdf (accessed on 25 December 2017).
- Groves, P.D. Principles of GNSS, Inertial, and Multisensor Integrated Navigation Systems; Artech House: Norwood, MA, USA, 2008. [Google Scholar]
Sensors | Items | Values |
---|---|---|
Gyroscopes (200 Hz) | Bias | 36/h (1 |
ARW | ||
Accelerometers (200 Hz) | Bias | 1 mg |
VRW | 0.05 m/s/ | |
Camera (10 Hz) | IOPs | Same with the experiment |
VP accuracy | 2 pixels (1 ) | |
Speed indicator (odometer) | Scale factor error | 0.001 (1 ) |
Noise standard deviation | 0.005 m/s (1 ) | |
Relative Pose | IMU/Vehicle misalignment | x-axis: , z-axis: (1 ) |
IMU/Vehicle lever-arm | 0.1 m in three directions (1 ) | |
camera/vehicle boresight error | for pitch (1 ) | |
GNSS (1 Hz, valid in first 80 s) | Position accuracy | 2 m (1 ) |
Velocity accuracy | 0.5 m/s (1 ) |
Free INS | INS/NHC | INS/NHC/VP | |||||||
---|---|---|---|---|---|---|---|---|---|
Mean | Max | RMS | Mean | Max | RMS | Mean | Max | RMS | |
East position (m) | 0.918 | 4.337 | 1.731 | −0.893 | 2.097 | 1.086 | 0.701 | 1.582 | 0.840 |
North position (m) | −4.337 | 13.23 | 5.670 | −1.327 | 3.313 | 1.616 | −0.839 | 2.139 | 1.026 |
Height (m) | 0.063 | 0.558 | 0.282 | 0.091 | 0.566 | 0.267 | 0.072 | 0.547 | 0.271 |
Pitch () | 0.096 | 0.216 | 0.108 | 0.012 | 0.103 | 0.038 | 0.009 | 0.103 | 0.038 |
Roll () | −0.032 | 0.114 | 0.046 | −0.024 | 0.099 | 0.045 | −0.020 | 0.095 | 0.043 |
Heading () | −0.241 | 0.409 | 0.254 | −0.248 | 0.423 | 0.262 | −0.189 | 0.423 | 0.206 |
East velocity (m/s) | 0.089 | 0.221 | 0.124 | −0.044 | 0.132 | 0.069 | −0.031 | 0.122 | 0.059 |
North velocity (m/s) | 0.315 | 0.824 | 0.372 | −0.080 | 0.171 | 0.091 | −0.055 | 0.132 | 0.072 |
Up velocity (m/s) | 0.004 | 0.038 | 0.022 | 0.004 | 0.040 | 0.021 | 0.004 | 0.039 | 0.021 |
Free INS | INS/NHC | INS/NHC/VP | |||||||
---|---|---|---|---|---|---|---|---|---|
Mean | Max | RMS | Mean | Max | RMS | Mean | Max | RMS | |
East position (m) | 2.768 | 7.076 | 3.561 | −1.970 | 4.336 | 2.355 | −1.877 | 4.1224 | 2.241 |
North position (m) | −4.337 | 13.23 | 5.670 | −1.327 | 3.313 | 1.616 | −0.839 | 2.139 | 1.026 |
Height (m) | 0.063 | 0.558 | 0.282 | 0.091 | 0.566 | 0.267 | 0.072 | 0.547 | 0.271 |
Pitch () | 0.096 | 0.216 | 0.108 | 0.012 | 0.103 | 0.038 | 0.009 | 0.103 | 0.038 |
Roll () | −0.032 | 0.114 | 0.046 | −0.024 | 0.099 | 0.045 | −0.020 | 0.095 | 0.043 |
Heading () | −0.241 | 0.409 | 0.254 | −0.248 | 0.423 | 0.262 | −0.189 | 0.423 | 0.206 |
East velocity (m/s) | 0.089 | 0.221 | 0.124 | −0.044 | 0.132 | 0.069 | −0.031 | 0.122 | 0.059 |
North velocity (m/s) | 0.315 | 0.824 | 0.372 | −0.080 | 0.171 | 0.091 | −0.055 | 0.132 | 0.072 |
Up velocity (m/s) | 0.004 | 0.038 | 0.022 | 0.004 | 0.040 | 0.021 | 0.004 | 0.039 | 0.021 |
Trajectory No. | Curve Detected | No VP | With VP | VP & AIME | |||
---|---|---|---|---|---|---|---|
Heading | z-Gyro Bias | Heading | z-Gyro Bias | Heading | z-Gyro Bias | ||
# 1: R = 3000 m | Success | 0.27 | 4.35/h | 0.25 | 3.98/h | – | – |
# 2: R = 4584 m | Failed | 0.27 | 4.67/h | 14.18 | 128.99/h | 0.34 | 4.20/h |
# 3: R = 6000 m | Failed | 0.32 | 5.45/h | 10.87 | 97.92/h | 0.32 | 4.40/h |
# 4: R = 8000 m | Failed | 0.28 | 4.72/h | 8.12 | 73.54/h | 0.31 | 3.99/h |
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Liu, Z.; El-Sheimy, N.; Yu, C.; Qin, Y. Motion Constraints and Vanishing Point Aided Land Vehicle Navigation. Micromachines 2018, 9, 249. https://doi.org/10.3390/mi9050249
Liu Z, El-Sheimy N, Yu C, Qin Y. Motion Constraints and Vanishing Point Aided Land Vehicle Navigation. Micromachines. 2018; 9(5):249. https://doi.org/10.3390/mi9050249
Chicago/Turabian StyleLiu, Zhenbo, Naser El-Sheimy, Chunyang Yu, and Yongyuan Qin. 2018. "Motion Constraints and Vanishing Point Aided Land Vehicle Navigation" Micromachines 9, no. 5: 249. https://doi.org/10.3390/mi9050249
APA StyleLiu, Z., El-Sheimy, N., Yu, C., & Qin, Y. (2018). Motion Constraints and Vanishing Point Aided Land Vehicle Navigation. Micromachines, 9(5), 249. https://doi.org/10.3390/mi9050249