A Novel Fault-Tolerant Navigation and Positioning Method with Stereo-Camera/Micro Electro Mechanical Systems Inertial Measurement Unit (MEMS-IMU) in Hostile Environment
Abstract
:1. Introduction
- A stereo VIO with MEMS-IMU aided method is proposed in the framework. MEMS-IMU pre-integration constraint from prediction model is used to constrain a range of candidate feature points searching and matching. The constraint also set as to optimize the initial iterator pose to avoid local optimum instead of adding MEMS-IMU measurements error joint optimization.
- An adaptive method is introduced to adjust measurement covariance according to motion characteristic. Besides, a novel fault-tolerant mechanism is used to decide whether stereo VIO pose estimation is reliable by comparing it with MEMS-IMU measurements.
2. Materials and Methods
2.1. Coordinates and Notations
2.2. Framework of Fault-Tolerant with Stereo-Camera and MEMS-IMU
2.2.1. State Predict with MEMS-IMU Measurements
2.2.2. An Improved Stereo VIO Method Aided by MEMS-IMU
2.2.3. Fault-Tolerant Adaptive Extended Kalman Filtering
- 1.
- Fault-tolerance with dramatic change detection
- 2.
- Covariance adaptive filtering
3. Results
3.1. Experiment Setup
3.1.1. Equipment
3.1.2. Experiment Environment Description
3.2. Experiments Results
3.2.1. Experiment I: In Corridor
3.2.2. Experiment II: In tennis court
3.3. Experimental Analysis
3.3.1. Accuracy Analysis
3.3.2. Inertial Aided Matching and Fault Tolerance Analysis
4. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Liu, Z.; El-Sheimy, N.; Yu, C.; Qin, Y.; Liu, Z.; El-Sheimy, N.; Yu, C.; Qin, Y. Motion Constraints and Vanishing Point Aided Land Vehicle Navigation. Micromachines 2018, 9, 249. [Google Scholar] [CrossRef] [PubMed]
- Weiss, S.; Achtelik, M.W.; Lynen, S.; Chli, M.; Siegwart, R. Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments. In Proceedings of the IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA, 14–18 May 2012; pp. 957–964. [Google Scholar]
- Nister, D.; Naroditsky, O.; Bergen, J. Visual odometry. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), Washington, DC, USA, 27 June–2 July 2004; Volume 1, pp. I-652–I-659. [Google Scholar]
- Usenko, V.; Engel, J.; Stückler, J.; Cremers, D. Direct visual-inertial odometry with stereo cameras. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 1885–1892. [Google Scholar]
- Vidal, A.R.; Rebecq, H.; Horstschaefer, T.; Scaramuzza, D. Ultimate SLAM? Combining Events, Images, and IMU for Robust Visual SLAM in HDR and High-Speed Scenarios. IEEE Robot. Autom. Lett. 2018, 3, 994–1001. [Google Scholar] [CrossRef] [Green Version]
- Corrêa, D.; Santos, D.; Contini, L.; Balbinot, A. MEMS Accelerometers Sensors: An Application in Virtual Reality. Sens. Transducers Tor. 2010, 120, 13–26. [Google Scholar]
- Wang, J.; Zeng, Q.; Liu, J.; Meng, Q.; Chen, R.; Zeng, S.; Huang, H. Realization of Pedestrian Seamless Positioning Based on the Multi-Sensor of the Smartphone. Navig. Position. Timing 2018, 1, 28–34. [Google Scholar] [CrossRef]
- Tian, Y.; Chen, Z.; Lu, S.; Tan, J. Adaptive Absolute Ego-Motion Estimation Using Wearable Visual-Inertial Sensors for Indoor Positioning. Micromachines 2018, 9, 113. [Google Scholar] [CrossRef] [PubMed]
- Mur-Artal, R.; Tardos, J.D. Visual-Inertial Monocular SLAM with Map Reuse. IEEE Robot. Autom. Lett. 2017, 2, 796–803. [Google Scholar] [CrossRef]
- He, Y.; Zhao, J.; Guo, Y.; He, W.; Yuan, K.; He, Y.; Zhao, J.; Guo, Y.; He, W.; Yuan, K. PL-VIO: Tightly-Coupled Monocular Visual–Inertial Odometry Using Point and Line Features. Sensors 2018, 18, 1159. [Google Scholar] [CrossRef] [PubMed]
- Sun, K.; Mohta, K.; Pfrommer, B.; Watterson, M.; Liu, S.; Mulgaonkar, Y.; Taylor, C.J.; Kumar, V. Robust Stereo Visual Inertial Odometry for Fast Autonomous Flight. IEEE Robot. Autom. Lett. 2018, 3, 965–972. [Google Scholar] [CrossRef] [Green Version]
- Tardif, J.P.; George, M.; Laverne, M.; Kelly, A.; Stentz, A. A new approach to vision-aided inertial navigation. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, Taiwan, 18–22 October 2010; pp. 4161–4168. [Google Scholar]
- Liu, Y.; Xiong, R.; Wang, Y.; Huang, H.; Xie, X.; Liu, X.; Zhang, G. Stereo Visual-Inertial Odometry With Multiple Kalman Filters Ensemble. IEEE Trans. Ind. Electron. 2016, 63, 6205–6216. [Google Scholar] [CrossRef]
- Schmid, K.; Lutz, P.; Tomić, T.; Mair, E.; Hirschmüller, H. Autonomous Vision-based Micro Air Vehicle for Indoor and Outdoor Navigation. J. Field Robot. 2014, 31, 537–570. [Google Scholar] [CrossRef]
- Mourikis, A.I.; Roumeliotis, S.I. A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigatio. Proceedings IEEE International Conference on Robotics and Automation, Roma, Italy, 10–14 April 2007; pp. 3565–3572. [Google Scholar]
- Forster, C.; Carlone, L.; Dellaert, F.; Scaramuzza, D. On-Manifold Preintegration for Real-Time Visual–Inertial Odometry. IEEE Trans. Robot. 2017, 33, 1–21. [Google Scholar] [CrossRef]
- Ramezani, M.; Khoshelham, K. Vehicle Positioning in GNSS-Deprived Urban Areas by Stereo Visual-Inertial Odometry. IEEE Trans. Intell. Veh. 2018, 3, 208–217. [Google Scholar] [CrossRef]
- Kümmerle, R.; Grisetti, G.; Strasdat, H.; Konolige, K.; Burgard, W. G2o: A general framework for graph optimization. In Proceedings of the IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 3607–3613. [Google Scholar]
- Wielicki, B.A.; Barkstrom, B.R.; Harrison, E.F.; Lee, R.B.; Smith, G.L.; Cooper, J.E. Clouds and the Earth’s Radiant Energy System (CERES): An Earth Observing System Experiment. Bull. Am. Meteor. Soc. 1996, 77, 853–868. [Google Scholar] [CrossRef]
- Alismail, H.; Kaess, M.; Browning, B.; Lucey, S. Direct Visual Odometry in Low Light Using Binary Descriptors. IEEE Robot. Autom. Lett. 2017, 2, 444–451. [Google Scholar] [CrossRef]
- Mur-Artal, R.; Tardós, J.D. ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras. IEEE Trans. Robot. 2017, 33, 1255–1262. [Google Scholar] [CrossRef] [Green Version]
- Lynen, S.; Achtelik, M.W.; Weiss, S.; Chli, M.; Siegwart, R. A robust and modular multi-sensor fusion approach applied to MAV navigation. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 3923–3929. [Google Scholar]
- Qin, T.; Li, P.; Shen, S. VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator. arXiv, 2017; arXiv:1708.03852. [Google Scholar] [CrossRef]
Length (m) | Proposed Error | ORB-SLAM2 Error | MSF-EKF Error | VINS-Mono Error |
---|---|---|---|---|
Experiment I: 108.8 | 0.43(0.58 *) | 0.94 * | 16.57 (0.90 *) | 1.80 * |
Experiment II: 38 | 0.6(0.53 *) | 0.75 * | 3.94 (0.6 *) | 0.88 (0.08 *) |
Yaw Angle Change (°) | Proposed Error | ORB-SLAM2 Error | MSF-EKF Error | VINS-Mono Error |
---|---|---|---|---|
Experiment I: 180 | 4.52 (2.9 *) | 3.13 * | 21.84 (3.10 *) | 3.0 * |
Experiment II: 90 | 0.19 (0.38 *) | 0.55 * | 1.21 (0.44 *) | 1.72 (0.56 *) |
Length (m) | Proposed Error | ORB-SLAM2 Error | MSF-EKF Error | VINS-Mono Error |
---|---|---|---|---|
Experiment I: 108.8 | 0.92, 0.8% | 194.3, 179.9% | 55.88, 51.4% | 67.36, 67.4% |
Experiment II: 38 | 1.89, 4.98% | 4.22, 11.1% | 13.3, 35.0% | 32.0, 84.2% |
Yaw Angle Change (°) | Proposed Error | ORB-SLAM2 Error | MSF-EKF Error | VINS-Mono Error |
---|---|---|---|---|
Experiment I: 180 | 1.8, 1% | 176.3, 97.9% | 68.5, 38.1% | 62.3, 34.6% |
Experiment II: 90 | 0.37, 0.4% | 22.17, 25% | 3.98, 4.04% | 5.95, 6.61% |
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yuan, C.; Lai, J.; Lyu, P.; Shi, P.; Zhao, W.; Huang, K. A Novel Fault-Tolerant Navigation and Positioning Method with Stereo-Camera/Micro Electro Mechanical Systems Inertial Measurement Unit (MEMS-IMU) in Hostile Environment. Micromachines 2018, 9, 626. https://doi.org/10.3390/mi9120626
Yuan C, Lai J, Lyu P, Shi P, Zhao W, Huang K. A Novel Fault-Tolerant Navigation and Positioning Method with Stereo-Camera/Micro Electro Mechanical Systems Inertial Measurement Unit (MEMS-IMU) in Hostile Environment. Micromachines. 2018; 9(12):626. https://doi.org/10.3390/mi9120626
Chicago/Turabian StyleYuan, Cheng, Jizhou Lai, Pin Lyu, Peng Shi, Wei Zhao, and Kai Huang. 2018. "A Novel Fault-Tolerant Navigation and Positioning Method with Stereo-Camera/Micro Electro Mechanical Systems Inertial Measurement Unit (MEMS-IMU) in Hostile Environment" Micromachines 9, no. 12: 626. https://doi.org/10.3390/mi9120626
APA StyleYuan, C., Lai, J., Lyu, P., Shi, P., Zhao, W., & Huang, K. (2018). A Novel Fault-Tolerant Navigation and Positioning Method with Stereo-Camera/Micro Electro Mechanical Systems Inertial Measurement Unit (MEMS-IMU) in Hostile Environment. Micromachines, 9(12), 626. https://doi.org/10.3390/mi9120626