REVIO: Range- and Event-Based Visual-Inertial Odometry for Bio-Inspired Sensors
Abstract
:1. Introduction
- An event-based visual-inertial odometry (EVIO) algorithm is proposed to achieve the location in high-speed motion. Additionally, it is tested on the publicly available event camera dataset.
- A new visual-inertial odometry REVIO simultaneously fusing range and event. It can improve the accuracy and robustness of the position estimation in typical high-dynamic scenes such as weak textures, fast motion or drastic light changes. The algorithm is validated in handheld experiments.
- The REVIO algorithm is tested in an actual environment and applied to the flight localization of an UAV.
2. Preliminaries
3. Range and Event-Based Visual-Inertial Odometry (REVIO)
3.1. Framework
3.2. EVIO Using Sliding Window Nonlinear Optimization
3.2.1. Front-End of Motion-Compensated Event Frames
- (1).
- Motion Compensation.
- (2).
- Feature extraction, prediction and tracking.
3.2.2. Back-End with Sliding Window Non-Linear Optimization
- (1).
- IMU pre-integration constraints.
- (2).
- Reprojection constraints.
- (3).
- Marginalized priori constraints.
3.3. Fusing Range and Event for VIO
3.3.1. Front-End Correction with Range Sensors
3.3.2. Back-End of Adding Range Constraints
- (1).
- Ground constraints.
- (2).
- Generalized scenario constraints.
4. Experiments
4.1. Dataset Experiments: Our EVIO versus Other Algorithms
4.2. Handheld Experiments: REVIO versus EVIO
4.3. Flight Experiments: REVIO versus VINS-Mono
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Bharatharaj, J.; Huang, L.; Al-Jumaily, A.M. Terrain perception using wearable parrot-inspired companion robot, KiliRo. Biomimetics 2022, 7, 81. [Google Scholar] [CrossRef]
- Badue, C.; Guidolini, R.; Carneiro, R.V. Self-driving cars: A survey. Expert Syst. Appl. 2021, 165, 113836. [Google Scholar] [CrossRef]
- Zhao, J.; Ji, S.; Cai, Z. Moving Object Detection and Tracking by Event Frame from Neuromorphic Vision Sensors. Biomimetics 2022, 7, 31. [Google Scholar] [CrossRef] [PubMed]
- Scaramuzza, D.; Zhang, Z. Visual-inertial odometry of aerial robots. arXiv 2019, arXiv:1906.03289. [Google Scholar]
- Urzua, S.; Munguía, R.; Grau, A. Vision-based SLAM system for MAVs in GPS-denied environments. Int. J. Micro Air Veh. 2017, 9, 283–296. [Google Scholar] [CrossRef] [Green Version]
- Leutenegcer, S.; Lynen, S.; Bosse, M. Keyframe-based visual–inertial odometry using nonlinear optimization. Int. J. Robot. Res. 2015, 34, 314–334. [Google Scholar] [CrossRef] [Green Version]
- Qin, T.; Li, P.; Shen, S. Vins-mono: A robust and versatile monocular visual-inertial state estimator. IEEE Trans. Robot. 2018, 34, 1004–1020. [Google Scholar] [CrossRef] [Green Version]
- Mur-Artal, R.; Tardós, J.D. Orb-slam2: An open-source slam system for monocular, stereo, and rgb-d cameras. IEEE Trans. Robot. 2017, 33, 1255–1262. [Google Scholar] [CrossRef] [Green Version]
- Campos, C.; Elvira, R.; Rodríguez, J.J.G. Orb-slam3: An accurate open-source library for visual, visual–inertial, and multimap slam. IEEE Trans. Robot. 2021, 37, 1874–1890. [Google Scholar] [CrossRef]
- Mourikis, A.I.; Roumeliotis, S.I. A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation. In Proceedings of the 2007 IEEE International Conference on Robotics and Automation, Rome, Italy, 10–14 April 2007; pp. 3565–3572. [Google Scholar]
- Bloesch, M.; Omari, S.; Hutter, M.; Siegwart, R. Robust visual inertial odometry using a direct EKF-based approach. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015; pp. 298–304. [Google Scholar]
- Geneva, P.; Eckenhoff, K.; Lee, W. OpenVINS: A Research Platform for Visual-Inertial Estimation. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 4666–4672. [Google Scholar]
- Wu, K.J.; Roumeliotis, S.I. Unobservable Directions of Vins under Special Motions; Department of Computer Science & Engineering, University of Minnesota: Minneapolis, MN, USA, 2016. [Google Scholar]
- Gallego, G.; Delbrück, T.; Orchard, G. Event-based vision: A survey. IEEE Trans. Pattern Anal. Mach. Intell. 2020, 44, 154–180. [Google Scholar] [CrossRef]
- Kim, H.; Handa, A.; Benosman, R. Simultaneous mosaicing and tracking with an event camera. J. Solid State Circ. 2008, 43, 566–576. [Google Scholar]
- Weikersdorfer, D.; Hoffmann, R.; Conradt, J. Simultaneous localization and mapping for event-based vision systems. In International Conference on Computer Vision Systems; Springer: Berlin/Heidelberg, Germany, 2013; pp. 133–142. [Google Scholar]
- Censi, A.; Scaramuzza, D. Low-latency event-based visual odometry. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 703–710. [Google Scholar]
- Rebecq, H.; Horstschäfer, T.; Gallego, G. Evo: A geometric approach to event-based 6-dof parallel tracking and mapping in real time. IEEE Robot. Autom. Lett. 2016, 2, 593–600. [Google Scholar] [CrossRef]
- Rebecq, H.; Horstschaefer, T.; Scaramuzza, D. Real-time Visual-Inertial Odometry for Event Cameras using Keyframe-based Nonlinear Optimization. In Proceedings of the British Machine Vision Conference (BMVC), London, UK, 4–7 September 2017. [Google Scholar]
- Vidal, A.R.; Rebecq, H.; Horstschaefer, T. Ultimate SLAM? Combining events, images, and IMU for robust visual SLAM in HDR and high-speed scenarios. IEEE Robot. Autom. Lett. 2018, 3, 994–1001. [Google Scholar] [CrossRef] [Green Version]
- Bayard, D.S.; Conway, D.T.; Brockers, R. Vision-based navigation for the NASA mars helicopter. In Proceedings of the AIAA Scitech 2019 Forum, San Diego, CA, USA, 7–11 January 2019. [Google Scholar]
- Delaune, J.; Bayard, D.S.; Brockers, R. Range-visual-inertial odometry: Scale observability without excitation. IEEE Robot. Autom. Lett. 2021, 6, 2421–2428. [Google Scholar] [CrossRef]
- Shen, S.; Michael, N.; Kumar, V. Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVs. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015. [Google Scholar]
- Mueggler, E.; Rebecq, H.; Gallego, G. The event-camera dataset and simulator: Event-based data for pose estimation, visual odometry, and SLAM. Int. J. Robot. Res. 2017, 36, 142–149. [Google Scholar] [CrossRef]
- Geiger, A.; Lenz, P.; Urtasun, R. Are we ready for autonomous driving? The kitti vision benchmark suiter. In Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition, Providence, RI, USA, 16–21 June 2012. [Google Scholar]
Sequence | Max Speed (m/s) | Length (m) | RMSE (m) | |||
---|---|---|---|---|---|---|
Our EVIO (E + I) * | EVIO-KF [19] (E + I) | Ultimate_SLAM [20] (Fr + E + I) | VINS_Mono [10] (Fr + I) | |||
poster_6dof | 3.370 | 61.143 | 0.147 | 1.036 | 0.161 | 0.290 |
poster_translation | 3.207 | 49.265 | 0.074 | 0.231 | 0.055 | 0.133 |
boxes_6dof | 4.014 | 69.852 | 0.143 | 0.910 | 0.230 | 0.163 |
boxes_translation | 3.853 | 65.236 | 0.158 | 0.686 | 0.187 | 0.162 |
hdr_poster | 2.774 | 55.437 | 0.322 | 0.322 | 0.373 | 0.342 |
hdr_boxes | 3.136 | 55.088 | 0.212 | 0.597 | 0.234 | 0.249 |
Sequence | Max Speed (m/s) | Max Mean Optical Flow (Pixel/s) | Length (m) | RMSE (m) | |
---|---|---|---|---|---|
REVIO (R + E + I) * | EVIO (E + I) | ||||
1 | 2.089 | 2210 | 31.52 | 0.111 | 0.105 |
2 | 2.349 | 1280 | 64.89 | 0.086 | 0.088 |
3 | 2.422 | 1740 | 55.43 | 0.094 | 0.109 |
4 | 3.489 | 1557 | 76.44 | 0.091 | 0.128 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, Y.; Shao, B.; Zhang, C.; Zhao, J.; Cai, Z. REVIO: Range- and Event-Based Visual-Inertial Odometry for Bio-Inspired Sensors. Biomimetics 2022, 7, 169. https://doi.org/10.3390/biomimetics7040169
Wang Y, Shao B, Zhang C, Zhao J, Cai Z. REVIO: Range- and Event-Based Visual-Inertial Odometry for Bio-Inspired Sensors. Biomimetics. 2022; 7(4):169. https://doi.org/10.3390/biomimetics7040169
Chicago/Turabian StyleWang, Yingxun, Bo Shao, Chongchong Zhang, Jiang Zhao, and Zhihao Cai. 2022. "REVIO: Range- and Event-Based Visual-Inertial Odometry for Bio-Inspired Sensors" Biomimetics 7, no. 4: 169. https://doi.org/10.3390/biomimetics7040169
APA StyleWang, Y., Shao, B., Zhang, C., Zhao, J., & Cai, Z. (2022). REVIO: Range- and Event-Based Visual-Inertial Odometry for Bio-Inspired Sensors. Biomimetics, 7(4), 169. https://doi.org/10.3390/biomimetics7040169