Research on Environment Perception System of Quadruped Robots Based on LiDAR and Vision
Abstract
:1. Introduction
1.1. Single Sensor Detection
1.2. Multi-Sensor Fusion
1.3. Deep Learning Method
2. System Overview
2.1. Hardware Architecture
2.2. Software Architecture
- A single sensor (LiDAR or RGB-D camera) is used for localization and mapping;
- Kalman Filter Fusion method is used to fuse the data obtained by the two sensors for localization and mapping;
- Gesture recognition is achieved by utilizing the enhanced YOLOv5 network with ASFF, enabling the quadruped robot to recognize basic instructions.
- The same multi-sensor fusion method is employed in the quadruped robot and vehicle to analyzed the extra problem of environmental perception of legged robots.
3. Single Sensor SLAM
3.1. LiDAR SLAM
3.2. Visual SLAM
4. Multi-Sensor Fusion SLAM
4.1. Problems with Single Sensor SLAM
4.2. Fusion Method
4.3. Fusion Result
- When the laser and visual information are tracked normally, the fusion algorithm can be used to improve the accuracy of mapping.
- When visual tracking fails, the localization from laser SLAM can be used to obtain continuous results.
- The two-dimensional laser can compensate for the limited field of view of the depth camera, which enhances the navigation safety.
5. Human–Machine Interaction
5.1. Improved YOLOv5 by Adding ASFF
5.2. Training Result
5.3. Test Result
6. Challenge of Environment Perception System for Legged Robots
- Oscillating body.
- Changing attitude.
- Non-smooth speed.
7. Conclusions
- Involve IMU or other sensors for positioning into multi-sensor fusion SLAM.
- Reduce the walking speed of the quadruped robot during map construction and implement intermittent stops to mitigate motion instability.
- Enhance the stability of the robot’s motion by improving gait planning methods and reducing shaking during movement. Additionally, incorporating cushioning materials at the foot end can help minimize ground impact while walking.
- Utilize mechanical anti-shake techniques and specialized sensors, such as gyroscopes and accelerometers, to detect robot movement and compensate for camera motion.
- Introduce filtering algorithms in the mapping algorithm to remove image noise.
- Apply digital video stabilization methods to estimate and smooth motion, filter out unwanted motion, and reconstruct stable video.
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Chen, G.; Wei, N.; Yan, L.; Lu, H.; Li, J. Perturbation-based approximate analytic solutions to an articulated SLIP model for legged robots. Commun. Nonlinear Sci. Numer. Simul. 2023, 117, 106943. [Google Scholar] [CrossRef]
- Hui, Z. Research on Environmental Perception, Recognition and Leader Following Algorithm of the Quadruped Robot. Ph.D. Thesis, Shandong University, Jinan, China, 2016. [Google Scholar]
- Chen, G.; Wang, J.; Wang, S.; Zhao, J.; Shen, W. Compliance control for a hydraulic bouncing system. ISA Trans. 2018, 79, 232–238. [Google Scholar] [CrossRef] [PubMed]
- Chen, G.; Wei, N.; Lu, H.; Yan, L.; Li, J. Optimization and evaluation of swing leg retraction for a hydraulic biped robot. J. Field Robot. 2023. early view. [Google Scholar] [CrossRef]
- Chen, G.; Guo, S.; Hou, B.; Wang, J. Virtual model control for quadruped robots. IEEE Access 2020, 8, 140736–140751. [Google Scholar] [CrossRef]
- Gao, Y.; Wang, D.; Wei, W.; Yu, Q.; Liu, X.; Wei, Y. Constrained Predictive Tracking Control for Unmanned Hexapod Robot with Tripod Gait. Drones 2022, 6, 246. [Google Scholar] [CrossRef]
- Lee, J.W.; Lee, W.; Kim, K.D. An algorithm for local dynamic map generation for safe UAV navigation. Drones 2021, 5, 88. [Google Scholar] [CrossRef]
- Lee, D.K.; Nedelkov, F.; Akos, D.M. Assessment of Android Network Positioning as an Alternative Source of Navigation for Drone Operations. Drones 2022, 6, 35. [Google Scholar] [CrossRef]
- Xia, X.; Hashemi, E.; Xiong, L.; Khajepour, A. Autonomous Vehicle Kinematics and Dynamics Synthesis for Sideslip Angle Estimation Based on Consensus Kalman Filter. IEEE Trans. Control Syst. Technol. 2022, 31, 179–192. [Google Scholar] [CrossRef]
- Gao, L.; Xiong, L.; Xia, X.; Lu, Y.; Yu, Z.; Khajepour, A. Improved vehicle localization using on-board sensors and vehicle lateral velocity. IEEE Sens. J. 2022, 22, 6818–6831. [Google Scholar] [CrossRef]
- Ramachandran, A.; Sangaiah, A.K. A review on object detection in unmanned aerial vehicle surveillance. Int. J. Cogn. Comput. Eng. 2021, 2, 215–228. [Google Scholar] [CrossRef]
- Liang, Y.; Li, M.; Jiang, C.; Liu, G. CEModule: A computation efficient module for lightweight convolutional neural networks. IEEE Trans. Neural Netw. Learn. Syst. 2021. early access. [Google Scholar] [CrossRef]
- Zhou, P.; Liu, G.; Wang, J.; Weng, Q.; Zhang, K.; Zhou, Z. Lightweight unmanned aerial vehicle video object detection based on spatial-temporal correlation. Int. J. Commun. Syst. 2022, 35, e5334. [Google Scholar] [CrossRef]
- Ocando, M.G.; Certad, N.; Alvarado, S.; Terrones, Á. Autonomous 2D SLAM and 3D mapping of an environment using a single 2D LIDAR and ROS. In Proceedings of the 2017 Latin American Robotics Symposium (LARS) and 2017 Brazilian Symposium on Robotics (SBR), Curitiba, Brazil, 8–11 November 2017; pp. 1–6. [Google Scholar]
- Jeong, W.; Lee, K.M. CV-SLAM: A new ceiling vision-based SLAM technique. In Proceedings of the 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, Edmonton, AB, Canada, 2–6 August 2005; pp. 3195–3200. [Google Scholar]
- Davison, A.J.; Reid, I.D.; Molton, N.D.; Stasse, O. MonoSLAM: Real-time single camera SLAM. IEEE Trans. Pattern Anal. Mach. Intell. 2007, 29, 1052–1067. [Google Scholar] [CrossRef] [PubMed]
- Belter, D.; Nowicki, M.; Skrzypczyński, P. Evaluating map-based RGB-D SLAM on an autonomous walking robot. In International Conference on Automation, 2–4 March 2016, Warsaw, Poland; Springer: Cham, Switzerland, 2016; pp. 469–481. [Google Scholar]
- Callmer, J.; Törnqvist, D.; Gustafsson, F.; Svensson, H.; Carlbom, P. Radar SLAM using visual features. EURASIP J. Adv. Signal Process. 2011, 2011, 71. [Google Scholar] [CrossRef]
- Mittal, A.; Shivakumara, P.; Pal, U.; Lu, T.; Blumenstein, M. A new method for detection and prediction of occluded text in natural scene images. Signal Process. Image Commun. 2022, 100, 116512. [Google Scholar] [CrossRef]
- Schlosser, J.; Chow, C.K.; Kira, Z. Fusing lidar and images for pedestrian detection using convolutional neural networks. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 2198–2205. [Google Scholar]
- Dhouioui, M.; Frikha, T. Design and implementation of a radar and camera-based obstacle classification system using machine-learning techniques. J. Real-Time Image Process. 2021, 18, 2403–2415. [Google Scholar] [CrossRef]
- López, E.; Barea, R.; Gómez, A.; Saltos, Á.; Bergasa, L.M.; Molinos, E.J.; Nemra, A. Indoor SLAM for micro aerial vehicles using visual and laser sensor fusion. In Robot 2015: Second Iberian Robotics Conference; Springer: Cham, Switzerland, 2016; pp. 531–542. [Google Scholar]
- Li, J.; Zhang, X.; Li, J.; Liu, Y.; Wang, J. Building and optimization of 3D semantic map based on Lidar and camera fusion. Neurocomputing 2020, 409, 394–407. [Google Scholar] [CrossRef]
- Jin, D. Research on Laser Vision Fusion SLAM and Navigation for Mobile Robots in Complex Indoor Environments. Ph.D. Thesis, Harbin Institute of Technology, Harbin, China, 2020. [Google Scholar]
- Valente, M.; Joly, C.; de La Fortelle, A. Deep sensor fusion for real-time odometry estimation. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019; pp. 6679–6685. [Google Scholar]
- Alatise, M.B.; Hancke, G.P. Pose estimation of a mobile robot based on fusion of IMU data and vision data using an extended Kalman filter. Sensors 2017, 17, 2164. [Google Scholar] [CrossRef] [PubMed]
- Xia, X.; Meng, Z.; Han, X.; Li, H.; Tsukiji, T.; Xu, R.; Zheng, Z.; Ma, J. An automated driving systems data acquisition and analytics platform. Transp. Res. Part C Emerg. Technol. 2023, 151, 104120. [Google Scholar] [CrossRef]
- Liu, W.; Xia, X.; Xiong, L.; Lu, Y.; Gao, L.; Yu, Z. Automated vehicle sideslip angle estimation considering signal measurement characteristic. IEEE Sens. J. 2021, 21, 21675–21687. [Google Scholar] [CrossRef]
- Xia, X.; Xiong, L.; Huang, Y.; Lu, Y.; Gao, L.; Xu, N.; Yu, Z. Estimation on IMU yaw misalignment by fusing information of automotive onboard sensors. Mech. Syst. Signal Process. 2022, 162, 107993. [Google Scholar] [CrossRef]
- Wang, K.; Liu, M.; Ye, Z. An advanced YOLOv3 method for small-scale road object detection. Appl. Soft Comput. 2021, 112, 107846. [Google Scholar] [CrossRef]
- Qiu, M.; Huang, L.; Tang, B.H. ASFF-YOLOv5: Multielement detection method for road traffic in UAV images based on multiscale feature fusion. Remote Sens. 2022, 14, 3498. [Google Scholar] [CrossRef]
- Zhu, X.; Lyu, S.; Wang, X.; Zhao, Q. TPH-YOLOv5: Improved YOLOv5 based on transformer prediction head for object detection on drone-captured scenarios. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Montreal, BC, Canada, 11–17 October 2021; pp. 2778–2788. [Google Scholar]
- Liu, W.; Quijano, K.; Crawford, M.M. YOLOv5-Tassel: Detecting tassels in RGB UAV imagery with improved YOLOv5 based on transfer learning. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 8085–8094. [Google Scholar] [CrossRef]
- Norzam, W.; Hawari, H.; Kamarudin, K. Analysis of mobile robot indoor mapping using GMapping based SLAM with different parameter. In IOP Conference Series: Materials Science and Engineering; IOP Publishing: Bristol, UK, 2019; Volume 705, p. 012037. [Google Scholar]
- Mur-Artal, R.; Tardós, J.D. Orb-slam2: An open-source slam system for monocular, stereo, and rgb-d cameras. IEEE Trans. Robot. 2017, 33, 1255–1262. [Google Scholar] [CrossRef]
- Labbé, M.; Michaud, F. RTAB-Map as an open-source lidar and visual simultaneous localization and mapping library for large-scale and long-term online operation. J. Field Robot. 2019, 36, 416–446. [Google Scholar] [CrossRef]
- Xiao, Y. Research on Real-Time Positioning and Mapping of Robots Based on Laser Vision Fusion. Master’s Thesis, University of Chinese Academy of Sciences (Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences), Shenzhen, China, 2018. [Google Scholar]
- Moore, T.; Stouch, D. A generalized extended kalman filter implementation for the robot operating system. In Intelligent Autonomous Systems 13; Springer: Cham, Switzerland, 2016; pp. 335–348. [Google Scholar]
- Liu, S.; Huang, D.; Wang, Y. Learning spatial fusion for single-shot object detection. arXiv 2019, arXiv:1911.09516. [Google Scholar]
Platform | Quadruped Robot (Unitree Go1) | Vehicle (Nano Pro) |
---|---|---|
LiDAR | 3i LiDAR Delta2A | |
Vision | Kinect V2 | Astra Pro |
Controller | Jetson Nano | |
Algorithm | Both platforms use the same algorithm |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chen, G.; Hong, L. Research on Environment Perception System of Quadruped Robots Based on LiDAR and Vision. Drones 2023, 7, 329. https://doi.org/10.3390/drones7050329
Chen G, Hong L. Research on Environment Perception System of Quadruped Robots Based on LiDAR and Vision. Drones. 2023; 7(5):329. https://doi.org/10.3390/drones7050329
Chicago/Turabian StyleChen, Guangrong, and Liang Hong. 2023. "Research on Environment Perception System of Quadruped Robots Based on LiDAR and Vision" Drones 7, no. 5: 329. https://doi.org/10.3390/drones7050329
APA StyleChen, G., & Hong, L. (2023). Research on Environment Perception System of Quadruped Robots Based on LiDAR and Vision. Drones, 7(5), 329. https://doi.org/10.3390/drones7050329