Review of Vision-Based Environmental Perception for Lower-Limb Exoskeleton Robots
Abstract
:1. Introduction
2. Visual Sensors and Control System
2.1. Visual Sensors
2.2. Control System
3. Key Technology Analysis
3.1. Related Datasets
3.2. Environment Classification
3.3. Stair and Ramp Detection
3.3.1. Line-Based Extraction Methods
3.3.2. Plane-Based Extraction Methods
3.3.3. Ramp Detection
3.4. Obstacle Detection
3.5. Environment-Oriented Adaptive Gait Planning
4. Prospects
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Qiu, S.; Pei, Z.; Wang, C.; Tang, Z. Systematic Review on Wearable Lower Extremity Robotic Exoskeletons for Assisted Locomotion. J. Bionic Eng. 2022, 20, 436–469. [Google Scholar] [CrossRef]
- Rupal, B.S.; Rafique, S.; Singla, A.; Singla, E.; Isaksson, M.; Virk, G.S. Lower-limb exoskeletons: Research trends and regulatory guidelines in medical and non-medical applications. Int. J. Adv. Robot. Syst. 2017, 14, 6. [Google Scholar] [CrossRef]
- Yang, Z.; Zhang, J.; Gui, L.; Zhang, Y.; Yang, X. Summarize on the Control Method of Exoskeleton Robot. J. Nav. Aviat. Univ. 2009, 24, 520–526. [Google Scholar]
- Kazerooni, H.; Racine, J.-L.; Huang, L.; Steger, R. On the Control of the Berkeley Lower Extremity Exoskeleton (BLEEX). In Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, 18–22 April 2005; pp. 4353–4360. [Google Scholar] [CrossRef]
- Whitney, D.E. Historical Perspective and State of the Art in Robot Force Control. Int. J. Robot. Res. 1987, 6, 3–14. [Google Scholar] [CrossRef]
- Kazerooni, H. Human/Robot Interaction via the Transfer of Power and Information Signals Part I Dynamics and Control Analysis. In Proceedings of the IEEE International Conference on Robotics and Automation, Scottsdale, AZ, USA, 14–19 May 1989; pp. 1632–1640. [Google Scholar] [CrossRef]
- Kazerooni, H. Human/robot interaction via the transfer of power and information signals. II. An experimental analysis. In Proceedings of the IEEE International Conference on Robotics and Automation, Scottsdale, AZ, USA, 14–19 May 1989; pp. 1641–1647. [Google Scholar] [CrossRef]
- Hayashibara, Y.; Tanie, K.; Arai, H. Design of a power assist system with consideration of actuator’s maximum torque. In Proceedings of the IEEE International Workshop on Robot and Human Communication, Tokyo, Japan, 5–7 July 1995; pp. 379–384. [Google Scholar] [CrossRef]
- Shen, C.; Pei, Z.; Chen, W.; Wang, J.; Zhang, J.; Chen, Z. Toward Generalization of sEMG-Based Pattern Recognition: A Novel Feature Extraction for Gesture Recognition. IEEE Trans. Instrum. Meas. 2022, 71, 2501412. [Google Scholar] [CrossRef]
- Shen, C.; Pei, Z.; Chen, W.; Li, Z.; Wang, J.; Zhang, J.; Chen, J. STMI: Stiffness Estimation Method Based on sEMG-Driven Model for Elbow Joint. IEEE Trans. Instrum. Meas. 2023, 72, 2526614. [Google Scholar] [CrossRef]
- Shen, C.; Pei, Z.; Chen, W.; Wang, J.; Wu, X.; Chen, J. Lower Limb Activity Recognition Based on sEMG Using Stacked Weighted Random Forest. IEEE Trans. Neural Syst. Rehabil. Eng. 2024, 32, 166–177. [Google Scholar] [CrossRef] [PubMed]
- Laschowski, B.; McNally, W.; Wong, A.; McPhee, J. Computer Vision and Deep Learning for Environment-Adaptive Control of Robotic Lower-Limb Exoskeletons. In Proceedings of the 2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Virtual Event, 1–5 November 2021; pp. 4631–4635. [Google Scholar] [CrossRef]
- Khalili, M.; Ozgoli, S. Environment Recognition for Controlling Lower-Limb Exoskeletons, by Computer Vision and Deep Learning Algorithm. In Proceedings of the 2022 8th International Conference on Control, Instrumentation and Automation (ICCIA), Tehran, Iran, 2–3 March 2022; pp. 1–5. [Google Scholar] [CrossRef]
- Laschowski, B.; McNally, W.; Wong, A.; McPhee, J. Preliminary Design of an Environment Recognition System for Controlling Robotic Lower-Limb Prostheses and Exoskeletons. In Proceedings of the 2019 IEEE 16th International Conference on Rehabilitation Robotics (ICORR), Toronto, ON, Canada, 24–28 June 2019; pp. 868–873. [Google Scholar] [CrossRef]
- Laschowski, B.; McNally, W.; Wong, A.; McPhee, J. Environment Classification for Robotic Leg Prostheses and Exoskeletons Using Deep Convolutional Neural Networks. Front. Neurorobotics 2022, 15, 730965. [Google Scholar] [CrossRef] [PubMed]
- Hirai, K.; Hirose, M.; Haikawa, Y.; Takenaka, T. The development of Honda humanoid robot. In Proceedings of the 1998 IEEE International Conference on Robotics and Automation (Cat. No.98CH36146), Leuven, Belgium, 20–20 May 1998; pp. 1321–1326. [Google Scholar] [CrossRef]
- Huang, R.; Cheng, H.; Chen, Y.; Chen, Q.; Lin, X.; Qiu, J. Optimisation of Reference Gait Trajectory of a Lower Limb Exoskeleton. Int. J. Soc. Robot. 2016, 8, 223–235. [Google Scholar] [CrossRef]
- Strausser, K.A.; Swift, T.A.; Zoss, A.B.; Kazerooni, H.; Bennett, B.C. Mobile Exoskeleton for Spinal Cord Injury: Development and Testing. In Proceedings of the ASME 2011 Dynamic Systems and Control Conference and Bath/ASME Symposium on Fluid Power and Motion Control, Arlington, VA, USA, 31 October–2 November 2011; pp. 419–425. [Google Scholar] [CrossRef]
- Cao, J.; Xie, S.Q.; Das, R.; Zhu, G.L. Control strategies for effective robot assisted gait rehabilitation: The state of art and future prospects. Med. Eng. Phys. 2014, 36, 1555–1566. [Google Scholar] [CrossRef] [PubMed]
- Huo, W.; Mohammed, S.; Moreno, J.C.; Amirat, Y. Lower Limb Wearable Robots for Assistance and Rehabilitation: A State of the Art. IEEE Syst. J. 2016, 10, 1068–1081. [Google Scholar] [CrossRef]
- Huang, R.; Cheng, H.; Guo, H. Hierarchical learning control with physical human-exoskeleton interaction. Inf. Sci. 2018, 432, 584–595. [Google Scholar] [CrossRef]
- Kajita, S.; Kanehiro, F.; Kaneko, K.; Fujiwara, K.; Harada, K.; Yokoi, K.; Hirukawa, H. Biped walking pattern generation by using preview control of zero-moment point. In Proceedings of the 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422), Taipei, Taiwan, 14–19 September 2003; pp. 1620–1626. [Google Scholar] [CrossRef]
- Vukobratovic, M.; Borovac, B. Zero-Moment Point—Thirty five years of its life. Int. J. Humanoid Robot. 2004, 1, 157–173. [Google Scholar] [CrossRef]
- Ijspeert, A.J. Central pattern generators for locomotion control in animals and robots: A review. Neural Netw. Off. J. Int. Neural Netw. Soc. 2008, 21, 642–653. [Google Scholar] [CrossRef] [PubMed]
- Vicon|Award Winning Motion Capture Systems. Available online: https://www.vicon.com/ (accessed on 3 April 2019).
- Noitom Motion Capture Systems. Available online: https://noitom.com/ (accessed on 1 September 2011).
- HTC Vive. Available online: https://www.vive.com/ (accessed on 2 March 1995).
- Miura, H.; Shimoyama, I. Dynamic Walk of a Biped. Int. J. Robot. Res. 1984, 3, 60–74. [Google Scholar] [CrossRef]
- Liu, C.; Chen, Q.; Wang, G. Adaptive walking control of quadruped robots based on central pattern generator (CPG) and reflex. J. Control Theory Appl. 2013, 11, 386–392. [Google Scholar] [CrossRef]
- Li, H. Design and Motion Optimization of Underwater Bionic Robot Based on CPG. Master’s Dissertation, Yanshan University, Qinhuangdao, China, 2023. [Google Scholar]
- Laschowski, B.; McNally, W.; Wong, A.; McPhee, J. Comparative Analysis of Environment Recognition Systems for Control of Lower-Limb Exoskeletons and Prostheses. In Proceedings of the 2020 8th IEEE RAS/EMBS International Conference for Biomedical Robotics and Biomechatronics (BioRob), New York, NY, USA, 29 November–1 December 2020; pp. 581–586. [Google Scholar] [CrossRef]
- Zeilig, G.; Weingarden, H.; Zwecker, M.; Dudkiewicz, I.; Bloch, A.; Esquenazi, A. Safety and tolerance of the ReWalk™ exoskeleton suit for ambulation by people with complete spinal cord injury: A pilot study. J. Spinal Cord Med. 2012, 35, 96–101. [Google Scholar] [CrossRef] [PubMed]
- Fineberg, D.B.; Asselin, P.; Harel, N.Y.; Agranova-Breyter, I.; Kornfeld, S.D.; Bauman, A.W.; Spungen, M.A. Vertical ground reaction force-based analysis of powered exoskeleton-assisted walking in persons with motor-complete paraplegia. J. Spinal Cord Med. 2013, 36, 313–321. [Google Scholar] [CrossRef] [PubMed]
- Esquenazi, A.; Talaty, M.; Packel, A.; Saulino, M. The ReWalk Powered Exoskeleton to Restore Ambulatory Function to Individuals with Thoracic-Level Motor-Complete Spinal Cord Injury. Am. J. Phys. Med. Rehabil. 2012, 91, 911–921. [Google Scholar] [CrossRef] [PubMed]
- Maeshima, S.; Osawa, A.; Nishio, D.; Hirano, Y.; Takeda, K.; Kigawa, H.; Sankai, Y. Efficacy of a hybrid assistive limb in post-stroke hemiplegic patients: A preliminary report. BMC Neurol. 2011, 11, 116. [Google Scholar] [CrossRef] [PubMed]
- Nilsson, A.; Vreede, K.S.; Häglund, V.; Kawamoto, H.; Sankai, Y.; Borg, J. Gait training early after stroke with a new exoskeleton–The hybrid assistive limb: A study of safety and feasibility. J. Neuroeng. Rehabil. 2014, 11, 92. [Google Scholar] [CrossRef] [PubMed]
- Sczesny-Kaiser, M.; Höffken, O.; Lissek, S.; Lenz, M.; Schlaffke, L.; Nicolas, V.; Meindl, R.; Aach, M.; Sankai, Y.; Schildhauer, T.A.; et al. Neurorehabilitation in Chronic Paraplegic Patients with the HAL® Exoskeleton–Preliminary Electrophysiological and fMRI Data of a Pilot Study. In Biosystems & Biorobotics; Pons, J., Torricelli, D., Pajaro, M., Eds.; Springer: Berlin, Germany, 2013; pp. 611–615. [Google Scholar] [CrossRef]
- Krausz, N.E.; Hargrove, L.J. A Survey of Teleceptive Sensing for Wearable Assistive Robotic Devices. Sensors 2019, 19, 5238. [Google Scholar] [CrossRef] [PubMed]
- Nelson, M.; MacIver, M. Sensory acquisition in active sensing systems. J. Comp. Physiol. A 2006, 192, 573–586. [Google Scholar] [CrossRef] [PubMed]
- Waltham, N. CCD and CMOS sensors. In Observing Photons in Space: A Guide to Experimental Space Astronomy; ISSI Scientific Report Series; Huber, M.C.E., Pauluhn, A., Culhane, J.L., Timothy, J.G., Wilhelm, K., Zehnder, A., Eds.; Springer: New York, NY, USA, 2013; pp. 423–442. [Google Scholar] [CrossRef]
- Zhu, X.; Li, Y.; Lu, H.; Zhang, H. Research on vision-based traversable region recognition for mobile robots. Appl. Res. Comput. 2012, 29, 2009–2013. [Google Scholar]
- Hall, D.S. High Definition Lidar System. U.S. Patent EP2041515A4, 11 November 2009. [Google Scholar]
- Intel RealSense Computer Vision—Depth and Tracking Cameras. Available online: https://www.intelrealsense.com/ (accessed on 18 December 2013).
- Kinect for Windows. Available online: http://www.k4w.cn/ (accessed on 24 May 2013).
- LiDAR Camera—Intel RealSense Depth and Tracking Cameras. Available online: https://www.intelrealsense.com/lidar-camera-l515/ (accessed on 18 December 2013).
- SPL6317/93|Philips. Available online: https://www.philips.com.cn/c-p/SPL6317_93/3000-series-full-hd-webcam (accessed on 3 November 1999).
- ZED Mini Stereo Camera|Stereolabs. Available online: https://store.stereolabs.com/products/zed-mini (accessed on 6 November 2002).
- Unitree 4D LiDAR L1—Believe in Light—Unitree. Available online: https://www.unitree.com/LiDAR/ (accessed on 15 April 1995).
- Depth Camera D435i—Intel RealSense Depth and Tracking Cameras. Available online: https://www.intelrealsense.com/depth-camera-d435i/ (accessed on 18 December 2013).
- Krausz, N.E.; Hargrove, L.J. Recognition of ascending stairs from 2D images for control of powered lower limb prostheses. In Proceedings of the 2015 7th International IEEE/EMBS Conference on Neural Engineering (NER), Montpellier, France, 22–24 April 2015; pp. 615–618. [Google Scholar] [CrossRef]
- Novo-Torres, L.; Ramirez-Paredes, J.-P.; Villarreal, D.J. Obstacle Recognition using Computer Vision and Convolutional Neural Networks for Powered Prosthetic Leg Applications. In Proceedings of the 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany, 23–27 July 2019; pp. 3360–3363. [Google Scholar] [CrossRef]
- Bao, W.; Villarreal, D.; Chiao, J.-C. Vision-Based Autonomous Walking in a Lower-Limb Powered Exoskeleton. In Proceedings of the 2020 IEEE 20th International Conference on Bioinformatics and Bioengineering (BIBE), Cincinnati, OH, USA, 26–28 October 2020; pp. 830–834. [Google Scholar] [CrossRef]
- Laschowski, B.; McNally, W.; Wong, A.; McPhee, J. ExoNet Database: Wearable Camera Images of Human Locomotion Environments. Front. Robot. AI 2020, 7, 562061. [Google Scholar] [CrossRef] [PubMed]
- Krausz, N.E.; Lenzi, T.; Hargrove, L.J. Depth Sensing for Improved Control of Lower Limb Prostheses. IEEE Trans. Biomed. Eng. 2015, 62, 2576–2587. [Google Scholar] [CrossRef]
- Khademi, G.; Simon, D. Convolutional Neural Networks for Environmentally Aware Locomotion Mode Recognition of Lower-Limb Amputees. In Proceedings of the ASME 2019 Dynamic Systems and Control Conference, Park City, UT, USA, 8–11 October 2019. [Google Scholar]
- Krausz, N.E.; Hu, B.H.; Hargrove, L.J. Subject- and Environment-Based Sensor Variability for Wearable Lower-Limb Assistive Devices. Sensors 2019, 19, 4887. [Google Scholar] [CrossRef] [PubMed]
- Zhang, K.; Wang, J.; Fu, C. Directional PointNet: 3D Environmental Classification for Wearable Robotics. arXiv 2019, arXiv:1903.06846. [Google Scholar]
- Ramanathan, M.; Luo, L.; Er, J.K.; Foo, M.J.; Chiam, C.H.; Li, L.; Yau, W.Y.; Ang, W.T. Visual Environment perception for obstacle detection and crossing of lower-limb exoskeletons. In Proceedings of the 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Kyoto, Japan, 23–27 October 2022; pp. 12267–12274. [Google Scholar] [CrossRef]
- Massalin, Y.; Abdrakhmanova, M.; Varol, H.A. User-Independent Intent Recognition for Lower Limb Prostheses Using Depth Sensing. IEEE Trans. Biomed. Eng. 2018, 65, 1759–1770. [Google Scholar] [CrossRef]
- Zhang, K.; Xiong, C.; Zhang, W.; Liu, H.; Lai, D.; Rong, Y.; Fu, C. Environmental Features Recognition for Lower Limb Prostheses Toward Predictive Walking. IEEE Trans. Neural Syst. Rehabil. Eng. 2019, 27, 465–476. [Google Scholar] [CrossRef] [PubMed]
- Shi, C. Research and Implementation of a Lower-Limb Exoskeleton Robot Up and Down Stairs. Master’s Dissertation, University of Electronic Science and Technology of China, Chengdu, China, 23 May 2019. [Google Scholar]
- Embedded Systems Developer Kits & Modules from NVIDIA Jetson. Available online: https://www.nvidia.com/en-eu/autonomous-machines/embedded-systems/ (accessed on 20 April 1993).
- Raspberry Pi. Available online: https://www.raspberrypi.com/ (accessed on 15 September 2008).
- Atlas 200 DK AI Developer Kit—Huawei Enterprise. Available online: https://e.huawei.com/eu/products/computing/ascend/atlas-200 (accessed on 1 January 2000).
- STMicroelectronics. Available online: https://www.st.com/content/st_com/en.html (accessed on 8 February 1993).
- Arduino—Home. Available online: https://www.arduino.cc/ (accessed on 26 October 2005).
- Kurbis, A.G.; Laschowski, B.; Mihailidis, A. Stair Recognition for Robotic Exoskeleton Control using Computer Vision and Deep Learning. In Proceedings of the 2022 International Conference on Rehabilitation Robotics (ICORR), Rotterdam, The Netherlands, 25–29 July 2022; pp. 1–6. [Google Scholar] [CrossRef]
- Zhu, H. Research on Terrain Recognition of Flexible Exoskeleton Based on Computer Vision. Master’s Dissertation, Wuhan University of Technology, Wuhan, China, December 2020. [Google Scholar] [CrossRef]
- Patil, U.; Gujarathi, A.; Kulkarni, A.; Jain, A.; Malke, L.; Tekade, R.; Paigwar, K.; Chaturvedi, P. Deep Learning Based Stair Detection and Statistical Image Filtering for Autonomous Stair Climbing. In Proceedings of the 2019 Third IEEE International Conference on Robotic Computing (IRC), Naples, Italy, 25–27 February 2019; pp. 159–166. [Google Scholar] [CrossRef]
- Rekhawar, N.; Govindani, Y.; Rao, N. Deep Learning based Detection, Segmentation and Vision based Pose Estimation of Staircase. In Proceedings of the 2022 1st International Conference on the Paradigm Shifts in Communication, Embedded Systems, Machine Learning and Signal Processing (PCEMS), Nagpur, India, 6–7 May 2022; pp. 78–83. [Google Scholar] [CrossRef]
- Habib, A.; Islam, M.M.; Kabir, N.M.; Mredul, B.M.; Hasan, M. Staircase Detection to Guide Visually Impaired People: A Hybrid Approach. Rev. D’Intelligence Artif. 2019, 33, 327–334. [Google Scholar] [CrossRef]
- Wang, C.; Pei, Z.; Qiu, S.; Tang, Z. Deep leaning-based ultra-fast stair detection. Sci. Rep. 2022, 12, 16124. [Google Scholar] [CrossRef] [PubMed]
- Wang, C.; Pei, Z.; Qiu, S.; Tang, Z. RGB-D-Based Stair Detection and Estimation Using Deep Learning. Sensors 2023, 23, 2175. [Google Scholar] [CrossRef] [PubMed]
- Wang, C.; Pei, Z.; Qiu, S.; Tang, Z. StairNetV3: Depth-aware stair modeling using deep learning. Vis. Comput. 2024. [Google Scholar] [CrossRef]
- Xue, Z. Research on the Method of Perceiving Traversable Area in Lower Limb Exoskeleton in Daily Life Environment. Master’s Dissertation, University of Electronic Science and Technology of China, Chengdu, China, June 2020. [Google Scholar] [CrossRef]
- Struebig, K.; Ganter, N.; Freiberg, L.; Lueth, T.C. Stair and Ramp Recognition for Powered Lower Limb Exoskeletons. In Proceedings of the 2021 IEEE International Conference on Robotics and Biomimetics (ROBIO), Sanya, China, 27–31 December 2021; pp. 1270–1276. [Google Scholar] [CrossRef]
- Miao, Y.; Wang, S.; Miao, Y.; An, M.; Wang, X. Stereo-based Terrain Parameters Estimation for Lower Limb Exoskeleton. In Proceedings of the 2021 IEEE 16th Conference on Industrial Electronics and Applications (ICIEA), Chengdu, China, 1–4 August 2021; pp. 1655–1660. [Google Scholar] [CrossRef]
- Everingham, M.; Eslami, S.M.A.; Van Gool, L.; Williams, C.K.I.; Winn, J.; Zisserman, A. The PASCAL Visual Object Classes Challenge: A Retrospective. Int. J. Comput. Vis. 2015, 111, 98–136. [Google Scholar] [CrossRef]
- Lin, T.; Maire, M.; Belongie, S.; Hays, J.; Perona, P.; Ramanan, D.; Dollár, P.; Zitnick, C.L. Microsoft COCO: Common Objects in Context. In Lecture Notes in Computer Science; Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T., Eds.; Springer: Cham, Switzerland, 2014; pp. 740–755. [Google Scholar] [CrossRef]
- Zhou, B.; Zhao, H.; Puig, X.; Fidler, S.; Barriuso, A.; Torralba, A. Scene Parsing through ADE20K Dataset. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 5122–5130. [Google Scholar] [CrossRef]
- Silberman, N.; Hoiem, D.; Kohli, P.; Fergus, R. Indoor Segmentation and Support Inference from RGBD Images. In Lecture Notes in Computer Science; Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C., Eds.; Springer: Berlin, Germany, 2012; pp. 746–760. [Google Scholar] [CrossRef]
- Song, S.; Lichtenberg, S.P.; Xiao, J. SUN RGB-D: A RGB-D scene understanding benchmark suite. In Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015; pp. 567–576. [Google Scholar] [CrossRef]
- Ren, J. Research on Vision assisted Technology for Exoskeleton Robot. Master’s Dissertation, Shenzhen Institute of Advanced Technology Chinese Academy of Sciences, Shenzhen, China, June 2019. [Google Scholar]
- An, D.; Zhu, A.; Yue, X.; Dang, D.; Zhang, Y. Environmental obstacle detection and localization model for cable-driven exoskeleton. In Proceedings of the 2022 19th International Conference on Ubiquitous Robots (UR), Jeju, Republic of Korea, 4–6 July 2022; pp. 64–69. [Google Scholar] [CrossRef]
- Wang, C.; Pei, Z.; Qiu, S.; Tang, Z. Stair dataset. Mendeley Data 2023, V3. [Google Scholar] [CrossRef]
- Wang, C.; Pei, Z.; Qiu, S.; Tang, Z. Stair dataset with depth maps. Mendeley Data 2023, V2. [Google Scholar] [CrossRef]
- Wang, C.; Pei, Z.; Qiu, S.; Wang, Y.; Tang, Z. RGB-D stair dataset. Mendeley Data 2023, V1. [Google Scholar] [CrossRef]
- Tan, M.; Le, Q.V. EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks. arXiv 2019, arXiv:1905.11946. [Google Scholar]
- Szegedy, C.; Vanhoucke, V.; Ioffe, S.; Shlens, J.; Wojna, Z. Rethinking the Inception Architecture for Computer Vision. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 2818–2826. [Google Scholar] [CrossRef]
- Howard, A.G.; Zhu, M.; Chen, B.; Kalenichenko, D.; Wang, W.; Weyand, T.; Andreetto, M.; Adam, H. MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications. arXiv 2017, arXiv:1704.04861. [Google Scholar]
- Sandler, M.; Howard, A.; Zhu, M.; Zhmoginov, A.; Chen, L.-C. MobileNetV2: Inverted Residuals and Linear Bottlenecks. In Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 4510–4520. [Google Scholar] [CrossRef]
- Simonyan, K.; Zisserman, A. Very Deep Convolutional Networks for Large-Scale Image Recognition. arXiv 2014, arXiv:1409.1556. [Google Scholar]
- Chollet, F. Xception: Deep Learning with Depthwise Separable Convolutions. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 1800–1807. [Google Scholar] [CrossRef]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar] [CrossRef]
- Huang, G.; Liu, Z.; Maaten, L.V.D.; Weinberger, K.Q. Densely Connected Convolutional Networks. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 2261–2269. [Google Scholar] [CrossRef]
- Deng, J.; Dong, W.; Socher, R.; Li, L.-J.; Li, K.; Li, F.-F. ImageNet: A large-scale hierarchical image database. In Proceedings of the 2009 IEEE Conference on Computer Vision and Pattern Recognition, Miami, FL, USA, 20–25 June 2009; pp. 248–255. [Google Scholar] [CrossRef]
- Diamantis, D.E.; Koutsiou, D.C.C.; Iakovidis, D.K. Staircase Detection Using a Lightweight Look-Behind Fully Convolutional Neural Network. In Communications in Computer and Information Science; Macintyre, J., Iliadis, L., Maglogiannis, I., Jayne, C., Eds.; Springer: Cham, Switzerland, 2019; pp. 522–532. [Google Scholar] [CrossRef]
- Charles, R.Q.; Su, H.; Kaichun, M.; Guibas, L.J. PointNet: Deep Learning on Point Sets for 3D Classification and Segmentation. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 77–85. [Google Scholar] [CrossRef]
- Shahrabadi, S.; Rodrigues, J.M.F.; du Buf, J.M.H. Detection of Indoor and Outdoor Stairs. In Lecture Notes in Computer Science; Sanches, J.M., Micó, L., Cardoso, J.S., Eds.; Springer: Berlin, Germany, 2013; pp. 847–854. [Google Scholar] [CrossRef]
- Wang, S.; Pan, H.; Zhang, C.; Tian, Y. RGB-D image-based detection of stairs, pedestrian crosswalks and traffic signs. J. Vis. Commun. Image Represent. 2014, 10, 263–272. [Google Scholar] [CrossRef]
- Huang, X.; Tang, Z. Staircase Detection Algorithm Based on Projection-Histogram. In Proceedings of the 2018 2nd IEEE Advanced Information Management, Communicates, Electronic and Automation Control Conference (IMCEC), Xi’an, China, 25–27 May 2018; pp. 1130–1133. [Google Scholar] [CrossRef]
- Vu, H.; Hoang, V.; Le, T.; Tran, T.; Nguyen, T.T. A projective chirp based stair representation and detection from monocular images and its application for the visually impaired. Pattern Recognit. Lett. 2020, 137, 17–26. [Google Scholar] [CrossRef]
- Hough, P.V.C. Method and Means for Recogninizing Complex Patterns. U.S. Patent US3069654, 18 December 1962. [Google Scholar]
- Khaliluzzaman, M.; Deb, K.; Jo, K.-H. Stairways detection and distance estimation approach based on three connected point and triangular similarity. In Proceedings of the 2016 9th International Conference on Human System Interactions (HSI), Portsmouth, UK, 6–8 July 2016; pp. 330–336. [Google Scholar] [CrossRef]
- Khaliluzzaman, M.; Yakub, M.; Chakraborty, N. Comparative Analysis of Stairways Detection Based on RGB and RGB-D Image. In Proceedings of the 2018 International Conference on Innovations in Science, Engineering and Technology (ICISET), Chittagong, Bangladesh, 27–28 October 2018; pp. 519–524. [Google Scholar] [CrossRef]
- Platt, J. Sequential minimal optimization: A fast algorithm for training support vector machines. Adv. Kernel Methods-Support Vector Learn. 1998; MSR-TR-98-14. Available online: https://www.microsoft.com/en-us/research/publication/sequential-minimal-optimization-a-fast-algorithm-for-training-support-vector-machines(accessed on 9 October 2007).
- Redmon, J.; Farhadi, A. YOLOv3: An Incremental Improvement. arXiv 2018, arXiv:1804.02767. [Google Scholar]
- GitHub—Ultralytics/yolov5: YOLOv5. Available online: https://github.com/ultralytics/yolov5 (accessed on 9 October 2007).
- Ronneberger, O.; Fischer, P.; Brox, T. U-Net: Convolutional Networks for Biomedical Image Segmentation. In Lecture Notes in Computer Science; Navab, N., Hornegger, J., Wells, W., Frangi, A., Eds.; Springer: Cham, Switzerland, 2015; pp. 234–241. [Google Scholar] [CrossRef]
- Oh, K.W.; Choi, K.S. Supervoxel-based Staircase Detection from Range Data. IEIE Trans. Smart Process. Comput. 2015, 4, 403–406. [Google Scholar] [CrossRef]
- Pérez-Yus, A.; López-Nicolás, G.; Guerrero, J.J. Detection and Modelling of Staircases Using a Wearable Depth Sensor. In Lecture Notes in Computer Science; Agapito, L., Bronstein, M., Rother, C., Eds.; Springer: Cham, Switzerland, 2015; pp. 449–463. [Google Scholar] [CrossRef]
- Ye, Y.; Wang, J. Stair area recognition in complex environment based on point cloud. J. Electron. Meas. Instrum. 2020, 34, 124–133. [Google Scholar] [CrossRef]
- Ciobanu, A.; Morar, A.; Moldoveanu, F.; Petrescu, L.; Ferche, O.; Moldoveanu, A. Real-Time Indoor Staircase Detection on Mobile Devices. In Proceedings of the 2017 21st International Conference on Control Systems and Computer Science (CSCS), Bucharest, Romania, 29–31 May 2017; pp. 287–293. [Google Scholar] [CrossRef]
- Holz, D.; Holzer, S.; Rusu, R.B.; Behnke, S. Real-Time Plane Segmentation Using RGB-D Cameras. In Lecture Notes in Computer Science; Röfer, T., Mayer, N.M., Savage, J., Saranlı, U., Eds.; Springer: Berlin, Germany, 2012; pp. 306–317. [Google Scholar] [CrossRef]
- Fischler, M.A.; Bolles, R.C. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 1981, 24, 381–395. [Google Scholar] [CrossRef]
- Canny, J. A Computational Approach to Edge Detection. IEEE Trans. Pattern Anal. Mach. Intell. 1986, PAMI-8, 679–698. [Google Scholar] [CrossRef]
- Mao, J.; Shi, S.; Wang, X.; Li, H. 3D Object Detection for Autonomous Driving: A Comprehensive Survey. arXiv 2022, arXiv:2206.09474. [Google Scholar] [CrossRef]
- Liu, D. Research on Multimodal Fusion-Based Control Strategy for Lower-Limb Exoskeleton Robot. Ph.D. Thesis, Shenzhen Institute of Advanced Technology Chinese Academy of Sciences, Shenzhen, China, June 2018. [Google Scholar]
- Hua, Y.; Zhang, H.; Li, Y.; Zhao, J.; Zhu, Y. Vision Assisted Control of Lower Extremity Exoskeleton for Obstacle Avoidance With Dynamic Constraint Based Piecewise Nonlinear MPC. IEEE Robot. Autom. Lett. 2022, 7, 12267–12274. [Google Scholar] [CrossRef]
- Castagno, J.; Atkins, E. Polylidar3D-Fast Polygon Extraction from 3D Data. Sensors 2020, 20, 4819. [Google Scholar] [CrossRef] [PubMed]
- Zeng, K.; Yan, Z.; Xu, D.; Peng, A. Online Gait Planning of Visual Lower Exoskeleton Down Stairs. Mach. Des. Manuf. 2022, 10, 46–50+55. [Google Scholar] [CrossRef]
- Gong, Q.; Zhao, J. Research on Gait of Exoskeleton Climbing Stairs Based on Environment Perception and Reconstruction. Control Eng. China 2022, 29, 1497–1504. [Google Scholar] [CrossRef]
- Xiang, S. Research and Implementation of Gait Planning Method for Walking Exoskeleton Ascend and Descend Stairs. Master’s Dissertation, University of Electronic Science and Technology of China, Chengdu, China, May 2020. [Google Scholar]
- Ijspeert, A.J.; Nakanishi, J.; Schaal, S. Trajectory formation for imitation with nonlinear dynamical systems. In Proceedings of the 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems. Expanding the Societal Role of Robotics in the Next Millennium (Cat. No.01CH37180), Maui, HI, USA, 29 October–3 November 2001; pp. 752–757. [Google Scholar] [CrossRef]
- Liang, K.; Li, Z.; Chen, D.; Chen, X. Improved Artificial Potential Field for Unknown Narrow Environments. In Proceedings of the 2004 IEEE International Conference on Robotics and Biomimetics, Shenyang, China, 22–26 August 2004; pp. 688–692. [Google Scholar] [CrossRef]
- Zhang, B.; Chen, W.; Fei, M. An Optimized Method for Path Planning Based on Artificial Potential Field. In Proceedings of the Sixth International Conference on Intelligent Systems Design and Applications, Ji’an, China, 16–18 October 2006; pp. 35–39. [Google Scholar] [CrossRef]
- Hoffmann, H.; Pastor, P.; Park, D.-H.; Schaal, S. Biologically-inspired dynamical systems for movement generation: Automatic real-time goal adaptation and obstacle avoidance. In Proceedings of the 2009 IEEE International Conference on Robotics and Automation, Kobe, Japan, 12–17 May 2009; pp. 2587–2592. [Google Scholar] [CrossRef]
- Yu, Z.; Yao, J. Gait Planning of Lower Extremity Exoskeleton Climbing Stair based on Online ZMP Correction. J. Mech. Transm. 2022, 44, 62–67. [Google Scholar] [CrossRef]
- Kooij, H.; Jacobs, R.; Koopman, B.; Helm, F.V.D. An alternative approach to synthesizing bipedal walking. Biol. Cybern. 2003, 88, 46–59. [Google Scholar] [CrossRef] [PubMed]
- Li, Z.; Zhao, K.; Zhang, L.; Wu, X.; Zhang, T.; Li, Q.; Li, X.; Su, C. Human-in-the-Loop Control of a Wearable Lower Limb Exoskeleton for Stable Dynamic Walking. IEEE/ASME Trans. Mechatronics 2021, 26, 2700–2711. [Google Scholar] [CrossRef]
- Lee, J.-T.; Kim, H.-U.; Lee, C.; Kim, C.-S. Semantic Line Detection and Its Applications. In Proceedings of the 2017 IEEE International Conference on Computer Vision (ICCV), Venice, Italy, 22–29 October 2017; pp. 3249–3257. [Google Scholar] [CrossRef]
- Zhao, K.; Han, Q.; Zhang, C.-B.; Xu, J.; Cheng, M.-M. Deep Hough Transform for Semantic Line Detection. IEEE Trans. Pattern Anal. Mach. Intell. 2022, 44, 4793–4806. [Google Scholar] [CrossRef] [PubMed]
- Zhou, Y.; Qi, H.; Ma, Y. End-to-End Wireframe Parsing. In Proceedings of the 2019 IEEE/CVF International Conference on Computer Vision (ICCV), Seoul, Republic of Korea, 27 October–2 November 2019; pp. 962–971. [Google Scholar] [CrossRef]
- Zhang, H.; Luo, Y.; Qin, F.; He, Y.; Liu, X. ELSD: Efficient Line Segment Detector and Descriptor. In Proceedings of the 2021 IEEE/CVF International Conference on Computer Vision (ICCV), Montreal, QC, Canada, 10–17 October 2021; pp. 2949–2958. [Google Scholar] [CrossRef]
- Xue, N.; Wu, T.; Bai, S.; Wang, F.; Xia, G.-S.; Zhang, L.; Torr, P.H.S. Holistically-Attracted Wireframe Parsing. In Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 13–19 June 2020; pp. 2785–2794. [Google Scholar] [CrossRef]
- Dai, X.; Gong, H.; Wu, S.; Yuan, X.; Ma, Y. Fully convolutional line parsing. Neurocomputing 2022, 506, 1–11. [Google Scholar] [CrossRef]
- Qin, Z.; Wang, H.; Li, X. Ultra Fast Structure-Aware Deep Lane Detection. In Lecture Notes in Computer Science; Vedaldi, A., Bischof, H., Brox, T., Frahm, J.M., Eds.; Springer: Cham, Switzerland, 2020; pp. 276–291. [Google Scholar] [CrossRef]
Installation Location | Advantages | Disadvantages | Suitable Devices |
---|---|---|---|
Head | Synchronizes with user’s view | Heavy weight may lead to discomfort and shaky images | Blind guidance equipment, upper-limb exoskeletons |
Chest | The images are stable, and the view is synchronized with the movement | Camera posture is easily affected by upper-body movements | Upper-limb exoskeletons, lower-limb exoskeletons |
Waist | The images are the most stable, and the view is synchronized with the movement | Low field of view, limited visual range | Lower-limb exoskeletons, lower-limb prosthetics |
Lower limb | High accuracy in detecting specific terrains at close range | Restrictions on user’s lower-body dress, shaky images | Lower-limb prosthetics |
Feet | High accuracy in detecting specific terrains at close range | Limited field of view, shaky images | Lower-limb prosthetics, smart shoes |
Source | Sensor | Number | Resolution | Annotation | Classes | Purpose |
---|---|---|---|---|---|---|
ExoNet [53] | RGB | 922790 | 1280 × 720 | Classification | 12 | Environment classification |
Kurbis, A. G., et al. [67] | RGB | 51500 | 1280 × 720 | Classification | 4 | Environment classification |
Khalili, M., et al. [13] | RGB | 30000 | 1280 × 720 | Classification | 3 | Environment classification |
Laschowski, B., et al. [14] | RGB | 34254 | 1280 × 720 | Classification | 3 | Environment classification |
Zhang, K., et al. [57] | Depth | 4016 | 2048 Points | Classification | 3 | Environment classification |
Zhu, H. [68] | RGB-D | 7000 | 1280 × 720 | Classification | 7 | Environment classification |
Patil, U., et al. [69] | RGB | 848 | 640 × 320 | 2D box | 1 | Stair detection |
Rekhawar, N., et al. [70] | RGB | 848 | 640 × 320 | 2D box + Stair-line mask | 1 | Stair detection |
Habib, A., et al. [71] | RGB | 510 | 720 × 960 | 2D box | 2 | Stair detection |
Wang, C., et al. [85] | RGB | 3094 | 512 × 512 | Stair-line ends | 2 | Stair detection |
Wang, C., et al. [86] | RGB-D | 2996 | 512 × 512 | Stair-line ends | 2 | Stair detection |
Wang, C., et al. [87] | RGB-D | 2986 | 512 × 512 | Stair-line ends + stair-step mask | 3 | Stair detection |
Ren, J. [83] | RGB | 1449 | 640 × 480 | Segmentation mask | 13 | Obstacle detection |
An, D., et al. [84] | RGB-D | 5000 | 256 × 256 | 2D box | 2 | Obstacle detection |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, C.; Pei, Z.; Fan, Y.; Qiu, S.; Tang, Z. Review of Vision-Based Environmental Perception for Lower-Limb Exoskeleton Robots. Biomimetics 2024, 9, 254. https://doi.org/10.3390/biomimetics9040254
Wang C, Pei Z, Fan Y, Qiu S, Tang Z. Review of Vision-Based Environmental Perception for Lower-Limb Exoskeleton Robots. Biomimetics. 2024; 9(4):254. https://doi.org/10.3390/biomimetics9040254
Chicago/Turabian StyleWang, Chen, Zhongcai Pei, Yanan Fan, Shuang Qiu, and Zhiyong Tang. 2024. "Review of Vision-Based Environmental Perception for Lower-Limb Exoskeleton Robots" Biomimetics 9, no. 4: 254. https://doi.org/10.3390/biomimetics9040254
APA StyleWang, C., Pei, Z., Fan, Y., Qiu, S., & Tang, Z. (2024). Review of Vision-Based Environmental Perception for Lower-Limb Exoskeleton Robots. Biomimetics, 9(4), 254. https://doi.org/10.3390/biomimetics9040254