State-of-the-Art Review on Wearable Obstacle Detection Systems Developed for Assistive Technologies and Footwear
Abstract
:1. Introduction
Obstacle Detection Smart Shoe System
2. Sensor Technology in Obstacle Detection Systems
Sensor Fusion
- Information obtained through fusion has a richer semantic and higher resolution than from a single sensor measurement.
- Joint information from multiple sources reduces the ambiguity and uncertainty with the measured values.
- Comprehensive coverage from multiple sensors gives extended spatial and temporal coverage.
- Increased confidence due to the availability of redundant information from many sensors scanning the same environment, and improved reliability due to sufficient information even if there is a partial failure.
3. Walking with Obstacle Detection
4. Obstacle Detection in Shoe-Based Assistive Devices
4.1. Sensors in Obsctale Detection Shoes
4.2. Additional Sensing Deployed
- Heat Sensing: The LM35 temperature sensor detects fire or a hot object near the user and the information can be relayed using voice commands [33].
- Insect Detection: The movement of insects/reptiles in a shoe triggers the infrared sensors in the shoe, notifying the user via the Blynk app in a smartphone or through email [32]. This insect detection is activated when the user is not using the shoes.
- Location Tracking: Whether the location is tracked for safety or when a person is missing, the use of smartphones ensures reliability and portability. Google’s Geolocation API makes use of Wi-Fi to determine the coordinates of the device and with a request to Google Maps, the user’s location can be mapped [32]. Bhongade, Girhay [33] reported on a system that tracks the location by sending a SMS code to receive the coordinates and navigate to the location using Google Maps.
- Height Detection: The height of the distant object in the frontal plane can be detected with a sensor integrated knee band [101].
- Text Detection: Verification of the classroom number utilizing a raspberry pi with an 8 MP Sony IMX219 image sensor to capture the images of the classroom tags and covert it to the desired form using OpenCV and Tesseract [105].
- Health Tracking: Daily activity such as the number of footsteps, distance travelled and burned calories can be recorded for one week and accessed by the user [33].
4.3. Microcontroller Unit
4.4. Feedback/Alerting Technique
4.5. Analysis of Obstacle Detection Techniques
4.6. Communication
4.7. Power Supply
4.8. Experimental Evaluation with Human Participants
5. Conclusions
- Design sensor systems that reliably detect obstacles using multiple data sources and only those that pose a hazard to the user, i.e., few false positives.
- Implementation of advanced wearable sensors and fast processing boards on the shoe while not compromising user comfort and ease of use.
- For prototype development, microcontrollers such as Arduino may be suitable but for real world applications smaller processors/boards with equivalent or advanced processing capabilities are needed.
- Ensure the additional hardware and weight, do not interfere with the gait and normal locomotion of the user.
- Examine temperature ranges for sensors (see Table 3) and other hardware components to determine the performance in extreme weather conditions.
- Evaluate prototypes in the real world to ensure comfort and acceptance for everyday use.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- WHO. Falls; WHO: Geneva, Switzerland, 16 January 2018. Available online: https://www.who.int/news-room/fact-sheets/detail/falls#:~:text=Falls%20are%20the%20second%20leading,greatest%20number%20of%20fatal%20falls (accessed on 10 October 2020).
- AIHW. Injury Expenditure in Australia 2015–16; AIHW: Darlinghurst, NSW, Australia, 2019. Available online: https://www.aihw.gov.au/reports/health-welfare-expenditure/injury-expenditure-in-australia-2015-16/contents/summary (accessed on 10 October 2020).
- Alexander, B.H.; Rivara, F.P.; Wolf, M.E. The cost and frequency of hospitalization for fall-related injuries in older adults. Am. J. Public Health 1992, 82, 1020–1023. [Google Scholar] [CrossRef] [Green Version]
- CDC. Keep on Your Feet—Preventing Older Adult Falls; CDC: Atlanta, GA, USA, 2020. Available online: https://www.cdc.gov/injury/features/older-adult-falls/index.html#:~:text=Every%20second%20of%20every%20day,particularly%20among%20the%20aging%20population (accessed on 10 October 2020).
- Blindness and Vision Impairment: Fact Sheet, o. 14 October 2021. Available online: https://www.who.int/news-room/fact-sheets/detail/blindness-and-visual-impairment (accessed on 3 October 2022).
- Schoene, D.; Heller, C.; Aung, Y.N.; Sieber, C.C.; Kemmler, W.; Freiberger, E. A systematic review on the influence of fear of falling on quality of life in older people: Is there a role for falls? Clin. Interv. Aging 2019, 14, 701. [Google Scholar] [CrossRef] [Green Version]
- Nagano, H.; Begg, R.K. Shoe-Insole Technology for Injury Prevention in Walking. Sensors 2018, 18, 1468. [Google Scholar] [CrossRef] [Green Version]
- Blake, A.J.; Morgan, K.; Bendall, M.J.; Dallosso, H.; Ebrahim, S.B.J.; Arie, T.H.D.; Fentem, P.H.; Bassey, E.J. Falls by elderly people at home: Prevalence and associated factors. Age Ageing 1988, 17, 365–372. [Google Scholar] [CrossRef]
- Nagano, H.; Begg, R.K.; Sparrow, W.A.; Taylor, S. Ageing and limb dominance effects on foot-ground clearance during treadmill and overground walking. Clin. Biomech. 2011, 26, 962–968. [Google Scholar] [CrossRef]
- Kocić, J.; Jovičić, N.; Drndarević, V. Sensors and Sensor Fusion in Autonomous Vehicles. In Proceedings of the 2018 26th Telecommunications Forum (TELFOR), Belgrade, Serbia, 20–21 November 2018. [Google Scholar]
- Niculescu, V.; Muller, H.; Ostovar, I.; Polonelli, T.; Magno, M.; Benini, L. Towards a Multi-Pixel Time-of-Flight Indoor Navigation System for Nano-Drone Applications. In Proceedings of the 2022 IEEE International Instrumentation and Measurement Technology Conference (I2MTC), Ottawa, ON, Canada, 16–19 May 2022. [Google Scholar]
- Marti, E.D.; de Miguel, M.A.; Garcia, F.; Perez, J. A Review of Sensor Technologies for Perception in Automated Driving. IEEE Intell. Transp. Syst. Mag. 2019, 11, 94–108. [Google Scholar] [CrossRef] [Green Version]
- Elmannai, W.; Elleithy, K. Sensor-Based Assistive Devices for Visually-Impaired People: Current Status, Challenges, and Future Directions. Sensors 2017, 17, 565. [Google Scholar] [CrossRef] [Green Version]
- Martinez, M.; Roitberg, A.; Koester, D.; Stiefelhagen, R.; Schauerte, B. Using Technology Developed for Autonomous Cars to Help Navigate Blind People. In Proceedings of the 2017 IEEE International Conference on Computer Vision Workshops (ICCVW), Venice, Italy, 22–29 October 2017. [Google Scholar]
- Digibarn. DigiBarn Weird Stuff: Puma RS Computer Tennis Shoes (Pedometer, 1980s); Digibarn: Boulder Creek, CA, USA, 2022; Available online: https://www.digibarn.com/collections/weirdstuff/computer-tennis-shoes/ (accessed on 15 December 2022).
- Wu, C.-J.; Tsun, F.-S.; Hsiang, S.-H.; Hsien, Y.-L. Electronic Pace and Distance Counting Shoe. U.S. Patent 4466204, 21 August 1984. [Google Scholar]
- Yang, S.; Li, Q. Inertial sensor-based methods in walking speed estimation: A systematic review. Sensors 2012, 12, 6102–6116. [Google Scholar] [CrossRef] [Green Version]
- Zhang, W.; Tomizuka, M.; Byl, N. A wireless human motion monitoring system for smart rehabilitation. J. Dyn. Syst. Meas. Control. 2016, 138, 111004. [Google Scholar] [CrossRef] [Green Version]
- Gokalgandhi, D.; Kamdar, L.; Shah, N.; Mehendale, N. A review of smart technologies embedded in shoes. J. Med. Syst. 2020, 44, 150. [Google Scholar] [CrossRef]
- Hegde, N.; Bries, M.; Sazonov, E. A Comparative Review of Footwear-Based Wearable Systems. Electronics 2016, 5, 48. [Google Scholar] [CrossRef]
- Kuriakose, B.; Shrestha, R.; Sandnes, F.E. Tools and Technologies for Blind and Visually Impaired Navigation Support: A Review. IETE Tech. Rev. 2020, 39, 3–18. [Google Scholar] [CrossRef]
- Hersh, M. Wearable Travel Aids for Blind and Partially Sighted People: A Review with a Focus on Design Issues. Sensors 2022, 22, 5454. [Google Scholar] [CrossRef]
- Agrawal, M.P.; Gupta, A.R. Smart Stick for the Blind and Visually Impaired People. In Proceedings of the 2018 Second International Conference on Inventive Communication and Computational Technologies (ICICCT), Coimbatore, India, 20–21 April 2018. [Google Scholar]
- Nada, A.; Mashelly, S.; Fakhr, M.A.; Seddik, A.F. Effective fast response smart stick for blind people. In Proceedings of the Second International Conference on Advances in Bioinformatics and Environmental Engineering–ICABEE, Rome, Italy, 18–19 April 2015. [Google Scholar]
- Bhuniya, A.; Laha, S.; Maity, D.K.; Sarkar, A.; Bhattacharyya, S. Smart Glass for Blind People. Amse J. Iieta 2017, 38, 102–110. [Google Scholar]
- Hossain, E.; Khan, R.; Ali, A. Design and data analysis for a belt-for-blind for visual impaired people. Int. J. Adv. Mechatron. Syst. 2011, 3, 384. [Google Scholar] [CrossRef] [Green Version]
- Hengle, A.; Kulkarni, A.; Bavadekar, N.; Kulkarni, N.; Udyawar, R. Smart Cap: A Deep Learning and IoT Based Assistant for the Visually Impaired. In Proceedings of the 2020 Third International Conference on Smart Systems and Inventive Technology (ICSSIT), Tirunelveli, India, 20–22 August 2020. [Google Scholar]
- Fallah, M.Y. Smart Bracelet Using Haptic System with Fuzzy Controller in Deminer Quad-rotor Robots. J. Instrum. Autom. Syst. 2017, 3, 22–30. [Google Scholar] [CrossRef]
- Srivastava, N.K.; Singh, S. Netra: Smart Hand Gloves Comprises Obstacle Detection, Object Identification & OCR Text to Speech Converter for Blinds. In Proceedings of the 2018 5th IEEE Uttar Pradesh Section International Conference on Electrical, Electronics and Computer Engineering (UPCON), Gorakhpur, India, 2–4 November 2018. [Google Scholar]
- Yang, C.M.; Jung, J.Y.; Kim, J.J. Development of obstacle detection shoes for visually impaired people. Sens. Mater. 2020, 32, 2227. [Google Scholar] [CrossRef]
- Patil, K.; Jawadwala, Q.; Shu, F.C. Design and Construction of Electronic Aid for Visually Impaired People. IEEE Trans. Human-Machine Syst. 2018, 48, 172–182. [Google Scholar] [CrossRef]
- Nanduri, S.; Umamaheswari, E.; Kishore, R.; Ajaykumar, M. Smart Bottine for autistic people. Mater. Today Proc. 2022, 62, 4788–4794. [Google Scholar] [CrossRef]
- Bhongade, P.; Girhay, S.; Sheikh, A.M.; Ghata, R.; Ambadkar, S.; Dusane, C. Internet of Things—Enabled Smart Shoes for Blind People. In Proceedings of the 2022 IEEE Delhi Section Conference, New Delhi, India, 11–13 February 2022. [Google Scholar]
- Wu, W.; Lei, N.; Tang, J. Smart Shoes for Obstacle Detection. In The 10th International Conference on Computer Engineering and Networks; Springer: Singapore, 2021; pp. 1319–1326. [Google Scholar]
- Kumar, P.; Inchara, K.M.; Lekhashree, S.; Likhith, C.N.; Pavan, U. Real Time Assistive Shoe for Visually Impaired People. In Proceedings of the 2021 6th International Conference for Convergence in Technology, I2CT 2021, Pune, India, 2–4 April 2021. [Google Scholar]
- Rao, S.; Singh, V.M. Computer vision and IoT based smart system for visually impaired people. In Proceedings of the Confluence 2021: 11th International Conference on Cloud Computing, Data Science and Engineering, Noida, India, 28–29 January 2021. [Google Scholar]
- Rahman, M.M.; Islam, M.M.; Ahmmed, S. “BlindShoe”: An electronic guidance system for the visually impaired people. J. Telecommun. Electron. Comput. Eng. 2019, 11, 49–54. [Google Scholar]
- Raja, L.; Santhosh, R. Experimental study on shoe based navigation system for the visually impaired. Mater. Today Proc. 2020, 45, 1713–1716. [Google Scholar] [CrossRef]
- Nair, D.; Aggarwal, J.K. Moving obstacle detection from a navigating robot. IEEE Trans. Robot. Autom. 1998, 14, 404–416. [Google Scholar] [CrossRef] [Green Version]
- Gray, K.W.; Baker, K. Obstacle Detection and Avoidance for an Autonomous farm Tractor; Citeseer: Princeton, NJ, USA, 2000. [Google Scholar]
- Prabhakar, G.; Kailath, B.; Natarajan, S.; Kumar, R. Obstacle detection and classification using deep learning for tracking in high-speed autonomous driving. In Proceedings of the 2017 IEEE Region 10 Symposium (TENSYMP), Kerala, India, 5–8 November 2017. [Google Scholar]
- Discant, A.; Rogozan, A.; Rusu, C.; Bensrhair, A. Sensors for obstacle detection—A survey. In Proceedings of the 2007 30th International Spring Seminar on Electronics Technology (ISSE), Cluj-Napoca, Romania, 9–13 May 2007. [Google Scholar]
- Mohammad, T. Using ultrasonic and infrared sensors for distance measurement. World Acad. Sci. Eng. Technol. 2009, 51, 293–299. [Google Scholar]
- Mustapha, B.; Zayegh, A.; Begg, R.K. Ultrasonic and infrared sensors performance in a wireless obstacle detection system. In Proceedings of the 2013 1st International Conference on Artificial Intelligence, Modelling and Simulation, Sabah, Malaysia, 3–5 December 2013. [Google Scholar]
- Azeta, J.; Bolu, C.; Hinvi, D.; Abioye, A.A. Obstacle detection using ultrasonic sensor for a mobile robot. IOP Conf. Series: Mater. Sci. Eng. 2019, 707, 012012. [Google Scholar] [CrossRef]
- Ruzaij, M.F.; Poonguzhali, S. Design and implementation of low cost intelligent wheelchair. In Proceedings of the 2012 International Conference on Recent Trends in Information Technology, Chennai, India, 19–21 April 2012. [Google Scholar]
- Adarsh, S.; Kaleemuddin, S.M.; Bose, D.; Ramachandran, K.I. Performance comparison of Infrared and Ultrasonic sensors for obstacles of different materials in vehicle/ robot navigation applications. IOP Conf. Ser. Mater. Sci. Eng. 2016, 149, 012141. [Google Scholar] [CrossRef] [Green Version]
- Yeong, D.J.; Velasco-Hernandez, G.; Barry, J.; Walsh, J. Sensor and sensor fusion technology in autonomous vehicles: A review. Sensors 2021, 21, 2140. [Google Scholar] [CrossRef]
- Han, J.; Kim, D.; Lee, M.; Sunwoo, M. Enhanced Road Boundary and Obstacle Detection Using a Downward-Looking LIDAR Sensor. IEEE Trans. Veh. Technol. 2012, 61, 971–985. [Google Scholar] [CrossRef]
- Baras, N.; Nantzios, G.; Ziouzios, D.; Dasygenis, M. Autonomous Obstacle Avoidance Vehicle Using LIDAR and an Embedded System. In Proceedings of the 2019 8th International Conference on Modern Circuits and Systems Technologies (MOCAST), Thessaloniki, Greece, 13–15 May 2019. [Google Scholar]
- Mazzari, V. What Is LiDAR Technology? Lidar 2019. Available online: https://blog.generationrobots.com/en/what-is-lidar-technology/ (accessed on 3 May 2021).
- Asvadi, A.; Premebida, C.; Peixoto, P.; Nunes, U. 3D Lidar-based static and moving obstacle detection in driving environments: An approach based on voxels and multi-region ground planes. Robot. Auton. Syst. 2016, 83, 299–311. [Google Scholar] [CrossRef]
- Fang, Z.; Zhao, S.; Wen, S.; Zhang, Y. A real-time 3d perception and reconstruction system based on a 2d laser scanner. J. Sensors 2018, 2018, 2937694. [Google Scholar]
- Jahromi, B.S.; Tulabandhula, T.; Cetin, S. Real-Time Hybrid Multi-Sensor Fusion Framework for Perception in Autonomous Vehicles. Sensors 2019, 19, 4357. [Google Scholar]
- Arnold, E.; Al-Jarrah, O.Y.; Dianati, M.; Fallah, S.; Oxtoby, D.; Mouzakitis, A. A survey on 3d object detection methods for autonomous driving applications. IEEE Trans. Intell. Transp. Syst. 2019, 20, 3782–3795. [Google Scholar] [CrossRef] [Green Version]
- Bhoi, A. Monocular depth estimation: A survey. arXiv 2019, arXiv:1901.09402. [Google Scholar]
- Yu, H.; Zhu, J.; Wang, Y.; Jia, W.; Sun, M.; Tang, Y. Obstacle classification and 3D measurement in unstructured environments based on ToF cameras. Sensors 2014, 14, 10753–10782. [Google Scholar] [CrossRef] [Green Version]
- Slamtech. RPLIDAR A2. Introduction and Datasheet 2016. Available online: https://www.slamtec.com/en/Lidar/A2Spec (accessed on 15 March 2021).
- Benewake. TFmini Infrared Module Specification. n.d. Available online: https://cdn.sparkfun.com/assets/5/e/4/7/b/benewake-tfmini-datasheet.pdf (accessed on 14 March 2021).
- IbeoAutomotiveSystems. ibeo LUX 4L/ibeo LUX 8L/ibeo LUX HD. 2017. Available online: https://cdn.www.ibeo-as.com/74ff7abf85e5f139f1f57020579ebf9d0436b25e/ibeo%2BLUX%2BFamily%2BData%2BSheet.pdf.pdf (accessed on 2 December 2022).
- AutonomousStuff. Velodyne LiDAR PUCK. 2015. Available online: https://www.amtechs.co.jp/product/VLP-16-Puck.pdf (accessed on 3 July 2021).
- AutonomousStuff. Delphi Electronically Scanning RADAR. n.d. Available online: https://hexagondownloads.blob.core.windows.net/public/AutonomouStuff/wp-content/uploads/2019/05/delphi-esr-whitelabel.pdf (accessed on 14 March 2021).
- Stanislas, L.; Peynot, T. Characterisation of the Delphi Electronically Scanning Radar for robotics applications. In Proceedings of the ICRA 2015, Seattle, WA, USA, 26–30 May 2015. [Google Scholar]
- Roboception. rc_visard 3D Stereo Sensor Assembly and Operating Manual. 2022. Available online: https://doc.rc-visard.com/latest/pdf/rc_visard_manual_en.pdf (accessed on 10 November 2022).
- Roboception. 3D Stereo Sensor. 2020. Available online: https://howtorobot.com/sites/default/files/2021-09/2020_Roboception_ProductSpecs.pdf (accessed on 10 November 2022).
- RaspberryPi. Camera Module. n.d. Available online: https://www.raspberrypi.org/documentation/hardware/camera/ (accessed on 20 March 2021).
- ifm. 3D Camera O3D303. 2015. Available online: https://www.ifm.com/au/en/product/O3D303 (accessed on 15 November 2022).
- Elecfreaks. Ultrasonic Ranging Module HC-SR04. n.d. Available online: https://cdn.sparkfun.com/datasheets/Sensors/Proximity/HCSR04.pdf (accessed on 10 March 2021).
- Sharp. Sharp GP2Y0A02YK0F Analog Distance Sensor. 2006. Available online: https://www.pololu.com/file/0J156/gp2y0a02yk_e.pdf (accessed on 14 March 2021).
- Wang, D.; Li, W.; Liu, X.; Li, N.; Zhang, C. UAV environmental perception and autonomous obstacle avoidance: A deep learning and depth camera combined solution. Comput. Electron. Agric. 2020, 175, 105523. [Google Scholar] [CrossRef]
- John, V.; Mita, S. RVNet: Deep Sensor Fusion of Monocular Camera and Radar for Image-Based Obstacle Detection in Challenging Environments. In Image and Video Technology; Springer International Publishing: Cham, Switzerland, 2019. [Google Scholar]
- Kato, T.; Ninomiya, Y.; Masaki, I. An obstacle detection method by fusion of radar and motion stereo. IEEE Trans. Intell. Transp. Syst. 2002, 3, 182–188. [Google Scholar] [CrossRef]
- Wu, S.; Decker, S.; Chang, P.; Camus, T.; Eledath, J. Collision Sensing by Stereo Vision and Radar Sensor Fusion. IEEE Trans. Intell. Transp. Syst. 2009, 10, 606–614. [Google Scholar]
- Shahdib, F.; Bhuiyan, M.W.U.; Hasan, M.K.; Mahmud, H. Obstacle detection and object size measurement for autonomous mobile robot using sensor. Int. J. Comput. Appl. 2013, 66, 28–33. [Google Scholar]
- Fung, M.L.; Chen, M.Z.Q.; Chen, Y.H. Sensor fusion: A review of methods and applications. In Proceedings of the 2017 29th Chinese Control And Decision Conference (CCDC), Chongqing, China, 28–30 May 2017. [Google Scholar]
- Elmenreich, W. An Introduction to Sensor Fusion; Vienna University of Technology: Vienna, Austria, 2002; Volume 502, pp. 1–28. [Google Scholar]
- Bhowmick, A.; Hazarika, S. An insight into assistive technology for the visually impaired and blind people: State-of-the-art and future trends. J. Multimodal User Interfaces 2017, 11, 149–172. [Google Scholar] [CrossRef]
- Shetara, J.; Majumder, S.; Acharjee, S.; Dhar, S. Smart Wrist Band for Obstacle Detection. In Proceedings of the International Conference on Materials, Electronics & Information Engineering, ICMEIE-2015, Rajshahi, Bangladesh, 27–28 February 2015. [Google Scholar]
- Busaeed, S.; Katib, I.; Albeshri, A.; Corchado, J.M.; Yigitcanlar, T.; Mehmood, R. LidSonic V2.0: A LiDAR and Deep-Learning-Based Green Assistive Edge Device to Enhance Mobility for the Visually Impaired. Sensors 2022, 22, 7435. [Google Scholar] [CrossRef]
- Al-Khalifa, S.; Al-Razgan, M. Ebsar: Indoor guidance for the visually impaired. Comput. Electr. Eng. 2016, 54, 26–39. [Google Scholar] [CrossRef]
- Schwarze, T.; Lauer, M.; Schwaab, M.; Romanovas, M.; Böhm, S.; Jürgensohn, T. A camera-based mobility aid for visually impaired people. KI-Künstliche Intell. 2016, 30, 29–36. [Google Scholar] [CrossRef] [Green Version]
- Khan, A.; Khan, A.; Waleed, M. Wearable navigation assistance system for the blind and visually impaired. In Proceedings of the 2018 International Conference on Innovation and Intelligence for Informatics, Computing, and Technologies (3ICT), Sakhier, Bahrain, 18–20 November 2018. [Google Scholar]
- Katzschmann, R.K.; Araki, B.; Rus, D. Safe local navigation for visually impaired users with a time-of-flight and haptic feedback device. IEEE Trans. Neural Syst. Rehabil. Eng. 2018, 26, 583–593. [Google Scholar] [CrossRef]
- Elgendy, M.; Sik-Lanyi, C.; Kelemen, A. A Novel Marker Detection System for People with Visual Impairment Using the Improved Tiny-YOLOv3 Model. Comput. Methods Programs Biomed. 2021, 205, 106112. [Google Scholar] [CrossRef]
- Kanwal, N.; Bostanci, E.; Currie, K.; Clark, A.F. A Navigation System for the Visually Impaired: A Fusion of Vision and Depth Sensor. Appl. Bionics Biomech. 2015, 2015, 479857. [Google Scholar] [CrossRef] [Green Version]
- Kuriakose, B.; Shrestha, R.; Sandnes, F.E. DeepNAVI: A deep learning based smartphone navigation assistant for people with visual impairments. Expert Syst. Appl. 2023, 212, 118720. [Google Scholar] [CrossRef]
- Mocanu, B.; Tapu, R.; Zaharia, T. When Ultrasonic Sensors and Computer Vision Join Forces for Efficient Obstacle Detection and Recognition. Sensors 2016, 16, 1807. [Google Scholar] [CrossRef] [Green Version]
- Chen, Z.; Liu, X.; Kojima, M.; Huang, Q.; Arai, T. A Wearable Navigation Device for Visually Impaired People Based on the Real-Time Semantic Visual SLAM System. Sensors 2021, 21, 1536. [Google Scholar] [CrossRef]
- Bai, J.; Liu, Z.; Lin, Y.; Li, Y.; Lian, S.; Liu, D. Wearable Travel Aid for Environment Perception and Navigation of Visually Impaired People. Electronics 2019, 8, 697. [Google Scholar] [CrossRef] [Green Version]
- Siddhartha, B.; Chavan, A.P.; Uma, B. An Electronic Smart Jacket for the Navigation of Visually Impaired Society. Mater. Today Proc. 2018, 5, 10665–10669. [Google Scholar] [CrossRef]
- Suman, S.; Mishra, S.; Sahoo, K.S.; Nayyar, A. Vision Navigator: A Smart and Intelligent Obstacle Recognition Model for Visually Impaired Users. Mob. Inf. Syst. 2022, 2022, 1–15. [Google Scholar] [CrossRef]
- Tudor, D.; Dobrescu, L.; Dobrescu, D. Ultrasonic electronic system for blind people navigation. In Proceedings of the 2015 E-Health and Bioengineering Conference (EHB), Iasi, Romania, 19–21 November 2015. [Google Scholar]
- Cardillo, E.; Di Mattia, V.; Manfredi, G.; Russo, P.; De Leo, A.; Caddemi, A.; Cerri, G. An electromagnetic sensor prototype to assist visually impaired and blind people in autonomous walking. IEEE Sens. J. 2018, 18, 2568–2576. [Google Scholar] [CrossRef]
- Nada, A.A.; Fakhr, M.A.; Seddik, A.F. Assistive infrared sensor based smart stick for blind people. In Proceedings of the 2015 Science and Information Conference (SAI), London, UK, 28–30 July 2015. [Google Scholar]
- Perumal, C.; Balamurugan, V.; Manickam, S.; Natarajamoorthy, M. Voice Navigation Based guiding Device for Visually Impaired People. In Proceedings of the 2021 International Conference on Artificial Intelligence and Smart Systems (ICAIS), Coimbatore, India, 25–27 March 2021. [Google Scholar]
- Barontini, F.; Catalano, M.G.; Pallottino, L.; Leporini, B.; Bianchi, M. Integrating Wearable Haptics and Obstacle Avoidance for the Visually Impaired in Indoor Navigation: A User-Centered Approach. IEEE Trans. Haptics 2021, 14, 109–122. [Google Scholar] [CrossRef]
- Ran, L.; Helal, S.; Moore, S. Drishti: An integrated indoor/outdoor blind navigation system and service. In Proceedings of the Second IEEE Annual Conference on Pervasive Computing and Communications, 2004, Orlando, FL, USA, 14–17 March 2004. [Google Scholar]
- Daou, R.A.Z.; Chehade, J.; Haydar, G.A.; Hayek, A.; Boercsoek, J.; Olmedo, J.J.S. Design and Implementation of Smart Shoes for Blind and Visually Impaired People for More Secure Movements. In Proceedings of the International Conference on Microelectronics, ICM, Aqaba, Jordan, 14–17 December 2020. [Google Scholar]
- Singh, V.; Sindhu, S.; Arora, R. BUZZFEET: Blind Man Shoes. In Proceedings of the International Conference on Machine Learning, Big Data, Cloud and Parallel Computing: Trends, Prespectives and Prospects, COMITCon 2019, Faridabad, Haryana, 14–16 February 2019. [Google Scholar]
- Rakshith, M.N.; Sundar, D.R.S.; Shanmugasundaram, M. An efficient assistive system for the visually impaired. ARPN J. Eng. Appl. Sci. 2017, 12, 5574–5577. [Google Scholar]
- Mishra, A.R.; Pippal, S.K.; Asif; Kumar, A.; Singh, D.; Singh, A. Clear Vision—Obstacle detection using Bat Algorithm Optimization Technique. In Proceedings of the 2021 9th International Conference on Reliability, Infocom Technologies and Optimization (Trends and Future Directions), ICRITO 2021, Noida, India, 7–9 September 2021. [Google Scholar]
- Kamaruddin, F.S.; Mahmood, N.H.; Razak, M.A.A.; Zakaria, N.A. Smart Assistive Shoes with Internet of Things Implementation for Visually Impaired People. J. Physics: Conf. Ser. 2021, 2107, 012030. [Google Scholar] [CrossRef]
- Chava, T.; Srinivas, A.T.; Sai, A.L.; Rachapudi, V. IoT based Smart Shoe for the Blind. In Proceedings of the 6th International Conference on Inventive Computation Technologies, ICICT 2021, Coimbatore, India, 20–22 January 2021. [Google Scholar]
- Anisha, M.; Kirthika, S.; Harline, J.D.; Thenmozhi, P.; Rubala, R.; Pragathi, T.G.; Benisha, M.; Elliot, C.J. Low-cost smart shoe for visually impaired. In Proceedings of the 3rd International Conference on Intelligent Communication Technologies and Virtual Mobile Networks, ICICV 2021, Tirunelveli, India, 4–6 February 2021. [Google Scholar]
- Alzamil, M.; AlBugmi, R.; AlOtaibi, S.; AlAnazi, G.; AlZubaidi, L.; Bashar, A. COMPASS: IPS-based navigation system for visually impaired students. In Proceedings of the 2020 IEEE 9th International Conference on Communication Systems and Network Technologies, CSNT 2020, Gwalior, India, 10–12 April 2020. [Google Scholar]
- Tang, Y.; Peng, Z.; Li, C. An experimental study on the feasibility of fall prevention using a wearable K-band FMCW radar. In Proceedings of the 2017 United States National Committee of URSI National Radio Science Meeting, USNC-URSI NRSM 2017, Boulder, CO, USA, 4–7 January 2017. [Google Scholar]
- Tang, Y.; Peng, Z.; Ran, L.; Li, C. IPrevent: A novel wearable radio frequency range detector for fall prevention. In Proceedings of the RFIT 2016—2016 IEEE International Symposium on Radio-Frequency Integration Technology, Taipei, Taiwan, 24–26 August 2016. [Google Scholar]
- Parmar, V.S.; Inkoolu, K.S. Designing smart shoes for obstacle detection: Empowering visually challenged users through ICT. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer International Publishing: Cham, Switzerland, 2017; pp. 258–266. [Google Scholar]
- WHO. Ageing and Health; WHO: Geneva, Switzerland, 2018. Available online: https://www.who.int/news-room/fact-sheets/detail/ageing-and-health (accessed on 2 June 2021).
- Li, G.; Tian, Z.; Gao, G.; Zhang, L.; Fu, M.; Chen, Y. A Shoelace Antenna for the Application of Collision Avoidance for the Blind Person. IEEE Trans. Antennas Propag. 2017, 65, 4941–4946. [Google Scholar] [CrossRef] [Green Version]
- Argañarás, J.G.; Wong, Y.T.; Pendharkar, G.; Begg, R.K.; Karmakar, N.C. Obstacle detection with MIMO 60 GHz radar for fall prevention. In Proceedings of the Asia-Pacific Microwave Conference, Singapore, 10–13 December 2019. [Google Scholar]
- Lin, T.H.; Yang, C.Y.; Shih, W.P. Fall Prevention Shoes Using Camera-Based Line-Laser Obstacle Detection System. J. Healthc. Eng. 2017, 2017, 8264071. [Google Scholar] [CrossRef] [Green Version]
- Dharme, R.K.; Surywanshi, J.R.; Kunwar, H.C.; Palve, Y.H. Smart Shoe Provides Vision to Visionless Person. In ICT Systems and Sustainability. Lecture Notes in Networks and Systems; Springer: Singapore, 2022; Volume 321, pp. 131–137. [Google Scholar]
- Arduino. Arduino Nano. 2022. Available online: https://docs.arduino.cc/static/282dbc4f4a12cec8da26375adf8c210d/A000005-datasheet.pdf (accessed on 2 November 2022).
- RobotShop. Arduino Mega 2560 Datasheet. Available online: https://www.robotshop.com/media/files/PDF/ArduinoMega2560Datasheet.pdf (accessed on 10 October 2022).
- STMicroelectronics. STM32L432KB STM32L432KC. 2018. Available online: https://www.st.com/resource/en/datasheet/stm32l432kc.pdf (accessed on 5 November 2022).
- Pi, R. Raspberry Pi Zero 2 W. 2021. Available online: https://datasheets.raspberrypi.com/rpizero2/raspberry-pi-zero-2-w-product-brief.pdf (accessed on 3 November 2022).
- Coral. Coral Dev Board Datasheet. 2020 July 2020. Available online: https://coral.ai/static/files/Coral-Dev-Board-datasheet.pdf (accessed on 25 October 2022).
- Foundation, R.P. Raspberry Pi 4 Computer Model B. 2021. Available online: https://datasheets.raspberrypi.com/rpi4/raspberry-pi-4-product-brief.pdf (accessed on 25 October 2022).
- NVIDIA. Jetson Nano. Available online: https://developer.nvidia.com/embedded/jetson-nano (accessed on 25 October 2022).
- Coral. Coral Dev Board Mini Datasheet. 2020. Available online: https://coral.ai/docs/dev-board-mini/datasheet/ (accessed on 25 October 2022).
- Coral. Coral USB Accelerator Datasheet. 2019. Available online: https://coral.ai/docs/accelerator/datasheet/ (accessed on 25 October 2022).
- Kang, P.; Somtham, A. An Evaluation of Modern Accelerator-Based Edge Devices for Object Detection Applications. Mathematics 2022, 10, 4299. [Google Scholar] [CrossRef]
- Kovács, B.; Henriksen, A.D.; Stets, J.D.; Nalpantidis, L. Object Detection on TPU Accelerated Embedded Devices; Springer International Publishing: Cham, Switzerland, 2021. [Google Scholar]
- Park, K.; Jang, W.; Lee, W.; Nam, K.; Seong, K.; Chai, K.; Li, W.S. Real-time mask detection on google edge TPU. arXiv 2020, arXiv:2010.04427. [Google Scholar]
- Winzig, J.; Almanza, J.C.A.; Mendoza, M.G.; Schumann, T. Edge AI—Use Case on Google Coral Dev Board Mini. In Proceedings of the 2022 IET International Conference on Engineering Technologies and Applications (IET-ICETA), Changhua City, Taiwan, 26–27 October 2022. [Google Scholar]
- Abraham, G.; Nithya, M. Multi-Functional Personal Assistant Robot Using Raspberry Pi and Coral Accelerator. In Proceedings of the 2021 5th International Conference on Computing Methodologies and Communication (ICCMC), Erode, India, 29–31 March 2022. [Google Scholar]
- Melo, J.G.; Barros, E. An Embedded Monocular Vision Approach for Ground-Aware Objects Detection and Position Estimation. arXiv 2022, arXiv:2207.09851. [Google Scholar]
- Devi, Y.S.; Sathvik, S.; Ananya, P.; Tharuni, P.; Vamsi, N.N.K. Vision-Based Obstacle Detection and Collision Prevention in Self-Driving Cars. J. Phys. Conf. Ser. 2022, 2335, 012019. [Google Scholar] [CrossRef]
- Fang, R.; Cai, C. Computer vision based obstacle detection and target tracking for autonomous vehicles. MATEC Web Conf. 2021, 336, 07004. [Google Scholar] [CrossRef]
- Tawil, Y.; Hafez, A.H.A. Deep Learning Obstacle Detection and Avoidance for Powered Wheelchair. In Proceedings of the 2022 Innovations in Intelligent Systems and Applications Conference (ASYU), Biarritz, France, 7–9 September 2022. [Google Scholar]
- Farheen, N.; Jaman, G.G.; Schoen, M.P. Object Detection and Navigation Strategy for Obstacle Avoidance Applied to Autonomous Wheel Chair Driving. In Proceedings of the 2022 Intermountain Engineering, Technology and Computing (IETC), Orem, UT, USA, 14–15 May 2022. [Google Scholar]
- Bergengruen, L.; Duran, D.; Sotelo, R. Effatá: Obstacle Identification System to help the Blind in Urban Displacement. In Proceedings of the 2021 IEEE Global Humanitarian Technology Conference (GHTC), Seattle, WA, USA, 19–23 October 2021. [Google Scholar]
- Sherpa, T.T.; Kimura, A. Pedestrian Crossing Lights and Obstacles Detections for Visually Impaired Person. In Proceedings of the 2022 Nicograph International (NicoInt), Tokyo, Japan, 4–5 June 2022. [Google Scholar]
- Bouteraa, Y. Smart real time wearable navigation support system for BVIP. Alex. Eng. J. 2023, 62, 223–235. [Google Scholar] [CrossRef]
- Hsieh, I.-H.; Cheng, H.-C.; Ke, H.-H.; Chen, H.-C.; Wang, W.-J. A CNN-Based Wearable Assistive System for Visually Impaired People Walking Outdoors. Appl. Sci. 2021, 11, 10026. [Google Scholar] [CrossRef]
- Pascale, F.; Adinolfi, E.; Avagliano, M.; Giannella, V.; Salas, A. A Low Energy IoT Application Using Beacon for Indoor Localization. Appl. Sci. 2021, 11, 4902. [Google Scholar] [CrossRef]
- Ramalingam, M.; Chinnavan, E.; Puviarasi, R.; Yu, N.H. Assistive technology for harvesting footstep energy in IoT enabled Smart shoe for the visually impaired. In Proceedings of the 2021 International Conference on Software Engineering and Computer Systems and 4th International Conference on Computational Science and Information Management, ICSECS-ICOCSIM 2021, Yokohama, Japan, 24–26 August 2021. [Google Scholar]
Patil, Jawadwala, 2018 [31]-(copyrights authorized by IEEE) | Obstacle Detection Sensor: Ultrasound Processing Board: Customized Microcontroller Feedback: Audio, Vibration Functionality:
|
Nanduri, Umamaheswari, 2022 [32] (copyrights authorized by Elsevier) | Obstacle Detection Sensor: Ultrasound Processing Board: Node MCU Feedback: Smartphone, Piezo buzzer Functionality:
|
Bhongade, Girhay, 2022 [33]-(copyrights authorized by IEEE) | Obstacle Detection Sensor: Ultrasound Processing Board: Arduino Atmega328 Feedback: Audio Functionality:
|
Wu, Lei, 2021 [34] (copyrights authorized by Springer nature) | Obstacle Detection Sensor: Ultrasound Processing Board: STM32L432KC Feedback: Vibration Functionality:
|
Pradeep Kumar, Inchara, 2021 [35]-(copyrights authorized by IEEE) | Obstacle Detection Sensor: Ultrasound Processing Board: Renesas microcontroller Feedback: Smartphone audio output Functionality:
|
Rao, Singh, 2021 [36]-(copyrights authorized by IEEE) | Obstacle Detection Sensor: Ultrasound, Fisheye camera Processing Board: Raspberry Pi Zero Feedback: Vibration Functionality:
|
Rahman, Islam, 2019 [37] | Obstacle Detection Sensor: Ultrasound Processing Board: Arduino Uno Feedback: Buzzer, Smartphone Functionality:
|
Raja and Santhosh, 2021 [38] (copyrights authorized by Elsevier) | Obstacle Detection Sensor: Ultrasound, Infrared Processing Board: ARM cortex M3 LPC1768 Feedback: Buzzer Functionality:
|
Yang, Jung, 2020 [30] | Obstacle Detection Sensor: Infrared Processing Board: Arduino Feedback: Piezoelectric Buzzer Functionality:
|
SENSOR | PERCEIVED ENERGY | ADVANTAGES | DISADVANTAGES |
---|---|---|---|
LIDAR (3D) | Laser Signal [emitted] |
|
|
RADAR | Millimeter wave radio waves [emitted] |
|
|
Camera | Visible light [ambient] |
|
|
Ultrasound | Sound waves above 20 kHz [emitted] |
|
|
Infrared | Infrared Light [emitted] |
|
|
SENSORS | RANGE | MASS | OPERATING TEMPERATURE | FIELD OF VIEW | ACCURACY | RESOLUTION | DIMENSION | POWERS | COST | OUTPUT |
---|---|---|---|---|---|---|---|---|---|---|
3D LIDAR [Velodyne VLP 16] | 30–100 m | 830 g | −10 °C to 60 °C | 360° horizontal, 30 ° vertical | ± 3 cm | Angular resolution: 2° vertical, 0.1–0.4° horizontal | 103 mm diameter × 72 mm | Voltage: 9 v–18 V Power Consumption: 8 W | AUD 5000+ | •Up to 300,000 points per second •100 Mbps Ethernet connection •UDP packets containing distances, calibrated reflectivities, rotation angles, synchronized time stamps (μs resolution) |
2D LIDAR [RPLIDAR A2M8] | 0.15–12 m | 190 g | 0 ℃ to 40 ℃ | 360° | 1% of range (</= 3 m), 2% of range (3–5 m), 2.5% of range (above 5 m) | Angular: 0.45° Range: </= 1% of range (below 12), </= 2% of range (12 m ~16 m) | 76 mm diameter × 41 mm | Voltage: 5 V Current: 450 mA–600 mA Power Consumption: 2.25–3 W | AUD 640 | 8000 points obtained with 10 Hz rotational speed |
1D LIDAR [TF Mini Lidar] | 0.3–12 m | 6.1 g | −20°C to 60 °C | 2.3° | 1% (0.3 m–6 m), 2% (6 m–12 m) | 5 mm | 42 × 15 × 16 mm | Voltage: 4.5 V–6 V Power Consumption: 0.12 W | AUD 75.36 | Distance obtained at 100 Hz |
Solid State Lidar [ibeo LUX 4L] | Up to 50 m | 998.7 g | −40 °C to +85 °C | 110° (H) × 3.2° (V) | 10 cm | Angular Resolution (H × V): 0.25° × 0.8° Range Resolution: 4 cm | 164.5 × 93.2 × 88 mm | Voltage: 9–27 V Power Consumption:7 W | - | Distance and echo pulse width |
Radar [Delphi ESR] | Long range: 1–174 m Mid-range: 1–60 m Velocity range: < −100 m/s to 40 m/s Lateral: −20 m/s to +20 m/s | 575 g | - | Horizontal Long range: ±10° Midrange: ±45° Vertical: 4.2° to 4.75° | Range: < =/- 0.5 m (long), < ± 0.25 m (mid) Range rate: < ± 0.12 m/s Azimuth angle Centroid: < ± 0.3° (corner reflector targets in long-range),< ± 0.5°(other targets in long range), < ± 1.0° (mid-range) | Azimuth angle: <3.5° (long range) <12 ° (mid-range) | 173.7 × 90.2 × 49.2 mm | - | AUD 2500 | The estimated centroid of the detected object which includes the range to the centroid, its bearing angle, its longitudinal and lateral speeds, its acceleration, and the power of the returned signal (Φ). |
Camera [Raspberry pi camera] | - | 3 g | −30 °C to 70 °C | 53.50 ± 0.13° (H), 41.41 ± 0.11° (V) | - | Still resolution: 5 MP Sensor resolution: 2592 × 1944 pixels | 25 × 24 × 9 mm | Power Consumption: 325 mW | AUD 25 | Images: 1080 p @ 30 fps, 720 p @ 60 fps, 480 p @ 90 fps |
Stereo Camera [Roboreception RC Visard 160] | Depth Range: 0.5–3 m | 850 g | 0 °C to 50 ℃ | Horizontal 61°, Vertical 48° | Depth Accuracy: 0.4–13 mm | Image resolution: 1280 × 960 pixels, 1.2 MP Lateral Resolution: 0.5–2.8 mm Depth Resolution: 0.1–3.3 mm | 230 × 75 × 84 mm | Voltage: 18–30 V Power Consumption: 25 W | - | Right and left rectified image, depth image, confidence image at 0.8–25 Hz frames per second |
TOF Camera [IFM O3D03] | 3–8 m | 766.95 g | −10 °C to +50 °C | 60° × 45° | - | Image Resolution: 352 × 264 pixels | 72 × 65 × 82.6 mm | Voltage: 20.4–28.8 V Current: <2400 mA Power consumption: 10 W | AUD 3000 | 3D image data obtained with a reading rate of 25 HZ |
Ultrasound [HC SR04] | 2 cm–4 m | 8.5 g | −15 °C to 70 °C | 30° conical | 3 mm | - | 45 × 20 × 15 mm | Voltage: 5 V DC Current: 15 mA | Under AUD 5 | Distance estimated from the time between send (eight k40 Hz signals) and received pulses |
Infrared [Sharp GP2Y0A02YK0F] | 20–150 cm | 4.8 g | −10 °C to 60 °C | 10° | - | - | 44.5 × 18.9 × 21.6 mm | Voltage: 4.4 V–5.5 V Current consumption: 33mA | AUD 30 | Distance |
Obstacle Detection Sensor | Number of Sensors | Device | Type of Obstacle | Mass | Range | Feedback | Cost | Reference |
---|---|---|---|---|---|---|---|---|
Ultrasound | 5 | Wearable jacket | Path obstacles | 3 g | 2 cm–400 cm | Buzzer, Vibrator | Low | [82] |
2 | Belt | Near object, distant object | Light | 2 cm–400 cm | Vibration | Low | [92] | |
Camera | 2 | Bicycle Helmet | Foreground object | Light | 10–20 m | Acoustic Feedback | Low | [81] |
Microwave radar | 1 | Cane | Floor/suspended obstacles | Light | 1 m–5 m | Acoustic, Vibration | Low | [93] |
Infrared sensor | 2 | Foldable Stick | High-level /floor level obstacle, staircases | Light | Up to 200 cm | Audio through earphone | Low | [94] |
ToF Distance sensor | 7 | Belt | Low and High Obstacle | 8 g | 0–14 m | Vibration belt | High | [83] |
Tfmini Lidar and Ultrasound | 1 each | Smart Glasses | Obstacles within the 1.7 m height, descending stairs | Light | 10 cm–12 m (Lidar), 2 cm–300 cm (ultrasound) | Buzzer, Audio | Low | [79] |
Lidar and Web Camera | 1 each | Haptic Strap | Chair, person, bottle, bicycle | Light | Not given | Vibration and Audio | Low | [95] |
Asus Xtion Pro camera | 1 | Strap on chest, Handcuff | Path Obstacles | Medium | 0.8–3.5 m | Vibration | High | [96] |
Microsoft Kinect | 1 | Strap on neck | Path Obstacles | Medium | Not given | Audio | Medium | [85] |
Shoe Name | Application | Sensors | Processor Board | Additional Device in Overall System | Additional Sensing | Accuracy | Alerting Technique | Battery Life | Detection Range |
---|---|---|---|---|---|---|---|---|---|
Smart-Alert Walker [91] | Visually Impaired | Ultrasound on shoes (obstacle), Camera on the cane (obstacle), Water Sensor (Moisture) | Arduino | Smart-fold cane | Water sensing | 95.54% for common obstacles | Vibration Alert in leg, Audio output for the detected obstacles | - | - |
Smart Bottine [32] | Autistic People | Ultrasonic sensor HC SR04 (obstacle) Infrared(insect) IMU MPU6050 (fall) | NodeMCU (ESP8266) | Smart Phone | Insect/Reptile Detection, Fall Detection, Location Tracking | - | Smartphone, Piezo buzzer | - | - |
Smart Shoe [33] | Visually Impaired | Ultrasonic sensor HCSR04(obstacle) Infrared sensor(pothole) Moisture sensor (water) Temperature sensor LM35 (Fire) | Arduino Nano | Smart Phone | Location tracking, Pothole Detection, Hot object detection, Slippery Surface, health tracking | - | Voice sent to user’s headphone | 3–4 Hrs | 20 cm–4 m |
Smart Shoe [34] | Visually Impaired | Ultrasonic Sensor HCSR04 (obstacle), Accelerometer ADXL335 (Foot motion) | STM32L432KC | Smartphone | Gait sensing, Fall detection | - | Vibration Motor | - | - |
Smart Shoe system [36] | Visually Impaired | Ultrasonic Sensor, Fisheye camera | Raspberry pi Zero (streaming, actuation), Smartphone(detection) | Smart phone | Navigation | - | Vibration Motor | - | - |
Shoe System [38,100] | Visually Impaired | Ultrasonic (obstacle), Infrared sensor (obstacle), Force sensitive resistor (Shoe wearing) | ARM cortex M3 LPC1768 | - | Detect whether the shoe is worn by user | - | Buzzer | - | - |
Real Time Assistive Shoe [35] | Visually Impaired | Ultrasonic sensor (obstacle), Moisture sensor (soil moisture) | Renesas RL78/G13 | Smartphone | Moisture Detection, Navigation | - | Audio output | - | 2 cm–80 cm |
Clear Vision- Smart Shoes [101] | Visually Impaired | Ultrasound HCSR04 | Arduino Nano (ATMEGA328) | Knee Band | Height Detection | - | Vibration | - | - |
Smart Assistive shoe [102] | Visually Impaired | Ultrasound HCSR04 | NodeMCU | Smartphone | Shoe Position Finder | - | Vibration | - | - |
Smart Shoe [103] | Visually Impaired | Ultrasound | Arduino UNO | Smart glasses | - | - | Vibration | - | - |
Smart Shoe [104] | Visually Impaired | Ultrasound HCSR04 | Arduino Nano | - | - | - | Buzzer | - | - |
Obstacle Detection Shoe [30] | Visually Impaired | Infrared sensor (obstacle, Accelerometer (Shoe direction) | Arduino 101 board | - | Shoe Direction | - | Piezoelectric Buzzer | - | - |
Smart Shoe [98] | Visually Impaired | Ultrasound HCSR04, Water Sensor (wet), MPU6050 sensor(fall) | Arduino Mega | - | Wet detection, Fall detection | Overall accuracy-95.33%, Sensitivity of 98% and false detection rate of 5.3%. | Audible notification and vibration motors | 2 h | 2 cm–300 cm |
COMPASS [105] | Visually Impaired | Ultrasound (obstacle), Raspberry Pi camera (text) | ESP32 development board(shoe) Raspberry Pi (bracelet) | Smart Bracelet, Smartphone | Text Detection | - | Beeper | - | - |
Blind Shoe [37] | Visually Impaired | Ultrasonic sensor (obstacle), Water level sensor(water) | Arduino UNO | Smartphone | Slippery or wet surface | 97.33% | Buzzer, Audio | - | 2 cm–4 m, 15 degree |
BUZZFEET [99] | Visually Impaired | Ultrasound (obstacle), Infrared (pit) | Arduino Lilypad | Audio processor module | Pit detection | - | Audio | - | - |
NavGuide [31] | Visually Impaired | Ultrasound(obstacle), Liquid Detector Sensors (wet floor) | Customized microcontroller | - | Wet floor Detection | - | Audio, Vibration | 850–1000 min | - |
Fall Prevention Shoes | Elderly | Line laser (obstacle), Camera (obstacle, gait) | - | - | Gait detection | - | Alarm message | - | 0.5–1 m |
IPrevent Shoes [106,107] | Senior People | Radar | Laptop | - | - | - | - | - | - |
Smart Shoes [108] | Visually challenged | Ultrasonic sensor | - | - | - | 89.5% | Tapping at the foot arch | 5 h | 0–2 m (regular) 0–1 m (crowd) |
Microcontroller Board | Description | Reference |
---|---|---|
Arduino | Controls, processes, and generates all inputs and outputs. It receives the echo signals from the ultrasonic sensor that trigger it to take further actions and checks if the obstacle is there. It generates an immediate alert using a buzzer. It also generates a caption for the image captured by a camera and later converts that caption into speech that is played through an audio device. | [91] |
Arduino Nano—small, complete, breadboard-friendly board based on ATmega328 CPU clocked at 16 MHZ, 2 KB SRAM, 32 KB flash, 22 digital I/o pins, 8 analog pins and mini-USB port. | [33,101,104,114] | |
Arduino UNO-equipped with the well-known ATmega328 P and the Atmega 16U2 Processor. | [37,103] | |
Arduino 101 Board—To adjust the detection range of the sensor according to the walking direction, Arduino 101 boards, which contain Bluetooth, a six-axis accelerometer, and a gyrometer, were utilized. | [30] | |
Arduino Mega—Atmega 2560-based with 54 digital input/output pins (of which 15 can be used as PWM outputs), 16 analog inputs, 4 UARTs (hardware serial ports), a 16 MHz crystal oscillator, a USB connection, a power jack, an ICSP header, and a reset button. | [98,115] | |
Arduino Lilypad Atmega 328 V, 1 KB SRAM,512 bytes EEPROM, 8 MHZ clock speed, The Arduino Lilypad is attached with APR Module, GSM Module and Sensors. This Lilypad recognize the instructions sent by the Ultrasonic and IR sensors and works accordingly. It operates on 5v and is programmed by using Arduino IDE simulation platform. It receives and sends instructions accordingly. | [99] | |
NODE MCU | 3.3 V operating voltage, 80 MHZ clock speed, 4 MB flash memory, 64 KB RAM, 11 digital pins, 1 analogue pin on this board and built-in Wi-Fi 802.11 b/g/n. The ESP8266 LX106 microcontroller on the NodeMCU receives data from attached sensors, process the data, and use the uploaded code such as SSID of the Wi- Fi network, the password for the Wi-Fi network, to communicate with smartphone. | [32,102] |
STM32L432KC | Ultra-low-power microcontrollers based on the high-performance Arm® Cortex®-M4 32-bit RISC core operating at a frequency of up to 80 MHz with floating point unit, 1.71 V to 3.6 V power supply. | [34,116] |
Raspberry Pi Zero | The board incorporates a quad-core 64-bit Arm Cortex-A53 CPU, clocked at 1 GHz, 512 MB LPDDR2, 2.4 GHz 802.11 b/g/n wireless LAN and Bluetooth 4.2. The streaming of sensors data and actuation on shoe will be performed by the raspberry pi, while the obstacle detection is performed by smartphone. | [36,117] |
ARM cortex M3 LPC1768 | A flash memory of 512 KB, 64 KB data memory, a processor frequency of 100 Hz, 13 general purpose input-output (GPIO) registers _ 6 pulse width modulation (PWM) pins, 8 channel 12-bit analog to digital converter (ADC). The ultrasonic and infrared data given to the ARM cortex M3 microcontroller which determines if an obstacle is present or not. | [38] |
Renesas microcontroller | Low level power consumption with supply voltage varying from 1.6–5.5 volts, the execution time can be varied from 32 Mhz–32 kHz, consists of 64 pins which include code flash memory, DMA controller, high-speed on-chip oscillator, serial interface, data flash memory. | [35] |
ESP32 Development Board | ESP32 is a development board that incorporates both Wi-Fi and Bluetooth, which makes it a good choice to be utilized in projects related to embedded systems. It has Tensilica Xtensa Dual-Core 32-bit LX6 microprocessor which operates at either 160 or 240 MHz. | [105] |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Joseph, A.M.; Kian, A.; Begg, R. State-of-the-Art Review on Wearable Obstacle Detection Systems Developed for Assistive Technologies and Footwear. Sensors 2023, 23, 2802. https://doi.org/10.3390/s23052802
Joseph AM, Kian A, Begg R. State-of-the-Art Review on Wearable Obstacle Detection Systems Developed for Assistive Technologies and Footwear. Sensors. 2023; 23(5):2802. https://doi.org/10.3390/s23052802
Chicago/Turabian StyleJoseph, Anna M., Azadeh Kian, and Rezaul Begg. 2023. "State-of-the-Art Review on Wearable Obstacle Detection Systems Developed for Assistive Technologies and Footwear" Sensors 23, no. 5: 2802. https://doi.org/10.3390/s23052802
APA StyleJoseph, A. M., Kian, A., & Begg, R. (2023). State-of-the-Art Review on Wearable Obstacle Detection Systems Developed for Assistive Technologies and Footwear. Sensors, 23(5), 2802. https://doi.org/10.3390/s23052802