Bioinspired Perception and Navigation of Service Robots in Indoor Environments: A Review
Abstract
:1. Introduction
2. Vision-Based Navigation
2.1. Optic Flow
2.2. SLAM
2.3. Landmark
2.4. Others
3. Remote Sensing Navigation
4. Tactile-Sensor-Based Navigation
5. Olfactory Navigation
6. Sound-Based Navigation
7. Inertial Navigation
8. Multimodal Navigation
9. Others
10. Discussion and Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
SLAM | Simultaneous localization and mapping system |
ACO | Ant colony optimization |
HSI | Hue, saturation, and intensity |
HiLAM | Hierarchical look-ahead trajectory model |
APF | Artificial potential field |
PF | Particle filter |
PSO | Particle swarm optimization |
GDM | Gas distribution mapping |
IMU | Inertial measurement unit |
EKF | Extended Kalman filter |
SLIC-SVM | Simple-linear-iterative-clustering-based support vector machine |
SL | Supervised learning |
DRL | Deep reinforcement learning |
RFID | Radiofrequency identification |
UWB | Ultrawideband |
MDP | Markov decision process |
AER | Address-event representation |
SNN | Spiking neural network |
CNN | Convolutional neural network |
MEMS | Micro-electromechanical systems |
References
- Fukuda, T.; Chen, F.; Shi, Q. Special feature on bio-inspired robotics. Appl. Sci. 2018, 8, 817. [Google Scholar] [CrossRef] [Green Version]
- Metka, B.; Franzius, M.; Bauer-Wersing, U. Bio-inspired visual self-localization in real world scenarios using Slow Feature Analysis. PLoS ONE 2018, 13, e0203994. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Pardo-Cabrera, J.; Rivero-Ortega, J.D.; Hurtado-López, J.; Ramírez-Moreno, D.F. Bio-inspired navigation and exploration system for a hexapod robotic platform. Eng. Res. Express 2022, 4, 025019. [Google Scholar] [CrossRef]
- Milford, M.; Schulz, R. Principles of goal-directed spatial robot navigation in biomimetic models. Philos. Trans. R. Soc. B Biol. Sci. 2014, 369, 20130484. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Maravall, D.; De Lope, J.; Fuentes, J.P. Navigation and self-semantic location of drones in indoor environments by combining the visual bug algorithm and entropy-based vision. Front. Neurorobot. 2017, 11, 46. [Google Scholar] [CrossRef] [Green Version]
- Rao, J.; Bian, H.; Xu, X.; Chen, J. Autonomous Visual Navigation System Based on a Single Camera for Floor-Sweeping Robot. Appl. Sci. 2023, 13, 1562. [Google Scholar] [CrossRef]
- Ayuso, F.; Botella, G.; García, C.; Prieto, M.; Tirado, F. GPU-based acceleration of bio-inspired motion estimation model. Concurr. Comput. 2013, 25, 1037–1056. [Google Scholar] [CrossRef]
- Gibaldi, A.; Vanegas, M.; Canessa, A.; Sabatini, S.P. A Portable Bio-Inspired Architecture for Efficient Robotic Vergence Control. Int. J. Comput. Vis. 2017, 121, 281–302. [Google Scholar] [CrossRef] [Green Version]
- Meyer, H.G.; Klimeck, D.; Paskarbeit, J.; Rückert, U.; Egelhaaf, M.; Porrmann, M.; Schneider, A. Resource-efficient bio-inspired visual processing on the hexapod walking robot HECTOR. PLoS ONE 2020, 15, e0230620. [Google Scholar] [CrossRef] [Green Version]
- De Oliveira, T.E.A.; Cretu, A.M.; Petriu, E.M. Multimodal bio-inspired tactile sensing module for surface characterization. Sensors 2017, 17, 1187. [Google Scholar] [CrossRef] [Green Version]
- Rao, A.; Elara, M.R.; Elangovan, K. Constrained VPH+: A local path planning algorithm for a bio-inspired crawling robot with customized ultrasonic scanning sensor. Robot. Biomim. 2016, 3, 12. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ramezani Dooraki, A.; Lee, D.J. An End-to-End Deep Reinforcement Learning-Based Intelligent Agent Capable of Autonomous Exploration in Unknown Environments. Sensors 2018, 18, 3575. [Google Scholar] [CrossRef] [Green Version]
- Wang, Y.; Shao, B.; Zhang, C.; Zhao, J.; Cai, Z. REVIO: Range- and Event-Based Visual-Inertial Odometry for Bio-Inspired Sensors. Biomimetics 2022, 7, 169. [Google Scholar] [CrossRef] [PubMed]
- Luneckas, M.; Luneckas, T.; Udris, D.; Plonis, D.; Maskeliūnas, R.; Damaševičius, R. A hybrid tactile sensor-based obstacle overcoming method for hexapod walking robots. Intell. Serv. Robot. 2021, 14, 9–24. [Google Scholar] [CrossRef]
- Villarreal, B.L.; Olague, G.; Gordillo, J.L. Synthesis of odor tracking algorithms with genetic programming. Neurocomputing 2016, 175, 1019–1032. [Google Scholar] [CrossRef]
- Gay, S.; Le Run, K.; Pissaloux, E.; Romeo, K.; Lecomte, C. Towards a Predictive Bio-Inspired Navigation Model. Information 2021, 12, 100. [Google Scholar] [CrossRef]
- Roubieu, F.L.; Serres, J.R.; Colonnier, F.; Franceschini, N.; Viollet, S.; Ruffier, F. A biomimetic vision-based hovercraft accounts for bees’ complex behaviour in various corridors. Bioinspir. Biomim. 2014, 9, 36003. [Google Scholar] [CrossRef]
- Bertrand, O.J.N.; Lindemann, J.P.; Egelhaaf, M. A Bio-inspired Collision Avoidance Model Based on Spatial Information Derived from Motion Detectors Leads to Common Routes. PLoS Comput. Biol. 2015, 11, e1004339. [Google Scholar] [CrossRef] [Green Version]
- Yadipour, M.; Billah, M.A.; Faruque, I.A. Optic flow enrichment via Drosophila head and retina motions to support inflight position regulation. J. Theor. Biol. 2023, 562, 111416. [Google Scholar] [CrossRef]
- Hyslop, A.; Krapp, H.G.; Humbert, J.S. Control theoretic interpretation of directional motion preferences in optic flow processing interneurons. Biol. Cybern. 2010, 103, 353–364. [Google Scholar] [CrossRef]
- Liu, S.C.; Delbruck, T.; Indiveri, G.; Whatley, A.; Douglas, R.; Douglas, R. Event-Based Neuromorphic Systems; John Wiley & Sons, Incorporated: New York, NY, UK, 2015. [Google Scholar]
- Gallego, G.; Delbruck, T.; Orchard, G.; Bartolozzi, C.; Taba, B.; Censi, A.; Leutenegger, S.; Davison, A.J.; Conradt, J.; Daniilidis, K.; et al. Event-Based Vision: A Survey. IEEE Trans. Pattern Anal. Mach. Intell. 2022, 44, 154–180. [Google Scholar] [CrossRef] [PubMed]
- Paredes-Valles, F.; Scheper, K.Y.W.; de Croon, G.C.H.E. Unsupervised Learning of a Hierarchical Spiking Neural Network for Optical Flow Estimation: From Events to Global Motion Perception. IEEE Trans. Pattern Anal. Mach. Intell. 2020, 42, 2051–2064. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Xu, P.; Humbert, J.S.; Abshire, P. Analog VLSI Implementation of Wide-field Integration Methods. J. Intell. Robot. Syst. 2011, 64, 465–487. [Google Scholar] [CrossRef]
- Zhu, A.Z.; Yuan, L.; Chaney, K.; Daniilidis, K. EV-FlowNet: Self-Supervised Optical Flow Estimation for Event-based Cameras. arXiv 2018. [Google Scholar] [CrossRef]
- Ruffier, F.; Viollet, S.; Franceschini, N. Visual control of two aerial micro-robots by insect-based autopilots. Adv. Robot. 2004, 18, 771–786. [Google Scholar] [CrossRef]
- Li, J.; Lindemann, J.P.; Egelhaaf, M. Peripheral Processing Facilitates Optic Flow-Based Depth Perception. Front. Comput. Neurosci. 2016, 10, 111. [Google Scholar] [CrossRef] [Green Version]
- de Croon, G.C.H.E.; Dupeyroux, J.J.G.; de Wagter, C.; Chatterjee, A.; Olejnik, D.A.; Ruffier, F. Accommodating unobservability to control flight attitude with optic flow. Nature 2022, 610, 485–490. [Google Scholar] [CrossRef]
- Vanhoutte, E.; Mafrica, S.; Ruffier, F.; Bootsma, R.J.; Serres, J. Time-of-travel methods for measuring optical flow on board a micro flying robot. Sensors 2017, 17, 571. [Google Scholar] [CrossRef] [Green Version]
- Serres, J.R.; Ruffier, F. Optic flow-based collision-free strategies: From insects to robots. Arthropod Struct. Dev. 2017, 46, 703–717. [Google Scholar] [CrossRef]
- Igual, F.D.; Botella, G.; García, C.; Prieto, M.; Tirado, F. Robust motion estimation on a low-power multi-core DSP. EURASIP J. Adv. Signal Process. 2013, 2013, 99. [Google Scholar] [CrossRef] [Green Version]
- Gremillion, G.; Humbert, J.S.; Krapp, H.G. Bio-inspired modeling and implementation of the ocelli visual system of flying insects. Biol. Cybern. 2014, 108, 735–746. [Google Scholar] [CrossRef] [PubMed]
- Zufferey, J.C.; Klaptocz, A.; Beyeler, A.; Nicoud, J.D.; Floreano, D. A 10-gram vision-based flying robot. Adv. Robot. 2007, 21, 1671–1684. [Google Scholar] [CrossRef] [Green Version]
- Serres, J.; Dray, D.; Ruffier, F.; Franceschini, N. A vision-based autopilot for a miniature air vehicle: Joint speed control and lateral obstacle avoidance. Auton. Robot. 2008, 25, 103–122. [Google Scholar] [CrossRef] [Green Version]
- Serres, J.R.; Ruffier, F. Biomimetic Autopilot Based on Minimalistic Motion Vision for Navigating along Corridors Comprising U-shaped and S-shaped Turns. J. Bionics Eng. 2015, 12, 47–60. [Google Scholar] [CrossRef] [Green Version]
- Kobayashi, N.; Bando, M.; Hokamoto, S.; Kubo, D. Guidelines for practical navigation systems based on wide-field-integration of optic flow. Asian J. Control 2021, 23, 2381–2392. [Google Scholar] [CrossRef]
- Serres, J.; Ruffier, F.; Viollet, S.; Franceschini, N. Toward Optic Flow Regulation for Wall-Following and Centring Behaviours. Int. J. Adv. Robot. Syst. 2006, 3, 23. [Google Scholar] [CrossRef]
- McGuire, K.; de Croon, G.; De Wagter, C.; Tuyls, K.; Kappen, H. Efficient Optical Flow and Stereo Vision for Velocity Estimation and Obstacle Avoidance on an Autonomous Pocket Drone. IEEE Robot. Autom. Lett. 2017, 2, 1070–1076. [Google Scholar] [CrossRef] [Green Version]
- Mounir, A.; Rachid, L.; Ouardi, A.E.; Tajer, A. Workload Partitioning of a Bio-inspired Simultaneous Localization and Mapping Algorithm on an Embedded Architecture. Int. J. Adv. Comput. Sci. Appl. 2021, 12, 221–229. [Google Scholar] [CrossRef]
- Vidal, A.R.; Rebecq, H.; Horstschaefer, T.; Scaramuzza, D. Ultimate SLAM? Combining Events, Images, and IMU for Robust Visual SLAM in HDR and High-Speed Scenarios. IEEE Robot. Autom. Lett. 2018, 3, 994–1001. [Google Scholar] [CrossRef] [Green Version]
- Ghosh, S.; Gallego, G. Multi-Event-Camera Depth Estimation and Outlier Rejection by Refocused Events Fusion. Adv. Intell. Syst. 2022, 4, 2200221. [Google Scholar] [CrossRef]
- Gelen, A.G.; Atasoy, A. An Artificial Neural SLAM Framework for Event-Based Vision. IEEE Access 2023, 11, 58436–58450. [Google Scholar] [CrossRef]
- Pathmakumar, T.; Muthugala, M.A.V.J.; Samarakoon, S.M.B.P.; Gómez, B.F.; Elara, M.R. A Novel Path Planning Strategy for a Cleaning Audit Robot Using Geometrical Features and Swarm Algorithms. Sensors 2022, 22, 5317. [Google Scholar] [CrossRef] [PubMed]
- Nantogma, S.; Ran, W.; Liu, P.; Yu, Z.; Xu, Y. Immunized Token-Based Approach for Autonomous Deployment of Multiple Mobile Robots in Burnt Area. Remote Sens. 2021, 13, 4135. [Google Scholar] [CrossRef]
- Jacobson, A.; Chen, Z.; Milford, M. Autonomous Multisensor Calibration and Closed-loop Fusion for SLAM. J. Field Robot. 2015, 32, 85–122. [Google Scholar] [CrossRef]
- Wu, C.; Yu, S.; Chen, L.; Sun, R. An Environmental-Adaptability-Improved RatSLAM Method Based on a Biological Vision Model. Machines 2022, 10, 259. [Google Scholar] [CrossRef]
- Erdem, U.M.; Milford, M.J.; Hasselmo, M.E. A hierarchical model of goal directed navigation selects trajectories in a visual environment. Neurobiol. Learn. Mem. 2015, 117, 109–121. [Google Scholar] [CrossRef] [Green Version]
- Sadeghi Amjadi, A.; Raoufi, M.; Turgut, A.E. A self-adaptive landmark-based aggregation method for robot swarms. Adapt. Behav. 2022, 30, 223–236. [Google Scholar] [CrossRef]
- Yu, S.E.; Lee, C.; Kim, D. Analyzing the effect of landmark vectors in homing navigation. Adapt. Behav. 2012, 20, 337–359. [Google Scholar] [CrossRef]
- Yu, X.; Yu, H. A novel low-altitude reconnaissance strategy for smart UAVs: Active perception and chaotic navigation. Trans. Inst. Meas. Control 2011, 33, 610–630. [Google Scholar] [CrossRef]
- Mair, E.; Augustine, M.; Jäger, B.; Stelzer, A.; Brand, C.; Burschka, D.; Suppa, M. A biologically inspired navigation concept based on the Landmark-Tree map for efficient long-distance robot navigation. Adv. Robot. 2014, 28, 289–302. [Google Scholar] [CrossRef] [Green Version]
- Salih, T.A.; Ghazal, M.T.; Mohammed, Z.G. Development of a dynamic intelligent recognition system for a real-time tracking robot. IAES Int. J. Robot. Autom. 2021, 10, 161. [Google Scholar] [CrossRef]
- Cheng, Y.; Jiang, P.; Hu, Y.F. A biologically inspired intelligent environment architecture for mobile robot navigation. Int. J. Intell. Syst. Technol. Appl. 2012, 11, 138–156. [Google Scholar] [CrossRef]
- Li, H.; Wang, H.; Cui, L.; Li, J.; Wei, Q.; Xia, J. Design and Experiments of a Compact Self-Assembling Mobile Modular Robot with Joint Actuation and Onboard Visual-Based Perception. Appl. Sci. 2022, 12, 3050. [Google Scholar] [CrossRef]
- Mathai, N.J.; Zourntos, T.; Kundur, D. Vector Field Driven Design for Lightweight Signal Processing and Control Schemes for Autonomous Robotic Navigation. EURASIP J. Adv. Signal Process. 2009, 2009, 984752. [Google Scholar] [CrossRef] [Green Version]
- Boudra, S.; Berrached, N.E.; Dahane, A. Efficient and secure real-time mobile robots cooperation using visual servoing. Int. J. Electr. Comput. Eng. 2020, 10, 3022. [Google Scholar] [CrossRef]
- Ahmad, S.; Sunberg, Z.N.; Humbert, J.S. End-to-End Probabilistic Depth Perception and 3D Obstacle Avoidance using POMDP. J. Intell. Robot. Syst. 2021, 103, 33. [Google Scholar] [CrossRef]
- Nguyen, T.; Mann, G.K.I.; Gosine, R.G.; Vardy, A. Appearance-Based Visual-Teach-And-Repeat Navigation Technique for Micro Aerial Vehicle. J. Intell. Robot. Syst. 2016, 84, 217–240. [Google Scholar] [CrossRef]
- Sinha, A.; Tan, N.; Mohan, R.E. Terrain perception for a reconfigurable biomimetic robot using monocular vision. Robot. Biomim. 2014, 1, 1–23. [Google Scholar] [CrossRef] [Green Version]
- Montiel-Ross, O.; Sepúlveda, R.; Castillo, O.; Quiñones, J. Efficient Stereoscopic Video Matching and Map Reconstruction for a Wheeled Mobile Robot. Int. J. Adv. Robot. Syst. 2012, 9, 120. [Google Scholar] [CrossRef]
- Aznar, F.; Pujol, M.; Rizo, R.; Rizo, C. Modelling multi-rotor UAVs swarm deployment using virtual pheromones. PLoS ONE 2018, 13, e0190692. [Google Scholar] [CrossRef]
- Yang, J.; Wang, X.; Bauer, P. V-Shaped Formation Control for Robotic Swarms Constrained by Field of View. Appl. Sci. 2018, 8, 2120. [Google Scholar] [CrossRef] [Green Version]
- Ohradzansky, M.T.; Humbert, J.S. Lidar-Based Navigation of Subterranean Environments Using Bio-Inspired Wide-Field Integration of Nearness. Sensors 2022, 22, 849. [Google Scholar] [CrossRef]
- Lopes, L.; Bodo, B.; Rossi, C.; Henley, S.; Zibret, G.; Kot-Niewiadomska, A.; Correia, V. ROBOMINERS; developing a bio-inspired modular robot miner for difficult to access mineral deposits. Adv. Geosci. 2020, 54, 99–108. [Google Scholar] [CrossRef]
- Jiang, Y.; Peng, P.; Wang, L.; Wang, J.; Wu, J.; Liu, Y. LiDAR-Based Local Path Planning Method for Reactive Navigation in Underground Mines. Remote Sens. 2023, 15, 309. [Google Scholar] [CrossRef]
- Romeh, A.E.; Mirjalili, S. Multi-Robot Exploration of Unknown Space Using Combined Meta-Heuristic Salp Swarm Algorithm and Deterministic Coordinated Multi-Robot Exploration. Sensors 2023, 23, 2156. [Google Scholar] [CrossRef]
- Moreno, L.; Garrido, S.; Blanco, D. Mobile Robot Global Localization using an Evolutionary MAP Filter. J. Glob. Optim. 2007, 37, 381–403. [Google Scholar] [CrossRef]
- Le, A.V.; Prabakaran, V.; Sivanantham, V.; Mohan, R.E. Modified A-Star Algorithm for Efficient Coverage Path Planning in Tetris Inspired Self-Reconfigurable Robot with Integrated Laser Sensor. Sensors 2018, 18, 2585. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- García, R.M.; Prieto-Castrillo, F.; González, G.V.; Tejedor, J.P.; Corchado, J.M. Stochastic navigation in smart cities. Energies 2017, 10, 929. [Google Scholar] [CrossRef] [Green Version]
- Saez-Pons, J.; Alboul, L.; Penders, J.; Nomdedeu, L. Multi-robot team formation control in the GUARDIANS project. Ind. Robot 2010, 37, 372–383. [Google Scholar] [CrossRef] [Green Version]
- Martinez, E.; Laguna, G.; Murrieta-Cid, R.; Becerra, H.M.; Lopez-Padilla, R.; LaValle, S.M. A motion strategy for exploration driven by an automaton activating feedback-based controllers. Auton. Robot. 2019, 43, 1801–1825. [Google Scholar] [CrossRef]
- Arvin, F.; Espinosa, J.; Bird, B.; West, A.; Watson, S.; Lennox, B. Mona: An Affordable Open-Source Mobile Robot for Education and Research. J. Intell. Robot. Syst. 2019, 94, 761–775. [Google Scholar] [CrossRef] [Green Version]
- Tarapore, D.; Christensen, A.L.; Timmis, J. Generic, scalable and decentralized fault detection for robot swarms. PLoS ONE 2017, 12, e0182058. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ordaz-Rivas, E.; Rodriguez-Liñan, A.; Torres-Treviño, L. Autonomous foraging with a pack of robots based on repulsion, attraction and influence. Auton. Robot. 2021, 45, 919–935. [Google Scholar] [CrossRef]
- Gia Luan, P.; Truong Thinh, N. Self-Organized Aggregation Behavior Based on Virtual Expectation of Individuals with Wave-Based Communication. Electronics 2023, 12, 2220. [Google Scholar] [CrossRef]
- Baker, C.J.; Smith, G.E.; Balleri, A.; Holderied, M.; Griffiths, H.D. Biomimetic Echolocation With Application to Radar and Sonar Sensing. Proc. IEEE 2014, 102, 447–458. [Google Scholar] [CrossRef] [Green Version]
- Ordaz-Rivas, E.; Rodriguez-Liñan, A.; Aguilera-Ruíz, M.; Torres-Treviño, L. Collective Tasks for a Flock of Robots Using Influence Factor. J. Intell. Robot. Syst. 2019, 94, 439–453. [Google Scholar] [CrossRef]
- Bouraine, S.; Azouaoui, O. Safe Motion Planning Based on a New Encoding Technique for Tree Expansion Using Particle Swarm Optimization. Robotica 2021, 39, 885–927. [Google Scholar] [CrossRef]
- Martinez, F.; Rendon, A. Autonomous Motion Planning for a Differential Robot using Particle Swarm Optimization. Int. J. Adv. Comput. Sci. Appl. 2023, 14. [Google Scholar] [CrossRef]
- Arena, E.; Arena, P.; Strauss, R.; Patané, L. Motor-Skill Learning in an Insect Inspired Neuro-Computational Control System. Front. Neurorobot. 2017, 11, 12. [Google Scholar] [CrossRef] [Green Version]
- Xu, P.; Liu, J.; Liu, X.; Wang, X.; Zheng, J.; Wang, S.; Chen, T.; Wang, H.; Wang, C.; Fu, X.; et al. A bio-inspired and self-powered triboelectric tactile sensor for underwater vehicle perception. NPJ Flex. Electron. 2022, 6, 25. [Google Scholar] [CrossRef]
- Mulvey, B.W.; Lalitharatne, T.D.; Nanayakkara, T. DeforMoBot: A Bio-Inspired Deformable Mobile Robot for Navigation among Obstacles. IEEE Robot. Autom. Lett. 2023, 8, 3827–3834. [Google Scholar] [CrossRef]
- Yu, Z.; Sadati, S.M.H.; Perera, S.; Hauser, H.; Childs, P.R.N.; Nanayakkara, T. Tapered whisker reservoir computing for real-time terrain identification-based navigation. Sci. Rep. 2023, 13, 5213. [Google Scholar] [CrossRef] [PubMed]
- Rausyan Fikri, M.; Wibowo Djamari, D. Palm-sized quadrotor source localization using modified bio-inspired algorithm in obstacle region. Int. J. Electr. Comput. Eng. 2022, 12, 3494. [Google Scholar] [CrossRef]
- Ojeda, P.; Monroy, J.; Gonzalez-Jimenez, J. A Simulation Framework for the Integration of Artificial Olfaction into Multi-Sensor Mobile Robots. Sensors 2021, 21, 2041. [Google Scholar] [CrossRef]
- Yamada, M.; Ohashi, H.; Hosoda, K.; Kurabayashi, D.; Shigaki, S. Multisensory-motor integration in olfactory navigation of silkmoth, Bombyx mori, using virtual reality system. eLife 2021, 10, e72001. [Google Scholar] [CrossRef] [PubMed]
- Martinez, D.; Rochel, O.; Hugues, E. A biomimetic robot for tracking specific odors in turbulent plumes. Auton. Robot. 2006, 20, 185–195. [Google Scholar] [CrossRef]
- Soegiarto, D.; Trilaksono, B.R.; Adiprawita, W.; Idris Hidayat, E.M.; Prabowo, Y.A. Combining SLAM, GDM, and Anemotaxis for Gas Source Localization in Unknown and GPS-denied Environments. Int. J. Electr. Eng. Inform. 2022, 14, 514–551. [Google Scholar] [CrossRef]
- Schillebeeckx, F.; De Mey, F.; Vanderelst, D.; Peremans, H. Biomimetic Sonar: Binaural 3D Localization using Artificial Bat Pinnae. Int. J. Robot. Res. 2011, 30, 975–987. [Google Scholar] [CrossRef]
- Steckel, J.; Peremans, H. BatSLAM: Simultaneous localization and mapping using biomimetic sonar. PLoS ONE 2013, 8, e54076. [Google Scholar] [CrossRef]
- Abbasi, A.; MahmoudZadeh, S.; Yazdani, A.; Moshayedi, A.J. Feasibility assessment of Kian-I mobile robot for autonomous navigation. Neural Comput. Appl. 2022, 34, 1199–1218. [Google Scholar] [CrossRef]
- Tidoni, E.; Gergondet, P.; Kheddar, A.; Aglioti, S.M. Audio-visual feedback improves the BCI performance in the navigational control of a humanoid robot. Front. Neurorobot. 2014, 8, 20. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ghosh, S.; Panigrahi, P.K.; Parhi, D.R. Analysis of FPA and BA meta-heuristic controllers for optimal path planning of mobile robot in cluttered environment. IET Sci. Meas. Technol. 2017, 11, 817–828. [Google Scholar] [CrossRef]
- Anumula, J.; Neil, D.; Delbruck, T.; Liu, S.C. Feature Representations for Neuromorphic Audio Spike Streams. Front. Neurosci. 2018, 12, 23. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Glackin, B.; Wall, J.A.; McGinnity, T.M.; Maguire, L.P.; McDaid, L.J. A spiking neural network model of the medial superior olive using spike timing dependent plasticity for sound localization. Front. Comput. Neurosci. 2010, 4, 18. [Google Scholar] [CrossRef] [Green Version]
- Kanoulas, D.; Tsagarakis, N.G.; Vona, M. Curved patch mapping and tracking for irregular terrain modeling: Application to bipedal robot foot placement. Robot. Auton. Syst. 2019, 119, 13–30. [Google Scholar] [CrossRef] [Green Version]
- Sabiha, A.D.; Kamel, M.A.; Said, E.; Hussein, W.M. Real-time path planning for autonomous vehicle based on teaching–learning-based optimization. Intell. Serv. Robot. 2022, 15, 381–398. [Google Scholar] [CrossRef]
- Chen, C.P.; Chen, J.Y.; Huang, C.K.; Lu, J.C.; Lin, P.C. Sensor data fusion for body state estimation in a bipedal robot and its feedback control application for stable walking. Sensors 2015, 15, 4925–4946. [Google Scholar] [CrossRef] [Green Version]
- Tan, N.; Mohan, R.E.; Elangovan, K. Scorpio: A biomimetic reconfigurable rolling–crawling robot. Int. J. Adv. Robot. Syst. 2016, 13. [Google Scholar] [CrossRef]
- Yi, L.; Le, A.V.; Hoong, J.C.C.; Hayat, A.A.; Ramalingam, B.; Mohan, R.E.; Leong, K.; Elangovan, K.; Tran, M.; Bui, M.V.; et al. Multi-Objective Instantaneous Center of Rotation Optimization Using Sensors Feedback for Navigation in Self-Reconfigurable Pavement Sweeping Robot. Mathematics 2022, 10, 3169. [Google Scholar] [CrossRef]
- Duivon, A.; Kirsch, P.; Mauboussin, B.; Mougard, G.; Woszczyk, J.; Sanfilippo, F. The Redesigned Serpens, a Low-Cost, Highly Compliant Snake Robot. Robotics 2022, 11, 42. [Google Scholar] [CrossRef]
- Kim, J.Y.; Kashino, Z.; Colaco, T.; Nejat, G.; Benhabib, B. Design and implementation of a millirobot for swarm studies–mROBerTO. Robotica 2018, 36, 1591–1612. [Google Scholar] [CrossRef] [Green Version]
- Fiack, L.; Cuperlier, N.; Miramond, B. Embedded and real-time architecture for bio-inspired vision-based robot navigation. J.-Real-Time Image Process. 2015, 10, 699–722. [Google Scholar] [CrossRef]
- Hartbauer, M. Simplified bionic solutions: A simple bio-inspired vehicle collision detection system. Bioinspir. Biomim. 2017, 12, 026007. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Porod, W.; Werblin, F.; Chua, L.O.; Roska, T.; Rodriguez-VÁZquez, Á.; Roska, B.; Fay, P.; Bernstein, G.H.; Huang, Y.F.; Csurgay, Á.I. Bio-Inspired Nano-Sensor-Enhanced CNN Visual Computer. Ann. N. Y. Acad. Sci. 2004, 1013, 92–109. [Google Scholar] [CrossRef]
- Colomer, S.; Cuperlier, N.; Bresson, G.; Gaussier, P.; Romain, O. LPMP: A Bio-Inspired Model for Visual Localization in Challenging Environments. Front. Robot. AI 2022, 8, 703811. [Google Scholar] [CrossRef]
- Tejera, G.; Llofriu, M.; Barrera, A.; Weitzenfeld, A. Bio-Inspired Robotics: A Spatial Cognition Model integrating Place Cells, Grid Cells and Head Direction Cells. J. Intell. Robot. Syst. 2018, 91, 85–99. [Google Scholar] [CrossRef]
- Jauffret, A.; Cuperlier, N.; Tarroux, P.; Gaussier, P. From self-assessment to frustration, a small step toward autonomy in robotic navigation. Front. Neurorobot. 2013, 7, 16. [Google Scholar] [CrossRef] [Green Version]
- Suzuki, M.; Floreano, D. Enactive Robot Vision. Adapt. Behav. 2008, 16, 122–128. [Google Scholar] [CrossRef]
- Li, W.; Wu, D.; Zhou, Y.; Du, J. A bio-inspired method of autonomous positioning using spatial association based on place cells firing. Int. J. Adv. Robot. Syst. 2017, 14, 172988141772801. [Google Scholar] [CrossRef] [Green Version]
- Yu, N.; Liao, Y.; Yu, H.; Sie, O. Construction of the rat brain spatial cell firing model on a quadruped robot. CAAI Trans. Intell. Technol. 2022, 7, 732–743. [Google Scholar] [CrossRef]
- Kyriacou, T. Using an evolutionary algorithm to determine the parameters of a biologically inspired model of head direction cells. J. Comput. Neurosci. 2012, 32, 281–295. [Google Scholar] [CrossRef]
- Montiel, H.; Martínez, F.; Martínez, F. Parallel control model for navigation tasks on service robots. J. Phys. Conf. Ser. 2021, 2135, 12002. [Google Scholar] [CrossRef]
- Yoo, H.; Cha, G.; Oh, S. Deep ego-motion classifiers for compound eye cameras. Sensors 2019, 19, 5275. [Google Scholar] [CrossRef] [Green Version]
- Skatchkovsky, N.; Jang, H.; Simeone, O. Spiking Neural Networks-Part III: Neuromorphic Communications. IEEE Commun. Lett. 2021, 25, 1746–1750. [Google Scholar] [CrossRef]
- Miskowicz, M. Send-On-Delta Concept: An Event-Based Data Reporting Strategy. Sensors 2006, 6, 49–63. [Google Scholar] [CrossRef] [Green Version]
- Tayarani-Najaran, M.H.; Schmuker, M. Event-Based Sensing and Signal Processing in the Visual, Auditory, and Olfactory Domain: A Review. Front. Neural Circuits 2021, 15, 610446. [Google Scholar] [CrossRef] [PubMed]
- Cheng, G.; Dean-Leon, E.; Bergner, F.; Rogelio Guadarrama Olvera, J.; Leboutet, Q.; Mittendorfer, P. A Comprehensive Realization of Robot Skin: Sensors, Sensing, Control, and Applications. Proc. IEEE 2019, 107, 2034–2051. [Google Scholar] [CrossRef]
- Cyr, A.; Thériault, F. Bio-inspired visual attention process using spiking neural networks controlling a camera. Int. J. Comput. Vis. Robot. 2019, 9, 39–55. [Google Scholar] [CrossRef]
- Floreano, D.; Zufferey, J.C.; Nicoud, J.D. From Wheels to Wings with Evolutionary Spiking Circuits. Artif. Life 2005, 11, 121–138. [Google Scholar] [CrossRef] [Green Version]
- Alnajjar, F.; Bin Mohd Zin, I.; Murase, K. A Hierarchical Autonomous Robot Controller for Learning and Memory: Adaptation in a Dynamic Environment. Adapt. Behav. 2009, 17, 179–196. [Google Scholar] [CrossRef]
- Arena, P.; De Fiore, S.; Fortuna, L.; Frasca, M.; Patané, L.; Vagliasindi, G. Reactive navigation through multiscroll systems: From theory to real-time implementation. Auton. Robot. 2008, 25, 123–146. [Google Scholar] [CrossRef]
- Botella, G.; Martín H, J.A.; Santos, M.; Meyer-Baese, U. FPGA-based multimodal embedded sensor system integrating low- and mid-level vision. Sensors 2011, 11, 8164–8179. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Elouaret, T.; Colomer, S.; De Melo, F.; Cuperlier, N.; Romain, O.; Kessal, L.; Zuckerman, S. Implementation of a Bio-Inspired Neural Architecture for Autonomous Vehicles on a Multi-FPGA Platform. Sensors 2023, 23, 4631. [Google Scholar] [CrossRef] [PubMed]
- Sanket, N.J.; Singh, C.D.; Ganguly, K.; Fermuller, C.; Aloimonos, Y. GapFlyt: Active Vision Based Minimalist Structure-Less Gap Detection For Quadrotor Flight. IEEE Robot. Autom. Lett. 2018, 3, 2799–2806. [Google Scholar] [CrossRef] [Green Version]
- Luan, H.; Fu, Q.; Zhang, Y.; Hua, M.; Chen, S.; Yue, S. A Looming Spatial Localization Neural Network Inspired by MLG1 Neurons in the Crab Neohelice. Front. Neurosci. 2022, 15, 787256. [Google Scholar] [CrossRef] [PubMed]
- Wang, J.; Yan, R.; Tang, H. Multi-Scale Extension in an Entorhinal-Hippocampal Model for Cognitive Map Building. Front. Neurorobot. 2021, 14, 592057. [Google Scholar] [CrossRef] [PubMed]
- Barrera, A.; Cáceres, A.; Weitzenfeld, A.; Ramirez-Amaya, V. Comparative Experimental Studies on Spatial Memory and Learning in Rats and Robots. J. Intell. Robot. Syst. 2011, 63, 361–397. [Google Scholar] [CrossRef]
- Pang, L.; Zhang, Y.; Coleman, S.; Cao, H. Efficient Hybrid-Supervised Deep Reinforcement Learning for Person Following Robot. J. Intell. Robot. Syst. 2020, 97, 299–312. [Google Scholar] [CrossRef]
- Zhu, Y.; Luo, K.; Ma, C.; Liu, Q.; Jin, B. Superpixel Segmentation Based Synthetic Classifications with Clear Boundary Information for a Legged Robot. Sensors 2018, 18, 2808. [Google Scholar] [CrossRef] [Green Version]
- Arena, P.; Fortuna, L.; Lombardo, D.; Patané, L. Perception for Action: Dynamic Spatiotemporal Patterns Applied on a Roving Robot. Adapt. Behav. 2008, 16, 104–121. [Google Scholar] [CrossRef]
- Zeng, T.; Si, B. Cognitive mapping based on conjunctive representations of space and movement. Front. Neurorobot. 2017, 11, 61. [Google Scholar] [CrossRef] [Green Version]
- Shrivastava, R.; Kumar, P.; Tripathi, S.; Tiwari, V.; Rajput, D.S.; Gadekallu, T.R.; Suthar, B.; Singh, S.; Ra, I.H. A Novel Grid and Place Neuron’s Computational Modeling to Learn Spatial Semantics of an Environment. Appl. Sci. 2020, 10, 5147. [Google Scholar] [CrossRef]
- Kazmi, S.M.A.M.; Mertsching, B. Gist+RatSLAM: An Incremental Bio-inspired Place Recognition Front-End for RatSLAM. EAI Endorsed Trans. Creat. Technol. 2016, 3, 27–34. [Google Scholar] [CrossRef] [Green Version]
- Yu, F.; Shang, J.; Hu, Y.; Milford, M. NeuroSLAM: A brain-inspired SLAM system for 3D environments. Biol. Cybern. 2019, 113, 515–545. [Google Scholar] [CrossRef]
- Ni, J.; Wang, C.; Fan, X.; Yang, S.X. A Bioinspired Neural Model Based Extended Kalman Filter for Robot SLAM. Math. Probl. Eng. 2014, 2014, 905826. [Google Scholar] [CrossRef] [Green Version]
- Ramalingam, B.; Le, A.V.; Lin, Z.; Weng, Z.; Mohan, R.E.; Pookkuttath, S. Optimal selective floor cleaning using deep learning algorithms and reconfigurable robot hTetro. Sci. Rep. 2022, 12, 15938. [Google Scholar] [CrossRef] [PubMed]
- Tai, L.; Li, S.; Liu, M. Autonomous exploration of mobile robots through deep neural networks. Int. J. Adv. Robot. Syst. 2017, 14, 1–9. [Google Scholar] [CrossRef]
- Chatty, A.; Gaussier, P.; Hasnain, S.K.; Kallel, I.; Alimi, A.M. The effect of learning by imitation on a multi-robot system based on the coupling of low-level imitation strategy and online learning for cognitive map building. Adv. Robot. 2014, 28, 731–743. [Google Scholar] [CrossRef]
- Martín, F.; Ginés, J.; Rodríguez-Lera, F.J.; Guerrero-Higueras, A.M.; Matellán Olivera, V. Client-Server Approach for Managing Visual Attention, Integrated in a Cognitive Architecture for a Social Robot. Front. Neurorobot. 2021, 15, 630386. [Google Scholar] [CrossRef]
- Huang, W.; Tang, H.; Tian, B. Vision enhanced neuro-cognitive structure for robotic spatial cognition. Neurocomputing 2014, 129, 49–58. [Google Scholar] [CrossRef]
- Kulvicius, T.; Tamosiunaite, M.; Ainge, J.; Dudchenko, P.; Wörgötter, F. Odor supported place cell model and goal navigation in rodents. J. Comput. Neurosci. 2008, 25, 481–500. [Google Scholar] [CrossRef] [Green Version]
- Marques-Villarroya, S.; Castillo, J.C.; Gamboa-Montero, J.J.; Sevilla-Salcedo, J.; Salichs, M.A. A Bio-Inspired Endogenous Attention-Based Architecture for a Social Robot. Sensors 2022, 22, 5248. [Google Scholar] [CrossRef]
- Zhu, D.; Li, W.; Yan, M.; Yang, S.X. The Path Planning of AUV Based on D-S Information Fusion Map Building and Bio-Inspired Neural Network in Unknown Dynamic Environment. Int. J. Adv. Robot. Syst. 2014, 11, 34. [Google Scholar] [CrossRef] [Green Version]
- Zhang, X.; Ding, W.; Wang, Y.; Luo, Y.; Zhang, Z.; Xiao, J. Bio-Inspired Self-Organized Fission–Fusion Control Algorithm for UAV Swarm. Aerospace 2022, 9, 714. [Google Scholar] [CrossRef]
- Yin, X.H.; Yang, C.; Xiong, D. Bio-inspired neurodynamics-based cascade tracking control for automated guided vehicles. Int. J. Adv. Manuf. Technol. 2014, 74, 519–530. [Google Scholar] [CrossRef]
- Rozsypálek, Z.; Broughton, G.; Linder, P.; Rouček, T.; Blaha, J.; Mentzl, L.; Kusumam, K.; Krajník, T. Contrastive Learning for Image Registration in Visual Teach and Repeat Navigation. Sensors 2022, 22, 2975. [Google Scholar] [CrossRef] [PubMed]
- Dasgupta, S.; Goldschmidt, D.; Wörgötter, F.; Manoonpong, P. Distributed recurrent neural forward models with synaptic adaptation and CPG-based control for complex behaviors of walking robots. Front. Neurorobot. 2015, 9, 10. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Hodge, V.J.; Hawkins, R.; Alexander, R. Deep reinforcement learning for drone navigation using sensor data. Neural Comput. Appl. 2021, 33, 2015–2033. [Google Scholar] [CrossRef]
- Al-Muteb, K.; Faisal, M.; Emaduddin, M.; Arafah, M.; Alsulaiman, M.; Mekhtiche, M.; Hedjar, R.; Mathkoor, H.; Algabri, M.; Bencherif, M.A. An autonomous stereovision-based navigation system (ASNS) for mobile robots. Intell. Serv. Robot. 2016, 9, 187–205. [Google Scholar] [CrossRef]
- Lazreg, M.; Benamrane, N. Intelligent System for Robotic Navigation Using ANFIS and ACOr. Appl. Artif. Intell. 2019, 33, 399–419. [Google Scholar] [CrossRef]
- Chen, C.; Richardson, P. Mobile robot obstacle avoidance using short memory: A dynamic recurrent neuro-fuzzy approach. Trans. Inst. Meas. Control 2012, 34, 148–164. [Google Scholar] [CrossRef]
- Nadour, M.; Boumehraz, M.; Cherroun, L.; Puig, V. Hybrid Type-2 Fuzzy Logic Obstacle Avoidance System based on Horn-Schunck Method. Electroteh. Electron. Autom. 2019, 67, 45–51. [Google Scholar]
- Singh, M.K.; Parhi, D.R. Path optimisation of a mobile robot using an artificial neural network controller. Int. J. Syst. Sci. 2011, 42, 107–120. [Google Scholar] [CrossRef]
- Arena, P.; Fortuna, L.; Lombardo, D.; Patanè, L.; Velarde, M.G. The winnerless competition paradigm in cellular nonlinear networks: Models and applications. Int. J. Circuit Theory Appl. 2009, 37, 505–528. [Google Scholar] [CrossRef]
- Liu, C.; Yang, J.; An, K.; Chen, Q. Rhythmic-Reflex Hybrid Adaptive Walking Control of Biped Robot. J. Intell. Robot. Syst. 2019, 94, 603–619. [Google Scholar] [CrossRef] [Green Version]
- Pathmakumar, T.; Sivanantham, V.; Anantha Padmanabha, S.G.; Elara, M.R.; Tun, T.T. Towards an Optimal Footprint Based Area Coverage Strategy for a False-Ceiling Inspection Robot. Sensors 2021, 21, 5168. [Google Scholar] [CrossRef]
- Corrales-Paredes, A.; Malfaz, M.; Egido-García, V.; Salichs, M.A. Waymarking in Social Robots: Environment Signaling Using Human–Robot Interaction. Sensors 2021, 21, 8145. [Google Scholar] [CrossRef] [PubMed]
- Karagüzel, T.A.; Turgut, A.E.; Eiben, A.E.; Ferrante, E. Collective gradient perception with a flying robot swarm. Swarm Intell. 2023, 17, 117–146. [Google Scholar] [CrossRef]
- Le, A.V.; Apuroop, K.G.S.; Konduri, S.; Do, H.; Elara, M.R.; Xi, R.C.C.; Wen, R.Y.W.; Vu, M.B.; Duc, P.V.; Tran, M. Multirobot Formation with Sensor Fusion-Based Localization in Unknown Environment. Symmetry 2021, 13, 1788. [Google Scholar] [CrossRef]
- Zhu, H.; Liu, H.; Ataei, A.; Munk, Y.; Daniel, T.; Paschalidis, I.C. Learning from animals: How to Navigate Complex Terrains. PLoS Comput. Biol. 2020, 16, e1007452. [Google Scholar] [CrossRef]
- de Croon, G.C.H.E.; De Wagter, C.; Seidl, T. Enhancing optical-flow-based control by learning visual appearance cues for flying robots. Nat. Mach. Intell. 2021, 3, 33–41. [Google Scholar] [CrossRef]
Paper | Contribution | Sensors | Real Time | How to Achieve Real-Time Operation |
---|---|---|---|---|
[9] | Resource-efficient vision-based navigation | Optic flow | Real time | Optimized parallel processing on FPGA |
[18] | Extract the depth structure from the optic flow | Optic flow | N/A | |
[19] | A quantitative model of the optic flow | Optic flow | N/A | |
[31] | A robust gradient-based optical flow model | Optic flow | Real time | Lightweight operating system |
[29] | Time-of-travel methods | Optic flow | N/A | |
[20] | Lobula plate tangential cells | Optic flow | N/A | |
[7] | A gradient-based optical flow model | Optic flow | Real time | GPU speed |
[24] | A novel integrated, single-chip solution | Optic flow | Real time | Computation speed |
[23] | Hierarchical SNN | Optic flow, event-based camera | Real time | Large-scale SNNs |
[25] | Self-supervised optical flow | Optic flow, event-based camera | Real time | Self-supervised neural network |
[26] | Control systems based on insects’ visuomotor control systems | Optic flow | N/A | |
[38] | Highly efficient computer vision algorithm | Optic flow | N/A | |
[161] | An actor–critic learning algorithm | Optic flow | N/A | |
[17] | A miniature hovercraft | Optic flow | N/A | |
[30] | Feedback loops | Optic flow | N/A | |
[28] | Attitude can be extracted from the optic flow | Optic flow | N/A | |
[33] | Ultralight autonomous microfliers | Optic flow, gyroscopes, anemometer, Bluetooth | N/A | |
[34] | Visuomotor control system | Optic flow | N/A | |
[35] | Minimalistic motion vision | Optic flow | N/A | |
[162] | Learning process | Optic flow | N/A | |
[37] | Optic flow-based autopilot | Optic flow | N/A | |
[27] | Adaptive peripheral visual system | Optic flow | N/A | |
[36] | Wide-field integration of optic flow | Optic flow | N/A | |
[8] | A Portable bioinspired architecture | Visual | Real time | A minimal quantity of resources |
[48] | A self-adaptive landmark-based aggregation method | Landmark | N/A | An error threshold parameter |
[51] | Landmark-tree (LT) map | Landmark and omnidirectional camera | N/A | |
[49] | A landmark vector algorithm | Landmark-based | N/A | |
[5] | Entropy-based vision and visual topological maps | Landmark and entropy-based vision | Real time | A conventional bug algorithm/metric maps |
[50] | Pan–tilt-based visual sensing system | Landmark and visual sensor | Real time | Perception-motion dynamics loop |
[39] | A bioinspired SLAM algorithm | SLAM and monocular or stereovision systems | Real time | CPU-GPU architecture |
[47] | Hierarchical look-ahead trajectory model (HiLAM) | SLAM | Real time | RatSLAM |
[40] | State estimation pipeline | SLAM | Real time | Standard frames |
[41] | Refocused events fusion | SLAM | Real time | Throughput |
[42] | Artificial neural SLAM framework | SLAM, event-based camera | N/A | |
[45] | Movement-based autonomous calibration techniques | SLAM, cameras, sonar sensors, a laser, an RGB, and a range sensor | Real time | Online sensor fusion |
[43] | Dirt-sample-gathering strategy/ACO | SLAM | Real time | Dirt-gathering efficiency |
[44] | An decentralized approach | SLAM | N/A | |
[46] | Environmental-adaptability-improved RatSLAM | SLAM | N/A | |
[6] | Visual navigation | Depth camera | Real time | Superpixel image segment algorithm |
[57] | Generate robot actions | Depth camera | Real time | Partially observable Markov decision process |
[32] | Ocular visual system | Ocular sensor and Monte Carlo | N/A | |
[53] | Distributed wireless nodes | Wireless vision sensors and mosaic eyes | Real time | Image acquisition and processing module |
[54] | Parallel mechanism-based docking system | Camera infrared sensor | Real time | Stereo camera |
[55] | Signal processing and control architectures | Visual tool | Real time | Custom OpenGL application |
[56] | Mobile robots cooperation | Camera | Real time | Visual servoing |
[52] | Intelligent recognition system | Wireless camera | Real time | Path delineation method |
[58] | Visual-Teach-and Repeat | Visual servo | N/A | |
[59] | Reconfigurable biomimetic robot | Monocular vision | Real time | GPGPU coding |
[60] | Efficient stereoscopic video matching | Stereoscopic vision | N/A | |
[61] | Modeling multirotor UAVs swarm deployment | Visual | N/A | |
[62] | V-shaped formation control | Binocular | N/A |
Paper | Contribution | Sensors | Real Time | How to Achieve Real-Time Operation |
---|---|---|---|---|
[63] | Wide-field integration (WFI) framework | LiDAR-based | N/A | |
[64] | Modular robot | LiDAR-based | Real time | Robot framework |
[78] | Tree expansion using particle swarm optimization | LiDAR-based | Real time | World model update |
[65] | LiDAR-Based Local path planning | LiDAR-based | N/A | |
[66] | Metaheuristic salp swarm algorithm and deterministic coordinated multirobot exploration | LiDAR-based | Real time | Metaheuristic algorithms |
[70] | Social potential field framework | LiDAR-based | N/A | |
[68] | Modified A-star algorithm | LiDAR-based | Real time | Modified A-star algorithm |
[71] | Motion policy; a direct mapping from observation to control | Laser ranger and omnidirectional | N/A | |
[67] | Evolutive localization filter (ELF) | Laser ranger | N/A | |
[69] | Chemical signaling | Infrared sensors | N/A | |
[73] | Generic fault-detection system | Infrared proximity sensors | N/A | |
[75] | Self-organized aggregation behavior | Infrared sensors | N/A | |
[76] | Cognitive dynamic system | Radar and sonar | N/A | |
[77] | Steering a swarm of robots | Proximity, infrared, ultrasonic sensors, and light-dependent resistor | N/A | |
[74] | A behavioral algorithm | Ultrasonic and infrared sensors and light-dependent resistor | N/A | |
[72] | Robotic platform | Short-range infrared proximity sensors | N/A | |
[79] | Control mechanisms | Distance sensor | Real time | ESP32 microcontroller |
Paper | Contribution | Sensors | Real Time | How to Achieve Real-Time Operation |
---|---|---|---|---|
[14] | A hybrid obstacle-overcoming method | Tactile sensor-based | Real time | Signal is transferred on board |
[81] | Triboelectric nanogenerators | Tactile-sensor-based | Real time | Palm structure and triboelectric nanogenerator technology |
[82] | Feedback control | Tactile-sensor-based and whisker | Real time | Whisker feedback |
[83] | Terrain-recognition-based navigation | Tactile-sensor-based and whisker | Real time | On-board reservoir computing system |
[80] | Motor learning | Tactile-sensor-based | Real time | Nonlinear recurrent SNN |
[15] | Odor-tracking algorithms with genetic programming | Olfaction-sensor-based | N/A | |
[84] | Search based on the silkworm moth | Olfaction-sensor-based | N/A | |
[86] | Multisensory-motor integration | Olfaction-sensor-based and visual | N/A | |
[87] | Odor recognition system | Olfaction-sensor-based and an SNN | N/A | |
[85] | Gas dispersal and sensing alongside vision | Olfaction-sensor-based | N/A | |
[88] | FRONTIER-multicriteria decision-making and anemotaxis-GDM | Olfaction-sensor-based, gas distribution mapping (GDM), and anemotaxis/SLAM | N/A | |
[89] | Binaural sonar sensor | Sound-based/sonar | N/A | |
[92] | Audiovisual synchrony | Sound-based/Visual | N/A | |
[11] | Enhanced vector polar histogram algorithm | Sound-based, ultrasonic sensors | N/A | |
[91] | A two-wheeled mobile robot with B-spline curves and PSO | Sound-based, ultrasonic sensors/camera | N/A | |
[90] | Sonar-based spatial orientation | Sound-based, sonar | N/A | |
[93] | FPA and BA metaheuristic | Sound-based, ultrasonic sensors | Real time | Algorithm efficiency |
[95] | SNN-based model | Sound-based | N/A | |
[96] | Curved patch mapping and tracking | IMU and RGB-D | Real time | Parametrized patch models |
[99] | Reconfigurable rolling–crawling robot | IMU and visual sensor | Real time | Remote computer for vision processing and feedback |
[100] | 4WISD reconfigurable robot | IMU, Velodyne, and LiDARs, ultrasonic sensors, absolute encoder, wire encoder, and camera. | N/A | |
[102] | Millirobot with an open-source design | IMU and camera | N/A | |
[97] | Teaching–learning-based optimization and EKF | IMU, wheel odometry, and light detection and ranging (LiDAR) | Real time | Algorithm |
[101] | Multipurpose modular snake robot | IMU | Real time | Linear discriminant analysis |
[98] | Sensor data fusion algorithm | IMU, a 2-axis inclinometer, and joint encoders | Real time | Overall control strategy |
[158] | Environment signaling system | Radiofrequency identification (RFID) | N/A | |
[159] | Collective gradient perception | UWB, laser, and camera | N/A | |
[160] | Multirobot formation with sensor fusion-based localization | UWB position system, IMU, and wheel encoders | Real time | Sensor fusion |
Paper | Contribution | Sensors | Real Time | How to Achieve Real-Time Operation |
---|---|---|---|---|
[143] | A Bioinspired endogenous attention-based architecture | Virtual | Real time | Selective attention’s correction |
[110] | Spatial association | Virtual, neural network, place cells | Real time | Neural network |
[133] | Quadrant-based approach | Virtual, neural network, place cells | N/A | |
[2] | Slow feature analysis | Vision-based, place cells and head direction cells | N/A | |
[107] | Spatial cognition model | Vision-based, place cells, grid cells and head direction cells | N/A | |
[16] | Navigation inspired by mammalian navigation | Vision-based | Real time | Prediction-oriented estimations |
[132] | Cognitive mapping model | Virtual, neural network, head direction cells, conjunctive grid cells, SLAM | Real time | Conjunctive space-by-movement attractor network |
[112] | Biologically inspired model, evolutionary algorithm | Virtual, neural network, head direction cells | N/A | |
[106] | Log-polar max-pi (LPMP) | Virtual, neural network | Real time | Visuospatial pattern |
[103] | Embedded control system | Vision-based, Neural control layer | Real time | Neural control layer |
[104] | Collision detection | Virtual, Neural network | Real time | Collision-detector neuron in locusts |
[3] | Central pattern generators (CPGs) | Virtual, neural network | Real time | Pattern generators |
[10] | Tactile probe | Tactile Sensing | N/A | |
[134] | Place recognition, RatSLAM system | Virtual, Neural network | Real time | Self-organizing neural network |
[12] | Deep-reinforcement-learning-based intelligent agent | Virtual, neural network, infrared proximity sensor | Real time | Memory-based deep reinforcement learning |
[13] | Range and event-based visual-inertial odometry | Virtual, inertial odometry, range | Real time | Sensor |
[135] | NeuroSLAM | Virtual, neural network | Real time | Neural network |
[108] | Generic neural architecture | Virtual, neural network | Real time | Online detection algorithm |
[144] | Neural dynamics and map planning | Virtual, neural network | Real time | Neural network |
[139] | Learning by imitation leads | Virtual, cognitive map | Real time | Cognitive map |
[145] | Self-organized fission–fusion control | Multimodal | N/A | |
[146] | Neurodynamics-based cascade tracking control | Multimodal | N/A | |
[105] | Nanosensor-Enhanced CNN | Virtual, neural network | N/A | |
[131] | Dynamic spatiotemporal patterns | Virtual, neural network | N/A | |
[119] | Bioinspired visual attention process using SNNs | Virtual, neural network | Real time | Restrict the data flow |
[114] | CNN-based egomotion classification framework | Virtual, neural network, compound eye | N/A | |
[125] | Minimalist sensorimotor framework | Virtual, deep learning | Real time | Minimalist philosophy |
[141] | Vision-enhanced neurocognitive structure | Virtual, neural network | N/A | |
[147] | Contrastive learning | Virtual, neural network | Real time | High efficiency |
[122] | Multisensory integration | Virtual, distance sensor | Real time | FPGA architecture |
[123] | FPGA-based embedded sensor system | Virtual, optic flow | Real time | Processing speed |
[124] | Bioinspired neural architecture | Virtual, image | Real time | FPGAs |
[113] | Parallel control model | Virtual, image | Real time | Two loops form |
[148] | Distributed recurrent neural network | Leg, neural network | Real time | Adaptive locomotion |
[150] | Stereovision-based navigation system | Virtual, fuzzy logic | Real time | Algorithm |
[153] | Type-2 Fuzzy logic | Virtual, fuzzy logic | N/A | |
[149] | Generic navigation algorithm | Neural network, onboard sensor | Real time | Proximal policy optimization |
[151] | Intelligent system, ACO | Infrared sensor, fuzzy logic | N/A | |
[154] | Multilayer feed-forward neural network | Infrared sensor, ultrasonic, neural network | Real time | Neural controller |
[140] | Visual attention system | Virtual, cognitive architecture | N/A | |
[137] | Selective area-cleaning/spot-cleaning technique | Virtual, deep learning | Real time | SSD MobileNet |
[127] | Optimized dynamical model | Virtual, grid cells | Real time | Vision-assisted map correction mechanism |
[126] | Looming spatial localization neural network | Virtual, motion-sensitive neuron | N/A | |
[138] | A novel deep learning library | Virtual, RGB-D information, deep learning | Real time | Deep learning |
[120] | Vision-based microrobot | Virtual, adaptive spiking neurons | N/A | |
[109] | Enactive vision | Virtual, neural networks, computer vision | N/A | |
[155] | Winnerless competition paradigm | Neural networks, olfactory | N/A | |
[136] | A bioinspired-neural-model-based extended Kalman filter | Neural networks, SLAM | Real time | Neural dynamic model |
[142] | Odor-supported place cell model | RL, olfactory | N/A | |
[156] | Hybrid rhythmic–reflex control method | Neural network | Real time | ZMP-based feedback loop |
[128] | Spatial memory and learning | Visual, cognitive map | N/A | |
[152] | Dynamic recurrent neurofuzzy approach | Ultrasonic, learning | Real time | Fuzzy logic |
[130] | Simple-linear-iterative-clustering-based support vector machine (SLIC-SVM), simple-linear-iterative-clustering-based SegNet | Visual, SVM | Real time | Sensor |
[129] | Hybrid supervised deep reinforcement learning | Visual, RL, Markov decision process (MDP) | Real time | SL policy network training |
[157] | Optimal functional footprint approach | Visual, camera, beacon, UWB, encoder, motor, Wi-Fi | N/A | |
[121] | A hierarchical autonomous robot controller | Visual, infrared, sound, neural network | N/A | |
[111] | Brain spatial cell firing model | IMU, neural network | N/A |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, J.; Lin, S.; Liu, A. Bioinspired Perception and Navigation of Service Robots in Indoor Environments: A Review. Biomimetics 2023, 8, 350. https://doi.org/10.3390/biomimetics8040350
Wang J, Lin S, Liu A. Bioinspired Perception and Navigation of Service Robots in Indoor Environments: A Review. Biomimetics. 2023; 8(4):350. https://doi.org/10.3390/biomimetics8040350
Chicago/Turabian StyleWang, Jianguo, Shiwei Lin, and Ang Liu. 2023. "Bioinspired Perception and Navigation of Service Robots in Indoor Environments: A Review" Biomimetics 8, no. 4: 350. https://doi.org/10.3390/biomimetics8040350
APA StyleWang, J., Lin, S., & Liu, A. (2023). Bioinspired Perception and Navigation of Service Robots in Indoor Environments: A Review. Biomimetics, 8(4), 350. https://doi.org/10.3390/biomimetics8040350