A Bio-Inspired Probabilistic Neural Network Model for Noise-Resistant Collision Perception
Abstract
:1. Introduction
2. Related Work
2.1. Probabilistic Spiking Neural Models
2.2. Bio-inspired Collision Perception Models
3. Model Description
3.1. Temporal Processing
3.2. E-I Layer with Probability of Signal Transmission
3.3. S Layer with Probability of Signal Interaction
3.4. LGMD Cell with Probability of Signal Integration
3.5. Setting the Network Parameters
4. Experimental Design and Performance Metrics
4.1. Evaluation Criteria
4.2. Setting the Experiments
5. Results and Analysis
- 1.
- Results in simple indoor scenarios;
- 2.
- Results in complex outdoor scenarios;
- 3.
- Results in the selection of probabilistic parameters;
- 4.
- Results of further investigations.
5.1. Results under Testing of Structured Indoor Scenes
5.2. Results under Testing of Complex Vehicle Scenes
5.3. Selection of Probabilistic Parameters
5.4. Further Investigations
5.4.1. Comparison with Engineering Techniques
5.4.2. Noise Injection between Network Layers
5.5. Further Discussion
6. Conclusions
- The comparative LGMD2 model demonstrates effective collision detection performance in physical and complex vehicle collision scenarios, even providing early collision warnings. However, in the presence of noise, both the LGMD2 and LGMD1 models struggle to recognize potential collisions, while the proposed Prob-LGMD model consistently exhibits a stable collision performance and resistance to noise.
- When compared to traditional engineering denoising methods such as Gaussian and median filtering, the Prob-LGMD model demonstrates superior noise resistance.
- By introducing noise into intermediate neural network layers to simulate the noise inherent in neural signal transmission and interaction, the proposed probabilistic model successfully overcomes this challenge and maintains robustness against noise.
- Based on biological insights, it is imperative to transition from deterministic motion perception models to probabilistic models. There is a wealth of modeling work in the field of dynamic vision systems that could benefit from the incorporation of similar stochastic processes.
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Fu, Q.; Wang, H.; Hu, C.; Yue, S. Towards computational models and applications of insect visual systems for motion perception: A review. Artif. Life 2019, 25, 263–311. [Google Scholar] [CrossRef]
- Wu, S.; Decker, S.; Chang, P.; Camus, T.; Eledath, J. Collision sensing by stereo vision and radar sensor fusion. IEEE Trans. Intell. Transp. Syst. 2009, 10, 606–614. [Google Scholar]
- Benet, G.; Blanes, F.; Simó, J.E.; Pérez, P. Using infrared sensors for distance measurement in mobile robots. Robot. Auton. Syst. 2002, 40, 255–266. [Google Scholar] [CrossRef]
- Baltzakis, H.; Argyros, A.; Trahanias, P. Fusion of laser and visual data for robot motion planning and collision avoidance. Mach. Vis. Appl. 2003, 15, 92–100. [Google Scholar] [CrossRef]
- Long, S.; He, X.; Yao, C. Scene text detection and recognition: The deep learning era. Int. J. Comput. Vis. 2021, 129, 161–184. [Google Scholar] [CrossRef]
- Fu, Q. Motion perception based on ON/OFF channels: A survey. Neural Netw. 2023, 165, 1–18. [Google Scholar] [CrossRef] [PubMed]
- Franceschini, N. Small Brains, Smart Machines: From Fly Vision to Robot Vision and Back Again. Proc. IEEE 2014, 102, 751–781. [Google Scholar] [CrossRef]
- Serres, J.; Ruffier, F. Optic flow-based collision-free strategies: From insects to robots. Arthropod Struct. Dev. 2017, 46, 703–717. [Google Scholar] [CrossRef] [PubMed]
- Floreano, D.; Pericet-Camara, R.; Viollet, S.; Ruffier, F.; Bruckner, A.; Leitel, R.; Buss, W.; Menouni, M.; Expert, F.; Juston, R.; et al. Miniature curved artificial compound eyes. Proc. Natl. Acad. Sci. USA 2013, 110, 9267–9272. [Google Scholar] [CrossRef] [PubMed]
- Rind, F.C.; Wernitznig, S.; Polt, P.; Zankel, A.; Gutl, D.; Sztarker, J.; Leitinger, G. Two identified looming detectors in the locust: Ubiquitous lateral connections among their inputs contribute to selective responses to looming objects. Sci. Rep. 2016, 6, 35525. [Google Scholar] [CrossRef]
- Gray, J.R.; Blincow, E.; Robertson, R.M. A pair of motion-sensitive neurons in the locust encode approaches of a looming object. J. Comp. Physiol. A 2010, 196, 927–938. [Google Scholar] [CrossRef]
- Hu, C.; Arvin, F.; Xiong, C.; Yue, S. Bio-inspired embedded vision system for autonomous micro-robots: The LGMD case. IEEE Trans. Cogn. Dev. Syst. 2016, 9, 241–254. [Google Scholar] [CrossRef]
- Fu, Q.; Hu, C.; Peng, J.; Rind, F.C.; Yue, S. A robust collision perception visual neural network with specific selectivity to darker objects. IEEE Trans. Cybern. 2020, 50, 5074–5088. [Google Scholar] [CrossRef] [PubMed]
- Faisal, A.A.; Selen, L.P.; Wolpert, D.M. Noise in the nervous system. Nat. Rev. Neurosci. 2008, 9, 292–303. [Google Scholar] [CrossRef]
- Läuger, P. Ionic channels with conformational substates. Biophys. J. 1985, 47, 581–590. [Google Scholar] [CrossRef] [PubMed]
- Kojima, H.; Katsumata, S. An analysis of synaptic transmission and its plasticity by glutamate receptor channel kinetics models and 2-photon laser photolysis. In Proceedings of the International Conference on Neural Information Processing, Auckland, New Zealand, 25–28 November 2008; Springer: Berlin/Heidelberg, Germany, 2008; pp. 88–94. [Google Scholar]
- Huguenard, J.R. Reliability of axonal propagation: The spike does not stop here. Proc. Natl. Acad. Sci. USA 2000, 97, 9349–9350. [Google Scholar] [CrossRef]
- Benuskova, L.; Kasabov, N.K. Computational Neurogenetic Modeling; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2010. [Google Scholar]
- Kasabov, N. Brain-, gene-, and quantum inspired computational intelligence: Challenges and opportunities. In Challenges for Computational Intelligence; Springer: Berlin/Heidelberg, Germany, 2007; pp. 193–219. [Google Scholar]
- Kasabov, N. Integrative connectionist learning systems inspired by nature: Current models, future trends and challenges. Nat. Comput. 2009, 8, 199–218. [Google Scholar] [CrossRef]
- Reutimann, J.; Giugliano, M.; Fusi, S. Event-driven simulation of spiking neurons with stochastic dynamics. Neural Comput. 2003, 15, 811–830. [Google Scholar] [CrossRef] [PubMed]
- Fisher, J.; Henzinger, T.A. Executable cell biology. Nat. Biotechnol. 2007, 25, 1239–1249. [Google Scholar] [CrossRef]
- Maass, W. Noise as a resource for computation and learning in networks of spiking neurons. Proc. IEEE 2014, 102, 860–880. [Google Scholar] [CrossRef]
- Tan, C.; Sarlija, M.; Kasabov, N. Spiking Neural Networks: Background, Recent Development and the NeuCube Architecture. Neural Process. Lett. 2020, 52, 1675–1701. [Google Scholar] [CrossRef]
- Lobo, J.L.; Ser, J.D.; Bifet, A.; Kasabov, N. Spiking neural networks and online learning: An overview and perspectives. Neural Netw. 2020, 121, 88–100. [Google Scholar] [CrossRef]
- Salt, L.; Howard, D.; Indiveri, G.; Sandamirskaya, Y. Parameter Optimization and Learning in a Spiking Neural Network for UAV Obstacle Avoidance Targeting Neuromorphic Processors. IEEE Trans. Neural Netw. Learn. Syst. 2020, 31, 3305–3318. [Google Scholar] [CrossRef]
- Hodgkin, A.L.; Huxley, A.F. A quantitative description of membrane current and its application to conduction and excitation in nerve. J. Physiol. 1952, 117, 500. [Google Scholar] [CrossRef]
- Gerstner, W.; Kistler, W.M. Spiking Neuron Models: Single Neurons, Populations, Plasticity; Cambridge University Press: Cambridge, UK, 2002. [Google Scholar]
- Abbott, L.F. Lapicque’s introduction of the integrate-and-fire model neuron (1907). Brain Res. Bull. 1999, 50, 303–304. [Google Scholar] [CrossRef]
- Izhikevich, E.M. Simple model of spiking neurons. IEEE Trans. Neural Netw. 2003, 14, 1569–1572. [Google Scholar] [CrossRef] [PubMed]
- Ghosh-Dastidar, S.; Adeli, H. Spiking neural networks. Int. J. Neural Syst. 2009, 19, 295–308. [Google Scholar] [CrossRef]
- Izhikevich, E.M.; Edelman, G.M. Large-scale model of mammalian thalamocortical systems. Proc. Natl. Acad. Sci. USA 2008, 105, 3593–3598. [Google Scholar] [CrossRef]
- Smetters, D.; Zador, A. Synaptic transmission: Noisy synapses and noisy neurons. Curr. Biol. 1996, 6, 1217–1218. [Google Scholar] [CrossRef] [PubMed]
- Kasabov, N. To spike or not to spike: A probabilistic spiking neuron model. Neural Netw. 2010, 23, 16–19. [Google Scholar] [CrossRef] [PubMed]
- Kasabov, N.K. Evolving Connectionist Systems: The Knowledge Engineering Approach; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2007. [Google Scholar]
- Yue, S.; Rind, F.C. Collision detection in complex dynamic scenes using an LGMD-based visual neural network with feature enhancement. IEEE Trans. Neural Netw. 2006, 17, 705–716. [Google Scholar]
- Borst, A.; Euler, T. Seeing things in motion: Models, circuits, and mechanisms. Neuron 2011, 71, 974–994. [Google Scholar] [CrossRef] [PubMed]
- Gabbiani, F.; Krapp, H.G.; Koch, C.; Laurent, G. Multiplicative computation in a visual neuron sensitive to looming. Nature 2002, 420, 320–324. [Google Scholar] [CrossRef] [PubMed]
- Borst, A.; Helmstaedter, M. Common circuit design in fly and mammalian motion vision. Nat. Neurosci. 2015, 18, 1067–1076. [Google Scholar] [CrossRef] [PubMed]
- Rind, F.C.; Bramwell, D. Neural network based on the input organization of an identified neuron signaling impending collision. J. Neurophysiol. 1996, 75, 967–985. [Google Scholar] [CrossRef] [PubMed]
- Borst, A.; Drews, M.; Meier, M. The neural network behind the eyes of a fly. Curr. Opin. Physiol. 2020, 16, 33–42. [Google Scholar] [CrossRef]
- Wiederman, S.D.; Shoemaker, P.A.; O’Carroll, D.C. A model for the detection of moving targets in visual clutter inspired by insect physiology. PLoS ONE 2008, 3, e2784. [Google Scholar] [CrossRef] [PubMed]
- Oliva, D.; Tomsic, D. Computation of object approach by a system of visual motion-sensitive neurons in the crab Neohelice. J. Neurophysiol. 2014, 112, 1477–1490. [Google Scholar] [CrossRef] [PubMed]
- Stouraitis, T.; Gkanias, E.; Hemmi, J.M.; Webb, B. Predator evasion by a robocrab. In Proceedings of the Biomimetic and Biohybrid Systems: 6th International Conference, Living Machines 2017, Stanford, CA, USA, 26–28 July 2017; Proceedings 6. Springer: Berlin/Heidelberg, Germany, 2017; pp. 428–439. [Google Scholar]
- Klapoetke, N.C.; Nern, A.; Peek, M.Y.; Rogers, E.M.; Breads, P.; Rubin, G.M.; Reiser, M.B.; Card, G.M. Ultra-selective looming detection from radial motion opponency. Nature 2017, 551, 237–241. [Google Scholar] [CrossRef]
- Hong, J.; Fu, Q.; Sun, X.; Li, H.; Peng, J. Boosting collision perception against noisy signals with a probabilistic neural network. In Proceedings of the 2023 International Joint Conference on Neural Networks (IJCNN), Gold Coast, Australia, 18–23 June 2023. [Google Scholar]
- Sun, X.; Fu, Q.; Peng, J.; Yue, S. An insect-inspired model facilitating autonomous navigation by incorporating goal approaching and collision avoidance. Neural Netw. 2023, 165, 106–118. [Google Scholar] [CrossRef]
Abbreviation | Full Name |
---|---|
LGMD | lobula giant movement detector |
Prob-LGMD | probabilistic LGMD |
pSNN | probabilistic spiking neural network |
pSNM | probabilisitc spiking neural model |
LPLC2 | lobula plate/lobula columnar type II neuron |
MLG1 | monostratified lobula giant type I neuron |
LPTC | lobula plate tangential cell |
P,E,I,S | photoreceptor, excitation, inhibition, summation |
DR | distinct ratio |
PNR | pepper noise ratio |
GNV | Gaussian noise variance |
Parameter | Description | Value |
---|---|---|
Persistent luminance change duration | 0∼2 | |
Inhibition weight | 0.3 | |
Threshold in S-layer processing | 30 | |
coefficient in S-layer processing | 4 | |
Probability parameter | 0∼1 | |
Spatial dimension of input stimuli | adaptable |
Dataset | Original | PNR = 0.01 | GNV = 0.007 | ||||||
---|---|---|---|---|---|---|---|---|---|
p = 0.5 | p = 0.6 | p = 0.7 | p = 0.5 | p = 0.6 | p = 0.7 | p = 0.5 | p = 0.6 | p = 0.7 | |
No.1 | 0.34 ± 0.01 | 0.44 ± 0.01 | 0.48 ± 0.00 | 0.28 ± 0.01 | 0.34 ± 0.01 | 0.31 | 0.15 ± 0.01 | 0.16 ± 0.01 | 0.17 ± 0.01 |
No.2 | 0.31 ± 0.02 | 0.42 ± 0.01 | 0.47 ± 0.00 | 0.16 ± 0.02 | 0.20 ± 0.01 | 0.18 ± 0.00 | 0.28 ± 0.01 | 0.33 ± 0.01 | 0.31 ± 0.00 |
No.3 | 0.32 ± 0.01 | 0.43 ± 0.02 | 0.48 ± 0.00 | 0.26 ± 0.02 | 0.32 ± 0.00 | 0.31 ± 0.00 | 0.15 ± 0.02 | 0.19 ± 0.01 | 0.18 ± 0.00 |
No.4 | 0.31 ± 0.02 | 0.43 ± 0.01 | 0.47 ± 0.00 | 0.13 ± 0.02 | 0.18 ± 0.01 | 0.16 ± 0.01 | 0.26 ± 0.01 | 0.32 ± 0.01 | 0.31 ± 0.00 |
No.5 | 0.31 ± 0.01 | 0.41 ± 0.01 | 0.47 ± 0.00 | 0.26 ± 0.01 | 0.33 ± 0.01 | 0.32 ± 0.00 | 0.13 ± 0.02 | 0.17 ± 0.01 | 0.16 ± 0.01 |
No.6 | 0.26 ± 0.03 | 0.36 ± 0.01 | 0.40 ± 0.00 | 0.19 ± 0.03 | 0.25 ± 0.01 | 0.23 ± 0.00 | 0.16 ± 0.02 | 0.20 ± 0.02 | 0.19 ± 0.01 |
No.7 | 0.46 ± 0.01 | 0.48 ± 0.00 | 0.46 ± 0.00 | 0.37 ± 0.00 | 0.34 ± 0.00 | 0.28 ± 0.00 | 0.33 ± 0.01 | 0.30 ± 0.00 | 0.24 ± 0.00 |
No.8 | 0.38 ± 0.02 | 0.44 ± 0.00 | 0.44 ± 0.00 | 0.31 ± 0.02 | 0.33 ± 0.00 | 0.30 ± 0.00 | 0.28 ± 0.02 | 0.30 ± 0.00 | 0.27 ± 0.00 |
No.9 | 0.24 ± 0.01 | 0.34 ± 0.02 | 0.38 ± 0.01 | 0.20 ± 0.02 | 0.24 ± 0.01 | 0.23 ± 0.01 | 0.17 ± 0.02 | 0.22 ± 0.02 | 0.23 ± 0.01 |
No.10 | 0.45 ± 0.01 | 0.44 ± 0.00 | 0.41 ± 0.00 | 0.38 ± 0.01 | 0.34 ± 0.00 | 0.29 ± 0.00 | 0.35 ± 0.01 | 0.31 ± 0.00 | 0.25 ± 0.00 |
No.11 | 0.38 ± 0.01 | 0.37 ± 0.00 | 0.30 ± 0.00 | 0.33 ± 0.01 | 0.29 ± 0.00 | 0.22 ± 0.00 | 0.35 ± 0.01 | 0.33 ± 0.00 | 0.26 ± 0.00 |
No.12 | 0.21 ± 0.02 | 0.28 ± 0.02 | 0.27 ± 0.00 | 0.29 ± 0.01 | 0.39 ± 0.01 | 0.41 ± 0.00 | 0.25 ± 0.02 | 0.29 ± 0.01 | 0.28 ± 0.00 |
No.13 | 0.39 ± 0.00 | 0.34 ± 0.00 | 0.28 ± 0.01 | 0.47 ± 0.00 | 0.44 ± 0.00 | 0.41 ± 0.00 | 0.40 ± 0.00 | 0.34 ± 0.00 | 0.28 ± 0.00 |
No.14 | 0.28 ± 0.01 | 0.26 ± 0.00 | 0.19 ± 0.00 | 0.37 ± 0.02 | 0.38 ± 0.00 | 0.33 ± 0.00 | 0.33 ± 0.01 | 0.30 ± 0.00 | 0.23 ± 0.00 |
No.15 | 0.16 ± 0.01 | 0.19 ± 0.01 | 0.21 ± 0.01 | 0.23 ± 0.02 | 0.35 ± 0.01 | 0.42 ± 0.01 | 0.18 ± 0.02 | 0.24 ± 0.02 | 0.26 ± 0.01 |
No.16 | 0.25 ± 0.02 | 0.27 ± 0.01 | 0.23 ± 0.00 | 0.32 ± 0.01 | 0.38 ± 0.01 | 0.37 ± 0.00 | 0.27 ± 0.02 | 0.30 ± 0.01 | 0.27 ± 0.00 |
No.17 | 0.24 ± 0.02 | 0.31 ± 0.01 | 0.31 ± 0.00 | 0.29 ± 0.02 | 0.38 ± 0.02 | 0.40 ± 0.00 | 0.24 ± 0.02 | 0.26 ± 0.01 | 0.24 ± 0.00 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hong, J.; Sun, X.; Peng, J.; Fu, Q. A Bio-Inspired Probabilistic Neural Network Model for Noise-Resistant Collision Perception. Biomimetics 2024, 9, 136. https://doi.org/10.3390/biomimetics9030136
Hong J, Sun X, Peng J, Fu Q. A Bio-Inspired Probabilistic Neural Network Model for Noise-Resistant Collision Perception. Biomimetics. 2024; 9(3):136. https://doi.org/10.3390/biomimetics9030136
Chicago/Turabian StyleHong, Jialan, Xuelong Sun, Jigen Peng, and Qinbing Fu. 2024. "A Bio-Inspired Probabilistic Neural Network Model for Noise-Resistant Collision Perception" Biomimetics 9, no. 3: 136. https://doi.org/10.3390/biomimetics9030136
APA StyleHong, J., Sun, X., Peng, J., & Fu, Q. (2024). A Bio-Inspired Probabilistic Neural Network Model for Noise-Resistant Collision Perception. Biomimetics, 9(3), 136. https://doi.org/10.3390/biomimetics9030136