Contingency Planning of Visual Contamination for Wheeled Mobile Robots with Chameleon-Inspired Visual System
Abstract
:1. Introduction
2. Description of the Chameleon-Inspired Visual System for WMRs
3. Target Search Model in CIBNCM Mode for WMRs
- —describes the target search model in CIBNCM and BCM mode for WMRs and is represented by , respectively;
- —expresses the focal length of cameras;
- —is the obtained FOV in CIBNCM and BCM mode;
- —denotes the detected target characteristic using the selective attention algorithm.
- —represents the robot’s ROI;
- —is the horizontal and vertical FOV of each camera, ;
- —denotes the overlap angle, ;
- —is the rotation time of cameras in the horizontal and vertical direction in CIBNCM and BCM mode, .
4. CBR-Based Contingency Planning of Chameleon-Inspired Visual Contamination for WMRs
- is the CBR-based contingency planning space of chameleon-inspired visual contamination for WMRs;
- represents the state space and corresponding action space of visual contamination for WMRs, respectively;
- denotes the CBR-based reasoning space, and expresses every step of the CBR-based reasoning process, respectively.
4.1. Case Representation of Visual Contamination
4.2. Case Reuse of Visual Contamination
4.3. Case Evaluation and Revision of Visual Contamination
4.4. Case Retention and Case-Base Maintenance of Visual Contamination
5. Perception Model in Chameleon-Inspired Visual Contamination of WMRs
- is the perception space in chameleon-inspired visual contamination for WMRs, consisting of the state space and action space of visual contamination;
- is the state space of chameleon-inspired visual contamination, where represents the transparency of the FOV, the centroid position of contamination, and the camera contamination topology, respectively,
6. Experiments and Comparison
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Corke, P. Robotics, Vision and Control; Springer Tracts in Advanced Robotics; Springer International Publishing: Cham, Switzerland, 2017; Volume 118, ISBN 978-3-319-54412-0. [Google Scholar]
- Saranli, U.; Buehler, M.; Koditschek, D.E. RHex: A Simple and Highly Mobile Hexapod Robot. Int. J. Robot. Res. 2001, 20, 616–631. [Google Scholar] [CrossRef]
- Wright, C.; Johnson, A.; Peck, A.; McCord, Z.; Naaktgeboren, A.; Gianfortoni, P.; Gonzalez-Rivero, M.; Hatton, R.; Choset, H. Design of a Modular Snake Robot. In Proceedings of the 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems, San Diego, CA, USA, 29 October 2007–2 November 2007; IEEE: Piscataway, NJ, USA, 2007; pp. 2609–2614. [Google Scholar]
- Wright, C.; Buchan, A.; Brown, B.; Geist, J.; Schwerin, M.; Rollinson, D.; Tesch, M.; Choset, H. Design and Architecture of the Unified Modular Snake Robot. In Proceedings of the 2012 IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA, 14–18 May 2012; IEEE: Piscataway, NJ, USA, 2012; pp. 4347–4354. [Google Scholar]
- Souto, D.; Faina, A.; Lopez-Pena, F.; Duro, R.J. Lappa: A New Type of Robot for Underwater Non-Magnetic and Complex Hull Cleaning. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 3409–3414. [Google Scholar]
- Rice, W.M.; Wittenstein, N. Sequential Sensor Cleaning System for Autonomous Vehicle. U.S. Patent 10173646, 8 January 2019. [Google Scholar]
- Uricar, M.; Krizek, P.; Sistu, G.; Yogamani, S. SoilingNet: Soiling Detection on Automotive Surround-View Cameras. In Proceedings of the 2019 IEEE Intelligent Transportation Systems Conference (ITSC), Auckland, New Zealand, 27–30 October 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 67–72. [Google Scholar]
- Bell, R.; Bell, R. Automatic Camera Lens Cleaning System. U.S. Patent 9217864, 22 December 2015. [Google Scholar]
- Hsiao, J.-C.; Wu, T.-T.; Chen, Y.-A. Vehicle Camera Cleaning System. U.S. Patent 9539988, 10 January 2017. [Google Scholar]
- Tran, T.; Kim, T.G.; Fuegner, K.; Aritharan, P.; Dovgan, M. Self-Cleaning Trocars/Laparoscopic Port Add-on for Surgical Camera Lens. J. Minim. Invasive Gynecol. 2022, 29, S15. [Google Scholar] [CrossRef]
- Theeuwes, H.; Zengerink, H.; Mannaerts, G. Easy Cleaning of the Camera Port During Laparoscopic Surgery: Three Practical Techniques. J. Laparoendosc. Adv. Surg. Tech. 2011, 21, 821–822. [Google Scholar] [CrossRef] [PubMed]
- Yazdanpanah, A.R.; Liu, X.; Li, N.; Tan, J. A Novel Laparoscopic Camera Robot with In-Vivo Lens Cleaning and Debris Prevention Modules. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 3669–3674. [Google Scholar]
- Jin, B.; Vai, M.I. An Adaptive Ultrasonic Backscattered Signal Processing Technique for Instantaneous Characteristic Frequency Detection. Bio-Med. Mater. Eng. 2014, 24, 2761–2770. [Google Scholar] [CrossRef] [PubMed]
- Industry’s First Ultrasonic Lens Cleaning Chipset Enables Self-Cleaning Cameras and Sensors. Available online: https://finance.yahoo.com/news/industrys-first-ultrasonic-lens-cleaning-145600962.html (accessed on 10 May 2023).
- Song, H.; Jang, D.; Lee, J.; Lee, K.Y.; Chung, S.K. SAW-Driven Self-Cleaning Drop Free Glass for Automotive Sensors. J. Micromech. Microeng. 2021, 31, 125007. [Google Scholar] [CrossRef]
- He, K.; Sun, J.; Tang, X. Single Image Haze Removal Using Dark Channel Prior. IEEE Trans. Pattern Anal. Mach. Intell. 2011, 33, 2341–2353. [Google Scholar] [CrossRef]
- Kang, L.-W.; Lin, C.-W.; Fu, Y.-H. Automatic Single-Image-Based Rain Streaks Removal via Image Decomposition. IEEE Trans. Image Process. 2012, 21, 1742–1755. [Google Scholar] [CrossRef]
- Fang, Z.; Roy, K.; Chen, B.; Sham, C.-W.; Hajirasouliha, I.; Lim, J.B.P. Deep Learning-Based Procedure for Structural Design of Cold-Formed Steel Channel Sections with Edge-Stiffened and Un-Stiffened Holes under Axial Compression. Thin-Walled Struct. 2021, 166, 108076. [Google Scholar] [CrossRef]
- Philip, R.E.; Andrushia, A.D.; Nammalvar, A.; Gurupatham, B.G.A.; Roy, K. A Comparative Study on Crack Detection in Concrete Walls Using Transfer Learning Techniques. J. Compos. Sci. 2023, 7, 169. [Google Scholar] [CrossRef]
- Zhao, K.; Hu, J.; Shao, H.; Hu, J. Federated Multi-Source Domain Adversarial Adaptation Framework for Machinery Fault Diagnosis with Data Privacy. Reliab. Eng. Syst. Saf. 2023, 236, 109246. [Google Scholar] [CrossRef]
- Zhao, K.; Jia, F.; Shao, H. A Novel Conditional Weighting Transfer Wasserstein Auto-Encoder for Rolling Bearing Fault Diagnosis with Multi-Source Domains. Knowl. Based Syst. 2023, 262, 110203. [Google Scholar] [CrossRef]
- Das, B.; Ebenezer, J.P.; Mukhopadhyay, S. A Comparative Study of Single Image Fog Removal Methods. Vis. Comput. 2022, 38, 179–195. [Google Scholar] [CrossRef]
- Zhu, Q.; Mai, J.; Shao, L. A Fast Single Image Haze Removal Algorithm Using Color Attenuation Prior. IEEE Trans. Image Process. 2015, 24, 3522–3533. [Google Scholar] [CrossRef]
- Wu, F.; Li, Y.; Han, J.; Dong, W.; Shi, G. Perceptual Image Dehazing Based on Generative Adversarial Learning. In Advances in Multimedia Information Processing—PCM 2018; Hong, R., Cheng, W.-H., Yamasaki, T., Wang, M., Ngo, C.-W., Eds.; Springer International Publishing: Cham, Switzerland, 2018; Volume 11164, pp. 877–887. ISBN 978-3-030-00775-1. [Google Scholar]
- Liu, W.; Hou, X.; Duan, J.; Qiu, G. End-to-End Single Image Fog Removal Using Enhanced Cycle Consistent Adversarial Networks. IEEE Trans. Image Process. 2020, 29, 7819–7833. [Google Scholar] [CrossRef]
- Wang, C.; Xing, X.; Wu, Y.; Su, Z.; Chen, J. DCSFN: Deep Cross-Scale Fusion Network for Single Image Rain Removal. In Proceedings of the 28th ACM International Conference on Multimedia, Seattle, WA, USA, 12–16 October 2020; ACM: New York, NY, USA, 2020; pp. 1643–1651. [Google Scholar]
- Zhu, H.; Peng, X.; Zhou, J.T.; Yang, S.; Chanderasekh, V.; Li, L.; Lim, J.-H. Singe Image Rain Removal with Unpaired Information: A Differentiable Programming Perspective. AAAI 2019, 33, 9332–9339. [Google Scholar] [CrossRef]
- Hu, X.; Fu, C.-W.; Zhu, L.; Heng, P.-A. Depth-Attentional Features for Single-Image Rain Removal. In Proceedings of the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 15–20 June 2019. [Google Scholar] [CrossRef]
- Wang, H.; Xie, Q.; Zhao, Q.; Meng, D. A Model-Driven Deep Neural Network for Single Image Rain Removal. In Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 13–19 June 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 3100–3109. [Google Scholar]
- Dong, Y.; Guo, W.; Zha, F.; Liu, Y.; Chen, C.; Sun, L. A Vision-Based Two-Stage Framework for Inferring Physical Properties of the Terrain. Appl. Sci. 2020, 10, 6473. [Google Scholar] [CrossRef]
- Li, Z.-Z.; Zhu, T.; Xiao, S.-N.; Zhang, J.-K.; Wang, X.-R.; Ding, H.-X. Simulation Method for Train Curve Derailment Collision and the Effect of Curve Radius on Collision Response. Proc. Inst. Mech. Eng. Part F J. Rail Rapid Transit 2023, 095440972311543. [Google Scholar] [CrossRef]
- Duan, J.; Duan, G.; Cheng, S.; Cao, S.; Wang, G. Fixed-Time Time-Varying Output Formation–Containment Control of Heterogeneous General Multi-Agent Systems. ISA Trans. 2023, S0019057823000083. [Google Scholar] [CrossRef]
- Dearden, R.; Meuleau, N.; Ramakrishnan, S.; Smith, D.; Washington, R.; Clancy, D. Contingency Planning for Planetary Rovers. In Proceedings of the 3rd International NASA Workshop on Planning and Scheduling for Space, Houston, TX, USA, 27–29 October 2002. [Google Scholar]
- McGrath, B.E. Mars Exploration Rovers Launch Contingency Efforts. In AIP Conference Proceedings; AIP: Albuquerque, NM, USA, 2004; Volume 699, pp. 300–307. [Google Scholar]
- Chang, Y.; Lear, M.H.; McGrath, B.E.; Heyler, G.A.; Takashima, N.; Owings, W.D. New Horizons Launch Contingency Effort. In AIP Conference Proceedings; AIP: Albuquerque, NM, USA, 2007; Volume 880, pp. 590–596. [Google Scholar]
- Yoo, C.; Fitch, R.; Sukkarieh, S. Online task planning and control for fuel-constrained aerial robots in wind fields. Int. J. Robot. Res. 2016, 35, 438–453. [Google Scholar] [CrossRef]
- Shah, B.C.; Švec, P.; Bertaska, I.R.; Sinisterra, A.J.; Klinger, W.; von Ellenrieder, K.; Dhanak, M.; Gupta, S.K. Resolution-Adaptive Risk-Aware Trajectory Planning for Surface Vehicles Operating in Congested Civilian Traffic. Auton. Robot. 2016, 40, 1139–1163. [Google Scholar] [CrossRef]
- Hardy, J.; Campbell, M. Contingency Planning Over Probabilistic Obstacle Predictions for Autonomous Road Vehicles. IEEE Trans. Robot. 2013, 29, 913–929. [Google Scholar] [CrossRef]
- Harris, C.; Dearden, R. Contingency Planning for Long-Duration AUV Missions. In Proceedings of the 2012 IEEE/OES Autonomous Underwater Vehicles (AUV), Southampton, UK, 24–27 September 2012; IEEE: Piscataway, NJ, USA, 2012; pp. 1–6. [Google Scholar]
- Khalastchi, E.; Kalech, M.; Rokach, L. A Hybrid Approach for Improving Unsupervised Fault Detection for Robotic Systems. Expert Syst. Appl. 2017, 81, 372–383. [Google Scholar] [CrossRef]
- Zhong, M.; Zhang, L.; Ding, S.X.; Zhou, D. A Probabilistic Approach to Robust Fault Detection for a Class of Nonlinear Systems. IEEE Trans. Ind. Electron. 2016, 64, 3930–3939. [Google Scholar] [CrossRef]
- Pryor, L.; Collins, G. Planning for Contingencies: A Decision-Based Approach. J. Artif. Intell. Res. 1996, 4, 287–339. [Google Scholar] [CrossRef]
- Tianyou, C. Modeling of the Laminar Cooling Process with Case-Based Reasoning. Control Theory Appl. 2005, 22, 248–253. [Google Scholar]
- Khosravani, M.R.; Nasiri, S. Injection Molding Manufacturing Process: Review of Case-Based Reasoning Applications. J. Intell. Manuf. 2020, 31, 847–864. [Google Scholar] [CrossRef]
- Zhou, P.; Chai, T.; Wang, H. Intelligent Optimal-Setting Control for Grinding Circuits of Mineral Processing Process. IEEE Trans. Automat. Sci. Eng. 2009, 6, 730–743. [Google Scholar] [CrossRef]
- Zhou, P.; Lu, S.-W.; Chai, T. Data-Driven Soft-Sensor Modeling for Product Quality Estimation Using Case-Based Reasoning and Fuzzy-Similarity Rough Sets. IEEE Trans. Automat. Sci. Eng. 2014, 11, 992–1003. [Google Scholar] [CrossRef]
- Ott, M. Chameleons Have Independent Eye Movements but Synchronise Both Eyes during Saccadic Prey Tracking. Exp. Brain Res. 2001, 139, 173–179. [Google Scholar] [CrossRef]
- Avni, O.; Borrelli, F.; Katzir, G.; Rivlin, E.; Rotstein, H. Scanning and Tracking with Independent Cameras—A Biologically Motivated Approach Based on Model Predictive Control. Auton. Robot. 2008, 24, 285–302. [Google Scholar] [CrossRef]
- Xu, H.; Xu, Y.; Fu, H.; Xu, Y.; Gao, X.Z.; Alipour, K. Coordinated Movement of Biomimetic Dual PTZ Visual System and Wheeled Mobile Robot. Ind. Robot Int. J. 2014, 41, 557–566. [Google Scholar] [CrossRef]
- Tsai, J.; Wang, C.-W.; Chang, C.-C.; Hu, K.-C.; Wei, T.-H. A Chameleon-like Two-Eyed Visual Surveillance System. In Proceedings of the 2014 International Conference on Machine Learning and Cybernetics, Lanzhou, China, 13–16 July 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 734–740. [Google Scholar]
- Joyeux, S.; Kirchner, F.; Lacroix, S. Managing Plans: Integrating Deliberation and Reactive Execution Schemes. Robot. Auton. Syst. 2010, 58, 1057–1066. [Google Scholar] [CrossRef]
- Hardy, J.; Campbell, M. Contingency Planning over Probabilistic Hybrid Obstacle Predictions for Autonomous Road Vehicles. In Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, Taiwan, 18–22 October 2010; IEEE: Piscataway, NJ, USA, 2010; pp. 2237–2242. [Google Scholar]
- Maye, A.; Engel, A.K. Extending Sensorimotor Contingency Theory: Prediction, Planning, and Action Generation. Adapt. Behav. 2013, 21, 423–436. [Google Scholar] [CrossRef]
- Chen, X.; Jia, S.; Xiang, Y. A Review: Knowledge Reasoning over Knowledge Graph. Expert Syst. Appl. 2020, 141, 112948. [Google Scholar] [CrossRef]
- Chen, Y.; Li, H.; Li, H.; Liu, W.; Wu, Y.; Huang, Q.; Wan, S. An Overview of Knowledge Graph Reasoning: Key Technologies and Applications. J. Sens. Actuator Netw. 2022, 11, 78. [Google Scholar] [CrossRef]
- Lipovanu, I.; Pascal, C. A Rule-Based Enhancement of a Vision Guided, Collision-Free Robotic Application. In Proceedings of the 2021 25th International Conference on System Theory, Control and Computing (ICSTCC), Iasi, Romania, 20 October 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 559–563. [Google Scholar]
- Zhu, Y.; Wang, Z.; Chen, C.; Dong, D. Rule-Based Reinforcement Learning for Efficient Robot Navigation with Space Reduction. IEEE/ASME Trans. Mechatron. 2022, 27, 846–857. [Google Scholar] [CrossRef]
Parameter | Description | Value |
---|---|---|
Longitudinal distance between and | 165 mm | |
Vertical distance between and | 180 mm | |
Horizontal distance between and | 90 mm | |
Vertical distance between and | 79 mm | |
Rotation angle of neck around axis of | ||
Rotation angle of cameras around axis of | ||
Rotation angle of cameras around axis of | ||
Rotation angle of robot around axis of |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xu, Y.; Yu, H.; Wu, L.; Song, Y.; Liu, C. Contingency Planning of Visual Contamination for Wheeled Mobile Robots with Chameleon-Inspired Visual System. Electronics 2023, 12, 2365. https://doi.org/10.3390/electronics12112365
Xu Y, Yu H, Wu L, Song Y, Liu C. Contingency Planning of Visual Contamination for Wheeled Mobile Robots with Chameleon-Inspired Visual System. Electronics. 2023; 12(11):2365. https://doi.org/10.3390/electronics12112365
Chicago/Turabian StyleXu, Yan, Hongpeng Yu, Liyan Wu, Yuqiu Song, and Cuihong Liu. 2023. "Contingency Planning of Visual Contamination for Wheeled Mobile Robots with Chameleon-Inspired Visual System" Electronics 12, no. 11: 2365. https://doi.org/10.3390/electronics12112365
APA StyleXu, Y., Yu, H., Wu, L., Song, Y., & Liu, C. (2023). Contingency Planning of Visual Contamination for Wheeled Mobile Robots with Chameleon-Inspired Visual System. Electronics, 12(11), 2365. https://doi.org/10.3390/electronics12112365