AI Enabled IoRT Framework for Rodent Activity Monitoring in a False Ceiling Environment
Abstract
:1. Introduction
- the systems need an adaptable teleoperated robot with robust locomotion characteristics to operate in dynamic and unstructured complex false ceiling environments;
- the systems need a secure wireless communication framework for controlling and collecting the visual information from the robot; and
- the systems need a rodent activity detection algorithm for automatically detecting rodent signs and rodents from the collected visual feed.
2. Overview of Proposed System
2.1. IoRT Framework
Physical Layer
2.2. Network Layer
2.3. Processing Layer
Rodent Activity Detection Module
2.4. Application Layer
3. Results and Discussion
3.1. Validation of Falcon Robot Performance
3.2. Evaluate the Performance of Rodent Activity Detection Module
3.3. Dataset Preparation and Annotations
3.4. Hardware Details
3.5. Training and Hyper Parameter Tuning
3.6. Dataset Evaluation
3.7. Offline and Real Time Test
3.8. Evaluate with Prototype Testbed
3.9. Experimental Comparisons
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Ramalingam, B.; Vega-Heredia, M.; Mohan, R.E.; Vengadesh, A.; Lakshmanan, A.; Ilyas, M.; James, T.J.Y. Visual Inspection of the Aircraft Surface Using a Teleoperated Reconfigurable Climbing Robot and Enhanced Deep Learning Technique. Int. J. Aerosp. Eng. 2019, 2019, 1–14. [Google Scholar] [CrossRef]
- Ramalingam, B.; Yin, J.; Rajesh Elara, M.; Tamilselvam, Y.K.; Mohan Rayguru, M.; Muthugala, M.A.V.J.; Félix Gómez, B. A Human Support Robot for the Cleaning and Maintenance of Door Handles Using a Deep-Learning Framework. Sensors 2020, 20, 3543. [Google Scholar] [CrossRef]
- Yin, J.; Apuroop, K.G.S.; Tamilselvam, Y.K.; Mohan, R.E.; Ramalingam, B.; Le, A.V. Table Cleaning Task by Human Support Robot Using Deep Learning Technique. Sensors 2020, 20, 1698. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ramalingam, B.; Hayat, A.A.; Elara, M.R.; Félix Gómez, B.; Yi, L.; Pathmakumar, T.; Rayguru, M.M.; Subramanian, S. Deep Learning Based Pavement Inspection Using Self-Reconfigurable Robot. Sensors 2021, 21, 2595. [Google Scholar] [CrossRef] [PubMed]
- Zabalza, J.; Fei, Z.; Wong, C.; Yan, Y.; Mineo, C.; Yang, E.; Rodden, T.; Mehnen, J.; Pham, Q.C.; Ren, J. Smart Sensing and Adaptive Reasoning for Enabling Industrial Robots with Interactive Human-Robot Capabilities in Dynamic Environments—A Case Study. Sensors 2019, 19, 1354. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Tun, T.T.; Elara, M.R.; Kalimuthu, M.; Vengadesh, A. Glass facade cleaning robot with passive suction cups and self-locking trapezoidal lead screw drive. Autom. Constr. 2018, 96, 180–188. [Google Scholar] [CrossRef]
- Kouzehgar, M.; Krishnasamy Tamilselvam, Y.; Vega Heredia, M.; Rajesh Elara, M. Self-reconfigurable façade-cleaning robot equipped with deep-learning-based crack detection based on convolutional neural networks. Autom. Constr. 2019, 108, 102959. [Google Scholar] [CrossRef]
- Hayat, A.A.; Ramanan, K.R.; Abdulkader, R.E.; Tun, T.T.; Ramalingam, B.; Elara, M.R. Robot with Reconfigurable Wheels for False-ceiling Inspection: Falcon. In Proceedings of the 5th IEEE/IFToMM International Conference on Reconfigurable Mechanisms and Robots (ReMAR), Toronto, ON, Canada, 12–14 August 2021; pp. 1–10. [Google Scholar]
- Tan, N.; Hayat, A.A.; Elara, M.R.; Wood, K.L. A framework for taxonomy and evaluation of self-reconfigurable robotic systems. IEEE Access 2020, 8, 13969–13986. [Google Scholar] [CrossRef]
- Manimuthu, M.; Hayat, A.A.; Elara, M.R.; Wood, K. Transformation design Principles as enablers for designing Reconfigurable Robots. In Proceedings of the International Design Engineering Technical Conferences and Computers and Information in Engineering, IDETC/CIE 2021, Virtual Conference, 17–19 August 2021; pp. 1–12. [Google Scholar]
- Vermesan, O.; Bahr, R.; Ottella, M.; Serrano, M.; Karlsen, T.; Wahlstrøm, T.; Sand, H.E.; Ashwathnarayan, M.; Gamba, M.T. Internet of Robotic Things Intelligent Connectivity and Platforms. Front. Robot. AI 2020, 7, 104. [Google Scholar] [CrossRef] [PubMed]
- Batth, R.S.; Nayyar, A.; Nagpal, A. Internet of Robotic Things: Driving Intelligent Robotics of Future—Concept, Architecture, Applications and Technologies. In Proceedings of the 2018 4th International Conference on Computing Sciences (ICCS), Jalandhar, India, 30–31 August 2018; pp. 151–160. [Google Scholar] [CrossRef]
- Atkinson, G.A.; Zhang, W.; Hansen, M.F.; Holloway, M.L.; Napier, A.A. Image segmentation of underfloor scenes using a mask regions convolutional neural network with two-stage transfer learning. Autom. Constr. 2020, 113, 103118. [Google Scholar] [CrossRef]
- Khan, M.S.; Zeng, K.; Wu, N.; Unwala, I. Robotics and Deep Learning Framework for Structural Health Monitoring of Utility Pipes. In Proceedings of the 2019 IEEE International Symposium on Measurement and Control in Robotics (ISMCR), Houston, TX, USA, 19–21 September 2019; pp. B3-3-1–B3-3-5. [Google Scholar] [CrossRef]
- Yin, X.; Chen, Y.; Bouferguene, A.; Zaman, H.; Al-Hussein, M.; Kurach, L. A deep learning-based framework for an automated defect detection system for sewer pipes. Autom. Constr. 2020, 109, 102967. [Google Scholar] [CrossRef]
- Asadi, K.; Ramshankar, H.; Pullagurla, H.; Bhandare, A.; Shanbhag, S.; Mehta, P.; Kundu, S.; Han, K.; Lobaton, E.; Wu, T. Vision-based integrated mobile robotic system for real-time applications in construction. Autom. Constr. 2018, 96, 470–482. [Google Scholar] [CrossRef]
- Yan, R.J.; Kayacan, E.; Chen, I.M.; Tiong, L.K.; Wu, J. QuicaBot: Quality Inspection and Assessment Robot. IEEE Trans. Autom. Sci. Eng. 2019, 16, 506–517. [Google Scholar] [CrossRef]
- Rustia, D.J.; Chao, J.J.; Chung, J.Y.; Lin, T.T. An Online Unsupervised Deep Learning Approach for an Automated Pest Insect Monitoring System. In Proceedings of the 2019 ASABE Annual International Meeting, Boston, MA, USA, 7–10 July 2019; pp. 1–5. [Google Scholar] [CrossRef]
- Severtson, D.; Congdon, B.; Valentine, C. Apps, traps and LAMP’s: ‘Smart’ improvements to pest and disease management. In Proceedings of the 2018 Grains Research Update, Perth, Australia, 26–27 February 2018. [Google Scholar]
- Kajol, R.; Akshay Kashyap, K.; Keerthan Kumar, T. Automated Agricultural FieldAnalysis and Monitoring System Using IOT. Int. J. Inf. Eng. Electron. Bus. 2018, 10, 17–24. [Google Scholar] [CrossRef] [Green Version]
- Potamitis, I.; Eliopoulos, P.; Rigakis, I. Automated Remote Insect Surveillance at a Global Scale and the Internet of Things. Robotics 2017, 6, 19. [Google Scholar] [CrossRef] [Green Version]
- Chen, R.; Little, R.; Mihaylova, L.; Delahay, R.; Cox, R. Wildlife surveillance using deep learning methods. Ecol. Evol. 2019, 9, 9453–9466. [Google Scholar] [CrossRef] [Green Version]
- Giordano, S.; Seitanidis, I.; Ojo, M.; Adami, D.; Vignoli, F. IoT solutions for crop protection against wild animal attacks. In Proceedings of the 2018 IEEE International Conference on Environmental Engineering (EE), IEEE, Milan, Italy, 12–14 March 2018; pp. 1–5. [Google Scholar]
- Sabeenian, R.; Deivanai, N.; Mythili, B. Wild Animals Intrusion Detection using Deep Learning Techniques. Int. J. Pharm. Res. 2020, 12, 1054–1058. [Google Scholar]
- Nikhil, R.; Anisha, B.; Kumar, R. Real-Time Monitoring of Agricultural Land with Crop Prediction and Animal Intrusion Prevention using Internet of Things and Machine Learning at Edge. In Proceedings of the 2020 IEEE International Conference on Electronics, Computing and Communication Technologies (CONECCT), Bangalore, India, 2–4 July 2020; pp. 1–6. [Google Scholar]
- Nguyen, H.; Maclagan, S.J.; Nguyen, T.D.; Nguyen, T.; Flemons, P.; Andrews, K.; Ritchie, E.G.; Phung, D. Animal Recognition and Identification with Deep Convolutional Neural Networks for Automated Wildlife Monitoring. In Proceedings of the 2017 IEEE International Conference on Data Science and Advanced Analytics (DSAA), Tokyo, Japan, 19–21 October 2017; pp. 40–49. [Google Scholar]
- Afanasyev, I.; Mazzara, M.; Chakraborty, S.; Zhuchkov, N.; Maksatbek, A.; Yesildirek, A.; Kassab, M.; Distefano, S. Towards the Internet of Robotic Things: Analysis, Architecture, Components and Challenges. In Proceedings of the 2019 12th International Conference on Developments in eSystems Engineering (DeSE), Kazan, Russia, 7–10 October 2019; pp. 3–8. [Google Scholar] [CrossRef] [Green Version]
- Ramalingam, B.; Mohan, R.E.; Pookkuttath, S.; Gómez, B.F.; Sairam Borusu, C.S.C.; Teng, T.W.; Tamilselvam, Y.K. Remote Insects Trap Monitoring System Using Deep Learning Framework and IoT. Sensors 2020, 20, 5280. [Google Scholar] [CrossRef] [PubMed]
- Ramalingam, B.; Lakshmanan, A.K.; Ilyas, M.; Le, A.V.; Elara, M.R. Cascaded Machine-Learning Technique for Debris Classification in Floor-Cleaning Robot Application. Appl. Sci. 2018, 8, 2946. [Google Scholar] [CrossRef] [Green Version]
- Chen, X.; Gupta, A. An Implementation of Faster RCNN with Study for Region Sampling. arXiv 2017, arXiv:1702.02138. [Google Scholar]
- Chen, X.; Gupta, A. Spatial Memory for Context Reasoning in Object Detection. arXiv 2017, arXiv:1704.04224. [Google Scholar]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. In Proceedings of the Neural Information Processing Systems (NIPS), Montreal, QC, Canada, 7–12 July 2015. [Google Scholar]
- Ning, C.; Zhou, H.; Song, Y.; Tang, J. Inception Single Shot MultiBox Detector for object detection. In Proceedings of the 2017 IEEE International Conference on Multimedia Expo Workshops (ICMEW), Hong Kong, China, 10–14 July 2017; pp. 549–554. [Google Scholar]
- Redmon, J.; Farhadi, A. YOLO9000: Better, Faster, Stronger. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 6517–6525. [Google Scholar]
- Cui, J.; Zhang, J.; Sun, G.; Zheng, B. Extraction and Research of Crop Feature Points Based on Computer Vision. Sensors 2019, 19, 2553. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Liu, M.; Wang, X.; Zhou, A.; Fu, X.; Ma, Y.; Piao, C. UAV-YOLO: Small Object Detection on Unmanned Aerial Vehicle Perspective. Sensors 2020, 20, 2238. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Description | Specification |
---|---|
Dimensions (L × W × H) | 0.236 × 0.156 × 0.072 m |
Weight (including battery) | 1.3 kg |
Type of Locomotion Drive | Track |
Top & Bottom Ground Clearance | 0.011 m, 0.011 m |
Operating Speed | 0.1 m/s |
Maximum Obstacle Height | 0.055 m |
Operational Duration | 0.5–0.75 h |
Battery | 3 cells Lithium Ion |
Operation Mode | Teleoperation (with integrated sensors to detect fall and stop autonomously) |
Communication Mode | Wi-Fi through local MQTT server |
Camera Specifications (with on-board Light source) | VGA 640 × 480, up to 30 fps, 60 degree view angle, 20 cm-infinity focusing range |
Class | Precision | Recall | F1 | Accuracy |
---|---|---|---|---|
Norway rat | 94.58 | 94.39 | 94.18 | 94.57 |
Roof rat | 94.67 | 94.58 | 94.26 | 94.89 |
House Mouse | 95.10 | 95.05 | 95.18 | 94.94 |
Gnaw markings | 92.76 | 92.54 | 92.89 | 92.98 |
Droppings | 89.87 | 87.23 | 89.12 | 89.58 |
Algorithm | Precision | Recall | F1 | Accuracy | Frames per Second |
---|---|---|---|---|---|
Faster RCNN VGG16 | 91.22 | 90.17 | 88.22 | 89.59 | 6 |
Faster RCNN Inception v2 | 93.02 | 92.58 | 93.04 | 92.65 | 3 |
YOLOv3 | 83.46 | 83.22 | 81.55 | 89.33 | 40 |
Faster RCNN ResNet 101 | 93.39 | 92.78 | 93.12 | 93.39 | 5 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ramalingam, B.; Tun, T.; Mohan, R.E.; Gómez, B.F.; Cheng, R.; Balakrishnan, S.; Mohan Rayaguru, M.; Hayat, A.A. AI Enabled IoRT Framework for Rodent Activity Monitoring in a False Ceiling Environment. Sensors 2021, 21, 5326. https://doi.org/10.3390/s21165326
Ramalingam B, Tun T, Mohan RE, Gómez BF, Cheng R, Balakrishnan S, Mohan Rayaguru M, Hayat AA. AI Enabled IoRT Framework for Rodent Activity Monitoring in a False Ceiling Environment. Sensors. 2021; 21(16):5326. https://doi.org/10.3390/s21165326
Chicago/Turabian StyleRamalingam, Balakrishnan, Thein Tun, Rajesh Elara Mohan, Braulio Félix Gómez, Ruoxi Cheng, Selvasundari Balakrishnan, Madan Mohan Rayaguru, and Abdullah Aamir Hayat. 2021. "AI Enabled IoRT Framework for Rodent Activity Monitoring in a False Ceiling Environment" Sensors 21, no. 16: 5326. https://doi.org/10.3390/s21165326
APA StyleRamalingam, B., Tun, T., Mohan, R. E., Gómez, B. F., Cheng, R., Balakrishnan, S., Mohan Rayaguru, M., & Hayat, A. A. (2021). AI Enabled IoRT Framework for Rodent Activity Monitoring in a False Ceiling Environment. Sensors, 21(16), 5326. https://doi.org/10.3390/s21165326