Revolutionizing Urban Pest Management with Sensor Fusion and Precision Fumigation Robotics
Abstract
:1. Introduction
- The development of a precision fumigation robot for urban landscapes: This is an original contribution, as no existing autonomous robots navigate urban environments for precision fumigation applications. This novel development addresses the need for precise, automated solutions in urban pest control.
- The development of a LiDAR-Vision-IMU fusion algorithm: Inspired by existing research [34,35], this contribution enhances traditional sensor fusion techniques to fit an autonomous fumigation robot, improving its ability to identify and map mosquito hotspots in real time. This adaptation enables more effective data collection and targeting of potential breeding hotspots.
2. Development of the Fumigation Robot
2.1. Setting Up the Navigation Stack
2.2. Training Hotspots Using YOLOv8
3. Mapping and Hotspot Identification
3.1. Autonomous Exploration of the Robot
3.2. Hotspot Identification Training
- The C3 module is replaced with the C2f module,
- YOLOv5 does not have convolution layers 10 and 14,
- The bottleneck layer was tampered, where 1 × 1 layer was replaced with 3 × 3 convolution layers and the decoupled head was used instead of the objectness step [44].
3.3. Dynamic Map Update
Algorithm 1: Detection.py |
Initialize parameters and variables Define read_csv function Define write_csv function Main program execution: Start RealSense pipeline Initialize ROS node and publishers Initialize variables for pose tracking and detection Start main loop: Obtain color and depth frames Detect objects in color frame using YOLO Iterate over detected results: Extract bounding box coordinates Calculate object center Estimate object depth Transform object pose to map frame Read existing data from CSV Compare current pose with existing entries: If match found: Update score for object If no match found: Append new entry with object’s pose and prev_score Write updated data back to CSV Visualize detected objects and scores on color image Display color image with annotations Wait for user input to exit program |
4. Results and Discussion
4.1. Performance Metric Analysis
4.2. Test Site Description
4.3. Hotspots Identification and Plotting on the Map
4.4. Comparison of Existing Robots with the Proposed Robot
4.5. Challenges, Possible Solutions, and Future Research Directions
- Navigation in complex environments: The differential drive robot is built for navigating in urban environments, where the surface is even and spacious for robot movements. Tracked platforms are effective in handling multiple terrains. However, these platforms consume more energy because they are heavy and bulky, requiring frequent recharging. A modular robot is another future research direction that enables it to change its locomotion system according to the environment.
- Battery life is a major limitation restricting the robot’s operational time, necessitating frequent recharging. One possible solution is to have frequent, preferably wireless charging stations that allow full autonomy.
- Renewable energy sources for charging robots: Researching the use of renewable energy sources, such as solar power, to extend the operational time of robots and reduce their environmental impact, designing more energy-efficient robots to improve their operational longevity and reduce the need for frequent recharging.
- Varied environments: Different environments may require different fumigation strategies, such as discrete fumigating motion, continuous fumigation motion, and 360° fumigating motion. Using advanced machine learning techniques and identifying the environment, a decision can be made to opt for the required strategy.
- Navigating multiple floors in a building: The proposed challenge can be solved by integrating the robot’s control with the building’s management system, such as access to calling lifts and doors. However, this requires planning from the initial stage to make the building’s infrastructure robot-friendly.
4.6. Limitations
- The trained model may underperform in scenarios significantly different from its training environment, restricting its effectiveness in unfamiliar urban landscapes.
- YOLOv8 needs robust computational resources, which could limit the deployment of the robot in settings with limited processing capabilities.
- Optimizing for real-time performance compromises the detection accuracy, which is critical for precise localization of mosquito breeding sites.
- The model’s effectiveness decreases with smaller objects, which could be crucial in identifying less conspicuous breeding grounds.
- Vibrations during locomotion affect detection accuracy, leading to blurred images being sent for object recognition.
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Dahmana, H.; Mediannikov, O. Mosquito-borne diseases emergence/resurgence and how to effectively control it biologically. Pathogens 2020, 9, 310. [Google Scholar] [CrossRef] [PubMed]
- Franklinos, L.H.; Jones, K.E.; Redding, D.W.; Abubakar, I. The effect of global change on mosquito-borne disease. Lancet Infect. Dis. 2019, 19, e302–e312. [Google Scholar] [CrossRef] [PubMed]
- Huang, Y.; Higgs, S.; Vanlandingham, D. Arbovirus-mosquito vector-host interactions and the impact on transmission and disease pathogenesis of arboviruses. Front Microbiol. 2019, 10, 22. [Google Scholar] [CrossRef] [PubMed]
- Nguyen-Tien, T.; Lundkvist, Å.; Lindahl, J. Urban transmission of mosquito-borne flaviviruses–a review of the risk for humans in Vietnam. Infect. Ecol. Epidemiol. 2019, 9, 1660129. [Google Scholar] [CrossRef] [PubMed]
- Meng, S.; Delnat, V.; Stoks, R. Mosquito larvae that survive a heat spike are less sensitive to subsequent exposure to the pesticide chlorpyrifos. Environ. Pollut. 2020, 265, 114824. [Google Scholar] [CrossRef] [PubMed]
- Singapore Government. National Environment Agency. Available online: https://www.nea.gov.sg/ (accessed on 2 January 2022).
- Dom, N.C.; Ahmad, A.H.; Ismail, R. Habitat characterization of Aedes sp. breeding in urban hotspot area. Procedia-Soc. Behav. Sci. 2013, 85, 100–109. [Google Scholar] [CrossRef]
- Wilke, A.B.; Vasquez, C.; Carvajal, A.; Moreno, M.; Fuller, D.O.; Cardenas, G.; Petrie, W.D.; Beier, J.C. Urbanization favors the proliferation of Aedes aegypti and Culex quinquefasciatus in urban areas of Miami-Dade County, Florida. Sci. Rep. 2021, 11, 22989. [Google Scholar] [CrossRef]
- Liew, C.; Soh, L.T.; Chen, I.; Ng, L.C. Public sentiments towards the use of Wolbachia-Aedes technology in Singapore. BMC Public Health 2021, 21, 1417. [Google Scholar] [CrossRef]
- Ahmed, T.; Hyder, M.Z.; Liaqat, I.; Scholz, M. Climatic conditions: Conventional and nanotechnology-based methods for the control of mosquito vectors causing human health issues. Int. J. Environ. Res. Public Health 2019, 16, 3165. [Google Scholar] [CrossRef] [PubMed]
- Wilke, A.B.; Vasquez, C.; Carvajal, A.; Moreno, M.; Petrie, W.D.; Beier, J.C. Evaluation of the effectiveness of BG-Sentinel and CDC light traps in assessing the abundance, richness, and community composition of mosquitoes in rural and natural areas. Parasites Vectors 2022, 15, 51. [Google Scholar] [CrossRef] [PubMed]
- Jhaiaun, P.; Panthawong, A.; Saeung, M.; Sumarnrote, A.; Kongmee, M.; Ngoen-Klan, R.; Chareonviriyaphap, T. Comparing Light—Emitting—Diodes light traps for catching anopheles mosquitoes in a forest setting, Western Thailand. Insects 2021, 12, 1076. [Google Scholar] [CrossRef]
- Barrera, R. New tools for Aedes control: Mass trapping. Curr. Opin. Insect Sci. 2022, 52, 100942. [Google Scholar] [CrossRef] [PubMed]
- Ong, J.; Chong, C.-S.; Yap, G.; Lee, C.; Abdul Razak, M.A.; Chiang, S.; Ng, L.-C. Gravitrap deployment for adult Aedes aegypti surveillance and its impact on dengue cases. PLoS Negl. Trop. Dis. 2020, 14, e0008528. [Google Scholar] [CrossRef]
- Bertola, M.; Fornasiero, D.; Sgubin, S.; Mazzon, L.; Pombi, M.; Montarsi, F. Comparative efficacy of BG-Sentinel 2 and CDC-like mosquito traps for monitoring potential malaria vectors in Europe. Parasites Vectors 2022, 15, 160. [Google Scholar] [CrossRef] [PubMed]
- Namango, I.H.; Marshall, C.; Saddler, A.; Ross, A.; Kaftan, D.; Tenywa, F.; Makungwa, N.; Odufuwa, O.G.; Ligema, G.; Ngonyani, H. The Centres for Disease Control light trap (CDC-LT) and the human decoy trap (HDT) compared to the human landing catch (HLC) for measuring Anopheles biting in rural Tanzania. Malar. J. 2022, 21, 181. [Google Scholar] [CrossRef] [PubMed]
- Jaffal, A.; Fite, J.; Baldet, T.; Delaunay, P.; Jourdain, F.; Mora-Castillo, R.; Olive, M.-M.; Roiz, D. Current evidences of the efficacy of mosquito mass-trapping interventions to reduce Aedes aegypti and Aedes albopictus populations and Aedes-borne virus transmission. PLoS Negl. Trop. Dis. 2023, 17, e0011153. [Google Scholar] [CrossRef]
- Pan, C.-Y.; Cheng, L.; Liu, W.-L.; Su, M.P.; Ho, H.-P.; Liao, C.-H.; Chang, J.-H.; Yang, Y.-C.; Hsu, C.-C.; Huang, J.-J. Comparison of fan-traps and gravitraps for aedes mosquito surveillance in Taiwan. Front. Public Health 2022, 10, 778736. [Google Scholar] [CrossRef]
- Singapore Government. National Environment Agency. Available online: https://www.nea.gov.sg/our-services/pest-control/fumigation (accessed on 21 July 2024).
- Park, M.-G.; Choi, J.; Hong, Y.-S.; Park, C.G.; Kim, B.-G.; Lee, S.-Y.; Lim, H.-J.; Mo, H.-h.; Lim, E.; Cha, W. Negative effect of methyl bromide fumigation work on the central nervous system. PLoS ONE 2020, 15, e0236694. [Google Scholar] [CrossRef]
- Nelsen, J.A.; Yee, D.A. Mosquito larvicides disrupt behavior and survival rates of aquatic insect predators. Hydrobiologia 2022, 849, 4823–4835. [Google Scholar] [CrossRef]
- Bravo, D.T.; Lima, G.A.; Alves, W.A.L.; Colombo, V.P.; Djogbenou, L.; Pamboukian, S.V.D.; Quaresma, C.C.; de Araujo, S.A. Automatic detection of potential mosquito breeding sites from aerial images acquired by unmanned aerial vehicles. Comput. Environ. Urban Syst. 2021, 90, 101692. [Google Scholar] [CrossRef]
- Hanif, A.S.; Han, X.; Yu, S.-H. Independent control spraying system for UAV-based precise variable sprayer: A review. Drones 2022, 6, 383. [Google Scholar] [CrossRef]
- Oğuz-Ekim, P. TDOA based localization and its application to the initialization of LiDAR based autonomous robots. Robot. Auton. Syst. 2020, 131, 103590. [Google Scholar] [CrossRef]
- Nasir, F.E.; Tufail, M.; Haris, M.; Iqbal, J.; Khan, S.; Khan, M.T. Precision agricultural robotic sprayer with real-time Tobacco recognition and spraying system based on deep learning. PLoS ONE 2023, 18, e0283801. [Google Scholar] [CrossRef]
- Sun, Q.; Chen, J.; Zhou, L.; Ding, S.; Han, S. A study on ice resistance prediction based on deep learning data generation method. Ocean Eng. 2024, 301, 117467. [Google Scholar] [CrossRef]
- Preethi, P.; Mamatha, H.R. Region-based convolutional neural network for segmenting text in epigraphical images. Artif. Intell. Appl. 2023, 1, 119–127. [Google Scholar] [CrossRef]
- Akande, T.O.; Alabi, O.O.; Ajagbe, S.A. A deep learning-based CAE approach for simulating 3D vehicle wheels under real-world conditions. Artif. Intell. Appl. 2022, 1–11. [Google Scholar] [CrossRef]
- Baltazar, A.R.; Santos, F.N.d.; Moreira, A.P.; Valente, A.; Cunha, J.B. Smarter robotic sprayer system for precision agriculture. Electronics 2021, 10, 2061. [Google Scholar] [CrossRef]
- Wang, B.; Yan, Y.; Lan, Y.; Wang, M.; Bian, Z. Accurate detection and precision spraying of corn and weeds using the improved YOLOv5 model. IEEE Access 2023, 11, 29868–29882. [Google Scholar] [CrossRef]
- Hu, C.; Xie, S.; Song, D.; Thomasson, J.A.; Hardin IV, R.G.; Bagavathiannan, M. Algorithm and system development for robotic micro-volume herbicide spray towards precision weed management. IEEE Robot. Autom. Lett. 2022, 7, 11633–11640. [Google Scholar]
- Fan, X.; Chai, X.; Zhou, J.; Sun, T. Deep learning based weed detection and target spraying robot system at seedling stage of cotton field. Comput. Electron. Agric. 2023, 214, 108317. [Google Scholar]
- Hassan, M.U.; Ullah, M.; Iqbal, J. Towards autonomy in agriculture: Design and prototyping of a robotic vehicle with seed selector. In Proceedings of the 2016 2nd International Conference on Robotics and Artificial Intelligence (ICRAI), Rawalpindi, Pakistan, 1–2 November 2016; pp. 37–44. [Google Scholar]
- Zhang, C.; Lei, L.; Ma, X.; Zhou, R.; Shi, Z.; Guo, Z. Map Construction Based on LiDAR Vision Inertial Multi-Sensor Fusion. World Electr. Veh. J. 2021, 12, 261. [Google Scholar] [CrossRef]
- Liu, Z.; Li, Z.; Liu, A.; Shao, K.; Guo, Q.; Wang, C. LVI-Fusion: A Robust Lidar-Visual-Inertial SLAM Scheme. Remote Sens. 2024, 16, 1524. [Google Scholar] [CrossRef]
- Lee, J.; Hwang, S.; Kim, W.J.; Lee, S. SAM-Net: LiDAR depth inpainting for 3D static map generation. IEEE Trans. Intell. Transp. Syst. 2021, 23, 12213–12228. [Google Scholar] [CrossRef]
- Shan, T.; Englot, B.; Meyers, D.; Wang, W.; Ratti, C.; Rus, D. Lio-sam: Tightly-coupled lidar inertial odometry via smoothing and mapping. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 24 October 2020–24 January 2021; pp. 5135–5142. [Google Scholar]
- Nourbakhsh, I.; Powers, R.; Birchfield, S. DERVISH an office-navigating robot. AI Mag. 1995, 16, 53. [Google Scholar]
- Abiyev, R.H.; Günsel, I.; Akkaya, N.; Aytac, E.; Çağman, A.; Abizada, S. Robot soccer control using behaviour trees and fuzzy logic. Procedia Comput. Sci. 2016, 102, 477–484. [Google Scholar] [CrossRef]
- Girshick, R. Fast r-cnn. In Proceedings of the 2015 IEEE International Conference on Computer Vision, Santiago, Chile, 7–13 December 2015; pp. 1440–1448. [Google Scholar]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster r-cnn: Towards real-time object detection with region proposal networks. IEEE Trans. Patt. Analy. Mach. Intell. 2016, 39, 1137–1149. [Google Scholar] [CrossRef] [PubMed]
- Yan, J.; Lei, Z.; Wen, L.; Li, S.Z. The fastest deformable part model for object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 23–28 June 2014; pp. 2497–2504. [Google Scholar]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You only look once: Unified, real-time object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar]
- Talaat, F.M.; ZainEldin, H. An improved fire detection approach based on YOLO-v8 for smart cities. Neural. Comput. Appl. 2023, 35, 20939–20954. [Google Scholar] [CrossRef]
- What is YOLOv8? The Ultimate Guide. Available online: https://blog.roboflow.com/whats-new-in-yolov8/#what-is-yolov8 (accessed on 10 July 2024).
- Minakshi, M.; Bhuiyan, T.; Kariev, S.; Kaddumukasa, M.; Loum, D.; Stanley, N.B.; Chellappan, S.; Habomugisha, P.; Oguttu, D.W.; Jacob, B.G. High-accuracy detection of malaria mosquito habitats using drone-based multispectral imagery and Artificial Intelligence (AI) algorithms in an agro-village peri-urban pastureland intervention site (Akonyibedo) in Unyama SubCounty, Gulu District, Northern Uganda. J. Public Health Epidemiol. 2020, 12, 202–217. [Google Scholar]
- Jeyabal, S.; Sachinthana, W.; Bhagya, S.; Samarakoon, P.; Elara, M.R.; Sheu, B.J. Hard-to-Detect Obstacle Mapping by Fusing LIDAR and Depth Camera. IEEE Sens. J. 2024, 24, 24690–24698. [Google Scholar] [CrossRef]
Aspect | Existing Methods | Proposed Work |
---|---|---|
Technology Used |
|
|
Primary Limitations |
|
|
Efficiency | Varies significantly with manual fumigation and trap effectiveness | High efficiency due to automated, precise detection and fumigation of hotspots |
Health Impact | Potential health risks due to chemical exposure | Reduced risk to human operators by automating chemical fumigation |
Environmental Impact | Potential for chemical dispersal affecting non-target areas | Focused application of chemicals, reducing environmental footprint |
Navigation and Mapping | Limited in non-open areas like indoor environments or dense urban settings | Advanced navigation using 3D-LiDAR and LIO-SAM algorithm, improving accuracy in complex environments |
Hotspot Identification | Relies heavily on manual inspection and stationary traps | Automated real-time identification and remapping, increasing responsiveness to changing conditions |
Operational Strategy | Static with periodic manual adjustments | Dynamic, with ongoing adjustments based on real-time data collection |
Cost | Lower initial cost but higher due to labor and repeated interventions | Higher initial investment but lower ongoing costs due to automation |
Adaptability | Limited adaptability to new breeding grounds without manual intervention | High adaptability with continuous learning and updating capabilities |
Product | Specifications |
---|---|
Oriental motors | BLHM450KC-30 |
IMU | Vectornav VN-100 |
Voltage regulator | DDR-480C-24, DDR-240C-24 |
2D LiDAR | SICK TiM581-2050101 |
3D LiDAR | Hesai QT128 |
Depth camera | Intel RealSense D435i |
Industrial PC (IPC) | Nuvo-10108GC-RTX3080 |
Battery | 48 V, 25 Ah, Lithium Iron Phosphate |
Fogging unit | 10 L tank, 50-micron droplet size, and flow rate 330 mL/min |
Component | Detailed Specification |
---|---|
CPU | Intel i7 12th-Gen Core 65 W LGA1700 CPU |
RAM | 64 GB DDR5 4800 MHz |
Graphics | NVIDIA RTX 4080 16 GB |
SSD | NVMe SSD 2 TB Gen4 M.2 2280 |
Temperature | Rugged, −25 °C to 60 °C operation |
DC Input | 3-pin + 4-pin pluggable terminal block for 8 V to 48 V DC input with ignition control, Humidity: 10~90%, non-condensing |
Vibration and Shock absorption | MIL-STD-810H, Method 514.8, Category 4 (with damping bracket) |
Type | Precision | Recall | F1 Score | [email protected] (%) |
---|---|---|---|---|
All | 0.81 | 0.69 | 0.74 | 0.74 |
Cooler | 0.82 | 0.84 | 0.83 | 0.93 |
Drain | 0.95 | 0.91 | 0.93 | 0.94 |
Toilet | 0.87 | 0.88 | 0.88 | 0.94 |
Dustbin | 0.75 | 0.51 | 0.61 | 0.62 |
Pot | 0.99 | 1 | 0.99 | 0.99 |
Model | Precision (P) | Recall (R) | F1 Score (F1) | [email protected] (%) |
---|---|---|---|---|
Faster RCNN | 0.56 | 0.61 | 0.58 | 0.57 |
YOLOv5 | 0.73 | 0.60 | 0.65 | 0.69 |
YOLOv7 | 0.70 | 0.58 | 0.63 | 0.61 |
YOLOv8 (current model) | 0.81 | 0.71 | 0.75 | 0.74 |
Aspect | Continuous Fumigation | Precision Fumigation |
---|---|---|
Area Covered | Entire area (20 m2) | Identified hotspots (3 hotspots) |
Robot Speed | 0.5 m/s | 0.5 m/s |
Fumigation Time per m2 | 2 s | 5 s per hotspot |
Total Fumigation Time | 40 s | 15 s |
Chemical Usage Rate | 330 mL/min | 330 mL/min |
Total Chemical Used | 220 ml | 82.5 ml |
Chemical Savings | N/A | 62.5% (137.5 mL less than continuous) |
Environmental Impact | Higher due to full area coverage | Lower due to targeted application |
Efficiency | Lower, as it treats the entire area | Higher, with focused treatment of hotspots |
Aspect | Ref. [29] | Ref. [30] | Ref. [31] | Ref. [32] | Proposed Robot |
---|---|---|---|---|---|
Robot developed | |||||
AI Integration | SVM classifier for leaf density | Improved YOLOv5 model with attention mechanisms | Novel scene representation and motion planning | Deep learning model with CBAM and BiFPN | YOLOv8 for dynamic hotspot localization |
Urban Navigation (Complex Environment) | No (focused on agricultural fields) | No (focused on agricultural fields) | No (designed for early-stage crops) | No (agricultural fields only) | Yes (specifically designed for urban landscapes) |
Real-Time Operation | Yes, but limited to agricultural fields | Yes, real-time performance with 30 ms/frame detection speed | Yes, sub-centimeter precision in spraying | Yes, real-time detection and spraying | Yes, with real-time mapping and fumigation |
Mapping Capabilities | No | No | Limited | No | Dynamic mapping |
Precision Targeting | High precision in spraying based on leaf density | High precision for corn and weed identification | High precision for micro- volume spraying | High precision | High precision for mosquito hotspots |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Jeyabal, S.; Vikram, C.; Chittoor, P.K.; Elara, M.R. Revolutionizing Urban Pest Management with Sensor Fusion and Precision Fumigation Robotics. Appl. Sci. 2024, 14, 7382. https://doi.org/10.3390/app14167382
Jeyabal S, Vikram C, Chittoor PK, Elara MR. Revolutionizing Urban Pest Management with Sensor Fusion and Precision Fumigation Robotics. Applied Sciences. 2024; 14(16):7382. https://doi.org/10.3390/app14167382
Chicago/Turabian StyleJeyabal, Sidharth, Charan Vikram, Prithvi Krishna Chittoor, and Mohan Rajesh Elara. 2024. "Revolutionizing Urban Pest Management with Sensor Fusion and Precision Fumigation Robotics" Applied Sciences 14, no. 16: 7382. https://doi.org/10.3390/app14167382
APA StyleJeyabal, S., Vikram, C., Chittoor, P. K., & Elara, M. R. (2024). Revolutionizing Urban Pest Management with Sensor Fusion and Precision Fumigation Robotics. Applied Sciences, 14(16), 7382. https://doi.org/10.3390/app14167382