Intrarow Uncut Weed Detection Using You-Only-Look-Once Instance Segmentation for Orchard Plantations
Abstract
:1. Introduction
2. Related Works
3. Materials and Methods
3.1. Image Acquisition
3.2. Image Dataset
3.3. Network Architecture
3.3.1. YOLOv5 Instance Segmentation
3.3.2. YOLOv8 Instance Segmentation
3.4. Systems for Network Training and Testing
3.5. Evaluation Metrics
3.5.1. Detection Accuracy
3.5.2. Number of Model Parameters
3.5.3. Computational Cost and Inference Time
4. Results
4.1. Training Details
4.2. Testing Results
5. Discussion
6. Conclusions
- The vision system could recognize uncut weeds within rows for robotic platforms. For recognition, the YOLO-based model was used, and several deep learning networks were evaluated for satisfactory recognition using instance segmentation.
- According to our findings for orchard object recognition, using instance segmentation, YOLOv5-based models were faster at prediction but exhibited less accuracy. In contrast, the YOLOv8-based models demonstrated higher accuracy, with speeds not significantly different than those of the most accurate YOLOv5 model.
- YOLOv8n-seg was the most efficient and accurate method for recognizing weeds remaining between rows and obstacles; this approach is fast and highly accurate compared with the YOLOv8 family because of its lower computational cost. Therefore, this model can be used for weeding operations in intrarow areas as a second operation after riding mower weeding that cannot reach weeds around tree trunks.
- This approach can be adopted in edge devices for in-field operations for autonomous robotic weeders.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Magaki, W. Development of Temporary and Seasonal Agricultural Labor Forces in Japan. Jpn. Agric. Res. Q. JARQ 2023, 57, 9–12. [Google Scholar] [CrossRef]
- Riczu, P.; Nagy, A.; Lehoczky, É.; Tamás, J. Precision Weed Detection Using Terrestrial Laser Scanning Techniques. Commun. Soil Sci. Plant Anal. 2015, 46, 309–316. [Google Scholar] [CrossRef]
- Zhang, Q.; Karkee, M.; Tabb, A. The Use of Agricultural Robots in Orchard Management. In Robotics and Automation for Improving Agriculture; Billingsley, J., Ed.; Burleigh Dodds Series in Agricultural Science Publishing; Burleigh Dodds Science Publishing: Cambridge, UK, 2019; pp. 187–214. [Google Scholar] [CrossRef]
- Wang, S.; Song, J.; Qi, P.; Yuan, C.; Wu, H.; Zhang, L.; Liu, W.; Liu, Y.; He, X. Design and Development of Orchard Autonomous Navigation Spray System. Front. Plant Sci. 2022, 13, 960686. [Google Scholar] [CrossRef] [PubMed]
- Jiang, A.; Ahamed, T. Navigation of an Autonomous Spraying Robot for Orchard Operations Using LiDAR for Tree Trunk Detection. Sensors 2023, 23, 4808. [Google Scholar] [CrossRef] [PubMed]
- Abeyrathna, R.M.R.D.; Nakaguchi, V.M.; Minn, A.; Ahamed, T. Recognition and Counting of Apples in a Dynamic State Using a 3D Camera and Deep Learning Algorithms for Robotic Harvesting Systems. Sensors 2023, 23, 3810. [Google Scholar] [CrossRef] [PubMed]
- Zhou, J.; Geng, S.; Qiu, Q.; Shao, Y.; Zhang, M. A Deep-Learning Extraction Method for Orchard Visual Navigation Lines. Agriculture 2022, 12, 1650. [Google Scholar] [CrossRef]
- Pan, S.; Ahamed, T. Pear Recognition in an Orchard from 3D Stereo Camera Datasets to Develop a Fruit Picking Mechanism Using Mask R-CNN. Sensors 2022, 22, 4187. [Google Scholar] [CrossRef] [PubMed]
- Camacho, J.C.; Morocho-Cayamcela, M.E. Mask R-CNN and YOLOv8 Comparison to Perform Tomato Maturity Recognition Task. In Information and Communication Technologies; Maldonado-Mahauad, J., Herrera-Tapia, J., Zambrano-Martínez, J.L., Berrezueta, S., Eds.; Communications in Computer and Information Science; Springer Nature: Cham, Switzerland, 2023; Volume 1885, pp. 382–396. ISBN 978-3-031-45437-0. [Google Scholar] [CrossRef]
- Kumar, S.P.; Tewari, V.K.; Chandel, A.K.; Mehta, C.R.; Nare, B.; Chethan, C.R.; Mundhada, K.; Shrivastava, P.; Gupta, C.; Hota, S. A Fuzzy Logic Algorithm Derived Mechatronic Concept Prototype for Crop Damage Avoidance during Eco-Friendly Eradication of Intra-Row Weeds. Artif. Intell. Agric. 2020, 4, 116–126. [Google Scholar] [CrossRef]
- Zhang, S.; Guo, C.; Gao, Z.; Sugirbay, A.; Chen, J.; Chen, Y. Research on 2D Laser Automatic Navigation Control for Standardized Orchard. Appl. Sci. 2020, 10, 2763. [Google Scholar] [CrossRef]
- Velička, R.; Mockevičienė, R.; Marcinkevičienė, A.; Pupalienė, R.; Kriaučiūnienė, Z.; Butkevičienė, L.M.; Kosteckas, R.; Čekanauskas, S. The Effect of Non-Chemical Weed Control on Soil Biological Properties in a Spring Oilseed Rape Crop. Zemdirb.-Agric. 2017, 104, 107–114. [Google Scholar] [CrossRef]
- Nørremark, M.; Griepentrog, H.W.; Nielsen, J.; Søgaard, H.T. The Development and Assessment of the Accuracy of an Autonomous GPS-Based System for Intra-Row Mechanical Weed Control in Row Crops. Biosyst. Eng. 2008, 101, 396–410. [Google Scholar] [CrossRef]
- Reiser, D.; Sehsah, E.-S.; Bumann, O.; Morhard, J.; Griepentrog, H. Development of an Autonomous Electric Robot Implement for Intra-Row Weeding in Vineyards. Agriculture 2019, 9, 18. [Google Scholar] [CrossRef]
- Hossain, M.Z.; Komatsuzaki, M. Weed Management and Economic Analysis of a Robotic Lawnmower: A Case Study in a Japanese Pear Orchard. Agriculture 2021, 11, 113. [Google Scholar] [CrossRef]
- Åstrand, B.; Baerveldt, A.-J. An Agricultural Mobile Robot with Vision-Based Perception for Mechanical Weed Control. Auton. Robots 2002, 13, 21–35. [Google Scholar] [CrossRef]
- Igawa, H.; Tanaka, T.; Kaneko, S.; Tada, T.; Suzuki, S. Visual and Tactual Recognition of Trunk of Grape for Weeding Robot in Vineyards. In Proceedings of the 2009 35th Annual Conference of IEEE Industrial Electronics, Porto, Portugal, 3–5 November 2009; pp. 4274–4279. [Google Scholar]
- Dobbs, A.M.; Ginn, D.; Skovsen, S.K.; Bagavathiannan, M.V.; Mirsky, S.B.; Reberg-Horton, C.S.; Leon, R.G. New Directions in Weed Management and Research Using 3D Imaging. Weed Sci. 2022, 70, 641–647. [Google Scholar] [CrossRef]
- Li, J.; Tang, L. Crop Recognition under Weedy Conditions Based on 3D Imaging for Robotic Weed Control. J. Field Robot. 2018, 35, 596–611. [Google Scholar] [CrossRef]
- Wu, X.; Aravecchia, S.; Lottes, P.; Stachniss, C.; Pradalier, C. Robotic Weed Control Using Automated Weed and Crop Classification. J. Field Robot. 2020, 37, 322–340. [Google Scholar] [CrossRef]
- Huang, P.; Huang, P.; Wang, Z.; Wu, X.; Liu, J.; Zhu, L. Deep-Learning-Based Trunk Perception with Depth Estimation and DWA for Robust Navigation of Robotics in Orchards. Agronomy 2023, 13, 1084. [Google Scholar] [CrossRef]
- Sapkota, B.B.; Popescu, S.; Rajan, N.; Leon, R.G.; Reberg-Horton, C.; Mirsky, S.; Bagavathiannan, M.V. Use of Synthetic Images for Training a Deep Learning Model for Weed Detection and Biomass Estimation in Cotton. Sci. Rep. 2022, 12, 19580. [Google Scholar] [CrossRef]
- Sapkota, R.; Ahmed, D.; Karkee, M. Comparing YOLOv8 and Mask R-CNN for object segmentation in complex orchard environments. arXiv 2023, arXiv:2312.07935. [Google Scholar]
- Dumitriu, A.; Tatui, F.; Miron, F.; Ionescu, R.T.; Timofte, R. Rip Current Segmentation: A Novel Benchmark and YOLOv8 Baseline Results. In Proceedings of the 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Vancouver, BC, Canada, 17–24 June 2023; pp. 1261–1271. [Google Scholar] [CrossRef]
- Fathipoor, H.; Shah-Hosseini, R.; Arefi, H. Crop and Weed Segmentation on Ground-Based Images Using Deep Convolutional Neural Network. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2023, X-4-W1-2022, 195–200. [Google Scholar] [CrossRef]
- Lashgari, E.; Liang, D.; Maoz, U. Data Augmentation for Deep-Learning-Based Electroencephalography. J. Neurosci. Methods 2020, 346, 108885. [Google Scholar] [CrossRef]
- Jocher, G. YOLOv5 by Ultralytics (Version 7.0). 2020. Available online: https://github.com/ultralytics/yolov5/ (accessed on 16 October 2023).
- Jocher, G.; Chaurasia, A.; Qiu, J. YOLO by Ultralytics (Version 8.0.0). 2023. Available online: https://github.com/ultralytics/ultralytics (accessed on 11 November 2023).
- He, K.; Gkioxari, G.; Dollár, P.; Girshick, R. Mask R-CNN. In Proceedings of the 2017 IEEE International Conference on Computer Vision (ICCV), Venice, Italy, 22–29 October 2017; pp. 2980–2988. [Google Scholar] [CrossRef]
- Lu, A.; Ma, L.; Cui, H.; Liu, J.; Ma, Q. Instance Segmentation of Lotus Pods and Stalks in Unstructured Planting Environment Based on Improved YOLOv5. Agriculture 2023, 13, 1568. [Google Scholar] [CrossRef]
- Terven, J.; Cordova-Esparza, D. A Comprehensive Review of YOLO: From YOLOv1 and Beyond. Available online: https://arxiv.org/abs/2304.00501v5 (accessed on 30 December 2023).
- Zhao, X.; Ding, W.; An, Y.; Du, Y.; Yu, T.; Li, M.; Tang, M.; Wang, J. Fast Segment Anything. arXiv 2023. [Google Scholar] [CrossRef]
- Bai, R.; Wang, M.; Zhang, Z.; Lu, J.; Shen, F. Automated Construction Site Monitoring Based on Improved YOLOv8-Seg Instance Segmentation Algorithm. IEEE Access 2023, 11, 139082–139096. [Google Scholar] [CrossRef]
- Dang, F.; Chen, D.; Lu, Y.; Li, Z. YOLOWeeds: A Novel Benchmark of YOLO Object Detectors for Multi-Class Weed Detection in Cotton Production Systems. Comput. Electron. Agric. 2023, 205, 107655. [Google Scholar] [CrossRef]
- Padilla, R.; Netto, S.L.; da Silva, E.A.B. A Survey on Performance Metrics for Object-Detection Algorithms. In Proceedings of the 2020 International Conference on Systems, Signals and Image Processing (IWSSIP), Niteroi, Brazil, 1–3 July 2020; pp. 237–242. [Google Scholar] [CrossRef]
- Champ, J.; Mora-Fallas, A.; Goëau, H.; Mata-Montero, E.; Bonnet, P.; Joly, A. Instance segmentation for the fine detection of crop and weed plants by precision agricultural robots. Appl. Plant Sci. 2020, 8, e11373. [Google Scholar] [CrossRef] [PubMed]
- Wu, Z.; Chen, Y.; Zhao, B.; Kang, X.; Ding, Y. Review of Weed Detection Methods Based on Computer Vision. Sensors 2021, 21, 3647. [Google Scholar] [CrossRef] [PubMed]
- Lee, H.M.; Noh, D.; Kang, H.; Byun, S.-W.; Choi, J.-H. Design of Autonomous Driving Algorithms for Fruit Harvesting in Orchards. In Proceedings of the 2022 Thirteenth International Conference on Ubiquitous and Future Networks (ICUFN), Barcelona, Spain, 5–8 July 2022; pp. 497–499. [Google Scholar] [CrossRef]
- Koner, R.; Hannan, T.; Shit, S.; Sharifzadeh, S.; Schubert, M.; Seidl, T.; Tresp, V. Instanceformer: An online video instance segmentation framework. In Proceedings of the AAAI Conference on Artificial Intelligence, Washington, DC, USA, 7–14 February 2023; Volume 37, pp. 1188–1195. [Google Scholar]
- Tao, H.; Duan, Q.; Lu, M.; Hu, Z. Learning discriminative feature representation with pixel-level supervision for forest smoke recognition. Pattern Recognit. 2023, 143, 109761. [Google Scholar] [CrossRef]
- Zhou, Y.; Yang, K. Exploring TensorRT to Improve Real-Time Inference for Deep Learning. In Proceedings of the 2022 IEEE 24th Int Conf on High Performance Computing & Communications; 8th Int Conf on Data Science & Systems; 20th Int Conf on Smart City; 8th Int Conf on Dependability in Sensor, Cloud & Big Data Systems & Application (HPCC/DSS/SmartCity/DependSys), Hainan, China, 18–20 December 2022; pp. 2011–2018. [Google Scholar] [CrossRef]
No. | Model | Image Size | Batch | Learning | Momentum | Decay | Epoch | Class |
---|---|---|---|---|---|---|---|---|
1 | YOLOv5n-seg | 640 × 640 | 16 | 0.01 | 0.937 | 0.0005 | 200 | 3 |
2 | YOLOv5s-seg | 640 × 640 | 8 | 0.01 | 0.937 | 0.0005 | 200 | 3 |
3 | YOLOv8n-seg | 640 × 640 | 16 | 0.01 | 0.937 | 0.0005 | 200 | 3 |
4 | YOLOv8s-seg | 640 × 640 | 8 | 0.01 | 0.937 | 0.0005 | 200 | 3 |
No. | Model | Box | Mask | ||||||
---|---|---|---|---|---|---|---|---|---|
Precision | Recall | [email protected] | [email protected]:0.95 | Precision | Recall | [email protected] | [email protected]:0.95 | ||
1 | YOLOv5n-seg | 80.70 | 75.90 | 80.90 | 45.80 | 69.90 | 60.60 | 60.60 | 28.70 |
2 | YOLOv5s-seg | 83.50 | 78.20 | 82.10 | 50.00 | 76.40 | 67.40 | 69.70 | 33.40 |
3 | YOLOv8n-seg | 82.50 | 75.50 | 81.17 | 51.70 | 77.10 | 70.50 | 74.60 | 36.90 |
4 | YOLOv8s-seg | 81.20 | 78.10 | 82.40 | 53.90 | 79.50 | 71.50 | 76.70 | 39.30 |
No. | Model | Parameters (Millions) | GFLOPs | Inference Times (ms) | [email protected] | [email protected]:0.95 |
---|---|---|---|---|---|---|
1 | YOLOv5n-seg | 1.88 | 6.70 | 25.90 | 60.60 | 28.70 |
2 | YOLOv5s-seg | 7.40 | 25.70 | 29.90 | 69.70 | 33.40 |
3 | YOLOv8n-seg | 3.26 | 12.10 | 36.92 | 74.60 | 36.90 |
4 | YOLOv8s-seg | 11.79 | 42.70 | 43.88 | 76.70 | 39.30 |
No. | Model | GeForce GTX 1050Ti (FPS) | Jetson Xavier NX (FPS) |
---|---|---|---|
1 | YOLOv5n-seg | 48.30 | 23.75 |
2 | YOLOv5s-seg | 42.37 | 17.01 |
3 | YOLOv8n-seg | 40.48 | 16.67 |
4 | YOLOv8s-seg | 24.14 | 11.10 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Sampurno, R.M.; Liu, Z.; Abeyrathna, R.M.R.D.; Ahamed, T. Intrarow Uncut Weed Detection Using You-Only-Look-Once Instance Segmentation for Orchard Plantations. Sensors 2024, 24, 893. https://doi.org/10.3390/s24030893
Sampurno RM, Liu Z, Abeyrathna RMRD, Ahamed T. Intrarow Uncut Weed Detection Using You-Only-Look-Once Instance Segmentation for Orchard Plantations. Sensors. 2024; 24(3):893. https://doi.org/10.3390/s24030893
Chicago/Turabian StyleSampurno, Rizky Mulya, Zifu Liu, R. M. Rasika D. Abeyrathna, and Tofael Ahamed. 2024. "Intrarow Uncut Weed Detection Using You-Only-Look-Once Instance Segmentation for Orchard Plantations" Sensors 24, no. 3: 893. https://doi.org/10.3390/s24030893
APA StyleSampurno, R. M., Liu, Z., Abeyrathna, R. M. R. D., & Ahamed, T. (2024). Intrarow Uncut Weed Detection Using You-Only-Look-Once Instance Segmentation for Orchard Plantations. Sensors, 24(3), 893. https://doi.org/10.3390/s24030893