Object Detection for Yellow Maturing Citrus Fruits from Constrained or Biased UAV Images: Performance Comparison of Various Versions of YOLO Models
Abstract
:1. Introduction
2. Materials and Methods
2.1. Acquision of UAV Images
2.2. Dataset Construction
2.3. Execution Environment of Deep Learning
2.4. Overview of YOLOv5u, YOLOv8 and YOLOv9
2.5. Model Training Execution
2.6. Evaluation Metrics
2.7. Assessment of Constructed Models
3. Results and Discussion
3.1. Comparison of Training Models Using Constrained Images Datasets
3.2. Comparison of Training Models Using Biased Images Datasets
3.3. Comparison of Models Trained Using Equal Number of Images Collected on Sunny and Cloudy Days
4. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Gonzatto, M.P.; Santos, J.S. Introductory Chapter: World Citrus Production and Research, Citrus Research—Horticultural and Human Health Aspects; InTech Open: Rijeka, Croatia, 2023; Available online: https://www.intechopen.com/chapters/86388 (accessed on 2 November 2024).
- Morinaga, K.; Sumikawa, O.; Kawamoto, O.; Yoshikawa, H.; Nakao, S.; Shimazaki, M.; Kusaba, S.; Hoshi, N. New Technologies and Systems for High Quality Citrus Fruit Production, Labor-Saving and Orchard Construction in Mountain Areas of Japan. J. Mt. Sci. 2005, 2, 59–67. [Google Scholar] [CrossRef]
- United States Department of Agriculture. Evaluation of Procedures for Estimating Citrus Fruit Yield; United States Department of Agriculture: Washington, DC, USA, 1972. Available online: https://www.nass.usda.gov/Education_and_Outreach/Reports,_Presentations_and_Conferences/Yield_Reports/Evaluation%20of%20Procedures%20for%20Estimating%20Citrus%20Fruit%20Yield.pdf (accessed on 2 November 2024).
- Stout, R.G. Estimating Citrus Production by Use of Frame Count Survey. J. Farm Econ. 1962, 44, 1037–1049. [Google Scholar] [CrossRef]
- United States Department of Agriculture. Sampling for Objective Yields of Apples and Peaches; United States Department of Agriculture: Washington, DC, USA, 1967. Available online: https://www.nass.usda.gov/Education_and_Outreach/Reports,_Presentations_and_Conferences/Yield_Reports/Sampling%20for%20Objective%20Yields%20of%20Apples%20and%20Oranges.pdf (accessed on 2 November 2024).
- Wulfsohn, D.; Aravena Zamora, F.; Potin, T.C.; Zamora, L.I.; García-Fiñana, M. Multilevel Systematic Sampling to Estimate Total Fruit Number for Yield Forecasts. Precis. Agric. 2012, 13, 256–275. [Google Scholar] [CrossRef]
- Zhang, W.; Wang, J.; Liu, Y.; Chen, K.; Li, H.; Duan, Y.; Wu, W.; Shi, Y.; Guo, W. Deep-Learning-Based in-Field Citrus Fruit Detection and Tracking. Hortic. Res. 2022, 9, uhac003. [Google Scholar] [CrossRef] [PubMed]
- Li, Y.; Gong, Z.; Zhou, Y.; He, Y.; Huang, R. Production Evaluation of Citrus Fruits Based on the YOLOv5 Compressed by Knowledge Distillation. In Proceedings of the 2023 26th International Conference on Computer Supported Cooperative Work in Design (CSCWD), Rio de Janeiro, Brazil, 24–26 May 2023; pp. 1938–1943. [Google Scholar]
- Gremes, M.F.; Fermo, I.R.; Krummenauer, R.; Flores, F.C.; Andrade, C.M.G.; Lima, O.C.d.M. System of Counting Green Oranges Directly from Trees Using Artificial Intelligence. AgriEngineering 2023, 5, 1813–1831. [Google Scholar] [CrossRef]
- Jing, J.; Zhai, M.; Dou, S.; Wang, L.; Lou, B.; Yan, J.; Yuan, S. Optimizing the YOLOv7-Tiny Model with Multiple Strategies for Citrus Fruit Yield Estimation in Complex Scenarios. Agriculture 2024, 14, 303. [Google Scholar] [CrossRef]
- Gao, A.; Tian, Z.; Ma, W.; Song, Y.; Ren, L.; Feng, Y.; Qian, J.; Xu, L. Fruits Hidden by Green: An Improved YOLOV8n for Detection of Young Citrus in Lush Citrus Trees. Front. Plant Sci. 2024, 15, 1375118. [Google Scholar]
- Apolo-Apolo, O.E.; Martínez-Guanter, J.; Egea, G.; Raja, P.; Pérez-Ruiz, M. Deep Learning Techniques for Estimation of the Yield and Size of Citrus Fruits Using a UAV. Eur. J. Agron. 2020, 115, 126030. [Google Scholar] [CrossRef]
- Novelero, J.M.; Cruz, J.C.D. On-Tree Mature Coconut Fruit Detection Based on Deep Learning Using UAV Images. In Proceedings of the 2022 IEEE International Conference on Cybernetics and Computational Intelligence (CyberneticsCom), Malang, Indonesia, 16–18 June 2022; pp. 494–499. [Google Scholar]
- Xiong, Z.; Wang, L.; Zhao, Y.; Lan, Y. Precision Detection of Dense Litchi Fruit in UAV Images Based on Improved YOLOv5 Model. Remote Sens. 2023, 15, 4017. [Google Scholar] [CrossRef]
- Wang, H.; Feng, J.; Yin, H. Improved Method for Apple Fruit Target Detection Based on YOLOv5s. Agriculture 2023, 13, 2167. [Google Scholar] [CrossRef]
- Arakawa, T.; Tanaka, T.S.T.; Kamio, S. Detection of On-Tree Chestnut Fruits Using Deep Learning and RGB Unmanned Aerial Vehicle Imagery for Estimation of Yield and Fruit Load. Agron. J. 2023, 116, 973–981. [Google Scholar] [CrossRef]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. IEEE Trans. Pattern Anal. Mach. Intell. 2016, 39, 1137–1149. [Google Scholar] [CrossRef] [PubMed]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You Only Look Once: Unified, Real-Time Object Detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar]
- Redmon, J.; Farhadi, A. YOLO9000: Better, Faster, Stronger. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 7263–7271. [Google Scholar]
- Redmon, J.; Farhadi, A. YOLOv3: An Incremental Improvement. arXiv 2018, arXiv:1804.02767. [Google Scholar] [CrossRef]
- Bochkovskiy, A.; Wang, C.Y.; Liao, H.Y.M. YOLOv4: Optimal Speed and Accuracy of Object Detection. arXiv 2020, arXiv:2004.10934. [Google Scholar] [CrossRef]
- Jocher, G.; Stoken, A.; Borovec, J.; Changyu, L.; Hogan, A.; Diaconu, L.; Rai, P. Ultralytics/yolov5: v3.1—Bug Fixes and Performance Improvements, Version 3.1; Zenodo: Geneva, Switzerland, 2020; Available online: https://zenodo.org/records/4154370 (accessed on 2 November 2024).
- Li, C.; Li, L.; Geng, Y.; Jiang, H.; Cheng, M.; Zhang, B.; Ke, Z.; Xu, X.; Chu, X. YOLOv6 v3.0: A Full-Scale Reloading. arXiv 2023, arXiv:2301.05586. [Google Scholar] [CrossRef]
- Wang, C.Y.; Bochkovskiy, A.; Liao, H.Y.M. YOLOv7: Trainable Bag-of-Freebies Sets New State-of-the-Art for Real-Time Object Detectors. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Vancouver, BC, Canada, 17–24 June 2023; pp. 7464–7475. [Google Scholar]
- Jocher, G.; Chaurasia, A.; Qiu, J. YOLO by Ultralytics, version 8.0.0; Ultralytics: Frederick, MD, USA, 2023; Available online: https://github.com/ultralytics/ultralytics (accessed on 2 November 2024).
- Wang, C.Y.; Yeh, I.H.; Liao, H.Y.M. YOLOv9: Learning What You Want to Learn Using Programmable Gradient Information. arXiv 2024, arXiv:2402.13616. [Google Scholar] [CrossRef]
- Wang, A.; Chen, H.; Liu, L.; Chen, K.; Lin, Z.; Han, J.; Ding, G. YOLOv10: Real-Time End-to-End Object Detection. arXiv 2024, arXiv:2405.14458. [Google Scholar] [CrossRef]
- Jocher, G.; Qiu, J. Ultralytics YOLO11. Available online: https://github.com/ultralytics/ultralytics (accessed on 2 November 2024).
- Shorten, C.; Khoshgoftaar, T.M. A Survey on Image Data Augmentation for Deep Learning. J. Big Data 2019, 6, 60. [Google Scholar] [CrossRef]
- Montserrat, D.M.; Lin, Q.; Allebach, J.; Delp, E.J. Training Object Detection and Recognition CNN Models Using Data Augmentation. Electron. Imaging 2017, 29, 27–36. [Google Scholar] [CrossRef]
- Yun, S.; Han, D.; Oh, S.J.; Chun, S.; Choe, J.; Yoo, Y. Cutmix: Regularization Strategy to Train Strong Classifiers with Localizable Features. In Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), Seoul, Republic of Korea, 27 October–2 November 2019; pp. 6023–6032. [Google Scholar]
- Zhong, Z.; Zheng, L.; Kang, G.; Li, S.; Yang, Y. Random Erasing Data Augmentation. In Proceedings of the AAAI Conference on Artificial Intelligence, New York, NY, USA, 7–12 February 2020; pp. 13001–13008. [Google Scholar]
- Goodfellow, I.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative Adversarial Nets. Adv. Neural Inf. Process. Syst. 2014, 27, 2672–2680. [Google Scholar]
- Isola, P.; Zhu, J.Y.; Zhou, T.; Efros, A.A. Image-to-Image Translation with Conditional Adversarial Networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 1125–1134. [Google Scholar]
- Webber, H.J. The Citrus Industry: History, World Distribution, Botany, and Varieties, 2nd ed.; University of California Press: Berkeley, CA, USA, 1967; pp. 389–390. [Google Scholar]
- Iwamasa, M. Citrus Cultivars in Japan. HortScience 1988, 23, 687–690. [Google Scholar] [CrossRef]
- Lan-Phi, N.T.; Shimamura, T.; Ukeda, H.; Sawamura, M. Chemical and Aroma Profiles of Yuzu (Citrus junos) Peel Oils of Different Cultivars. Food Chem. 2009, 115, 1042–1047. [Google Scholar] [CrossRef]
- Tanimoto, Y.; Yoshida, S. A Method of Constructing Models for Estimating Proportions of Citrus Fruit Size Grade Using Polynomial Regression. Agronomy 2024, 14, 174. [Google Scholar] [CrossRef]
- Yuan, W. Accuracy Comparison of YOLOv7 and YOLOv4 Regarding Image Annotation Quality for Apple Flower Bud Classification. AgriEngineering 2023, 5, 413–424. [Google Scholar] [CrossRef]
- Vilcapoma, P.; Parra Meléndez, D.; Fernández, A.; Vásconez, I.N.; Hillmann, N.C.; Gatica, G.; Vásconez, J.P. Comparison of Faster R-CNN, YOLO, and SSD for Third Molar Angle Detection in Dental Panoramic X-Rays. Sensors 2024, 24, 6053. [Google Scholar] [CrossRef]
- Sarma, K.S.R.K.; Sasikala, C.; Surendra, K.; Erukala, S.; Aruna, S.L. A comparative study on faster R-CNN, YOLO and SSD object detection algorithms on HIDS system. AIP Conf. Proc. 2024, 2971, 060044. [Google Scholar]
- Jocher, G. Ultralytics YOLOv5. 2020. Available online: https://github.com/ultralytics/yolov5 (accessed on 2 November 2024).
- Shijie, J.; Ping, W.; Peiyi, J.; Siping, H. Research on Data Augmentation for Image Classification Based on Convolution Neural Networks. In Proceedings of the 2017 Chinese Automation Congress (CAC), Jinan, China, 20–22 October 2017; pp. 4165–4170. [Google Scholar]
- Alin, A.Y.; Yuana, K.A. Data Augmentation Method on Drone Object Detection with YOLOv5 Algorithm. In Proceedings of the 2023 Eighth International Conference on Informatics and Computing (ICIC), Manado, Indonesia, 8–9 December 2023; pp. 1–6. [Google Scholar]
- Fu, X.; Zhao, S.; Wang, C.; Tang, X.; Tao, D.; Li, G.; Jiao, L.; Dong, D. Green Fruit Detection with a Small Dataset under a Similar Color Background Based on the Improved YOLOv5-AT. Foods 2024, 13, 1060. [Google Scholar] [CrossRef]
- Mirhaji, H.; Soleymani, M.; Asakereh, A.; Mehdizadeh, S.A. Fruit Detection and Load Estimation of an Orange Orchard Using the YOLO Models Through Simple Approaches in Different Imaging and Illumination Conditions. Comput. Electron. Agric. 2021, 191, 106533. [Google Scholar] [CrossRef]
- Xu, L.; Wang, Y.; Shi, X.; Tang, Z.; Chen, X.; Wang, Y.; Zou, Z.; Huang, P.; Liu, B.; Yang, N.; et al. Real-Time and Accurate Detection of Citrus in Complex Scenes Based on HPL-YOLOv4. Comput. Electron. Agric. 2023, 205, 107590. [Google Scholar] [CrossRef]
- Buda, M.; Maki, A.; Mazurowski, M.A. A Systematic Study of the Class Imbalance Problem in Convolutional Neural Networks. Neural Netw. 2018, 106, 249–259. [Google Scholar] [CrossRef]
Train or Test | Orchard Number | Acquisition Date of UAV Images | Number of Trees | Weather Condition of Acquisition Date | Fruit Maturing of Acquisition Date | Harvesting Date |
---|---|---|---|---|---|---|
Train | 1 | 2020/10/28 | 40 | Cloudy | Completely Matured | 2020/11/06 |
Train | 1 | 2021/10/31 | 45 | Cloudy | Completely Matured | 2021/11/11 |
Test | 1 | 2022/10/27 | 15 | Cloudy | Completely Matured | 2022/11/08 |
Train | 2 | 2020/10/30 | 20 | Sunny | Completely Matured | 2020/11/04 |
Test | 2 | 2021/11/05 | 15 | Cloudy | Completely Matured | 2021/11/12 |
Train | 2 | 2022/10/27 | 20 | Cloudy | Completely Matured | 2022/11/02 |
Test | 5 | 2023/10/18 | 15 | Cloudy | Under-Matured | 2023/11/07 |
Test | 7 | 2023/10/25 | 10 | Sunny | Under-Matured | 2023/11/09 |
(a) | ||||||
Datasets | Number of Images | Average Number of Instances | ||||
Train | Validation | Total | Train | Validation | Total | |
125-Image Dataset | 100 | 25 | 125 | 9738 | 2434 | 12,172 |
25-Image Dataset | 19 | 6 | 25 | 1887 | 547 | 2434 |
50-Image Dataset | 38 | 12 | 50 | 3774 | 1095 | 4869 |
75-Image Dataset | 57 | 18 | 75 | 5661 | 1642 | 7303 |
100-Image Dataset | 76 | 24 | 100 | 7548 | 2190 | 9738 |
(b) | ||||||
Datasets | Number of Images | Average Number of Instances | ||||
Train | Validation | Total | Train | Validation | Total | |
Orchard 1 in 2020 Dataset | 128 | 32 | 160 | 10,819 | 2705 | 13,524 |
Orchard 1 in 2021 Dataset | 144 | 36 | 180 | 20,000 | 5000 | 25,000 |
Orchard 2 in 2020 Dataset | 64 | 16 | 80 | 3510 | 878 | 4388 |
Orchard 2 in 2022 Dataset | 64 | 16 | 80 | 4621 | 1155 | 5776 |
(c) | ||||||
Datasets | Number of Images | Average Number of Instances | ||||
Train | Validation | Total | Train | Validation | Total | |
Approximately Equal Number of Images on Sunny and Cloudy Days | 164 | 41 | 205 | 13,248 | 3312 | 16,560 |
Model | YOLOv8m | YOLOv9c | YOLOv5mu |
---|---|---|---|
Parameters (Millions) 1 | 25.8 | 25.3 | 25.0 |
GFLOPs 1 | 78.7 | 102.3 | 64.0 |
Epochs | 400 | 400 | 400 |
Early Stopping | 50 | 50 | 50 |
Batch Size | 8 | 8 | 8 |
Image Size | 640 × 640 | 640 × 640 | 640 × 640 |
Optimizer | AdamW | AdamW | AdamW |
Learning Rate | 0.002 | 0.002 | 0.002 |
Momentum | 0.9 | 0.9 | 0.9 |
Model | YOLOv8m | YOLOv9c | YOLOv5mu |
---|---|---|---|
Batch Size | 8 | 8 | 8 |
Image Size | 1200 × 1200 | 1200 × 1200 | 1200 × 1200 |
Confidence Score | 0.25 | 0.25 | 0.25 |
IoU | 0.50 | 0.50 | 0.50 |
Model | Precision | ||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
25-Image 1 | 50-Image | 75-Image | 100-Image | 125-Image | |||||||||||
YOLOv8m | 78.6% | ± | 11.0% | 86.7% | ± | 2.3% | 85.4% | ± | 2.0% | 85.0% | ± | 1.6% | 87.6% | ± | 1.4% |
YOLOv9c | 80.9% | ± | 17.0% | 85.5% | ± | 3.7% | 84.5% | ± | 2.9% | 86.7% | ± | 1.6% | 87.7% | ± | 1.3% |
YOLOv5mu | 84.7% | ± | 1.6% | 85.3% | ± | 1.3% | 85.3% | ± | 2.3% | 85.9% | ± | 1.5% | 86.7% | ± | 1.6% |
Model | Recall | ||||||||||||||
25-Image | 50-Image | 75-Image | 100-Image | 125-Image | |||||||||||
YOLOv8m | 72.3% | ± | 7.4% | 73.4% | ± | 2.2% | 73.6% | ± | 3.5% | 76.0% | ± | 3.5% | 78.1% | ± | 2.0% |
YOLOv9c | 73.5% | ± | 5.6% | 75.9% | ± | 2.2% | 75.3% | ± | 4.9% | 77.4% | ± | 4.6% | 80.0% | ± | 1.6% |
YOLOv5mu | 74.6% | ± | 4.3% | 76.2% | ± | 4.8% | 74.9% | ± | 3.4% | 77.5% | ± | 5.0% | 79.0% | ± | 0.9% |
Model | F1-score | ||||||||||||||
25-Image | 50-Image | 75-Image | 100-Image | 125-Image | |||||||||||
YOLOv8m | 75.2% | ± | 8.7% | 79.5% | ± | 2.1% | 79.0% | ± | 2.4% | 80.2% | ± | 2.3% | 82.6% | ± | 1.5% |
YOLOv9c | 76.7% | ± | 11.3% | 80.4% | ± | 2.5% | 79.6% | ± | 3.1% | 81.7% | ± | 2.8% | 83.6% | ± | 1.0% |
YOLOv5mu | 79.3% | ± | 2.4% | 80.4% | ± | 3.0% | 79.7% | ± | 2.3% | 81.4% | ± | 2.7% | 82.7% | ± | 1.0% |
Model | AP@50 | ||||||||||||||
25-Image | 50-Image | 75-Image | 100-Image | 125-Image | |||||||||||
YOLOv8m | 80.3% | ± | 9.3% | 84.8% | ± | 2.3% | 84.7% | ± | 3.1% | 85.4% | ± | 2.4% | 88.0% | ± | 1.1% |
YOLOv9c | 81.6% | ± | 12.7% | 85.7% | ± | 2.1% | 85.2% | ± | 2.9% | 87.4% | ± | 2.9% | 89.0% | ± | 0.9% |
YOLOv5mu | 85.1% | ± | 2.5% | 85.9% | ± | 3.1% | 85.2% | ± | 2.4% | 86.7% | ± | 3.0% | 88.0% | ± | 1.0% |
(a) | |||||||||||||||
Model | AP@50 | ||||||||||||||
25-Image 1 | 50-Image | 75-Image | 100-Image | 125-Image | |||||||||||
YOLOv8m | 83.9% | ± | 4.7% | 86.2% | ± | 1.0% | 85.8% | ± | 0.6% | 86.8% | ± | 0.4% | 88.5% | ± | 0.5% |
YOLOv9c | 84.4% | ± | 4.9% | 86.2% | ± | 1.0% | 85.4% | ± | 1.5% | 86.8% | ± | 0.8% | 88.7% | ± | 0.3% |
YOLOv5mu | 86.1% | ± | 0.8% | 86.0% | ± | 1.0% | 85.8% | ± | 0.6% | 87.3% | ± | 0.3% | 88.2% | ± | 0.7% |
(b) | |||||||||||||||
Model | AP@50 | ||||||||||||||
25-Image | 50-Image | 75-Image | 100-Image | 125-Image | |||||||||||
YOLOv8m | 81.1% | ± | 8.6% | 84.4% | ± | 0.8% | 84.7% | ± | 1.4% | 85.2% | ± | 0.7% | 87.6% | ± | 0.7% |
YOLOv9c | 81.8% | ± | 7.3% | 85.1% | ± | 1.3% | 85.0% | ± | 1.9% | 85.6% | ± | 1.0% | 87.6% | ± | 0.3% |
YOLOv5mu | 84.9% | ± | 1.1% | 85.1% | ± | 1.5% | 84.5% | ± | 0.8% | 85.8% | ± | 1.0% | 87.4% | ± | 0.3% |
(c) | |||||||||||||||
Model | AP@50 | ||||||||||||||
25-Image | 50-Image | 75-Image | 100-Image | 125-Image | |||||||||||
YOLOv8m | 71.1% | ± | 11.6% | 79.1% | ± | 2.4% | 77.5% | ± | 2.4% | 80.8% | ± | 2.4% | 80.0% | ± | 1.0% |
YOLOv9c | 74.0% | ± | 7.5% | 77.7% | ± | 1.2% | 78.0% | ± | 3.1% | 78.7% | ± | 1.6% | 81.5% | ± | 1.3% |
YOLOv5mu | 77.8% | ± | 3.5% | 78.0% | ± | 3.0% | 79.7% | ± | 2.1% | 80.2% | ± | 2.4% | 80.9% | ± | 2.2% |
(d) | |||||||||||||||
Model | AP@50 | ||||||||||||||
25-Image | 50-Image | 75-Image | 100-Image | 125-Image | |||||||||||
YOLOv8m | 76.5% | ± | 9.1% | 81.3% | ± | 1.6% | 80.2% | ± | 1.7% | 81.9% | ± | 1.4% | 83.2% | ± | 1.5% |
YOLOv9c | 77.0% | ± | 6.1% | 80.2% | ± | 1.8% | 79.7% | ± | 2.3% | 80.6% | ± | 0.8% | 84.2% | ± | 1.2% |
YOLOv5mu | 79.5% | ± | 1.8% | 80.4% | ± | 2.0% | 80.3% | ± | 1.5% | 81.4% | ± | 1.2% | 82.2% | ± | 2.8% |
Model | Precision | ||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
125-Image 1 | Orchard 1 in 2020 | Orchard 1 in 2021 | Orchard 2 in 2020 | Orchard 2 in 2022 | |||||||||||
YOLOv8m | 87.6% | ± | 1.4% | 87.7% | ± | 1.5% | 86.1% | ± | 1.9% | 85.1% | ± | 2.3% | 83.5% | ± | 2.4% |
YOLOv9c | 87.7% | ± | 1.3% | 89.2% | ± | 1.1% | 87.4% | ± | 1.4% | 85.9% | ± | 3.1% | 88.0% | ± | 3.7% |
YOLOv5mu | 86.7% | ± | 1.6% | 87.8% | ± | 1.5% | 86.6% | ± | 1.5% | 87.7% | ± | 3.5% | 86.3% | ± | 5.3% |
Model | Recall | ||||||||||||||
125-Image | Orchard 1 in 2020 | Orchard 1 in 2021 | Orchard 2 in 2020 | Orchard 2 in 2022 | |||||||||||
YOLOv8m | 78.1% | ± | 2.0% | 79.0% | ± | 1.6% | 76.0% | ± | 2.5% | 77.8% | ± | 2.8% | 77.2% | ± | 5.2% |
YOLOv9c | 80.0% | ± | 1.6% | 80.6% | ± | 1.6% | 77.7% | ± | 3.5% | 80.1% | ± | 3.0% | 78.4% | ± | 3.3% |
YOLOv5mu | 79.0% | ± | 0.9% | 79.8% | ± | 1.8% | 77.3% | ± | 2.6% | 78.5% | ± | 3.6% | 77.8% | ± | 2.8% |
Model | F1-score | ||||||||||||||
125-Image | Orchard 1 in 2020 | Orchard 1 in 2021 | Orchard 2 in 2020 | Orchard 2 in 2022 | |||||||||||
YOLOv8m | 82.6% | ± | 1.5% | 83.1% | ± | 1.2% | 81.9% | ± | 3.6% | 81.3% | ± | 2.5% | 80.1% | ± | 3.2% |
YOLOv9c | 83.6% | ± | 1.0% | 84.7% | ± | 1.3% | 82.2% | ± | 2.1% | 82.9% | ± | 2.9% | 82.9% | ± | 3.2% |
YOLOv5mu | 82.7% | ± | 1.0% | 83.6% | ± | 1.1% | 81.7% | ± | 2.0% | 82.8% | ± | 3.1% | 81.8% | ± | 3.2% |
Model | AP@50 | ||||||||||||||
125-Image | Orchard 1 in 2020 | Orchard 1 in 2021 | Orchard 2 in 2020 | Orchard 2 in 2022 | |||||||||||
YOLOv8m | 88.0% | ± | 1.1% | 88.2% | ± | 1.0% | 86.6% | ± | 1.9% | 86.1% | ± | 2.1% | 85.9% | ± | 3.4% |
YOLOv9c | 89.0% | ± | 0.9% | 89.2% | ± | 1.0% | 87.7% | ± | 2.1% | 87.5% | ± | 3.1% | 87.6% | ± | 2.8% |
YOLOv5mu | 88.0% | ± | 1.0% | 88.6% | ± | 1.5% | 87.1% | ± | 2.1% | 87.1% | ± | 2.9% | 87.0% | ± | 3.7% |
(a) | |||||||||||||||
Model | AP@50 | ||||||||||||||
125-Image 1 | Orchard 1 in 2020 | Orchard 1 in 2021 | Orchard 2 in 2020 | Orchard 2 in 2022 | |||||||||||
YOLOv8m | 88.5% | ± | 0.5% | 86.8% | ± | 0.9% | 87.6% | ± | 0.6% | 83.3% | ± | 0.9% | 86.6% | ± | 0.4% |
YOLOv9c | 88.7% | ± | 0.3% | 87.7% | ± | 0.8% | 87.9% | ± | 0.5% | 84.0% | ± | 1.2% | 87.6% | ± | 0.6% |
YOLOv5mu | 88.2% | ± | 0.7% | 86.5% | ± | 0.8% | 87.9% | ± | 0.5% | 84.3% | ± | 0.7% | 87.6% | ± | 0.6% |
(b) | |||||||||||||||
Model | AP@50 | ||||||||||||||
125-Image | Orchard 1 in 2020 | Orchard 1 in 2021 | Orchard 2 in 2020 | Orchard 2 in 2022 | |||||||||||
YOLOv8m | 87.6% | ± | 0.7% | 85.8% | ± | 0.9% | 86.6% | ± | 0.4% | 82.7% | ± | 0.5% | 84.1% | ± | 1.0% |
YOLOv9c | 87.6% | ± | 0.3% | 86.9% | ± | 1.0% | 87.2% | ± | 0.7% | 82.4% | ± | 2.0% | 85.1% | ± | 0.8% |
YOLOv5mu | 87.4% | ± | 0.3% | 85.7% | ± | 1.0% | 87.3% | ± | 0.6% | 83.4% | ± | 0.9% | 85.4% | ± | 0.7% |
(c) | |||||||||||||||
Model | AP@50 | ||||||||||||||
125-Image | Orchard 1 in 2020 | Orchard 1 in 2021 | Orchard 2 in 2020 | Orchard 2 in 2022 | |||||||||||
YOLOv8m | 80.0% | ± | 1.0% | 74.6% | ± | 1.8% | 76.9% | ± | 1.3% | 78.5% | ± | 2.5% | 77.9% | ± | 1.5% |
YOLOv9c | 81.5% | ± | 1.3% | 77.4% | ± | 1.4% | 79.6% | ± | 1.6% | 77.8% | ± | 2.2% | 78.2% | ± | 2.5% |
YOLOv5mu | 80.9% | ± | 2.2% | 75.5% | ± | 1.7% | 81.0% | ± | 0.5% | 80.8% | ± | 1.4% | 80.1% | ± | 1.2% |
(d) | |||||||||||||||
Model | AP@50 | ||||||||||||||
125-Image | Orchard 1 in 2020 | Orchard 1 in 2021 | Orchard 2 in 2020 | Orchard 2 in 2022 | |||||||||||
YOLOv8m | 83.2% | ± | 1.5% | 77.0% | ± | 1.4% | 77.4% | ± | 1.4% | 81.3% | ± | 0.5% | 80.0% | ± | 1.7% |
YOLOv9c | 84.2% | ± | 1.2% | 76.9% | ± | 2.1% | 77.0% | ± | 2.0% | 82.4% | ± | 0.8% | 78.5% | ± | 2.2% |
YOLOv5mu | 82.2% | ± | 2.8% | 75.3% | ± | 2.3% | 79.5% | ± | 0.6% | 83.6% | ± | 0.3% | 78.6% | ± | 1.2% |
Model | Precision | Recall | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
125-Image 1 | Equal Images | 125-Image | Equal Images | |||||||||
YOLOv8m | 87.6% | ± | 1.4% | 86.9% | ± | 1.5% | 78.1% | ± | 2.0% | 79.1% | ± | 1.6% |
YOLOv9c | 87.7% | ± | 1.3% | 88.7% | ± | 0.9% | 80.0% | ± | 1.6% | 79.7% | ± | 1.4% |
YOLOv5mu | 86.7% | ± | 1.6% | 87.7% | ± | 2.1% | 79.0% | ± | 0.9% | 78.6% | ± | 0.8% |
Model | F1-score | AP@50 | ||||||||||
125-Image | Equal Images | 125-Image | Equal Images | |||||||||
YOLOv8m | 82.6% | ± | 1.5% | 82.8% | ± | 1.5% | 88.0% | ± | 1.1% | 88.1% | ± | 1.4% |
YOLOv9c | 83.6% | ± | 1.0% | 84.0% | ± | 0.8% | 89.0% | ± | 0.9% | 89.1% | ± | 0.7% |
YOLOv5mu | 82.7% | ± | 1.0% | 82.9% | ± | 0.8% | 88.0% | ± | 1.0% | 88.3% | ± | 0.6% |
Model | Orchard 1 in 2022 | Orchard 2 in 2021 | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
125-Image 1 | Equal Images | 125-Image | Equal Images | |||||||||
YOLOv8m | 88.5% | ± | 0.5% | 87.9% | ± | 0.9% | 87.6% | ± | 0.7% | 87.1% | ± | 0.9% |
YOLOv9c | 88.7% | ± | 0.3% | 88.3% | ± | 0.9% | 87.6% | ± | 0.3% | 87.4% | ± | 0.6% |
YOLOv5mu | 88.2% | ± | 0.7% | 88.4% | ± | 0.3% | 87.4% | ± | 0.3% | 87.4% | ± | 0.4% |
Model | Orchard 5 in 2023 | Orchard 7 in 2023 | ||||||||||
125-Image | Equal Images | 125-Image | Equal Images | |||||||||
YOLOv8m | 80.0% | ± | 1.0% | 81.2% | ± | 1.5% | 83.2% | ± | 1.5% | 83.5% | ± | 1.8% |
YOLOv9c | 81.5% | ± | 1.3% | 81.4% | ± | 1.2% | 84.2% | ± | 1.2% | 83.4% | ± | 1.3% |
YOLOv5mu | 80.9% | ± | 2.2% | 82.5% | ± | 0.5% | 82.2% | ± | 2.8% | 84.1% | ± | 1.0% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Tanimoto, Y.; Zhang, Z.; Yoshida, S. Object Detection for Yellow Maturing Citrus Fruits from Constrained or Biased UAV Images: Performance Comparison of Various Versions of YOLO Models. AgriEngineering 2024, 6, 4308-4324. https://doi.org/10.3390/agriengineering6040243
Tanimoto Y, Zhang Z, Yoshida S. Object Detection for Yellow Maturing Citrus Fruits from Constrained or Biased UAV Images: Performance Comparison of Various Versions of YOLO Models. AgriEngineering. 2024; 6(4):4308-4324. https://doi.org/10.3390/agriengineering6040243
Chicago/Turabian StyleTanimoto, Yuu, Zhen Zhang, and Shinichi Yoshida. 2024. "Object Detection for Yellow Maturing Citrus Fruits from Constrained or Biased UAV Images: Performance Comparison of Various Versions of YOLO Models" AgriEngineering 6, no. 4: 4308-4324. https://doi.org/10.3390/agriengineering6040243
APA StyleTanimoto, Y., Zhang, Z., & Yoshida, S. (2024). Object Detection for Yellow Maturing Citrus Fruits from Constrained or Biased UAV Images: Performance Comparison of Various Versions of YOLO Models. AgriEngineering, 6(4), 4308-4324. https://doi.org/10.3390/agriengineering6040243