AriAplBud: An Aerial Multi-Growth Stage Apple Flower Bud Dataset for Agricultural Object Detection Benchmarking
Abstract
:1. Summary
- Network size influence on YOLOv4 performance [15];
- YOLOv4 sensitivity analysis regarding white noise, motion blur, hue shift, saturation change, and intensity change of test images, and dataset size, annotation quality, negative sample presence, image sequence, and image distortion level of training datasets [18];
- YOLOv7 and YOLOv4 accuracy comparison under varying levels of training image annotation quality [17].
2. Dataset Creation
2.1. Orchard Location
2.2. Data Collection
2.3. Image Annotation
- Annotate an apple flower bud as a tip when the leaves of the flower bud are not open at all;
- Annotate an apple flower bud as a half-inch green when the leaves of the flower bud are open but the florets of the flower bud are not visible;
- Annotate an apple flower bud as a tight cluster when the florets of the flower bud are visible but show no pink color;
- Annotate an apple flower bud as a pink when the florets of the flower bud show pink color, or annotate an apple flower as a pink when the petals of the flower have yet turned mostly white;
- Annotate an apple flower as a bloom when the petals of the flower are mostly white or at least one petal is attached to the receptacle of the flower;
- Annotate an apple flower bud as a petal fall when no petals are attached to the receptacles of all the flowers, or annotate an apple flower as a petal fall when no petals are attached to the receptacles of the flower;
- For stages at or before tight cluster, include the entire apple flower bud in a bounding box;
- For stages at or after pink, exclude the apple flower bud leaves from a bounding box;
- For stages at or before tight cluster, each bounding box should contain only one apple flower bud;
- For bloom stage, each bounding box should contain only one apple flower;
- For pink and petal fall stages, annotate the flowers that belong to the same apple flower bud and are relatively close to each other in one bounding box; otherwise, annotate each flower with one bounding box;
- When knowing an apple flower bud or an apple flower exists, but the complete shape of it cannot be identified due to image blurriness or dense flower bud distribution, do not annotate;
- When in doubt whether an apple bud is a vegetative or flower bud, do not annotate.
3. Dataset Description
- <object_class> <x_center> <y_center> <x_width> <y_height>,
- <object_class> is the zero-based index for apple flower bud growth stage, ranging from 0 to 5 as an integer and representing tip, half-inch green, tight cluster, pink, bloom, and petal fall, respectively;
- <x_center> is the bounding box center horizontal position relative to image width, ranging from 0 to 1 as a floating-point number;
- <y_center> is the bounding box center vertical position relative to image height, ranging from 0 to 1 as a floating-point number;
- <x_width> is the bounding box width relative to image width, ranging from 0 to 1 as a floating-point number;
- <y_height> is the bounding box height relative to image height, ranging from 0 to 1 as a floating-point number.
- 2 0.342578 0.346720 0.172656 0.305221
- 2 0.569531 0.209505 0.137500 0.255689
- 2 0.471094 0.212182 0.101563 0.210174
- 1 0.589453 0.896252 0.078906 0.089692
4. Dataset Utilization
4.1. Preparation
4.2. Model Development
4.3. Results
5. Dataset Characteristics
5.1. Class Imbalance
5.2. Annotation per Image
5.3. Bounding Box Size
5.4. Annotation Error
5.5. Annotation Style Inconsistency
5.6. Image Similarity
5.7. Positive and Negative Sample Dissimilarity
5.8. Flight Height Inconsistency
5.9. Blurred Image
5.10. Artificial Object
Supplementary Materials
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Zou, Z.; Chen, K.; Shi, Z.; Guo, Y.; Ye, J. Object Detection in 20 Years: A Survey. Proc. IEEE 2023, 111, 257–276. [Google Scholar] [CrossRef]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet Classification with Deep Convolutional Neural Networks. Commun. ACM 2017, 60, 84–90. [Google Scholar] [CrossRef]
- Dhillon, A.; Verma, G.K. Convolutional Neural Network: A Review of Models, Methodologies and Applications to Object Detection. Prog. Artif. Intell. 2020, 9, 85–112. [Google Scholar] [CrossRef]
- Ramachandran, A.; Sangaiah, A.K. A Review on Object Detection in Unmanned Aerial Vehicle Surveillance. Int. J. Cogn. Comput. Eng. 2021, 2, 215–228. [Google Scholar] [CrossRef]
- Wolf, S.A.; Wood, S.D. Precision Farming: Environmental Legitimation, Commodification of Information, and Industrial Coordination. Rural Sociol. 1997, 62, 180–206. [Google Scholar] [CrossRef]
- Deng, J.; Dong, W.; Socher, R.; Li, L.-J.; Li, K.; Li, F.-F. ImageNet: A Large-Scale Hierarchical Image Database. In Proceedings of the 2009 IEEE Conference on Computer Vision and Pattern Recognition, Miami, FL, USA, 20–25 June 2009; pp. 248–255. [Google Scholar] [CrossRef]
- Lin, T.-Y.; Maire, M.; Belongie, S.; Bourdev, L.; Girshick, R.; Hays, J.; Perona, P.; Ramanan, D.; Zitnick, C.L.; Dollár, P. Microsoft COCO: Common Objects in Context. In Proceedings of the 13th European Conference on Computer Vision (ECCV), Zurich, Switzerland, 6–12 September 2014; pp. 740–755. [Google Scholar]
- Everingham, M.; Van Gool, L.; Williams, C.K.I.; Winn, J.; Zisserman, A. The Pascal Visual Object Classes (VOC) Challenge. Int. J. Comput. Vis. 2010, 88, 303–338. [Google Scholar] [CrossRef]
- Zou, H.; Lu, H.; Li, Y.; Liu, L.; Cao, Z. Maize Tassels Detection: A Benchmark of the State of the Art. Plant Methods 2020, 16, 108. [Google Scholar] [CrossRef]
- Dang, F.; Chen, D.; Lu, Y.; Li, Z.; Zheng, Y. DeepCottonWeeds (DCW): A Novel Benchmark of YOLO Object Detectors for Weed Detection in Cotton Production Systems. In Proceedings of the 2022 ASABE Annual International Meeting, Houston, TX, USA, 17–20 July 2022; p. 2200214. [Google Scholar]
- Dang, F.; Chen, D.; Lu, Y.; Li, Z. YOLOWeeds: A Novel Benchmark of YOLO Object Detectors for Multi-Class Weed Detection in Cotton Production Systems. Comput. Electron. Agric. 2023, 205, 107655. [Google Scholar] [CrossRef]
- Hani, N.; Roy, P.; Isler, V. MinneApple: A Benchmark Dataset for Apple Detection and Segmentation. IEEE Robot. Autom. Lett. 2020, 5, 852–858. [Google Scholar] [CrossRef]
- Papagiannaki, K.; Lagouvardos, K.; Kotroni, V.; Papagiannakis, G. Agricultural Losses Related to Frost Events: Use of the 850 HPa Level Temperature as an Explanatory Variable of the Damage Cost. Nat. Hazards Earth Syst. Sci. 2014, 14, 2375–2386. [Google Scholar] [CrossRef]
- Snyder, R.L.; de Melo-Abreu, J.P. Frost Protection: Fundamentals, Practice, and Economics; Food and Agriculture Organization of the United Nations: Rome, Italy, 2005; Volume 1, ISBN 9251053286. [Google Scholar]
- Yuan, W.; Choi, D. UAV-Based Heating Requirement Determination for Frost Management in Apple Orchard. Remote Sens. 2021, 13, 273. [Google Scholar] [CrossRef]
- Ballard, J.K.; Proebsting, E.L. Frost and Frost Control in Washington Orchards; eb0634_1978; Washington State University Cooperative Extension: Pullman, WA, USA, 1978; Available online: https://content.libraries.wsu.edu/digital/collection/ext/id/17531 (accessed on 8 February 2024).
- Yuan, W. Accuracy Comparison of YOLOv7 and YOLOv4 Regarding Image Annotation Quality for Apple Flower Bud Classification. AgriEngineering 2023, 5, 413–424. [Google Scholar] [CrossRef]
- Yuan, W.; Choi, D.; Bolkas, D.; Heinemann, P.H.; He, L. Sensitivity Examination of YOLOv4 Regarding Test Image Distortion and Training Dataset Attribute for Apple Flower Bud Classification. Int. J. Remote Sens. 2022, 43, 3106–3130. [Google Scholar] [CrossRef]
- Yuan, W. Development of a UAV-Based Multi-Dimensional Mapping Framework for Precise Frost Management in Apple Orchards. Ph.D. Thesis, The Pennsylvania State University, University Park, PA, USA, 2022. [Google Scholar]
- Bochkovskiy, A.; Wang, C.-Y.; Liao, H.-Y.M. YOLOv4: Optimal Speed and Accuracy of Object Detection. arXiv 2020, arXiv:2004.10934. [Google Scholar]
- Wang, C.-Y.; Bochkovskiy, A.; Liao, H.-Y.M. YOLOv7: Trainable Bag-of-Freebies Sets New State-of-the-Art for Real-Time Object Detectors. arXiv 2022, arXiv:2207.02696. [Google Scholar]
- Ballard, J.K.; Proebsting, E.L.; Tukey, R.B. Apples: Critical Temperatures for Blossom Buds; 99900502315001842; Washington State University Extension: Pullman, WA, USA, 1971; Available online: https://rex.libraries.wsu.edu/esploro/outputs/report/Apples-critical-temperatures-for-blossom-buds/99900502315001842 (accessed on 8 February 2024).
- YOLO-Label. Available online: https://github.com/developer0hye/Yolo_Label (accessed on 8 February 2024).
- Jocher, G.; Chaurasia, A.; Qiu, J. YOLO by Ultralytics 2023. Available online: https://github.com/ultralytics/ultralytics (accessed on 8 February 2024).
- Li, Y.; Nie, J.; Chao, X. Do We Really Need Deep CNN for Plant Diseases Identification? Comput. Electron. Agric. 2020, 178, 105803. [Google Scholar] [CrossRef]
- Ba, L.J.; Caruana, R. Do Deep Nets Really Need to Be Deep? Adv. Neural Inf. Process. Syst. 2014, 3, 2654–2662. [Google Scholar]
- Crassweller, R. Home Orchards: Flowering Habits of Apples and Pears. Available online: https://extension.psu.edu/home-orchards-flowering-habits-of-apples-and-pears (accessed on 8 February 2024).
- Xianbao, C.; Guihua, Q.; Yu, J.; Zhaomin, Z. An Improved Small Object Detection Method Based on Yolo V3. Pattern Anal. Appl. 2021, 24, 1347–1355. [Google Scholar] [CrossRef]
- Benjumea, A.; Teeti, I.; Cuzzolin, F.; Bradley, A. YOLO-Z: Improving Small Object Detection in YOLOv5 for Autonomous Vehicles. arXiv 2021, arXiv:2112.11798. [Google Scholar]
- Wang, Z.Z.; Xie, K.; Zhang, X.Y.; Chen, H.Q.; Wen, C.; He, J.B. Small-Object Detection Based on YOLO and Dense Block via Image Super-Resolution. IEEE Access 2021, 9, 56416–56429. [Google Scholar] [CrossRef]
- Du, Z.; Yin, J.; Yang, J. Expanding Receptive Field YOLO for Small Object Detection. J. Phys. Conf. Ser. 2019, 1314, 012202. [Google Scholar] [CrossRef]
- He, X.; Cheng, R.; Zheng, Z.; Wang, Z. Small Object Detection in Traffic Scenes Based on Yolo-Mxanet. Sensors 2021, 21, 7422. [Google Scholar] [CrossRef] [PubMed]
- Qiu, Z.; Wang, S.; Zeng, Z.; Yu, D. Automatic Visual Defects Inspection of Wind Turbine Blades via YOLO-Based Small Object Detection Approach. J. Electron. Imaging 2019, 28, 043023. [Google Scholar] [CrossRef]
- Li, Y.; Li, S.; Du, H.; Chen, L.; Zhang, D.; Li, Y. YOLO-ACN: Focusing on Small Target and Occluded Object Detection. IEEE Access 2020, 8, 227288–227303. [Google Scholar] [CrossRef]
- Liu, M.; Wang, X.; Zhou, A.; Fu, X.; Ma, Y.; Piao, C. Uav-Yolo: Small Object Detection on Unmanned Aerial Vehicle Perspective. Sensors 2020, 20, 2238. [Google Scholar] [CrossRef]
- Kaufman, S.; Rosset, S.; Perlich, C.; Stitelman, O. Leakage in Data Mining: Formulation, Detection, and Avoidance. ACM Trans. Knowl. Discov. Data 2012, 6, 556–563. [Google Scholar] [CrossRef]
Date | 19 April 2020 | 23 April 2020 | 28 April 2020 | 2 May 2020 | 7 May 2020 | 13 May 2020 | 16 May 2020 | 21 May 2020 | 25 September 2020 | Total |
---|---|---|---|---|---|---|---|---|---|---|
Tip | 146 | 342 | 15 | 1 | 0 | 0 | 1 | 0 | 0 | 505 |
Half-inch green | 460 | 845 | 84 | 0 | 0 | 0 | 0 | 0 | 0 | 1389 |
Tight cluster | 2479 | 10,842 | 5931 | 514 | 1 | 0 | 0 | 0 | 0 | 19,767 |
Pink | 64 | 691 | 5898 | 10,796 | 4966 | 2899 | 622 | 30 | 0 | 25,966 |
Bloom | 0 | 0 | 2 | 643 | 19,905 | 13,074 | 7928 | 2239 | 0 | 43,791 |
Petal fall | 0 | 0 | 0 | 0 | 7 | 4376 | 7754 | 6912 | 0 | 19,049 |
Total | 3149 | 12,720 | 11,930 | 11,954 | 24,879 | 20,349 | 16,305 | 9181 | 0 | 110,467 |
Growth Stage | YOLOv8n AP50 | YOLOv8x AP50 |
---|---|---|
Tip | 0.524 | 0.579 |
Half-inch green | 0.542 | 0.617 |
Tight cluster | 0.886 | 0.872 |
Pink | 0.772 | 0.765 |
Bloom | 0.863 | 0.874 |
Petal fall | 0.600 | 0.657 |
mAP50 | 0.698 | 0.727 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yuan, W. AriAplBud: An Aerial Multi-Growth Stage Apple Flower Bud Dataset for Agricultural Object Detection Benchmarking. Data 2024, 9, 36. https://doi.org/10.3390/data9020036
Yuan W. AriAplBud: An Aerial Multi-Growth Stage Apple Flower Bud Dataset for Agricultural Object Detection Benchmarking. Data. 2024; 9(2):36. https://doi.org/10.3390/data9020036
Chicago/Turabian StyleYuan, Wenan. 2024. "AriAplBud: An Aerial Multi-Growth Stage Apple Flower Bud Dataset for Agricultural Object Detection Benchmarking" Data 9, no. 2: 36. https://doi.org/10.3390/data9020036
APA StyleYuan, W. (2024). AriAplBud: An Aerial Multi-Growth Stage Apple Flower Bud Dataset for Agricultural Object Detection Benchmarking. Data, 9(2), 36. https://doi.org/10.3390/data9020036