Combining Image Classification and Unmanned Aerial Vehicles to Estimate the State of Explorer Roses
Abstract
:1. Introduction
2. Materials and Methods
2.1. Data Acquisition and Transmission Stage
2.1.1. System Implementation and Conditions
2.1.2. Dataset Retrieval Method
2.1.3. Roses Detection Model
2.1.4. Roses Tracking
2.2. Processsing Stage
Models Training
3. Results
3.1. Rose Counting Based on the Crossline Method
3.2. Evaluation of Models
3.3. Evaluation of the Counting Method
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Cheng, C.; Fu, J.; Su, H.; Ren, L. Recent Advancements in Agriculture Robots: Benefits and Challenges. Machines 2023, 11, 48. [Google Scholar] [CrossRef]
- Wani, M.A.; Din, A.; Nazki, I.T.; Rehman, T.U.; Al-Khayri, J.M.; Jain, S.M.; Lone, R.A.; Bhat, Z.A.; Mushtaq, M. Navigating the Future: Exploring Technological Advancements and Emerging Trends in the Sustainable Ornamental Industry. Front. Environ. Sci. 2023, 11, 1188643. [Google Scholar] [CrossRef]
- Verdonk, J.C.; van Ieperen, W.; Carvalho, D.R.A.; van Geest, G.; Schouten, R.E. Effect of Preharvest Conditions on Cut-Flower Quality. Front. Plant Sci. 2023, 14, 1281456. [Google Scholar] [CrossRef] [PubMed]
- Ramadhani, W.; Anshari, K.; Febtiningsih, P. The Implementation of Internet of Things-Based Automatic Plant Watering Equipment at Sun Flower Ornamental Plant Shop. AIP Conf. Proc. 2023, 2601, 020041. [Google Scholar] [CrossRef]
- Mahmud, M.S.; Zahid, A.; Das, A.K. Sensing and Automation Technologies for Ornamental Nursery Crop Production: Current Status and Future Prospects. Sensors 2023, 23, 1818. [Google Scholar] [CrossRef] [PubMed]
- Velusamy, P.; Rajendran, S.; Mahendran, R.K.; Naseer, S.; Shafiq, M.; Choi, J.-G. Unmanned Aerial Vehicles (UAV) in Precision Agriculture: Applications and Challenges. Energies 2021, 15, 217. [Google Scholar] [CrossRef]
- Darwin, B.; Dharmaraj, P.; Prince, S.; Popescu, D.E.; Hemanth, D.J. Recognition of Bloom/Yield in Crop Images Using Deep Learning Models for Smart Agriculture: A Review. Agronomy 2021, 11, 646. [Google Scholar] [CrossRef]
- Radoglou-Grammatikis, P.; Sarigiannidis, P.; Lagkas, T.; Moscholios, I. A Compilation of UAV Applications for Precision Agriculture. Comput. Netw. 2020, 172, 107148. [Google Scholar] [CrossRef]
- Maraveas, C. Incorporating Artificial Intelligence Technology in Smart Greenhouses: Current State of the Art. Appl. Sci. 2023, 13, 14. [Google Scholar] [CrossRef]
- Egi, Y.; Hajyzadeh, M.; Eyceyurt, E. Drone-Computer Communication Based Tomato Generative Organ Counting Model Using YOLO V5 and Deep-Sort. Agriculture 2022, 12, 1290. [Google Scholar] [CrossRef]
- Babila, I.F.E.; Villasor, S.A.E.; Dela Cruz, J.C. Object Detection for Inventory Stock Counting Using YOLOv5. In Proceedings of the 2022 IEEE 18th International Colloquium on Signal Processing & Applications (CSPA), Selangor, Malaysia, 12 May 2022; pp. 304–309. [Google Scholar] [CrossRef]
- Jintasuttisak, T.; Edirisinghe, E.; Elbattay, A. Deep Neural Network Based Date Palm Tree Detection in Drone Imagery. Comput. Electron. Agric. 2022, 192, 106560. [Google Scholar] [CrossRef]
- Coutinho Meneguzzi, C.; Fernandes da Silva, G.; Regis Mauri, G.; Ribeiro de Mendonça, A.; Almeida de Barros Junior, A. Routing Model Applied to Forest Inventory Vehicles Planning. Comput. Electron. Agric. 2020, 175, 105544. [Google Scholar] [CrossRef]
- Wang, A.; Peng, T.; Cao, H.; Xu, Y.; Wei, X.; Cui, B. TIA-YOLOv5: An Improved YOLOv5 Network for Real-Time Detection of Crop and Weed in the Field. Front. Plant Sci. 2022, 13, 1091655. [Google Scholar] [CrossRef] [PubMed]
- Dorrer, M.G.; Popov, A.A.; Tolmacheva, A.E. Building an Artificial Vision System of an Agricultural Robot Based on the DarkNet System. IOP Conf. Ser. Earth Environ. Sci. 2020, 548, 32032. [Google Scholar] [CrossRef]
- Tian, M.; Liao, Z. Research on Flower Image Classification Method Based on YOLOv5. J. Phys. Conf. Ser. 2021, 2024, 12022. [Google Scholar] [CrossRef]
- Lin, P.; Li, D.; Jia, Y.; Chen, Y.; Huang, G.; Elkhouchlaa, H.; Yao, Z.; Zhou, Z.; Zhou, H.; Li, J.; et al. A Novel Approach for Estimating the Flowering Rate of Litchi Based on Deep Learning and UAV Images. Front. Plant Sci. 2022, 13, 966639. [Google Scholar] [CrossRef] [PubMed]
- Chen, Z.; Su, R.; Wang, Y.; Chen, G.; Wang, Z.; Yin, P.; Wang, J. Automatic Estimation of Apple Orchard Blooming Levels Using the Improved YOLOv5. Agronomy 2022, 12, 2483. [Google Scholar] [CrossRef]
- Feng, Z.; Guo, L.; Huang, D.; Li, R. Electrical Insulator Defects Detection Method Based on YOLOv5. In Proceedings of the 2021 IEEE 10th Data Driven Control and Learning Systems Conference (DDCLS), Suzhou, China, 14–16 May 2021; pp. 979–984. [Google Scholar] [CrossRef]
- Horvat, M.; Jelečević, L.; Gledec, G. A Comparative Study of YOLOv5 Models Performance for Image Localization and Classification. In Proceedings of the Central European Conference on Information and Intelligent Systems, Dubrovnik, Croatia, 21–23 September 2022; Faculty of Organization and Informatics Varazdin: Varazdin, Croatia, 2022; pp. 349–356. [Google Scholar]
- Padilla, R.; Netto, S.L.; da Silva, E.A.B. A Survey on Performance Metrics for Object-Detection Algorithms. In Proceedings of the 2020 International Conference on Systems, Signals and Image Processing (IWSSIP), Niterói, Brazil, 1–3 July 2020; pp. 237–242. [Google Scholar] [CrossRef]
- Gunjal, P.R.; Gunjal, B.R.; Shinde, H.A.; Vanam, S.M.; Aher, S.S. Moving Object Tracking Using Kalman Filter. In Proceedings of the 2018 International Conference on Advances in Communication and Computing Technology (ICACCT), Sangamner, India, 8–9 February 2018; pp. 544–547. [Google Scholar] [CrossRef]
- Li, Y.; Ma, R.; Zhang, R.; Cheng, Y.; Dong, C. A Tea Buds Counting Method Based on YOLOv5 and Kalman Filter Tracking Algorithm. Plant Phenomics 2023, 5, 0030. [Google Scholar] [CrossRef]
- Oh, S.; Chang, A.; Ashapure, A.; Jung, J.; Dube, N.; Maeda, M.; Gonzalez, D.; Landivar, J. Plant Counting of Cotton from UAS Imagery Using Deep Learning-Based Object Detection Framework. Remote Sens. 2020, 12, 2981. [Google Scholar] [CrossRef]
- Hosseiny, B.; Rastiveis, H.; Homayouni, S. An Automated Framework for Plant Detection Based on Deep Simulated Learning from Drone Imagery. Remote Sens. 2020, 12, 3521. [Google Scholar] [CrossRef]
- Heylen, R.; van Mulders, P.; Gallace, N. Counting Strawberry Flowers on Drone Imagery with a Sequential Convolutional Neural Network. In Proceedings of the 2021 IEEE International Geoscience and Remote Sensing Symposium IGARSS, Brussels, Belgium, 11–16 July 2021; pp. 4880–4883. [Google Scholar] [CrossRef]
- Wan Nurazwin Syazwani, R.; Muhammad Asraf, H.; Megat Syahirul Amin, M.A.; Nur Dalila, K.A. Automated Image Identification, Detection and Fruit Counting of Top-View Pineapple Crown Using Machine Learning. Alex. Eng. J. 2022, 61, 1265–1276. [Google Scholar] [CrossRef]
- Sun, X.; Li, Z.; Zhu, T.; Ni, C. Four-Dimension Deep Learning Method for Flower Quality Grading with Depth Information. Electronics 2021, 10, 2353. [Google Scholar] [CrossRef]
- Bozkurt, F. A Study on CNN Based Transfer Learning for Recognition of Flower Species. Eur. J. Sci. Technol. 2022, 32, 883–890. [Google Scholar] [CrossRef]
Parameters | Values |
---|---|
Drone | DJI Mavic Mini SE |
Camera | 1/2.3-inch CMOS sensor, 12Mpixels, high resolution |
Camera field of view (FOV) | 83 degrees and an aperture of f/2.8 with autofocus |
Data acquisition | 2.7 K videos at 30 frames per second |
Schedule task of image acquisition | 8:00 AM to 12:00 PM |
Greenhouse temperature | 5.3 °C to 36.1 °C |
Greenhouse relative humidity | 27.5% to 96.8% |
Greenhouse dew point | 4.6 °C to 25 °C |
Vapor Pressure Deficit (VPD) | 0.03 kPa to 4.23 kPa |
Items | Values |
---|---|
Number of videos taken | 42 |
Length travelled by drone in each video (meters) | 33.9213 |
Bed width (meters) | 0.7 |
Plants per bed | 388 |
Approximate duration of each video | 1 min |
Video resolution (pixels) | 2704 × 1520 |
Images registered | 1250 |
Images extracted from video | 2500 |
Image resolution (pixels) | 960 × 608 |
Training dataset | 60% |
Validation dataset | 20% |
Test dataset | 20% |
Average number of open rosebuds per image | 10.2 |
Average number of lose rosebuds per image | 9.3 |
Models | Training Duration (h) | Parameters | Layers | GFLOPs |
---|---|---|---|---|
YOLOv5n | 9.084 | 1,761,871 | 157 | 4.1 |
YOLOv5s | 11.511 | 701,519 | 157 | 15.8 |
YOLOv5m | 4.311 | 208,569,775 | 212 | 47.9 |
YOLOv5l | 3.885 | 46,113,663 | 267 | 107.7 |
YOLOv5x | 3.942 | 86,180,143 | 322 | 203.8 |
Open Rosebuds | P (%) | R (%) | [email protected] | [email protected] |
---|---|---|---|---|
YOLOv5n | 0.906 | 0.965 | 0.957 | 0.720 |
YOLOv5s | 0.951 | 0.956 | 0.960 | 0.702 |
YOLOv5m | 0.949 | 0.960 | 0.961 | 0.707 |
YOLOv5l | 0.932 | 0.972 | 0.962 | 0.697 |
YOLOv5x | 0.949 | 0.956 | 0.965 | 0.730 |
Closed rosebuds | ||||
YOLOv5n | 0.870 | 0.802 | 0.868 | 0.540 |
YOLOv5s | 0.900 | 0.778 | 0.875 | 0.555 |
YOLOv5m | 0.877 | 0.837 | 0.886 | 0.559 |
YOLOv5l | 0.845 | 0.827 | 0.903 | 0.562 |
YOLOv5x | 0.883 | 0.813 | 0.889 | 0.568 |
All | ||||
YOLOv5n | 0.888 | 0.883 | 0.913 | 0.630 |
YOLOv5s | 0.924 | 0.867 | 0.920 | 0.643 |
YOLOv5m | 0.905 | 0.904 | 0.924 | 0.630 |
YOLOv5l | 0.897 | 0.893 | 0.923 | 0.630 |
YOLOv5x | 0.917 | 0.885 | 0.941 | 0.632 |
List of Works | Working Topic | Proposed Method | Accuracy (%) |
---|---|---|---|
This work | Rosebuds detection | YoloV5 Models | 94.10% |
Hossein et al. [25] | Plants on agricultural land | Faster R-CNN | 86.00% |
Egi et al. [10] | Detect and count tomato flowers and fruits | YoloV5 | 92.00% |
Syazwani et al. [27] | Counting and detection of pineapple | ANN-GDX | 94.00% |
Heylen et al. [26] | Counting strawberry flowers | CNN | 90.00% |
Sun et al. [28] | Classification of grades of maturation of rosebuds | InceptionV3 | 98.00% |
Bozkurt. [29] | CNN Based Transfer Learning for Recognition of Flower | InceptionResNetV2 | 92.25% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Herrera, D.; Escudero-Villa, P.; Cárdenas, E.; Ortiz, M.; Varela-Aldás, J. Combining Image Classification and Unmanned Aerial Vehicles to Estimate the State of Explorer Roses. AgriEngineering 2024, 6, 1008-1021. https://doi.org/10.3390/agriengineering6020058
Herrera D, Escudero-Villa P, Cárdenas E, Ortiz M, Varela-Aldás J. Combining Image Classification and Unmanned Aerial Vehicles to Estimate the State of Explorer Roses. AgriEngineering. 2024; 6(2):1008-1021. https://doi.org/10.3390/agriengineering6020058
Chicago/Turabian StyleHerrera, David, Pedro Escudero-Villa, Eduardo Cárdenas, Marcelo Ortiz, and José Varela-Aldás. 2024. "Combining Image Classification and Unmanned Aerial Vehicles to Estimate the State of Explorer Roses" AgriEngineering 6, no. 2: 1008-1021. https://doi.org/10.3390/agriengineering6020058
APA StyleHerrera, D., Escudero-Villa, P., Cárdenas, E., Ortiz, M., & Varela-Aldás, J. (2024). Combining Image Classification and Unmanned Aerial Vehicles to Estimate the State of Explorer Roses. AgriEngineering, 6(2), 1008-1021. https://doi.org/10.3390/agriengineering6020058