WED-YOLO: A Detection Model for Safflower Under Complex Unstructured Environment
Abstract
:1. Introduction
- (1)
- Based on the significant variations in safflower size and shape as the shooting distance changes, the loss function was replaced with the WIoU, better suited for handling these variations.
- (2)
- To retain more detailed safflower feature information during upsampling, the upsample module in the neck network was replaced with the DySample module, enhancing the model’s upsampling capabilities.
- (3)
- To better capture safflower features, the EMA module was integrated into the C2f module of the backbone network, improving the backbone’s feature extraction ability for safflower targets.
- (4)
- A small target detection layer was incorporated to improve the model’s capability in extracting and identifying small safflower target features within the dataset, providing a precise detection method for safflower automation.
2. Materials and Methods
2.1. Material
2.1.1. Acquisition of Image Data
2.1.2. Augmentation and Processing of the Safflower Filaments Image Dataset
2.2. Methods
2.2.1. YOLOv8 Model
2.2.2. Improved YOLOv8 Model for Small Target Safflower Filaments Detection
- (1)
- Replace the CIoU of the original model with the WIoU loss function. WIoU assigns different weights to different size targets of safflower samples, which could improve the bounding box fitting ability and speed up the convergence of the network.
- (2)
- In the neck network, the Dysample module is introduced to replace the Upsample module of the original model. Dysample could enhance the ability of upsampling for safflower targets in low-resolution images or far-field views and then reduce the loss of safflower features during the sampling process.
- (3)
- Incorporate an efficient multi-scale attention mechanism in the C2f module in the backbone network to improve the extraction capability of the backbone network for safflower features.
- (4)
- A small object prediction head with a size of 160 × 160 × 32 is added, intending to enhance the ability to detect small target safflower samples. The structure of the YOLOv8-WED model is shown in Figure 4.
2.2.3. WIoU Loss Function
2.2.4. Dysample
2.2.5. C2f-EMA Module
2.2.6. Small-Target Detection Layer
3. Results
3.1. Experimental Setup
3.1.1. Test Platform
3.1.2. Evaluation Indicators
3.2. Experimental Result
3.2.1. Model Training
3.2.2. Ablation Experiment
4. Discussion
4.1. Comparison of Different Object Detection Methods
4.2. Comparison of Different Loss Functions
4.3. Visualization of Heatmap
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
References
- Kassa, B.A.; Mekbib, F.; Assefa, K. Effects of plant hormones and genotypes on anther culture response of safflower (Carthamus tinctorius L.). Sci. Afr. 2024, 26, e02367. [Google Scholar] [CrossRef]
- Chen, Y.; Li, M.; Wen, J.; Pan, X.; Deng, Z.; Chen, J.; Chen, G.; Yu, L.; Tang, Y.; Li, G.; et al. Pharmacological activities of safflower yellow and its clinical applications. Evid.-Based Complement. Altern. Med. 2022, 2022, 2108557. [Google Scholar] [CrossRef] [PubMed]
- Gongal, A.; Amatya, S.; Karkee, M.; Zhang, Q.; Lewis, K. Sensors and systems for fruit detection and localization: A review. Comput. Electron. Agric. 2015, 116, 8–19. [Google Scholar] [CrossRef]
- Zhang, Z.; Guo, J.; Yang, S.; Zhang, X.; Niu, Q.; Han, C.; Lv, Q. Feasibility of high-precision numerical simulation technology for improving the harvesting mechanization level of safflower filaments: A review. Int. Agric. Eng. J. 2020, 29, 139–150. [Google Scholar]
- Zhang, Z.; Xing, Z.; Yang, S.; Feng, N.; Liang, R.; Zhao, M. Design and experiments of the circular arc progressive type harvester for the safflower filaments. Trans. Chin. Soc. Agric. Eng. 2022, 38, 10–21. [Google Scholar] [CrossRef]
- Fan, X.; Zhou, J.; Xu, Y.; Li, K.; Wen, D. Identification and localization of weeds based on optimized faster R-CNN in cotton seedling stage. Trans. Chin. Soc. Agric. Mach. 2021, 52, 26–34. [Google Scholar] [CrossRef]
- Lin, G.; Tang, Y.; Zou, X.; Li, J.; Xiong, J. In-field citrus detection and localisation based on RGB-D image analysis. Biosyst. Eng. 2019, 186, 34–44. [Google Scholar] [CrossRef]
- Deng, X.; Zhou, B.; Hou, Y. A reliability test method for agricultural paddy field intelligent robot. NMATEH-Agric. Eng. 2021, 3, 271–280. [Google Scholar] [CrossRef]
- Farjon, G.; Krikeb, O.; Hillel, A.B.; Alchanatis, V. Detection and counting of flowers on apple trees for better chemical thinning decisions. Precis Agric. 2020, 21, 503–521. [Google Scholar] [CrossRef]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE Trans. Pattern Anal. Mach. Intell. 2016, 39, 1137–1149. [Google Scholar] [CrossRef]
- Zhang, Z.; Shi, R.; Xing, Z.; Guo, Q.; Zeng, C. Improved Faster Region-Based Convolutional Neural Networks (R-CNN) model based on split attention for the detection of safflower filaments in natural environments. Agronomy 2023, 13, 2596. [Google Scholar] [CrossRef]
- Bhattarai, U.; Bhusal, S.; Majeed, Y.; Karkee, M. Automatic blossom detection in apple trees using deep learning. IFAC-PapersOnLine 2020, 53, 15810–15815. [Google Scholar] [CrossRef]
- Tian, Y.; Yang, G.; Wang, Z.; Li, E.; Liang, Z. Instance segmentation of apple flowers using the improved mask R–CNN model. Biosyst. Eng. 2020, 193, 264–278. [Google Scholar] [CrossRef]
- Zhang, Z.; Xing, Z.; Zhao, M.; Yang, S.; Guo, Q.; Shi, R.; Zeng, C. Detecting safflower filaments using an improved YOLOv3 under complex environments. Trans. Chin. Soc. Agric. Eng. 2023, 39, 162–170. [Google Scholar] [CrossRef]
- Guo, H.; Chen, H.; Gao, G.; Zhou, W.; Wu, T.; Qiu, Z. Safflower corolla object detection and spatial positioning methods based onYOLO v5m. Trans. Chin. Soc. Agric. Mach. 2023, 54, 272–281. [Google Scholar] [CrossRef]
- WANG, X.; XU, Y.; ZHOU, J.; CHEN, J. Safflower picking recognition in complex environments based on an improved YOLOv7. Trans. Chin. Soc. Agric. Eng. 2023, 39, 169–176. [Google Scholar] [CrossRef]
- Wang, Z.; Jin, L.; Wang, S.; Xu, H. Apple stem/calyx real-time recognition using YOLO-v5 algorithm for fruit automatic loading system. Postharvest Biol. Technol. 2022, 185, 111808. [Google Scholar] [CrossRef]
- Bhattarai, U.; Bhusal, S.; Zhang, Q.; Karkee, M. AgRegNet: A deep regression network for flower and fruit density estimation, localization, and counting in orchards. Comput. Electron. Agric. 2024, 227, 109534. [Google Scholar] [CrossRef]
- Dias, P.A.; Tabb, A.; Medeiros, H. Multispecies fruit flower detection using a refined semantic segmentation network. IEEE Robot. Autom. Lett. 2018, 3, 3003–3010. [Google Scholar] [CrossRef]
- Qi, C.; Gao, J.; Pearson, S.; Harman, H.; Chen, K.; Shu, L. Tea chrysanthemum detection under unstructured environments using the TC-YOLO model. Expert Syst. Appl. 2022, 193, 116473. [Google Scholar] [CrossRef]
- Bai, Y.; Yu, J.; Yang, S.; Ning, J. An improved YOLO algorithm for detecting flowers and fruits on strawberry seedlings. Biosyst. Eng. 2024, 237, 1–12. [Google Scholar] [CrossRef]
- Zhao, C.; Wen, C.; Lin, S.; Guo, W.; Long, J. Tomato florescence recognition and detection method based on cascaded neural network. Trans. Chin. Soc. Agric. Eng. 2020, 36, 143–152. [Google Scholar] [CrossRef]
- Liu, W.; Ren, G.; Yu, R.; Guo, S.; Zhu, J.; Zhang, L. Image-adaptive YOLO for object detection in adverse weather conditions. In Proceedings of the AAAI Conference on Artificial Intelligence, Virtually, 22 February–1 March 2022; Volume 36, pp. 1792–1800. [Google Scholar] [CrossRef]
- Jiang, F.; Zhang, H.; Feng, C.; Zhu, C. A Closed-Loop Detection Algorithm for Indoor Simultaneous Localization and Mapping Based on You Only Look Once v3. Traitement du Signal 2022, 39, 109–117. [Google Scholar] [CrossRef]
- Rahim, U.F.; Mineno, H. Data augmentation method for strawberry flower detection in non-structured environment using convolutional object detection networks. J. Agric. Crop Res. 2020, 8, 260–271. [Google Scholar] [CrossRef]
- Wang, C.Y.; Bochkovskiy, A.; Liao, H.Y.M. YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Vancouver, BC, Canada, 17–24 June 2023; pp. 7464–7475. [Google Scholar] [CrossRef]
- Tong, Z.; Chen, Y.; Xu, Z.; Yu, R. Wise-IoU: Bounding box regression loss with dynamic focusing mechanism. arXiv 2023, arXiv:2301.10051. [Google Scholar] [CrossRef]
- Xu, X.; Zhang, G.; Zheng, W.; Zhao, A.; Zhong, Y.; Wang, H. High-precision detection algorithm for metal workpiece defects based on deep learning. Machines 2023, 11, 834. [Google Scholar] [CrossRef]
- Liu, W.; Lu, H.; Fu, H.; Cao, Z. Learning to upsample by learning to sample. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Paris, France, 2–3 October 2023; pp. 6027–6037. [Google Scholar]
- Ouyang, D.; He, S.; Zhang, G.; Luo, M.; Guo, H.; Zhan, J.; Huang, Z. Efficient multi-scale attention module with cross-spatial learning. In Proceedings of the ICASSP 2023—2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Rhodes Island, Greece, 4–10 June 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 1–5. [Google Scholar] [CrossRef]
- Pan, Y.; Xiao, X.; Hu, K.; Kang, H.; Jin, Y.; Chen, Y.; Zou, X. Odn-pro: An improved model based on yolov8 for enhanced instance detection in orchard point clouds. Agronomy 2024, 14, 697. [Google Scholar] [CrossRef]
- Zhang, Y.F.; Ren, W.; Zhang, Z.; Jia, Z.; Wang, L.; Tan, T. Focal and efficient IOU loss for accurate bounding box regression. Neurocomputing 2022, 506, 146–157. [Google Scholar] [CrossRef]
- Gevorgyan, Z. SIoU Loss: More Powerful Learning for Bounding Box Regression. arXiv 2022, arXiv:2205.12740. [Google Scholar] [CrossRef]
- Selvaraju, R.R.; Cogswell, M.; Das, A.; Vedantam, R.; Parikh, D.; Batra, D. Grad-CAM: Visual explanations from deep networks via gradient-based localization. Int. J. Comput. Vis. 2020, 128, 336–359. [Google Scholar] [CrossRef]
Weather Condition | Image Quantities Under Different Shooting Distance | |
---|---|---|
Close Distance (0.4~0.7 m) | Far Distance (0.7~1 m) | |
Sunny | 204 | 169 |
Overcast | 223 | 192 |
Cloudy | 172 | 166 |
Test NO. | Base Model | WIoU | DySample | C2f-EMA | S-T D Layer | Precision /% | Recall /% | F1/% | mAP/% | Inference Time/ms |
---|---|---|---|---|---|---|---|---|---|---|
1 | √ | - | - | - | - | 90.25 | 80.01 | 84.82 | 90.53 | 15.2 |
2 | √ | √ | - | - | - | 90.35 | 81.33 | 85.43 | 90.69 | 15.2 |
3 | √ | √ | √ | - | - | 90.89 | 82.98 | 86.58 | 91.46 | 12.2 |
4 | √ | √ | √ | √ | - | 91.93 | 84.85 | 88.24 | 93.18 | 12.6 |
5 | √ | √ | √ | √ | √ | 93.15 | 86.71 | 89.64 | 95.03 | 13.9 |
Model | P/% | R/% | F1/% | AP/% | mAP/ % | FPS/ Frame·s−1 | Model Size/MB | |
---|---|---|---|---|---|---|---|---|
Opening Period | Flower-Shedding Period | |||||||
Faster R-CNN | 78.96 | 82.32 | 76.50 | 82.92 | 77.34 | 81.97 | 23 | 322 |
YOLOv5n | 89.92 | 76.86 | 82.50 | 90.17 | 90.19 | 90.18 | 69 | 4.1 |
YOLOv7 | 89.89 | 79.94 | 83.33 | 90.11 | 90.23 | 90.17 | 47 | 74.8 |
YOLOv8n | 90.25 | 80.01 | 84.82 | 90.40 | 90.66 | 90.53 | 66 | 6.2 |
YOLOv10n | 89.98 | 80.04 | 83.42 | 90.13 | 90.28 | 90.21 | 80 | 3.5 |
YOLOv8-WED | 93.15 | 86.7 | 89.64 | 95.25 | 94.82 | 95.03 | 72 | 6.4 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, Z.; Wang, Y.; Xu, P.; Shi, R.; Xing, Z.; Li, J. WED-YOLO: A Detection Model for Safflower Under Complex Unstructured Environment. Agriculture 2025, 15, 205. https://doi.org/10.3390/agriculture15020205
Zhang Z, Wang Y, Xu P, Shi R, Xing Z, Li J. WED-YOLO: A Detection Model for Safflower Under Complex Unstructured Environment. Agriculture. 2025; 15(2):205. https://doi.org/10.3390/agriculture15020205
Chicago/Turabian StyleZhang, Zhenguo, Yunze Wang, Peng Xu, Ruimeng Shi, Zhenyu Xing, and Junye Li. 2025. "WED-YOLO: A Detection Model for Safflower Under Complex Unstructured Environment" Agriculture 15, no. 2: 205. https://doi.org/10.3390/agriculture15020205
APA StyleZhang, Z., Wang, Y., Xu, P., Shi, R., Xing, Z., & Li, J. (2025). WED-YOLO: A Detection Model for Safflower Under Complex Unstructured Environment. Agriculture, 15(2), 205. https://doi.org/10.3390/agriculture15020205