Improved Faster Region-Based Convolutional Neural Networks (R-CNN) Model Based on Split Attention for the Detection of Safflower Filaments in Natural Environments
Abstract
:1. Introduction
- (1)
- Aiming at addressing the interference problem of the natural environment, ResNeSt-101 used the Split Attention module to carry out multi-scale effective feature extraction. Meanwhile, enhancing the ability of local feature extraction improved the detection accuracy and avoided leakage and false detection.
- (2)
- Based on the size of safflower filaments, the anchor box size in the region generation network was optimized using the PAM clustering algorithm. The anchor box size matched with the filaments size more closely, leading to improved accuracy in detecting small sizes.
- (3)
- ROI Align was used instead of ROI Pooling to reduce the feature error of safflower filaments caused by double quantization. The target bounding box was depicted more accurately, improving the detection accuracy.
2. Materials
2.1. The Overall Structure of a Safflower-Picking Robot
2.2. Image Acquisition
2.3. Construct of Safflower Image Dataset
3. Methods
3.1. Faster R-CNN Network
3.2. Improved Faster R-CNN
3.2.1. Backbone of Faster R-CNN-S101 Model
3.2.2. Optimizing the Size of Anchor Boxes
3.2.3. ROI Align
4. Network Training
4.1. Training Platform
4.2. Evaluation Indicators of the Network Model
5. Results and Discussion
5.1. Training Results
5.2. Comparison of Other Object Detection Models
5.3. Ablation Experiment
5.4. Confidence Test
5.5. Safflower Image Test for Detection in Natural Environments
5.5.1. Effect of Detection in Different Weather Conditions
5.5.2. Effect of Detection in Different Illumination Conditions
5.5.3. Effect of Detection in Different Occlusion Conditions
6. Conclusions and Future Work
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Zhang, H.; Ge, Y.; Sun, C.; Zeng, H.F.; Liu, N. Picking path planning method of dual rollers type safflower picking robot based on improved ant colony algorithm. Processes 2022, 10, 1213. [Google Scholar] [CrossRef]
- Guo, H.; Luo, D.; Gao, G.M.; Wu, T.L.; Diao, H.W. Design and experiment of a safflower picking robot based on a parallel manipulator. Eng. Agric. 2022, 42, e20210129. [Google Scholar] [CrossRef]
- Barreda, V.D.; Palazzesi, L.; Tellería, M.C.; Olivero, E.B.; Raine, J.I.; Forest, F. Early evolution of the angiosperm clade Asteraceae in the Cretaceous of Antarctica. Proc. Natl. Acad. Sci. USA 2015, 112, 10989–10994. [Google Scholar] [CrossRef]
- Ma, Q.; Ruan, Y.Y.; Xu, H.; Shi, X.M.; Wang, Z.X.; Hu, Y.L. Safflower yellow reduces lipid peroxidation, neuropathology, tau phosphorylation and ameliorates amyloid β-induced impairment of learning and memory in rats. Biomed. Pharmacother. 2015, 76, 153–164. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Z.G.; Guo, J.X.; Yang, S.P. Feasibility of high-precision numerical simulation technology for improving the harvesting mechanization level of safflower filaments: A Review. Int. Agric. Eng. J. 2019, 29, 139–150. [Google Scholar]
- Zhang, Z.G.; Zhao, M.Y.; Xing, Z.Y.; Liu, X.F. Design and test of double-acting opposite direction cutting end effector for safflower harvester. Trans. Chin. Soc. Agric. Mach. 2022, 53, 160–170. [Google Scholar] [CrossRef]
- Thorp, K.R.; Dierig, D.A. Color image segmentation approach to monitor flowering in lesquerella. Ind. Crops Prod. 2011, 34, 1150–1159. [Google Scholar] [CrossRef]
- Saddik, A.; Latif, R.; El Ouardi, A. Low-Power FPGA Architecture Based Monitoring Applications in Precision Agriculture. J. Low Power Electron. Appl. 2021, 11, 39. [Google Scholar] [CrossRef]
- Chen, Z.; Su, R.; Wang, Y.; Chen, G.; Wang, Z.; Yin, P.; Wang, J. Automatic Estimation of Apple Orchard Blooming Levels Using the Improved YOLOv5. Agronomy 2022, 12, 2483. [Google Scholar] [CrossRef]
- Saddik, A.; Latif, R.; Taher, F.; El Ouardi, A.; Elhoseny, M. Mapping Agricultural Soil in Greenhouse Using an Autonomous Low-Cost Robot and Precise Monitoring. Sustainability 2022, 14, 15539. [Google Scholar] [CrossRef]
- Tang, Y.; Chen, M.; Wang, C.; Luo, L.; Li, J.; Lian, G.; Zou, X. Recognition and localization methods for vision-based fruit picking robots: A review. Front. Plant Sci. 2020, 11, 510. [Google Scholar] [CrossRef]
- Li, Y.; He, L.; Jia, J.; Lv, J.; Chen, J. In-field tea shoot detection and 3D localization using an RGB-D camera. Comput. Electron. Agric. 2021, 185, 106149. [Google Scholar] [CrossRef]
- Parvathi, S.; Selvi, S.T. Detection of maturity stages of coconuts in complex background using Faster R-CNN model. Biosyst Eng. 2021, 202, 119–132. [Google Scholar] [CrossRef]
- Jia, W.; Tian, Y.; Luo, R.; Zhang, Z.; Lian, J.; Zheng, Y. Detection and segmentation of overlapped fruits based on optimized Mask R-CNN application in apple harvesting robot. Comput. Electron. Agric. 2020, 172, 105380. [Google Scholar] [CrossRef]
- Dias, P.; Tabb, A.; Medeiros, H. Apple flower detection using deep convolutional networks. Comput. Ind. 2018, 99, 17–28. [Google Scholar] [CrossRef]
- Farjon, G.; Krikeb, O.; Hillel, A.B.; Alchanatis, V. Detection and counting of flowers on apple trees for better chemical thinning decisions. Precis. Agric. 2020, 21, 503–521. [Google Scholar] [CrossRef]
- Gogul, I.; Kumar, V.S. Flower species recognition system using convolution neural networks and transfer learning. In Proceedings of the 2017 Fourth International Conference on Signal Processing, Communication and Networking (ICSCN), Chennai, India, 16–18 March 2017; pp. 1–6. [Google Scholar] [CrossRef]
- Sun, K.; Wang, X.; Liu, S.; Liu, C. Apple, peach, and pear flower detection using semantic segmentation network and shape constraint level set. Comput. Electron. Agric. 2021, 185, 106150. [Google Scholar] [CrossRef]
- Zhao, C.J.; Wen, C.W.; Lin, S.; Guo, W.Z.; Long, J.H. Tomato florescence recognition and detection method based on cascaded neural network. Trans. Chin. Soc. Agric. Eng. 2020, 36, 143–152. [Google Scholar] [CrossRef]
- Xia, H.M.; Zhao, K.D.; Jiang, L.H.; Liu, Y.J.; Zhen, W.B. Flower bud detection model for hydroponic Chinese kale based on the fusion of attention mechanism and multi-scale feature. Trans. Chin. Soc. Agric. Eng. 2021, 37, 161–168. [Google Scholar] [CrossRef]
- Li, Z.; Li, Y.; Yang, Y.; Guo, R.; Yang, J.; Yue, J.; Wang, Y. A high-precision detection method of hydroponic lettuce seedlings status based on improved Faster RCNN. Comput. Electron. Agric. 2021, 182, 106054. [Google Scholar] [CrossRef]
- Nguyen, N.D.; Do, T.; Ngo, T.D.; Le, D.D. An evaluation of deep learning methods for small object detection. J. Electr. Comput. Eng. 2020, 2020, 3189691. [Google Scholar] [CrossRef]
- Cao, W.B.; Yang, S.P.; Li, S.F.; Jiao, H.B.; Lian, G.D.; Niu, C.; An, L.L. Parameter optimization of height limiting device for comb-type safflower harvesting machine. Trans. Chin. Soc. Agric. Eng. 2019, 35, 48–56. [Google Scholar] [CrossRef]
- Zhao, S.; Liu, J.; Wu, S. Multiple disease detection method for greenhouse-cultivated strawberry based on multiscale feature fusion Faster R-CNN. Comput. Electron. Agric. 2022, 199, 107176. [Google Scholar] [CrossRef]
- Parico, A.I.B.; Ahamed, T. An Aerial Weed Detection System for Green Onion Crops Using the You Only Look Once (YOLOv3) Deep Learning Algorithm. Eng. Agric. Environ. 2020, 13, 42–48. [Google Scholar] [CrossRef] [PubMed]
- Mu, Y.; Feng, R.; Ni, R.; Li, J.; Luo, T.; Liu, T.; Li, X.; Gong, H.; Guo, Y.; Sun, Y.; et al. A Faster R-CNN-Based Model for theIdentification of Weed Seedling. Agronomy 2022, 12, 2867. [Google Scholar] [CrossRef]
- Song, P.; Chen, K.; Zhu, L.; Yang, M.; Ji, C.; Xiao, A.; Jia, H.Y.; Zhang, J.; Yang, W. An improved cascade R-CNN and RGB-D camera-based method for dynamic cotton top bud recognition and localization in the field. Comput. Electron. Agric. 2022, 202, 107442. [Google Scholar] [CrossRef]
- Zhang, H.; Wu, C.; Zhang, Z.; Zhu, Y.; Lin, H.; Zhang, Z.; Sun, Y.; He, T.; Mueller, J.; Manmatha, R.; et al. Resnest: Split-attention networks. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, New Orleans, LA, USA, 18–24 June 2022; pp. 2736–2746. [Google Scholar]
- Noon, S.K.; Amjad, M.; Qureshi, M.A.; Mannan, A. Use of deep learning techniques for identification of plant leaf stresses: A review. Sustain. Comput. Inform. Syst. 2020, 28, 100443. [Google Scholar] [CrossRef]
- He, K.; Gkioxari, G.; Dollár, P.; Girshick, R. Mask r-cnn. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), Venice, Italy, 22–29 October 2017; pp. 2961–2969. [Google Scholar]
- Teimouri, N.; Jørgensen, R.N.; Green, O. Novel Assessment of Region-Based CNNs for Detecting Monocot/Dicot Weeds in Dense Field Environments. Agronomy 2022, 12, 1167. [Google Scholar] [CrossRef]
- Saddik, A.; Latif, R.; El Ouardi, A.; Alghamdi, M.I.; Elhoseny, M. Improving Sustainable Vegetation Indices Processing on Low-Cost Architectures. Sustainability 2022, 14, 2521. [Google Scholar] [CrossRef]
- Hu, G.; Wei, K.; Zhang, Y.; Bao, W.; Liang, D. Estimation of tea leaf blight severity in natural scene images. Precis. Agric. 2021, 22, 1239–1262. [Google Scholar] [CrossRef]
- Redmon, J.; Farhadi, A. Yolov3: An incremental improvement. arXiv 2018, arXiv:1804.02767. [Google Scholar] [CrossRef]
- Bochkovskiy, A.; Wang, C.Y.; Liao, H.Y.M. Yolov4: Optimal speed and accuracy of object detection. arXiv 2020, arXiv:2004.10934. [Google Scholar] [CrossRef]
- Wang, Z.; Jin, L.; Wang, S.; Xu, H. Apple stem/calyx real-time recognition using YOLO-v5 algorithm for fruit automatic loading system. Postharvest Biol. Technol. 2022, 185, 111808. [Google Scholar] [CrossRef]
- Li, C.; Li, L.; Jiang, H.; Weng, K.; Geng, Y.; Li, L.; Ke, Z.; Li, Q.; Cheng, M.; Wei, X. YOLOv6: A single-stage object detection framework for industrial applications. arXiv 2022, arXiv:2209.02976. [Google Scholar] [CrossRef]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster r-cnn: Towards real-time object detection with region proposal networks. Adv. Neural Inf. Process. Syst. 2015, 28, 1137–1149. [Google Scholar] [CrossRef] [PubMed]
Model | Precision/% | Recall/% | F1/% | AP/% | mAP/% | |
---|---|---|---|---|---|---|
Opening Period | Flower-Shedding Period | |||||
YOLOv3 | 89.70 | 84.49 | 87.32 | 91.49 | 86.52 | 89.00 |
YOLOv4 | 89.49 | 77.75 | 78.57 | 88.25 | 82.82 | 85.54 |
YOLOv5 | 89.92 | 76.86 | 82.54 | 89.40 | 86.46 | 87.93 |
YOLOv6 | 90.31 | 80.25 | 83.36 | 90.12 | 89.91 | 90.02 |
Faster R-CNN | 78.96 | 82.32 | 76.52 | 82.92 | 77.34 | 81.97 |
Faster R-CNN-S101 | 89.94 | 87.81 | 89.57 | 93.19 | 89.93 | 91.49 |
Test No. | Module Setting | mAP/% | ||
---|---|---|---|---|
ResNeSt-101 | ROI Align | Optimize Anchor Boxes | ||
I | - | - | - | 81.97 |
II | √ | - | - | 87.67 |
III | √ | √ | - | 90.07 |
IV | √ | √ | √ | 91.49 |
Model | No. | Total Number | Discover Number | Miss Number |
---|---|---|---|---|
Faster R-CNN | 1 | 9 | 9 | 2 |
2 | 10 | 7 | 0 | |
3 | 13 | 11 | 1 | |
4 | 14 | 10 | 2 | |
5 | 18 | 14 | 0 | |
Faster R-CNN-S101 | 1 | 9 | 9 | 0 |
2 | 10 | 9 | 0 | |
3 | 13 | 13 | 1 | |
4 | 14 | 13 | 0 | |
5 | 18 | 17 | 1 |
Model | Scenes | Classes | Precision/% | Recall/% | F1/% | mAP/% |
---|---|---|---|---|---|---|
Faster R-CNN | Weather conditions | Sunny day | 78.84 | 74.17 | 76.71 | 81.92 |
Cloudy day | 78.86 | 74.06 | 76.52 | 81.81 | ||
Overcast day | 78.98 | 74.17 | 77.60 | 81.88 | ||
Illumination conditions | Sunlight | 79.31 | 74.19 | 77.74 | 82.05 | |
Backlight | 79.15 | 74.08 | 77.69 | 81.90 | ||
Occlusion conditions | Branch and leaf occlusion | 79.31 | 74.06 | 77.55 | 81.62 | |
Dense occlusion | 79.24 | 74.29 | 77.47 | 81.48 | ||
Faster R-CNN-S101 | Weather conditions | Sunny day | 89.75 | 86.19 | 89.67 | 91.62 |
Cloudy day | 89.58 | 86.18 | 89.56 | 91.48 | ||
Overcast day | 89.61 | 86.09 | 89.61 | 91.56 | ||
Illumination conditions | Sunlight | 89.68 | 86.53 | 89.69 | 91.69 | |
Backlight | 89.65 | 86.44 | 89.54 | 91.66 | ||
Occlusion conditions | Branch and leaf occlusion | 89.59 | 86.43 | 89.60 | 91.53 | |
Dense occlusion | 89.64 | 86.26 | 89.53 | 91.50 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, Z.; Shi, R.; Xing, Z.; Guo, Q.; Zeng, C. Improved Faster Region-Based Convolutional Neural Networks (R-CNN) Model Based on Split Attention for the Detection of Safflower Filaments in Natural Environments. Agronomy 2023, 13, 2596. https://doi.org/10.3390/agronomy13102596
Zhang Z, Shi R, Xing Z, Guo Q, Zeng C. Improved Faster Region-Based Convolutional Neural Networks (R-CNN) Model Based on Split Attention for the Detection of Safflower Filaments in Natural Environments. Agronomy. 2023; 13(10):2596. https://doi.org/10.3390/agronomy13102596
Chicago/Turabian StyleZhang, Zhenguo, Ruimeng Shi, Zhenyu Xing, Quanfeng Guo, and Chao Zeng. 2023. "Improved Faster Region-Based Convolutional Neural Networks (R-CNN) Model Based on Split Attention for the Detection of Safflower Filaments in Natural Environments" Agronomy 13, no. 10: 2596. https://doi.org/10.3390/agronomy13102596
APA StyleZhang, Z., Shi, R., Xing, Z., Guo, Q., & Zeng, C. (2023). Improved Faster Region-Based Convolutional Neural Networks (R-CNN) Model Based on Split Attention for the Detection of Safflower Filaments in Natural Environments. Agronomy, 13(10), 2596. https://doi.org/10.3390/agronomy13102596