Streamlining YOLOv7 for Rapid and Accurate Detection of Rapeseed Varieties on Embedded Device
Abstract
:1. Introduction
2. Dataset Creation
3. Model Design
3.1. Overview
3.2. Pruning Strategies
3.2.1. Pruning Process
3.2.2. Spatial Pruning
3.2.3. Channel Pruning
3.3. Model Structure and Training
3.3.1. Model Structure
3.3.2. Sparse Training
3.3.3. Loss Function
3.4. Prototype Implementation
4. Results
4.1. Metrics
4.2. Comparison of Different Target Detection Algorithms
4.3. Comparison of Different Branch Selections
4.4. Comparison of Different Channel Dimension Pruning Methods
4.4.1. Global Pruning
4.4.2. Fixed Ratio Layer-by-Layer Pruning
4.4.3. Custom Ratio Layer-by-Layer Pruning
5. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Izli, N.; Unal, H.; Sincik, M. Physical and mechanical properties of rapeseed at different moisture content. Int. Agrophys. 2009, 23, 137–145. [Google Scholar]
- Mácová, K.; Prabhullachandran, U.; Štefková, M.; Spyroglou, I.; Pěnčík, A.; Endlová, L.; Novák, O.; Robert, H.S. Long-term high-temperature stress impacts on embryo and seed development in Brassica napus. Front. Plant Sci. 2022, 13, 844292. [Google Scholar] [CrossRef] [PubMed]
- Zhou, L.; Li, Y.; Hussain, N.; Li, Z.; Wu, D.; Jiang, L. Allelic variation of BnaC. TT2. a and its association with seed coat color and fatty acids in rapeseed (Brassica napus L.). PLoS ONE 2016, 11, e0146661. [Google Scholar]
- Jordan, M.I.; Mitchell, T.M. Machine learning: Trends, perspectives, and prospects. Science 2015, 349, 255–260. [Google Scholar] [CrossRef] [PubMed]
- Li, J.; Liao, G.; Yu, X.; Tong, Z. Plumpness Recognition and Quantification of Rapeseeds using Computer Vision. J. Softw. 2010, 5, 1038–1047. [Google Scholar] [CrossRef]
- Kurtulmuş, F.; Ünal, H. Discriminating rapeseed varieties using computer vision and machine learning. Expert Syst. Appl. 2015, 42, 1880–1891. [Google Scholar] [CrossRef]
- Shahsavari, M.; Mohammadi, V.; Alizadeh, B.; Alizadeh, H. Application of machine learning algorithms and feature selection in rapeseed (Brassica napus L.) breeding for seed yield. Plant Methods 2023, 19, 57. [Google Scholar] [CrossRef] [PubMed]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
- Przybył, K.; Wawrzyniak, J.; Koszela, K.; Adamski, F.; Gawrysiak-Witulska, M. Application of deep and machine learning using image analysis to detect fungal contamination of rapeseed. Sensors 2020, 20, 7305. [Google Scholar] [CrossRef]
- Loddo, A.; Loddo, M.; Di Ruberto, C. A novel deep learning based approach for seed image classification and retrieval. Comput. Electron. Agric. 2021, 187, 106269. [Google Scholar] [CrossRef]
- Díaz-Martínez, V.; Orozco-Sandoval, J.; Manian, V.; Dhatt, B.K.; Walia, H. A deep learning framework for processing and classification of hyperspectral rice seed images grown under high day and night temperatures. Sensors 2023, 23, 4370. [Google Scholar] [CrossRef] [PubMed]
- Liu, Z.; Wang, L.; Liu, Z.; Wang, X.; Hu, C.; Xing, J. Detection of Cotton Seed Damage Based on Improved YOLOv5. Processes 2023, 11, 2682. [Google Scholar] [CrossRef]
- Ouf, N.S. Leguminous seeds detection based on convolutional neural networks: Comparison of faster R-CNN and YOLOv4 on a small custom dataset. Artif. Intell. Agric. 2023, 8, 30–45. [Google Scholar] [CrossRef]
- Bochkovskiy, A.; Wang, C.Y.; Liao, H.Y.M. Yolov4: Optimal speed and accuracy of object detection. arXiv 2020, arXiv:2004.10934. [Google Scholar]
- Girshick, R. Fast r-cnn. In Proceedings of the IEEE International Conference on Computer Vision, Santiago, Chile, 7–13 December 2015; pp. 1440–1448. [Google Scholar]
- Ng, J.; Liao, I.Y.; Jelani, M.F.; Chen, Z.Y.; Wong, C.K.; Wong, W.C. Multiview-based method for high-throughput quality classification of germinated oil palm seeds. Comput. Electron. Agric. 2024, 218, 108684. [Google Scholar] [CrossRef]
- Jia, L.; Wang, T.; Chen, Y.; Zang, Y.; Li, X.; Shi, H.; Gao, L. MobileNet-CA-YOLO: An improved YOLOv7 based on the MobileNetV3 and attention mechanism for Rice pests and diseases detection. Agriculture 2023, 13, 1285. [Google Scholar] [CrossRef]
- Soeb, M.J.A.; Jubayer, M.F.; Tarin, T.A.; Al Mamun, M.R.; Ruhad, F.M.; Parven, A.; Mubarak, N.M.; Karri, S.L.; Meftaul, I.M. Tea leaf disease detection and identification based on YOLOv7 (YOLO-T). Sci. Rep. 2023, 13, 6078. [Google Scholar] [CrossRef]
- Wu, D.; Jiang, S.; Zhao, E.; Liu, Y.; Zhu, H.; Wang, W.; Wang, R. Detection of Camellia oleifera fruit in complex scenes by using YOLOv7 and data augmentation. Appl. Sci. 2022, 12, 11318. [Google Scholar] [CrossRef]
- Zhou, J.; Zhang, Y.; Wang, J. A dragon fruit picking detection method based on YOLOv7 and PSP-Ellipse. Sensors 2023, 23, 3803. [Google Scholar] [CrossRef]
- Wang, K.; Hu, X.; Zheng, H.; Lan, M.; Liu, C.; Liu, Y.; Zhong, L.; Li, H.; Tan, S. Weed detection and recognition in complex wheat fields based on an improved YOLOv7. Front. Plant Sci. 2024, 15, 1372237. [Google Scholar] [CrossRef]
- Gallo, I.; Rehman, A.U.; Dehkordi, R.H.; Landro, N.; La Grassa, R.; Boschetti, M. Deep object detection of crop weeds: Performance of YOLOv7 on a real case dataset from UAV images. Remote Sens. 2023, 15, 539. [Google Scholar] [CrossRef]
- Kim, S.W.; Ko, K.; Ko, H.; Leung, V.C. Edge-network-assisted real-time object detection framework for autonomous driving. IEEE Netw. 2021, 35, 177–183. [Google Scholar] [CrossRef]
- Zonglei, L.; Xianhong, X. Deep compression: A compression technology for apron surveillance video. IEEE Access 2019, 7, 129966–129974. [Google Scholar] [CrossRef]
- Guo, Y.; Wang, M.; Li, X. Application of an improved Apriori algorithm in a mobile e-commerce recommendation system. Ind. Manag. Data Syst. 2017, 117, 287–303. [Google Scholar] [CrossRef]
- Hinton, G.; Vinyals, O.; Dean, J. Distilling the knowledge in a neural network. arXiv 2015, arXiv:1503.02531. [Google Scholar]
- Huang, Z.; Wang, R.; Cao, Y.; Zheng, S.; Teng, Y.; Wang, F.; Wang, L.; Du, J. Deep learning based soybean seed classification. Comput. Electron. Agric. 2022, 202, 107393. [Google Scholar] [CrossRef]
- Lin, B. Safety Helmet Detection Based on Improved YOLOv8. IEEE Access 2024, 12, 28260–28272. [Google Scholar] [CrossRef]
- Kim, Y.D.; Park, E.; Yoo, S.; Choi, T.; Yang, L.; Shin, D. Compression of deep convolutional neural networks for fast and low power mobile applications. arXiv 2015, arXiv:1511.06530. [Google Scholar]
- Polino, A.; Pascanu, R.; Alistarh, D. Model compression via distillation and quantization. arXiv 2018, arXiv:1802.05668. [Google Scholar]
- Liu, Z.; Sun, M.; Zhou, T.; Huang, G.; Darrell, T. Rethinking the value of network pruning. arXiv 2018, arXiv:1810.05270. [Google Scholar]
- Anwar, S.; Hwang, K.; Sung, W. Structured pruning of deep convolutional neural networks. ACM J. Emerg. Technol. Comput. Syst. (JETC) 2017, 13, 1–18. [Google Scholar] [CrossRef]
- Fang, G.; Ma, X.; Song, M.; Mi, M.B.; Wang, X. Depgraph: Towards any structural pruning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Vancouver, BC, Canada, 17–24 June 2023; pp. 16091–16101. [Google Scholar]
- Chen, X.; Zhu, J.; Jiang, J.; Tsui, C.Y. Tight compression: Compressing CNN model tightly through unstructured pruning and simulated annealing based permutation. In Proceedings of the 2020 57th ACM/IEEE Design Automation Conference (DAC), IEEE, San Francisco, CA, USA, 20–24 July 2020; pp. 1–6. [Google Scholar]
- Yang, Z.; Zhang, H. Comparative analysis of structured pruning and unstructured pruning. In Proceedings of the International Conference on Frontier Computing, Seoul, Republic of Korea, 13–17 July 2021; Springer: Singapore, 2021; pp. 882–889. [Google Scholar]
- Li, C.; Li, H.; Liao, L.; Liu, Z.; Dong, Y. Real-time seed sorting system via 2D information entropy-based CNN pruning and TensorRt acceleration. Iet Image Process. 2023, 17, 1694–1708. [Google Scholar] [CrossRef]
- Jin, X.; Zhao, Y.; Wu, H.; Sun, T. Sunflower seeds classification based on sparse convolutional neural networks in multi-objective scene. Sci. Rep. 2022, 12, 19890. [Google Scholar] [CrossRef] [PubMed]
- Wang, Y.; Bai, H.; Sun, L.; Tang, Y.; Huo, Y.; Min, R. The Rapid and Accurate Detection of Kidney Bean Seeds Based on a Compressed Yolov3 Model. Agriculture 2022, 12, 1202. [Google Scholar] [CrossRef]
- Redmon, J.; Farhadi, A. Yolov3: An incremental improvement. arXiv 2018, arXiv:1804.02767. [Google Scholar]
- Ioffe, S.; Szegedy, C. Batch normalization: Accelerating deep network training by reducing internal covariate shift. In Proceedings of the International Conference on Machine Learning. pmlr, Lille, France, 7–9 July 2015; pp. 448–456. [Google Scholar]
- Liu, Z.; Li, J.; Shen, Z.; Huang, G.; Yan, S.; Zhang, C. Learning efficient convolutional networks through network slimming. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 2736–2744. [Google Scholar]
- He, Y.; Zhang, X.; Sun, J. Channel pruning for accelerating very deep neural networks. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 1389–1397. [Google Scholar]
- Pan, H.; Chao, Z.; Qian, J.; Zhuang, B.; Wang, S.; Xiao, J. Network pruning using linear dependency analysis on feature maps. In Proceedings of the ICASSP 2021–2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), IEEE, Toronto, ON, Canada, 6–11 June 2021; pp. 1720–1724. [Google Scholar]
- Ding, G.; Zhang, S.; Jia, Z.; Zhong, J.; Han, J. Where to prune: Using LSTM to guide data-dependent soft pruning. IEEE Trans. Image Process. 2020, 30, 293–304. [Google Scholar] [CrossRef]
- Huang, Q.; Zhou, K.; You, S.; Neumann, U. Learning to prune filters in convolutional neural networks. In Proceedings of the 2018 IEEE Winter Conference on Applications of Computer Vision (WACV), IEEE, Lake Tahoe, NV, USA, 12–15 March 2018; pp. 709–718. [Google Scholar]
- Zhen, C.; Zhang, W.; Mo, J.; Ji, M.; Zhou, H.; Zhu, J. Rasp: Regularization-based amplitude saliency pruning. Neural Netw. 2023, 168, 1–13. [Google Scholar] [CrossRef] [PubMed]
- Mussay, B.; Feldman, D.; Zhou, S.; Braverman, V.; Osadchy, M. Data-independent structured pruning of neural networks via coresets. IEEE Trans. Neural Netw. Learn. Syst. 2021, 33, 7829–7841. [Google Scholar] [CrossRef]
- Bai, S.; Chen, J.; Shen, X.; Qian, Y.; Liu, Y. Unified data-free compression: Pruning and quantization without fine-tuning. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Paris, France, 2–6 October 2023; pp. 5876–5885. [Google Scholar]
- He, Y.; Liu, P.; Wang, Z.; Hu, Z.; Yang, Y. Filter pruning via geometric median for deep convolutional neural networks acceleration. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 15–20 June 2019; pp. 4340–4349. [Google Scholar]
- Guo, J.; Ouyang, W.; Xu, D. Multi-dimensional pruning: A unified framework for model compression. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 13–19 June 2020; pp. 1508–1517. [Google Scholar]
Variety | Registration Year | Origin | Color | Quantity |
---|---|---|---|---|
Yuyou-35 | 2019 | GH31 × GH33 | tawny | 345 |
Yuhuang-20 | 2022 | W01YA × L804 | tawny | 351 |
Deyou-5 | 2019 | K97A × Z68R | dark brown | 356 |
Yuyou-55 | 2021 | 17,346 × 16,110 | dark brown | 414 |
Yuyou-56 | 2022 | 17448A × 18,007 | ark brown | 405 |
Hyperparameters | Value | |
---|---|---|
weight coefficients | a | 0.0001 |
b | 0.0001 | |
initial learning rate | 0.01 | |
batch size | 16 | |
#epochs | 400 |
Models | mAP (%) | Inference Speed (s) | Number of Parameters (M) |
---|---|---|---|
YOLOv7x | 95.70 | 7.56 | 70.84 |
YOLOv7-tiny | 96.15 | 0.60 | 6.02 |
YOLOv7 | 96.68 | 4.48 | 36.50 |
YOLOv7 (MB) | 96.66 | 4.52 | 37.05 |
Branch Selection Methods | Branch Selections | mAP (%) | ||||
---|---|---|---|---|---|---|
MB1 | MB2 | MB3 | MB4 | MB5 | ||
pruning strategy selection | b2 | b3 | b1 | b3 | b4 | 96.81 |
random selection1 | b3 | b4 | b2 | b1 | b3 | 96.63 |
random selection2 | b1 | b2 | b3 | b4 | b2 | 96.76 |
Pruning Ratios | mAP (%) | Inference Time (s) | Number of Parameters (M) |
---|---|---|---|
0.1 | 96.81 | 3.35 | 29.20 |
0.2 | 96.79 | 2.94 | 24.25 |
0.3 | 96.75 | 2.73 | 19.24 |
0.4 | 96.80 | 2.54 | 18.16 |
0.5 | 96.73 | 2.34 | 15.34 |
0.6 | 96.64 | 2.25 | 13.24 |
0.622 | 96.53 | 2.14 | 13.06 |
mAP (%) | Inference Speed (s) | Number of Parameters (M) | |
---|---|---|---|
Custom1 | 96.89 | 1.18 | 9.19 |
Custom2 | 96.12 | 1.17 | 8.92 |
Custom3 | 96.77 | 1.15 | 8.77 |
Random | 95.94 | 1.33 | 13.15 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gu, S.; Meng, W.; Sun, G. Streamlining YOLOv7 for Rapid and Accurate Detection of Rapeseed Varieties on Embedded Device. Sensors 2024, 24, 5585. https://doi.org/10.3390/s24175585
Gu S, Meng W, Sun G. Streamlining YOLOv7 for Rapid and Accurate Detection of Rapeseed Varieties on Embedded Device. Sensors. 2024; 24(17):5585. https://doi.org/10.3390/s24175585
Chicago/Turabian StyleGu, Siqi, Wei Meng, and Guodong Sun. 2024. "Streamlining YOLOv7 for Rapid and Accurate Detection of Rapeseed Varieties on Embedded Device" Sensors 24, no. 17: 5585. https://doi.org/10.3390/s24175585
APA StyleGu, S., Meng, W., & Sun, G. (2024). Streamlining YOLOv7 for Rapid and Accurate Detection of Rapeseed Varieties on Embedded Device. Sensors, 24(17), 5585. https://doi.org/10.3390/s24175585