SN-CNN: A Lightweight and Accurate Line Extraction Algorithm for Seedling Navigation in Ridge-Planted Vegetables
Abstract
:1. Introduction
2. Materials and Methods
2.1. Research Process and Methods of This Paper
2.2. Dataset Construction and Image Preprocessing
2.3. Improvement of YOLOv8n
2.4. Efficient C2f_UIB Block
2.5. Integrating the SimAM Attention Mechanism
2.6. Model Training and Evaluation
2.6.1. Hardware Platform and Hyperparameter Settings
2.6.2. Model Evaluation
3. Results
3.1. Comparison of SN-CNN and the Baseline’s Training Loss Functions
3.2. Comparison of SN-CNN Performance with Existing Algorithms
3.3. Ablation Experiment
3.4. Performance Evaluation of Navigation Line Fitting on Jetson AGX Xavier
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
References
- Karunathilake, E.M.B.M.; Le, A.T.; Heo, S.; Chung, Y.S.; Mansoor, S. The Path to Smart Farming: Innovations and Opportunities in Precision Agriculture. Agriculture 2023, 13, 1593. [Google Scholar] [CrossRef]
- Liu, X.; Wang, Y.; Yan, X.; Hou, H.; Liu, P.; Cai, T.; Zhang, P.; Jia, Z.; Ren, X.; Chen, X. Appropriate ridge-furrow ratio can enhance crop production and resource use efficiency by improving soil moisture and thermal condition in a semi-arid region. Agric. Water Manag. 2020, 240, 106289. [Google Scholar] [CrossRef]
- Jin, Y.; Liu, J.; Xu, Z.; Yuan, S.; Li, P.; Wang, J. Development status and trend of agricultural robot technology. Int. J. Agric. Biol. Eng. 2021, 14, 1–19. [Google Scholar] [CrossRef]
- Ma, G.; Mao, H.; Han, L.; Liu, Y.; Gao, F. Reciprocating mechanism for whole row automatic seedling picking and dropping on a transplanter. Appl. Eng. Agric. 2020, 36, 751–766. [Google Scholar] [CrossRef]
- Zhao, S.; Liu, J.; Jin, Y.; Bai, Z.; Liu, J.; Zhou, X. Design and Testing of an Intelligent Multi-Functional Seedling Transplanting System. Agronomy 2022, 12, 2683. [Google Scholar] [CrossRef]
- Han, L.; Mo, M.; Gao, Y.; Ma, H.; Xiang, D.; Ma, G.; Mao, H. Effects of new compounds into substrates on seedling qualities for efficient transplanting. Agronomy 2022, 12, 983. [Google Scholar] [CrossRef]
- Zhang, T.; Zhou, J.; Liu, W.; Yue, R.; Yao, M.; Shi, J.; Hu, J. Seedling-YOLO: High-Efficiency Target Detection Algorithm for Field Broccoli Seedling Transplanting Quality Based on YOLOv7-Tiny. Agronomy 2024, 14, 931. [Google Scholar] [CrossRef]
- Wu, T.; Zhang, Q.; Wu, J.; Liu, Q.; Su, J.; Li, H. An improved YOLOv5s model for effectively predict sugarcane seed replenishment positions verified by a field re-seeding robot. Comput. Electron. Agric. 2023, 214, 108280. [Google Scholar] [CrossRef]
- Jin, X.; Zhu, X.; Xiao, L.; Li, M.; Li, S.; Zhao, B.; Ji, J. YOLO-RDS: An efficient algorithm for monitoring the uprightness of seedling transplantation. Comput. Electron. Agric. 2024, 218, 108654. [Google Scholar]
- Sun, X.; Miao, Y.; Wu, X.; Wang, Y.; Li, Q.; Zhu, H.; Wu, H. Cabbage Transplantation State Recognition Model Based on Modified YOLOv5-GFD. Agronomy 2024, 14, 760. [Google Scholar] [CrossRef]
- Holzinger, A.; Fister, I., Jr.; Fister, I.; Kaul, H.-P.; Asseng, S. Human-Centered AI in smart farming: Towards Agriculture 5.0. IEEE Access 2024, 12, 62199–62214. [Google Scholar] [CrossRef]
- Radočaj, D.; Plaščak, I.; Jurišić, M. Global Navigation Satellite Systems as State-of-the-Art Solutions in Precision Agriculture: A Review of Studies Indexed in the Web of Science. Agricultural 2023, 13, 1417. [Google Scholar] [CrossRef]
- Wang, T.; Chen, B.; Zhang, Z.; Li, H.; Zhang, M. Applications of machine vision in agricultural robot navigation: A review. Comput. Electron. Agric. 2022, 198, 107085. [Google Scholar] [CrossRef]
- Ruangurai, P.; Dailey, M.N.; Ekpanyapong, M.; Soni, P. Optimal vision-based guidance row locating for autonomous agricultural machines. Precis. Agric. 2022, 23, 1205–1225. [Google Scholar] [CrossRef]
- Kanagasingham, S.; Ekpanyapong, M.; Chaihan, R. Integrating machine vision-based row guidance with GPS and compass-based routing to achieve autonomous navigation for a rice field weeding robot. Precis. Agric. 2020, 21, 831–855. [Google Scholar] [CrossRef]
- Liu, W.; Hu, J.; Liu, J.; Yue, R.; Zhang, T.; Yao, M.; Li, J. Method for the navigation line recognition of the ridge without crops via machine vision. Int. J. Agric. Biol. Eng. 2024, 17, 230–239. [Google Scholar]
- Shi, J.; Bai, Y.; Diao, Z.; Zhou, J.; Yao, X.; Zhang, B. Row Detection BASED Navigation and Guidance for Agricultural Robots and Autonomous Vehicles in Row-Crop Fields: Methods and Applications. Agronomy 2023, 13, 1780. [Google Scholar] [CrossRef]
- Chen, J.; Qiang, H.; Wu, J.; Xu, G.; Wang, Z. Navigation path extraction for greenhouse cucumber-picking robots using the prediction-point Hough transform. Comput. Electron. Agric. 2021, 180, 105911. [Google Scholar] [CrossRef]
- Ospina, R.; Noguchi, N. Simultaneous mapping and crop row detection by fusing data from wide angle and telephoto images. Comput. Electron. Agric. 2019, 162, 602–612. [Google Scholar] [CrossRef]
- Rabab, S.; Badenhorst, P.; Chen, Y.-P.P.; Daetwyler, H.D. A template-free machine vision-based crop row detection algorithm. Precis. Agric. 2021, 22, 124–153. [Google Scholar] [CrossRef]
- Hamuda, E.; Mc Ginley, B.; Glavin, M.; Jones, E. Automatic crop detection under field conditions using the HSV colour space and morphological operations. Comput. Electron. Agric. 2017, 133, 97–107. [Google Scholar] [CrossRef]
- Li, D.; Li, B.; Kang, S.; Feng, H.; Long, S.; Wang, J. E2CropDet: An efficient end-to-end solution to crop row detection. Expert Syst. Appl. 2023, 227, 120345. [Google Scholar] [CrossRef]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You only look once: Unified, real-time object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar]
- Yang, Y.; Zhou, Y.; Yue, X.; Zhang, G.; Wen, X.; Ma, B.; Xu, L.; Chen, L. Real-time detection of crop rows in maize fields based on autonomous extraction of ROI. Expert Syst. Appl. 2023, 213, 118826. [Google Scholar] [CrossRef]
- Diao, Z.; Guo, P.; Zhang, B.; Zhang, D.; Yan, J.; He, Z.; Zhao, S.; Zhao, C.; Zhang, J. Navigation line extraction algorithm for corn spraying robot based on improved YOLOv8s network. Comput. Electron. Agric. 2023, 212, 108049. [Google Scholar] [CrossRef]
- Liu, T.-H.; Zheng, Y.; Lai, J.-S.; Cheng, Y.-F.; Chen, S.-Y.; Mai, B.-F.; Liu, Y.; Li, J.-Y.; Xue, Z. Extracting visual navigation line between pineapple field rows based on an enhanced YOLOv5. Comput. Electron. Agric. 2024, 217, 108574. [Google Scholar] [CrossRef]
- Diao, Z.; Ma, S.; Zhang, D.; Zhang, J.; Guo, P.; He, Z.; Zhao, S.; Zhang, B. Algorithm for Corn Crop Row Recognition during Different Growth Stages Based on ST-YOLOv8s Network. Agronomy 2024, 14, 1466. [Google Scholar] [CrossRef]
- Hu, J.; Shen, L.; Sun, G. Squeeze-and-excitation networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 7132–7141. [Google Scholar]
- Woo, S.; Park, J.; Lee, J.-Y.; Kweon, I.S. Cbam: Convolutional block attention module. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 3–19. [Google Scholar]
- Lin, Y.; Chen, T.; Liu, S.; Cai, Y.; Shi, H.; Zheng, D.; Lan, Y.; Yue, X.; Zhang, L. Quick and accurate monitoring peanut seedlings emergence rate through UAV video and deep learning. Comput. Electron. Agric. 2022, 197, 106938. [Google Scholar] [CrossRef]
- Cui, J.; Zheng, H.; Zeng, Z.; Yang, Y.; Ma, R.; Tian, Y.; Tan, J.; Feng, X.; Qi, L. Real-time missing seedling counting in paddy fields based on lightweight network and tracking-by-detection algorithm. Comput. Electron. Agric. 2023, 212, 108045. [Google Scholar] [CrossRef]
- Han, K.; Wang, Y.; Tian, Q.; Guo, J.; Xu, C.; Xu, C. Ghostnet: More features from cheap operations. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 14–19 June 2020; pp. 1580–1589. [Google Scholar]
- Zhang, X.; Zhou, X.; Lin, M.; Sun, J. Shufflenet: An extremely efficient convolutional neural network for mobile devices. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 6848–6856. [Google Scholar]
- Ji, W.; Pan, Y.; Xu, B.; Wang, J. A Real-Time Apple Targets Detection Method for Picking Robot Based on ShufflenetV2-YOLOX. Remote Sens. 2022, 12, 856. [Google Scholar] [CrossRef]
- Howard, A.G.; Zhu, M.; Chen, B.; Kalenichenko, D.; Wang, W.; Weyand, T.; Andreetto, M.; Adam, H. Mobilenets: Efficient convolutional neural networks for mobile vision applications. arXiv 2017, arXiv:1704.04861. [Google Scholar] [CrossRef]
- Gong, H.; Wang, X.; Zhuang, W.J.A. Research on Real-Time Detection of Maize Seedling Navigation Line Based on Improved YOLOv5s Lightweighting Technology. Agriculture 2024, 14, 124. [Google Scholar] [CrossRef]
- Andújar, D.; Rueda-Ayala, V.; Moreno, H.; Rosell-Polo, J.R.; Escolá, A.; Valero, C.; Gerhards, R.; Fernández-Quintanilla, C.; Dorado, J.; Griepentrog, H.-W. Discriminating Crop, Weeds and Soil Surface with a Terrestrial LIDAR Sensor. Sensors 2013, 13, 14662–14675. [Google Scholar] [CrossRef]
- Yao, M.; Hu, J.; Liu, W.; Yue, R.; Zhu, W.; Zhang, Z. Positioning control method for the seedling tray of automatic transplanters based on interval analysis. Trans. Chin. Soc. Agric. Eng. 2023, 39, 27–36. [Google Scholar]
- Yu, G.; Lei, W.; Liang, S.; Xiong, Z.; Ye, B. Advancement of mechanized transplanting technology and equipments for field crops. Trans. Chin. Soc. Agric. Mach. 2022, 53, 1–20. [Google Scholar]
- Qin, D.; Leichner, C.; Delakis, M.; Fornoni, M.; Luo, S.; Yang, F.; Wang, W.; Banbury, C.; Ye, C.; Akin, B. MobileNetV4-Universal Models for the Mobile Ecosystem. arXiv 2024, arXiv:2404.10518. [Google Scholar]
- Sandler, M.; Howard, A.; Zhu, M.; Zhmoginov, A.; Chen, L.-C. Mobilenetv2: Inverted residuals and linear bottlenecks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 14–19 June 2018; pp. 4510–4520. [Google Scholar]
- Yang, L.; Zhang, R.-Y.; Li, L.; Xie, X. Simam: A simple, parameter-free attention module for convolutional neural networks. In Proceedings of the International Conference on Machine Learning, Virtual, 18–24 July 2021; pp. 11863–11874. [Google Scholar]
- Zhang, W.; Zhao, W.; Li, J.; Zhuang, P.; Sun, H.; Xu, Y.; Li, C. CVANet: Cascaded visual attention network for single image super-resolution. Neural Netw. 2024, 170, 622–634. [Google Scholar] [CrossRef]
- Yu, J.; Zhang, J.; Shu, A.; Chen, Y.; Chen, J.; Yang, Y.; Tang, W.; Zhang, Y.J. Study of convolutional neural network-based semantic segmentation methods on edge intelligence devices for field agricultural robot navigation line extraction. Comput. Electron. Agric. 2023, 209, 107811. [Google Scholar] [CrossRef]
- Tian, Y.; Zhang, W.; Su, P.; Xu, Y.; Zhuang, P.; Xie, X.; Zhao, W. S4: Self-Supervised learning with Sparse-dense Sampling. Knowl.-Based Syst. 2024, 299, 112040. [Google Scholar] [CrossRef]
- Zhang, W.; Li, Z.; Li, G.; Zhuang, P.; Hou, G.; Zhang, Q.; Li, C. Gacnet: Generate adversarial-driven cross-aware network for hyperspectral wheat variety identification. IEEE Trans. Geosci. Remote Sens. 2023, 62, 5503314. [Google Scholar] [CrossRef]
Set | Values |
---|---|
Lr0 | 0.001667 |
Momentum | 0.9 |
Weight decay | 0.0005 |
Batch size | 32 |
Epochs | 200 |
Pre-trained weight | YOLOv8n.pt |
Networks | Parameters | FLOPs | P | R | [email protected] |
---|---|---|---|---|---|
YOLOv5s | 7.3 M | 17.5 G | 91% | 88.4% | 93.1% |
YOLOv7-tiny | 6.35 M | 14.2 G | 91.2% | 89.1% | 93.4% |
YOLOv8n | 3.31 M | 8.7 G | 88.4% | 89.3% | 92.6% |
YOLOv8s | 11.4 M | 29.4 G | 91.5% | 89.2% | 93.5% |
SN-CNN(ours) | 2.37 M | 6.7 G | 93.9% | 90.2% | 94.6% |
Model | C2f_UIB | SimAM | Parameters | FLOPs | P | R | [email protected] |
---|---|---|---|---|---|---|---|
M1 | × | × | 3.31 M | 8.7 G | 88.4% | 89.3% | 92.6% |
M2 | √ | × | 2.37 M | 6.7 G | 90.2% | 88.3% | 92.8% |
M3 | × | √ | 3.31 M | 8.7 G | 92.6% | 90.5% | 93.7% |
M4 | √ | √ | 2.37 M | 6.7 G | 93.9% | 90.2% | 94.6% |
Methods | Number of Images | RMSE (Pixels) | ms/(Images) |
---|---|---|---|
SN-CNN + LS | 100 | 6.8 | 22 |
SN-CNN + RANSAC | 100 | 5.7 | 25 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, T.; Zhou, J.; Liu, W.; Yue, R.; Shi, J.; Zhou, C.; Hu, J. SN-CNN: A Lightweight and Accurate Line Extraction Algorithm for Seedling Navigation in Ridge-Planted Vegetables. Agriculture 2024, 14, 1446. https://doi.org/10.3390/agriculture14091446
Zhang T, Zhou J, Liu W, Yue R, Shi J, Zhou C, Hu J. SN-CNN: A Lightweight and Accurate Line Extraction Algorithm for Seedling Navigation in Ridge-Planted Vegetables. Agriculture. 2024; 14(9):1446. https://doi.org/10.3390/agriculture14091446
Chicago/Turabian StyleZhang, Tengfei, Jinhao Zhou, Wei Liu, Rencai Yue, Jiawei Shi, Chunjian Zhou, and Jianping Hu. 2024. "SN-CNN: A Lightweight and Accurate Line Extraction Algorithm for Seedling Navigation in Ridge-Planted Vegetables" Agriculture 14, no. 9: 1446. https://doi.org/10.3390/agriculture14091446
APA StyleZhang, T., Zhou, J., Liu, W., Yue, R., Shi, J., Zhou, C., & Hu, J. (2024). SN-CNN: A Lightweight and Accurate Line Extraction Algorithm for Seedling Navigation in Ridge-Planted Vegetables. Agriculture, 14(9), 1446. https://doi.org/10.3390/agriculture14091446