Navigation of an Autonomous Spraying Robot for Orchard Operations Using LiDAR for Tree Trunk Detection
Abstract
:1. Introduction
2. Materials and Methods
2.1. Experimental Prototype Vehicles and the Installation of Sensors
2.1.1. Vehicle
2.1.2. Pesticide Spraying System
2.1.3. LiDAR
2.2. Path Planning Algorithm
2.2.1. Density-Based Spatial Clustering of Applications with Noise (DBSCAN)
- Core points: If |Nε(p)| ≥ minPoints, then p is a core point, and all the points in this set, together with p, belong to the same cluster.
- Border points: They are directly reachable from the core point; however, |Nε(p)| < minPoints belongs to the same cluster with the core point.
- Noise points: A point that is not included in any cluster.
2.2.2. K-Means Clustering
2.2.3. RANSAC
2.3. Vehicle Guidance
2.4. Control System
2.4.1. Trajectory Tracking Control
2.4.2. Navigation Decision
2.4.3. Program Platform and GUI
3. Results
3.1. Planning Path Calibration on a Concrete Road
3.2. Operation Calibration on a Concrete Road
3.2.1. Calibration of the Curve Path
3.2.2. Calibration of Straight Maneuvers and U-Turns
3.3. Operation Calibration on Grass
3.4. Field Test in a Facilitated Artificial-Tree-Based Orchard
4. Discussion
4.1. Machine Learning System from Point Clouds
4.2. Prototype Testing under the Different Lighting Conditions
4.3. Prototype Testing on Concrete and Grass Using a Facilitated Artificial Tree Pattern
4.4. Prototype Testing on Grass Using a Conventional Tree Pattern
5. Conclusions
- Integration of machine learning algorithms, DBSCAN, K-means, and RANSAC, was performed to detect tree locations, divide them into left and right groups, and calculate boundary lines to find the midline for the navigation planning path.
- Integration of the pure pursuit tracking algorithm and the incremental PID control were performed to calculate the steering angles for navigation. The calculated steering angles information was sent to the microcontroller to control the stepper motor rotation to achieve vehicle steering control on the navigation path in real-time.
- In the concrete road calibration, the positional RMSE in the curve and the U-turn calibrations were 20.6 cm and 11.8 cm, respectively, indicating that the guidance system could calculate the path and control the vehicle safely based on the position of the landmarks in real time.
- In the grass calibration, the position RMSE of the right and left turns was 14.1 cm, proving that this navigation system could operate properly on the soil. In the facilitated artificial-tree-based orchard, the positional RMSE was 12.6 cm, and a U-turn was performed to steer the robot when applying pesticides in the joint-orchard system for our future research.
- In this study, an automatic navigation system for orchards was produced using only 2D LiDAR. Not only can the vehicle be driven under any light conditions, but the computational complexity was also reduced; thus, it does not need to rely on powerful performance computers.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Satterthwaite, D.; McGranahan, G.; Tacoli, C. Urbanization and its implications for food and farming. Philos. Trans. R. Soc. London. Ser. B Biol. Sci. 2010, 365, 2809–2820. [Google Scholar] [CrossRef] [PubMed]
- Usman, M.; Sawaya, A.; Igarashi, M.; Gayman, J.J.; Dixit, R. Strained agricultural farming under the stress of youths’ career selection tendencies: A case study from Hokkaido (Japan). Humanit. Soc. Sci. Commun. 2021, 8, 19. [Google Scholar] [CrossRef]
- Dang, N.T.; Luy, N.T. LiDAR-Based Online Navigation Algorithm for An Autonomous Agricultural Robot. J. Control. Eng. Appl. Inform. 2022, 24, 90–100. [Google Scholar]
- Bergerman, M.; Billingsley, J.; Reid, J.; van Henten, E. Robotics in agriculture and forestry. In Springer Handbook of Robotics; Springer: Berlin/Heidelberg, Germany, 2016; pp. 1463–1492. [Google Scholar]
- Mousazadeh, H. A technical review on navigation systems of agricultural autonomous off-road vehicles. J. Terramechanics 2013, 50, 211–232. [Google Scholar] [CrossRef]
- Sun, H.; Slaughter, D.C.; Ruiz, M.P.; Gliever, C.; Upadhyaya, S.K.; Smith, R.F. RTK GPS mapping of transplanted row crops. Comput. Electron. Agric. 2010, 71, 32–37. [Google Scholar] [CrossRef]
- Li, M.; Imou, K.; Wakabayashi, K.; Yokoyama, S. Review of research on agricultural vehicle autonomous guidance. Int. J. Agric. Biol. Eng. 2009, 2, 1–16. [Google Scholar]
- Subramanian, V.; Burks, T.F.; Arroyo, A.A. Development of machine vision and laser radar based autonomous vehicle guidance systems for citrus grove navigation. Comput. Electron. Agric. 2006, 53, 130–143. [Google Scholar] [CrossRef]
- Takagaki, A.; Masuda, R.; Iida, M.; Suguri, M. Image Processing for Ridge/Furrow Discrimination for Autonomous Agricultural Vehicles Navigation. IFAC Proc. Vol. 2013, 46, 47–51. [Google Scholar] [CrossRef]
- Chen, J.; Qiang, H.; Wu, J.; Xu, G.; Wang, Z.; Liu, X. Extracting the navigation path of a tomato-cucumber greenhouse robot based on a median point Hough transform. Comput. Electron. Agric. 2020, 174, 105472. [Google Scholar] [CrossRef]
- Li, X.; Qiu, Q. Autonomous Navigation for Orchard Mobile Robots: A Rough Review. In Proceedings of the 2021 36th Youth Academic Annual Conference of Chinese Association of Automation (YAC), Nanchang, China, 28–30 May 2021; pp. 552–557. [Google Scholar]
- Takai, R.; Barawid, O.; Ishii, K.; Noguchi, N. Development of Crawler-Type Robot Tractor based on GPS and IMU. IFAC Proc. Vol. 2010, 43, 151–156. [Google Scholar] [CrossRef]
- Shalal, N.; Low, T.; McCarthy, C.; Hancock, N. A review of autonomous navigation systems in agricultural environments. In Proceedings of the SEAg 2013: Innovative Agricultural Technologies for a Sustainable Future, Barton, Australia, 22–25 September 2013. [Google Scholar]
- Wang, X.; Pan, H.; Guo, K.; Yang, X.; Luo, S. The evolution of LiDAR and its application in high precision measurement. IOP Conf. Ser. Earth Environ. Sci. 2020, 502, 012008. [Google Scholar] [CrossRef]
- Wang, Y.; Geng, C.; Zhu, G.; Shen, R.; Gu, H.; Liu, W. Information Perception Method for Fruit Trees Based on 2D LiDAR Sensor. Agriculture 2022, 12, 914. [Google Scholar] [CrossRef]
- Wang, C.; Ji, M.; Wang, J.; Wen, W.; Li, T.; Sun, Y. An Improved DBSCAN Method for LiDAR Data Segmentation with Automatic Eps Estimation. Sensor 2019, 19, 172. [Google Scholar] [CrossRef]
- Zhou, M.; Xia, J.; Yang, F.; Zheng, K.; Hu, M.; Li, D.; Zhang, S. Design and experiment of visual navigated UGV for orchard based on Hough matrix and RANSAC. Int. J. Agric. Biol. Eng. 2021, 14, 176–184. [Google Scholar] [CrossRef]
- Fischler, M.A.; Bolles, R.C. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 1981, 24, 381–395. [Google Scholar] [CrossRef]
- Barawid, O.C., Jr.; Mizushima, A.; Ishii, K.; Noguchi, N. Development of an autonomous navigation system using a two-dimensional laser scanner in an orchard application. Biosyst. Eng. 2007, 96, 139–149. [Google Scholar] [CrossRef]
- Chen, J.; Qiang, H.; Wu, J.; Xu, G.; Wang, Z. Navigation path extraction for greenhouse cucumber-picking robots using the prediction-point Hough transform. Comput. Electron. Agric. 2021, 180, 105911. [Google Scholar] [CrossRef]
- Pajares, G.; García-Santillán, I.; Campos, Y.; Montalvo, M.; Guerrero, J.M.; Emmi, L.; Romeo, J.; Guijarro, M.; Gonzalez-de-Santos, P. Machine-vision systems selection for agricultural vehicles: A guide. J. Imaging 2016, 2, 34. [Google Scholar] [CrossRef]
- Akinlar, C.; Topal, C. EDLines: A real-time line segment detector with a false detection control. Pattern Recognit. Lett. 2011, 32, 1633–1642. [Google Scholar] [CrossRef]
- Santos, L.C.; Aguiar, A.S.; Santos, F.N.; Valente, A.; Ventura, J.B.; Sousa, A.J. Navigation Stack for Robots Working in Steep Slope Vineyard. In Intelligent Systems and Applications; Arai, K., Kapoor, S., Bhatia, R., Eds.; Springer International Publishing: Cham, Switzerland, 2021; pp. 264–285. [Google Scholar]
- Shalal, N.; Low, T.; McCarthy, C.; Hancock, N. Orchard mapping and mobile robot localisation using on-board camera and laser scanner data fusion–Part B: Mapping and localisation. Comput. Electron. Agric. 2015, 119, 267–278. [Google Scholar] [CrossRef]
- Jiang, S.; Wang, S.; Yi, Z.; Zhang, M.; Lv, X. Autonomous Navigation System of Greenhouse Mobile Robot Based on 3D Lidar and 2D Lidar SLAM. Front. Plant Sci. 2022, 13, 815218. [Google Scholar] [CrossRef] [PubMed]
- Chen, H.; Liang, M.; Liu, W.; Wang, W.; Liu, P.X. An approach to boundary detection for 3D point clouds based on DBSCAN clustering. Pattern Recognit. 2022, 124, 108431. [Google Scholar] [CrossRef]
- Troccoli, E.B.; Cerqueira, A.G.; Lemos, J.B.; Holz, M. K-means clustering using principal component analysis to automate label organization in multi-attribute seismic facies analysis. J. Appl. Geophys. 2022, 198, 104555. [Google Scholar] [CrossRef]
- Borlea, I.-D.; Precup, R.-E.; Borlea, A.-B. Improvement of K-means Cluster Quality by Post Processing Resulted Clusters. Procedia Comput. Sci. 2022, 199, 63–70. [Google Scholar] [CrossRef]
- Xu, B.; Jiang, W.; Shan, J.; Zhang, J.; Li, L. Investigation on the Weighted RANSAC Approaches for Building Roof Plane Segmentation from LiDAR Point Clouds. Remote Sens. 2016, 8, 5. [Google Scholar] [CrossRef]
- Goodwin, G.C.; Graebe, S.F.; Salgado, M.E. Control System Design; Prentice Hall: Upper Saddle River, NJ, USA, 2001; Volume 240. [Google Scholar]
- Shen, L.; Liu, Z.; Zhang, Z.; Shi, X. Frame-level bit allocation based on incremental PID algorithm and frame complexity estimation. J. Vis. Commun. Image Represent. 2009, 20, 28–34. [Google Scholar] [CrossRef]
- Jiang, A.; Noguchi, R.; Ahamed, T. Tree Trunk Recognition in Orchard Autonomous Operations under Different Light Conditions Using a Thermal Camera and Faster R-CNN. Sensors 2022, 22, 2065. [Google Scholar] [CrossRef]
Site | RMSE (cm) | Calibration 1 | Calibration 2 | Calibration 3 | Average | |
---|---|---|---|---|---|---|
Path | ||||||
Concrete road | Curve path (right) | 18.3 | 14.0 | 18.9 | 17.1 | |
Curve path (left) | 25.3 | 22.7 | 24.3 | 24.1 | ||
Operational path (right) | 10.4 | 14.6 | 10.9 | 12.0 | ||
Operational path (left) | 12.0 | 11.2 | 11.7 | 11.6 | ||
Grass | Operational path (right) | 9.4 | 15.8 | 12.7 | 12.6 | |
Operational path (left) | 10.7 | 18.6 | 17.3 | 15.5 | ||
Facilitated artificial-tree-based orchard | Operational path (right) | 11.4 | 14.5 | 15.5 | 13.8 | |
Operational path (left) | 18.6 | 6.7 | 8.8 | 11.4 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Jiang, A.; Ahamed, T. Navigation of an Autonomous Spraying Robot for Orchard Operations Using LiDAR for Tree Trunk Detection. Sensors 2023, 23, 4808. https://doi.org/10.3390/s23104808
Jiang A, Ahamed T. Navigation of an Autonomous Spraying Robot for Orchard Operations Using LiDAR for Tree Trunk Detection. Sensors. 2023; 23(10):4808. https://doi.org/10.3390/s23104808
Chicago/Turabian StyleJiang, Ailian, and Tofael Ahamed. 2023. "Navigation of an Autonomous Spraying Robot for Orchard Operations Using LiDAR for Tree Trunk Detection" Sensors 23, no. 10: 4808. https://doi.org/10.3390/s23104808
APA StyleJiang, A., & Ahamed, T. (2023). Navigation of an Autonomous Spraying Robot for Orchard Operations Using LiDAR for Tree Trunk Detection. Sensors, 23(10), 4808. https://doi.org/10.3390/s23104808