Computer Vision in Self-Steering Tractors
Abstract
:1. Introduction
2. Evolution of Vision-Based Self-Steering Tractors
3. Safety Issues
4. Self-Steering Tractors’ System Architecture
4.1. Basic Modeling
4.2. Vision-Based Architecture
4.3. Path Tracking Control System
4.4. Basic Sensors
5. Vision-Based Navigation
5.1. Monocular Vision Methods
5.2. Binocular Vision Methods
Classification of Stereovision Methods
5.3. Multi-Vision Methods
6. Discussion
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Acknowledgments
Conflicts of Interest
References
- Saiz-Rubio, V.; Rovira-Más, F. From Smart Farming towards Agriculture 5.0: A Review on Crop Data Management. Agronomy 2020, 10, 207. [Google Scholar] [CrossRef] [Green Version]
- Lokers, R.; Knapen, R.; Janssen, S.; van Randen, Y.; Jansen, J. Analysis of Big Data technologies for use in agro-environmental science. Environ. Model. Softw. 2016, 84, 494–504. [Google Scholar] [CrossRef] [Green Version]
- De Clercq, M.; Vats, A.; Biel, A. Agriculture 4.0: The future of farming technology. In Proceedings of the World Government Summit, Dubai, United Arab Emirates, 11–13 February 2018; pp. 11–13. [Google Scholar]
- Martos, V.; Ahmad, A.; Cartujo, P.; Ordoñez, J. Ensuring Agricultural Sustainability through Remote Sensing in the Era of Agriculture 5.0. Appl. Sci. 2021, 11, 5911. [Google Scholar] [CrossRef]
- Sparrow, R.; Howard, M. Robots in agriculture: Prospects, impacts, ethics, and policy. Precis. Agric. 2021, 22, 818–833. [Google Scholar] [CrossRef]
- Aqeel-ur-Rehman; Abbasi, A.Z.; Islam, N.; Shaikh, Z.A. A review of wireless sensors and networks’ applications in agriculture. Comput. Stand. Interfaces 2014, 36, 263–270. [Google Scholar] [CrossRef]
- Shanmugapriya, P.; Rathika, S.; Ramesh, T.; Janaki, P. Applications of Remote Sensing in Agriculture-A Review. Int. J. Curr. Microbiol. Appl. Sci. 2019, 8, 2270–2283. [Google Scholar] [CrossRef]
- Fan, J.; Zhang, Y.; Wen, W.; Gu, S.; Lu, X.; Guo, X. The future of Internet of Things in agriculture: Plant high-throughput phenotypic platform. J. Clean. Prod. 2021, 280, 123651. [Google Scholar] [CrossRef]
- Wolfert, S.; Ge, L.; Verdouw, C.; Bogaardt, M.-J. Big Data in Smart Farming—A review. Agric. Syst. 2017, 153, 69–80. [Google Scholar] [CrossRef]
- Shalal, N.; Low, T.; McCarthy, C.; Hancock, N. A Review of Autonomous Navigation Systems in Agricultural Environments. In Proceedings of the Society for Engineering in Agriculture Conference: Innovative Agricultural Technologies for a Sustainable Future, Barton, WA, Australia, 22–25 September 2013. [Google Scholar]
- Rovira-Más, F.; Zhang, Q.; Reid, J.F.; Will, J.D. Machine Vision Based Automated Tractor Guidance. Int. J. Smart Eng. Syst. Des. 2003, 5, 467–480. [Google Scholar] [CrossRef]
- Thomasson, J.A.; Baillie, C.P.; Antille, D.L.; Lobsey, C.R.; McCarthy, C.L. Autonomous Technologies in Agricultural Equipment: A Review of the State of the Art. In Proceedings of the 2019 Agricultural Equipment Technology Conference, Louisville, KY, USA, 11–13 February 2019; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2019; pp. 1–17. [Google Scholar]
- Baillie, C.P.; Lobsey, C.R.; Antille, D.L.; McCarthy, C.L.; Thomasson, J.A. A review of the state of the art in agricultural automation. Part III: Agricultural machinery navigation systems. In Proceedings of the 2018 Detroit, Michigan, 29 July–1 August 2018; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2018; p. 1. [Google Scholar]
- Schmidt, G.T. GPS Based Navigation Systems in Difficult Environments. Gyroscopy Navig. 2019, 10, 41–53. [Google Scholar] [CrossRef]
- Wilson, J. Guidance of agricultural vehicles—A historical perspective. Comput. Electron. Agric. 2000, 25, 3–9. [Google Scholar] [CrossRef]
- Reid, J.; Searcy, S. Vision-based guidance of an agriculture tractor. IEEE Control Syst. Mag. 1987, 7, 39–43. [Google Scholar] [CrossRef]
- Reid, J.F.; Searcy, S.W. Automatic Tractor Guidance with Computer Vision. In SAE Technical Papers; SAE International: Warrendale, PA, USA, 1987. [Google Scholar]
- Billingsley, J.; Schoenfisch, M. Vision-guidance of agricultural vehicles. Auton. Robots 1995, 2, 65–76. [Google Scholar] [CrossRef]
- Billingsley, J.; Schoenfisch, M. The successful development of a vision guidance system for agriculture. Comput. Electron. Agric. 1997, 16, 147–163. [Google Scholar] [CrossRef]
- Pinto, F.A.C.; Reid, J.F. Heading angle and offset determination using principal component analysis. In Proceedings of the ASAE Paper, Disney’s Coronado Springs, Orlando, FL, USA, 12–16 July 1998; p. 983113. [Google Scholar]
- Benson, E.R.; Reid, J.F.; Zhang, Q.; Pinto, F.A.C. An adaptive fuzzy crop edge detection method for machine vision. In Proceedings of the 2000 ASAE Annual Intenational Meeting, Technical Papers: Engineering Solutions for a New Century, Milwaukee, WI, USA, 9–12 July 2000; pp. 49085–49659. [Google Scholar]
- Benson, E.R.; Reid, J.F.; Zhang, Q. Development of an automated combine guidance system. In Proceedings of the 2000 ASAE Annual Intenational Meeting, Technical Papers: Engineering Solutions for a New Century, Milwaukee, WI, USA, 9–12 July 2000; pp. 1–11. [Google Scholar]
- Benson, E.R.; Reid, J.F.; Zhang, Q. Machine Vision Based Steering System for Agricultural Combines. In Proceedings of the 2001 Sacramento, Sacramento, CA, USA, 29 July–1 August 2001; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2001. [Google Scholar]
- Benson, E.R.; Reid, J.F.; Zhang, Q. Machine vision-based guidance system for an agricultural small-grain harvester. Trans. ASAE 2003, 46, 1255–1264. [Google Scholar] [CrossRef]
- Keicher, R.; Seufert, H. Automatic guidance for agricultural vehicles in Europe. Comput. Electron. Agric. 2000, 25, 169–194. [Google Scholar] [CrossRef]
- Åstrand, B.; Baerveldt, A.-J. A vision based row-following system for agricultural field machinery. Mechatronics 2005, 15, 251–269. [Google Scholar] [CrossRef]
- Søgaard, H.T.; Olsen, H.J. Crop row detection for cereal grain. In Precision Agriculture ’99; Sheffield Academic Press: Sheffield, UK, 1999; pp. 181–190. ISBN 1841270423. [Google Scholar]
- Láng, Z. Image processing based automatic steering control in plantation. VDI Ber. 1998, 1449, 93–98. [Google Scholar]
- Kise, M.; Zhang, Q.; Rovira Más, F. A Stereovision-Based Crop Row Detection Method for Tractor-automated Guidance. Biosyst. Eng. 2005, 90, 357–367. [Google Scholar] [CrossRef]
- Tillett, N.D.; Hague, T. Computer-Vision-based Hoe Guidance for Cereals—An Initial Trial. J. Agric. Eng. Res. 1999, 74, 225–236. [Google Scholar] [CrossRef]
- Hague, T.; Tillett, N.D. A bandpass filter-based approach to crop row location and tracking. Mechatronics 2001, 11, 1–12. [Google Scholar] [CrossRef]
- Tillett, N.D.; Hague, T.; Miles, S.J. Inter-row vision guidance for mechanical weed control in sugar beet. Comput. Electron. Agric. 2002, 33, 163–177. [Google Scholar] [CrossRef]
- Subramanian, V.; Burks, T.F.; Arroyo, A.A. Development of machine vision and laser radar based autonomous vehicle guidance systems for citrus grove navigation. Comput. Electron. Agric. 2006, 53, 130–143. [Google Scholar] [CrossRef]
- Misao, Y.; Karahashi, M. An image processing based automatic steering rice transplanter (II). In Proceedings of the 2000 ASAE Annual Intenational Meeting, Technical Papers: Engineering Solutions for a New Century, Milwaukee, WI, USA, 9–12 July 2000; pp. 1–5. [Google Scholar]
- Han, S.; Dickson, M.A.; Ni, B.; Reid, J.F.; Zhang, Q. A Robust Procedure to Obtain a Guidance Directrix for Vision-Based Vehicle Guidance Systems. In Proceedings of the Automation Technology for Off-Road Equipment Proceedings of the 2002 Conference, Chicago, IL, USA, 26–27 July 2002; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2013; p. 317. [Google Scholar]
- Okamoto, H.; Hamada, K.; Kataoka, T.; Terawaki, M.; Hata, S. Automatic Guidance System with Crop Row Sensor. In Proceedings of the Automation Technology for Off-Road Equipment Proceedings of the 2002 Conference, Chicago, IL, USA, 26–27 July 2002; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2013; p. 307. [Google Scholar]
- Fargnoli, M.; Lombardi, M. Safety Vision of Agricultural Tractors: An Engineering Perspective Based on Recent Studies (2009–2019). Safety 2019, 6, 1. [Google Scholar] [CrossRef] [Green Version]
- Kumar, A.; Varghese, M.; Mohan, D. Equipment-related injuries in agriculture: An international perspective. Inj. Control Saf. Promot. 2000, 7, 175–186. [Google Scholar] [CrossRef]
- Vallone, M.; Bono, F.; Quendler, E.; Febo, P.; Catania, P. Risk exposure to vibration and noise in the use of agricultural track-laying tractors. Ann. Agric. Environ. Med. 2016, 23, 591–597. [Google Scholar] [CrossRef] [Green Version]
- Irwin, A.; Poots, J. Investigation of UK Farmer Go/No-Go Decisions in Response to Tractor-Based Risk Scenarios. J. Agromedicine 2018, 23, 154–165. [Google Scholar] [CrossRef]
- Jamshidi, N.; Abdollahi, S.M.; Maleki, A. A survey on the actuating force on brake and clutch pedal controls in agricultural tractor in use in Iran. Polish Ann. Med. 2016, 23, 113–117. [Google Scholar] [CrossRef]
- Fargnoli, M.; Lombardi, M.; Puri, D. Applying Hierarchical Task Analysis to Depict Human Safety Errors during Pesticide Use in Vineyard Cultivation. Agriculture 2019, 9, 158. [Google Scholar] [CrossRef] [Green Version]
- Bo, H.; Liang, W.; Yuefeng, D.; Zhenghe, S.; Enrong, M.; Zhongxiang, Z. Design and Experiment on Integrated Proportional Control Valve of Automatic Steering System. IFAC Pap. 2018, 51, 389–396. [Google Scholar] [CrossRef]
- Franceschetti, B.; Rondelli, V.; Ciuffoli, A. Comparing the influence of Roll-Over Protective Structure type on tractor lateral stability. Saf. Sci. 2019, 115, 42–50. [Google Scholar] [CrossRef]
- Kaizu, Y.; Choi, J. Development of a Tractor Navigation System Using Augmented Reality. Eng. Agric. Environ. Food 2012, 5, 96–101. [Google Scholar] [CrossRef]
- Ehlers, S.G.; Field, W.E.; Ess, D.R. Methods of Collecting and Analyzing Rearward Visibility Data for Agricultural Machinery: Hazard and/or Object Detectability. J. Agric. Saf. Health 2017, 23, 39–53. [Google Scholar] [PubMed]
- Liu, B.; Koc, A.B. Field Tests of a Tractor Rollover Detection and Emergency Notification System. J. Agric. Saf. Health 2015, 21, 113–127. [Google Scholar]
- Irwin, A.; Caruso, L.; Tone, I. Thinking Ahead of the Tractor: Driver Safety and Situation Awareness. J. Agromedicine 2019, 24, 288–297. [Google Scholar] [CrossRef]
- Liu, B.; Liu, G.; Wu, X. Research on Machine Vision Based Agricultural Automatic Guidance Systems. In Computer and Computing Technologies in Agriculture; Springer: Boston, MA, USA, 2008; Volume I, pp. 659–666. ISBN 9780387772509. [Google Scholar]
- Lameski, P.; Zdravevski, E.; Kulakov, A. Review of Automated Weed Control Approaches: An Environmental Impact Perspective. In Communications in Computer and Information Science; Springer: Berlin/Heidelberg, Germany, 2018; Volume 940, pp. 132–147. ISBN 9783030008246. [Google Scholar]
- Rowduru, S.; Kumar, N.; Kumar, A. A critical review on automation of steering mechanism of load haul dump machine. Proc. Inst. Mech. Eng. Part I J. Syst. Control Eng. 2020, 234, 160–182. [Google Scholar] [CrossRef]
- Eddine Hadji, S.; Kazi, S.; Howe Hing, T.; Mohamed Ali, M.S. A Review: Simultaneous Localization and Mapping Algorithms. J. Teknol. 2015, 73. [Google Scholar] [CrossRef] [Green Version]
- Rodríguez Flórez, S.A.; Frémont, V.; Bonnifait, P.; Cherfaoui, V. Multi-modal object detection and localization for high integrity driving assistance. Mach. Vis. Appl. 2014, 25, 583–598. [Google Scholar] [CrossRef] [Green Version]
- Jha, H.; Lodhi, V.; Chakravarty, D. Object Detection and Identification Using Vision and Radar Data Fusion System for Ground-Based Navigation. In Proceedings of the 2019 6th International Conference on Signal Processing and Integrated Networks (SPIN), Noida, India, 7–8 March 2019; pp. 590–593. [Google Scholar]
- Karur, K.; Sharma, N.; Dharmatti, C.; Siegel, J.E. A Survey of Path Planning Algorithms for Mobile Robots. Vehicles 2021, 3, 448–468. [Google Scholar] [CrossRef]
- Ge, J.; Pei, H.; Yao, D.; Zhang, Y. A robust path tracking algorithm for connected and automated vehicles under i-VICS. Transp. Res. Interdiscip. Perspect. 2021, 9, 100314. [Google Scholar] [CrossRef]
- Zhang, S.; Wang, Y.; Zhu, Z.; Li, Z.; Du, Y.; Mao, E. Tractor path tracking control based on binocular vision. Inf. Process. Agric. 2018, 5, 422–432. [Google Scholar] [CrossRef]
- Pajares, G.; García-Santillán, I.; Campos, Y.; Montalvo, M.; Guerrero, J.; Emmi, L.; Romeo, J.; Guijarro, M.; Gonzalez-de-Santos, P. Machine-Vision Systems Selection for Agricultural Vehicles: A Guide. J. Imaging 2016, 2, 34. [Google Scholar] [CrossRef] [Green Version]
- Zhai, Z.; Zhu, Z.; Du, Y.; Song, Z.; Mao, E. Multi-crop-row detection algorithm based on binocular vision. Biosyst. Eng. 2016, 150, 89–103. [Google Scholar] [CrossRef]
- Schouten, G.; Steckel, J. A Biomimetic Radar System for Autonomous Navigation. IEEE Trans. Robot. 2019, 35, 539–548. [Google Scholar] [CrossRef]
- Wang, R.; Chen, L.; Wang, J.; Zhang, P.; Tan, Q.; Pan, D. Research on autonomous navigation of mobile robot based on multi ultrasonic sensor fusion. In Proceedings of the 2018 IEEE 4th Information Technology and Mechatronics Engineering Conference (ITOEC), Chongqing, China, 14–16 December 2018; pp. 720–725. [Google Scholar]
- Torii, T.; Takamizawa, A.; Okamoto, T.; Imou, K. Crop Row Tracking by an Autonomous Vehicle Using Machine Vision (part 2): Field test using an autonomous tractor. J. Jpn. Soc. Agric. Mach. 2000, 62, 37–42. [Google Scholar]
- Fehr, B.W.; Garish, J.B. Vision-guided row-crop follower. Appl. Eng. Agric. 1995, 11, 613–620. [Google Scholar] [CrossRef]
- Gerrish, J.B.; Fehr, B.W.; Van Ee, G.R.; Welch, D.P. Self-steering tractor guided by computer-vision. Appl. Eng. Agric. 1997, 13, 559–563. [Google Scholar] [CrossRef]
- Fitzpatrick, K.; Pahnos, D.; Pype, W.V. Robot windrower is first unmanned harvester. Ind. Robot. Int. J. 1997, 24, 342–348. [Google Scholar] [CrossRef]
- Younse, P.; Burks, T. Intersection Detection and Navigation for an Autonomous Greenhouse Sprayer using Machine Vision. In Proceedings of the 2005 Tampa, Tampa, FL, USA, 17–20 July 2005; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2005; p. 1. [Google Scholar]
- Hague, T.; Tillett, N.D. Navigation and control of an autonomous horticultural robot. Mechatronics 1996, 6, 165–180. [Google Scholar] [CrossRef]
- Royer, E.; Bom, J.; Dhome, M.; Thuilot, B.; Lhuillier, M.; Marmoiton, F. Outdoor autonomous navigation using monocular vision. In Proceedings of the 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, Edmonton, AB, Canada, 2–6 August 2005; pp. 1253–1258. [Google Scholar]
- English, A.; Ross, P.; Ball, D.; Corke, P. Vision based guidance for robot navigation in agriculture. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 29 September 2014; pp. 1693–1698. [Google Scholar]
- da Silva, S.P.P.; Almeida, J.S.; Ohata, E.F.; Rodrigues, J.J.P.C.; de Albuquerque, V.H.C.; Reboucas Filho, P.P. Monocular Vision Aided Depth Map from RGB Images to Estimate of Localization and Support to Navigation of Mobile Robots. IEEE Sens. J. 2020, 20, 12040–12048. [Google Scholar] [CrossRef]
- Ohno, T.; Ohya, A.; Yuta, S. Autonomous navigation for mobile robots referring pre-recorded image sequence. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems. IROS ’96, Osaka, Japan, 8 November 1996; Volume 2, pp. 672–679. [Google Scholar]
- Remazeilles, A.; Chaumette, F.; Gros, P. Robot motion control from a visual memory. In Proceedings of the IEEE International Conference on Robotics and Automation, New Orleans, LA, USA, 26 April–1 May 2004; Volume 5, pp. 4695–4700. [Google Scholar]
- Montalvo, M.; Pajares, G.; Guerrero, J.M.; Romeo, J.; Guijarro, M.; Ribeiro, A.; Ruz, J.J.; Cruz, J.M. Automatic detection of crop rows in maize fields with high weeds pressure. Expert Syst. Appl. 2012, 39, 11889–11897. [Google Scholar] [CrossRef] [Green Version]
- Guerrero, J.M.; Guijarro, M.; Montalvo, M.; Romeo, J.; Emmi, L.; Ribeiro, A.; Pajares, G. Automatic expert system based on images for accuracy crop row detection in maize fields. Expert Syst. Appl. 2013, 40, 656–664. [Google Scholar] [CrossRef] [Green Version]
- Kidono, K.; Miura, J.; Shirai, Y. Autonomous visual navigation of a mobile robot using a human-guided experience. Rob. Auton. Syst. 2002, 40, 121–130. [Google Scholar] [CrossRef]
- Davison, A.J. Real-time simultaneous localisation and mapping with a single camera. In Proceedings of the Ninth IEEE International Conference on Computer Vision, Nice, France, 13–16 October 2003; Volume 2, pp. 1403–1410. [Google Scholar]
- Vrochidou, E.; Bazinas, C.; Manios, M.; Papakostas, G.A.; Pachidis, T.P.; Kaburlasos, V.G. Machine Vision for Ripeness Estimation in Viticulture Automation. Horticulturae 2021, 7, 282. [Google Scholar] [CrossRef]
- Meng, Q.; Qiu, R.; He, J.; Zhang, M.; Ma, X.; Liu, G. Development of agricultural implement system based on machine vision and fuzzy control. Comput. Electron. Agric. 2015, 112, 128–138. [Google Scholar] [CrossRef]
- Burgos-Artizzu, X.P.; Ribeiro, A.; Guijarro, M.; Pajares, G. Real-time image processing for crop/weed discrimination in maize fields. Comput. Electron. Agric. 2011, 75, 337–346. [Google Scholar] [CrossRef] [Green Version]
- Jiang, G.; Wang, Z.; Liu, H. Automatic detection of crop rows based on multi-ROIs. Expert Syst. Appl. 2015, 42, 2429–2441. [Google Scholar] [CrossRef]
- Fernandes, L.A.F.; Oliveira, M.M. Real-time line detection through an improved Hough transform voting scheme. Pattern Recognit. 2008, 41, 299–314. [Google Scholar] [CrossRef]
- Leemans, V.; Destain, M.-F. Line cluster detection using a variant of the Hough transform for culture row localisation. Image Vis. Comput. 2006, 24, 541–550. [Google Scholar] [CrossRef] [Green Version]
- Fontaine, V.; Crowe, T.G. Development of line-detection algorithms for local positioning in densely seeded crops. Can. Biosyst. Eng. 2006, 48, 19–29. [Google Scholar]
- Zhang, L.; Grift, T.E. A New Approach to Crop-Row Detection in Corn. In Proceedings of the 2010 Pittsburgh, Pittsburgh, PA, USA, 20–23 June 2010; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2010; p. 1. [Google Scholar]
- Li, Y.; Wang, X.; Liu, D. 3D Autonomous Navigation Line Extraction for Field Roads Based on Binocular Vision. J. Sens. 2019, 2019, 1–16. [Google Scholar] [CrossRef] [Green Version]
- Zhang, Z.; Li, P.; Zhao, S.; Lv, Z.; Du, F.; An, Y. An Adaptive Vision Navigation Algorithm in Agricultural IoT System for Smart Agricultural Robots. Comput. Mater. Contin. 2020, 66, 1043–1056. [Google Scholar] [CrossRef]
- Wang, Q.; Meng, Z.; Liu, H. Review on Application of Binocular Vision Technology in Field Obstacle Detection. IOP Conf. Ser. Mater. Sci. Eng. 2020, 806, 012025. [Google Scholar] [CrossRef]
- Zhang, T.; Li, H.; Chen, D.; Huang, P.; Zhuang, X. Agricultural vehicle path tracking navigation system based on information fusion of multi-source sensor. Nongye Jixie Xuebao/Trans. Chin. Soc. Agric. Mach. 2015, 46, 37–42. [Google Scholar]
- Dairi, A.; Harrou, F.; Senouci, M.; Sun, Y. Unsupervised obstacle detection in driving environments using deep-learning-based stereovision. Rob. Auton. Syst. 2018, 100, 287–301. [Google Scholar] [CrossRef] [Green Version]
- Ji, Y.; Li, S.; Peng, C.; Xu, H.; Cao, R.; Zhang, M. Obstacle detection and recognition in farmland based on fusion point cloud data. Comput. Electron. Agric. 2021, 189, 106409. [Google Scholar] [CrossRef]
- Ann, N.Q.; Achmad, M.S.H.; Bayuaji, L.; Daud, M.R.; Pebrianti, D. Study on 3D scene reconstruction in robot navigation using stereo vision. In Proceedings of the 2016 IEEE International Conference on Automatic Control and Intelligent Systems (I2CACIS), Selangor, Malaysia, 22 October 2016; pp. 72–77. [Google Scholar]
- Song, D.; Jiang, Q.; Sun, W.; Yao, L. A Survey: Stereo Based Navigation for Mobile Binocular Robots. In Advances in Intelligent Systems and Computing; Springer: Berlin/Heidelberg, Germany, 2013; pp. 1035–1046. ISBN 9783642373732. [Google Scholar]
- Zhu, Z.X.; He, Y.; Zhai, Z.Q.; Liu, J.Y.; Mao, E.R. Research on Cotton Row Detection Algorithm Based on Binocular Vision. Appl. Mech. Mater. 2014, 670–671, 1222–1227. [Google Scholar] [CrossRef]
- Herrera, P.J.; Pajares, G.; Guijarro, M.; Ruz, J.J.; Cruz, J.M. A Stereovision Matching Strategy for Images Captured with Fish-Eye Lenses in Forest Environments. Sensors 2011, 11, 1756–1783. [Google Scholar] [CrossRef] [Green Version]
- Zhang, X.; Dai, H.; Sun, H.; Zheng, N. Algorithm and VLSI Architecture Co-Design on Efficient Semi-Global Stereo Matching. IEEE Trans. Circuits Syst. Video Technol. 2020, 30, 4390–4403. [Google Scholar] [CrossRef]
- Kumari, D.; Kaur, K. A Survey on Stereo Matching Techniques for 3D Vision in Image Processing. Int. J. Eng. Manuf. 2016, 6, 40–49. [Google Scholar] [CrossRef] [Green Version]
- Hamzah, R.A.; Ibrahim, H. Literature Survey on Stereo Vision Disparity Map Algorithms. J. Sens. 2016, 2016, 1–23. [Google Scholar] [CrossRef] [Green Version]
- Luo, W.; Schwing, A.G.; Urtasun, R. Efficient Deep Learning for Stereo Matching. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 26 June–1 July 2016; pp. 5695–5703. [Google Scholar]
- Son, J.; Kim, S.; Sohn, K. A multi-vision sensor-based fast localization system with image matching for challenging outdoor environments. Expert Syst. Appl. 2015, 42, 8830–8839. [Google Scholar] [CrossRef]
- Se, S.; Lowe, D.; Little, J. Mobile Robot Localization and Mapping with Uncertainty using Scale-Invariant Visual Landmarks. Int. J. Rob. Res. 2002, 21, 735–758. [Google Scholar] [CrossRef]
- Guang-lin, H.; Li, L. The Multi-vision Method for Localization Using Modified Hough Transform. In Proceedings of the 2009 WRI World Congress on Computer Science and Information Engineering, Los Angeles, CA, USA, 2 April–31 March 2009; pp. 34–37. [Google Scholar]
- Rabab, S.; Badenhorst, P.; Chen, Y.-P.P.; Daetwyler, H.D. A template-free machine vision-based crop row detection algorithm. Precis. Agric. 2021, 22, 124–153. [Google Scholar] [CrossRef]
- Zhang, Y.; Yang, H.; Liu, Y.; Yu, N.; Liu, X.; Pei, H. Camera Calibration Algorithm for Tractor Vision Navigation. In Proceedings of the 2020 3rd International Conference on E-Business, Information Management and Computer Science, Wuhan, China, 5–6 December 2020; ACM: New York, NY, USA, 2020; pp. 459–463. [Google Scholar]
- Corno, M.; Furioli, S.; Cesana, P.; Savaresi, S.M. Adaptive Ultrasound-Based Tractor Localization for Semi-Autonomous Vineyard Operations. Agronomy 2021, 11, 287. [Google Scholar] [CrossRef]
- Radcliffe, J.; Cox, J.; Bulanon, D.M. Machine vision for orchard navigation. Comput. Ind. 2018, 98, 165–171. [Google Scholar] [CrossRef]
Sensors | Object Detection | Localization and Mapping | Path Planning | Path Tracking and Steering Control |
---|---|---|---|---|
Optical sensors | Convolutional neural network (CNN) | Simultaneous localization and mapping (SLAM) | A* algorithm | Proportional integral derivative (PID) |
Radar | Region-based fully convolutional network (R-FCN) | Dead reckoning | Dijkstra Algorithm | Stanley controller |
Laser scanner | Fast R-CNN | Curb localization | D* Algorithm | Pure pursuit algorithm |
Light detection and ranging (LiDAR) | Faster R-CNN | Visual object detection | Rapidly Exploring Random Trees (RRT) | Fuzzy logic controller |
Ultrasonic | Histogram of oriented gradients (HOG) | Particle filter | Genetic Algorithm (GA) | Neural Networks |
Global positioning system (GPS) | Single shot detector (SSD) | Extended Kalman filter | Ant Colony Algorithm | H-infinity controller |
Global navigation satellite system (GNSS) | You only look once (YOLO) | Covariance intersection | Firefly Algorithm | Model predictive controller |
Internal measurement unit (IMU) | ||||
Odometry |
Ref. | Vision System | Visual Sensors | Additional Sensors | Image Processing Algorithm | Application | Performance |
---|---|---|---|---|---|---|
[62] | Monocular | SONY-CCD-TR55, Focal length 0.011 m | N/A | HSI to detect different colored crops, Horizontal Scanning Method and Least Squares Method to detect the boundary of crop row | Tested on a 4-Wheel drive tractor in a Komatsuna field on 3 rows 20 m long at 0.2 m intervals of 1.7 m and 0.7 m width | 0.024 m maximum error in the offset and 1.5° in the attitude angle for 0.25 m/s navigation speed |
[19] | Monocular | Creative technology Ltd. ‘Video Blaster’ 3rd revision | N/A | Level-adjustment thresholding to discriminate rows from gaps, an averaging technique using a viewport to locate rows, regression analysis to fit the best line | Tested on a CASE 7 140 tractor model to target a white tape 15 mm width at a 35-s run | 0.020 m accuracy at a speed of 1 m/s |
[24] | Monocular | Cohu 2100 monochromatic camera | Trimble 4400 RTK GPS | Histogram-based segmentation, line-by-line low-pass filtering, blob analysis | Tested on a Case 2188 combine harvester, in-lab and on a typical Illinois field with an average yield of 8.61 t/ha | 0.050 m overall accuracy in day and night navigation trials at a speed of 1.3 m/s |
[63] | Monocular | Metal-oxide semiconductor-type color camera Hitachi VK-C3400A | N/A | A contrast algorithm applied on a color histogram to detect rows, image scanning with intensity threshold to locate left edge of rows, and a color discrimination algorithm applied to color histograms, average color intensity calculation, image scanning to locate both edges of rows | Tested on a 15 kW General Electric ElecTrac lawn tractor, in-lab and on a Michigan State University corn farm for 1.5 m | 0.050 m best performance at testing speed of 0.16, 0.5 and 1.11 m/s |
[64] | Monocular | Three-element video camera Model WVD5100 by Panasonic, with a vertically polarizing filter in front of the lens | N/A | Calibration to determine target reference color from average color intensity of histograms for crop row detection, image scanning to locate the row | Tested on a J.I. Case-IH 7110 tractor, for a 125-m run by keeping a stable distance of 10 cm from the left side of a corn crop row in varying environmental illumination for 70 trials | 95% of the trials reported SD from the predetermined route identical to that of a human driver with speeds of 1.33 and 3.58 m/s |
[65] | Monocular | Sony CCD color camera | Radar, Laser, Encoders, Potentiometers, Limit switches, INS, GPS | Intensity segmentation and color segmentation to detect and track the crop cut line | Tested on standard self-propelled New Holland Model 2550 windrower over 5 km in El Centro, California | 95% of rows were segmented correctly at a speed of 1.94 m/s |
[36] | Monocular | CCD color camera | N/A | Ground coordinate image transformation, R and G histogram image intensity integration to extract crop areas | Tested on a weeding machine, Nichinoki Seiko Inc. NAK-5, and a tractor at a farm of Hokkaido University over 150 m | 0.030 m RMS offset error and 0.3° heading error between camera and crop row for speeds of up to 1.3 m/s |
[66] | Monocular | Sony FCB-EX7805 CCD camera | N/A | Path segmentation using color thresholding, image scanning to determine path edges, least squares and RANSAC to find best fit lines, intersection detection | Tested on Singh sprayer in a greenhouse for a tape path and a plant path of 61 cm width for both a straight path and a 90° turn | 0.010 m average error inside a straight plant path and 0.011 m-before/0.078 m-after a 90° turn at a speed of 0.2 m/s |
[67] | Monocular | Not specified | Encoders | Infrared images to heighten contrast between plants and soil, amplitude threshold to identify the area of plants, Hough transform to determine position of rows | Tested on a commercially manufactured tractor for use on horticultural plots inside artificial crop rows | ±0.020 m peak offset error with a speed of 1.5 m/s |
[68] | Monocular | A camera equipped with a fish-eye lens with a 130° field of view | N/A | 3D map reconstruction, key frame selection, camera motion computation, Hierarchical bundle adjustment | Tested on an experimental electric vehicle, Cycab, on a 127 m trajectory in sunny and cloudy weather | 0.018 m mean of the lateral deviation in straight line, 0.161 m minimum deviation in curves |
[69] | Monocular | Microsoft LifeCam Cinema web camera | IMU (CH Robotics UM6), RTK-GPS/INS (Novatel FlexPack with Tactical Grade IMU) | Lens distortion and down-sample correction, image stabilization, warp of image into overhead view, estimation of dominant parallel texture, image skewing for heading correction, frame template generation | Tested on a John Deere Gator TE electric utility vehicle for spraying weeds in wheat and sorghum stubble fields in Emerald, Australia, during day and night | 0.034 m minimum RMS error for open-loop experiments, 0.028 m for closed-loop experiments |
[26] | Monocular | COHU CCD gray-scale camera with a near-infrared filter 780 nm and 8.5 mm focal length lens | N/A | Opening operation on image to get an intensity independent gray-level image, thresholding to derive a binary image, perspective transform of world to image coordinates, Hough transform to find crop boxes | Tested on an inter-row cultivator tractor and a mobile robot for 5.5 Km in a sugar beet field | 0.027 m SD for the position of the tractor and 0.023 m for the robot |
[78] | Monocular | Color video camera | GPS receiver | Thresholding of H component histogram of HIS color model to discriminate plants, vertical projection method to detect number and position of crop rows, linear scanning method to detect crop lines | Tested on a tractor in a corn field in Shang Zhuang under varying illuminations | 0.027 m maximum average error at speeds of 0.6, 1.0 and 1.4 m/s |
[104] | Monocular | GoPro Hero 3+ with a near infrared filter at 750 nm | N/A | Lower part of mage cropped, green plane extraction from cropped image, thresholding to extract path plane, filtering to remove noise, centroid of path plane determination | Tested on a GEARs Surface Mobility Platform in a laboratory setting and in a peach orchard | 0.023 m RMS in-Lab, 0.021 m RMS in-field |
[16] | Monocular | Camera with a near infrared optical filter 850 nm and 100 nm bandwidth and an auto-iris lens | N/A | Image intensity distribution, thresholding based on distribution of image pixels, segmentation of guidance rows | Tested on a Nebraska Ford tractor in a cotton field with straight lines of Texas agricultural experiment station | 0.35 to 0.55 m heading error |
[17] | Monocular | MOS camera with near infrared Oriel model number 57700 SSO-nm, narrow-band pass filter and a Computar APC, 1: 1.3 25 mm auto-iris lens | N/A | Image subsampling to evaluate the intensity of sample size, Bayes thresholding and class mixture coefficient calculation, RLE to identify edge points of crop rows, heuristic detection algorithm to determine parameters of lines belonging to crop rows | Tested on a Ford 7710 tractor in cotton, sorghum and soybean crop rows | −0.65 m to 0.90 m heading error, 0.0 m to −0.27 m offset error, with speeds of 0.88, 1.55 and 2.13 m/s |
[30] | Monocular | Pulnix TM500 charge coupled device (CCD) video camera | N/A | Thresholding, features extraction | Tested on a tractor-mounted steerage hoe for cereals by Garford Farm Machinery at speeds up to 1.66 m/s | 0.013 m standard error in hoe position independent of speed |
[18] | Monocular | Video Blaster camera inference | N/A | Regression analysis to estimate the best-fit line through a row of blobs within a frame, thresholding for brightness adjustment | Tested on a Case Maxxum 100 horsepower tractor for 100 m in a cotton field | 0.5 m initial displacement error from row for speeds of 1 m/s and 5 m/s, settles at 0.050 m after 20 m |
[31] | Monocular | Standard CCD camera sensitive in the near infra-red | N/A | Bandpass filtering to extract image intensity due to crop rows | Tested on a mechanical hoe in winter wheat at Silsoe, UK | 0.0156 m positional error, at a speed of 1.6 m/s |
[32] | Monocular | Standard mono-chrome CCD with a near infrared bandpass filter and 4.8 mm focal length | N/A | Saturation detection in histograms of pixel intensities, image thresholding, Kalman filtering to extract row features | Tested on a hoe of 6 m span standard design by Garford Farm Machinery Ltd. for a 110 m | 0.016 m SD in lateral error and ±0.010 m mean bias at 1.6 m/s |
[33] | Monocular | Sony FCB-EX780S “block” single CCD analog color video camera | Lidar SICK LMS 200, DGPS receiver | Segmentation algorithm, adaptive thresholding, morphological operations | Tested on a John Deere 6410 tractor in a citrus grove on a 22 m straight path and a 17 m curve | 0.028 m average error in curved path at a speed of 3.1 m/s |
[59] | Binocular | Bumblebee2 parallel binocular camera BB2-08S2C-38 (baseline 120 mm, focal length 3.8 mm, horizontal field of view 66°) | N/A | Improved 2G-R-B grayscale transformation, Harris corner point detection, rank NSAD region matching, RANSAC refining of disparity, location of 3D position, crop row classification, centerline pathway determination | Tested on a manual four-wheel trolley in cotton fields on cloudy and sunny days with a speed between 1.0 and 2.0 m/s | 92.78% detection rate for average deviation angle, 1.05°absolute average value, 3.66° average SD |
[29] | Binocular | STH-MD1 (VidereDesign, CA) stereo camera | RTK-GPS | Disparity map computing, 3D points reconstruction, C-to-V transformation, elevation map creation, median filtering, navigation point determination | Tested on a John Deere 7700 tractor in soya bean fields | 0.050 m RMS error of lateral deviation on straight and curved rows at speeds up to 3.0 m/s |
[86] | Binocular | Bumblebee2 binocular vision system Model BB2-03S2C-60 | N/A | SURF for feature extraction and matching to obtain feature pairs, confidence density image construction by integrating the enhanced elevation image and the corresponding binarized crop row image | Tested on a smart agricultural robot manufactured in Shanghai, China on S-type and O-type in-lab simulated crop plant leaves paths | 0.7° absolute mean of turning angle and 1.5° absolute SD for speed less than 0.5 m/s |
[57] | Binocular | Bumblebee 2 by Point Grey | N/A | Excess green minus excess red function to transform RGB to greyscale, smallest uni-value segment assimilating nucleus detector to detect contour of crop rows, stereo matching with Census transform to calculate disparity of corner points of crop rows | Tested on Revo Leopard TG1254 tractor in a cotton leaves line with a crop distance of 0.60 cm, row spacing of 1.20 m and 0.25 m minimum and 0.50 m maximum height of leaves at 1.2 m/s | 0.95° mean absolute deviation of course angle/1.26° SD, 0.040 m mean absolute deviation of lateral position/0.049 m SD, 2.99° mean absolute deviation of frond wheel angle/0.036 m SD |
[85] | Binocular | RER-720P2 CAM-90 by RERVISION Technology Co., Ltd. (Shenzhen, China) | RTK-GPS | Threshold segmentation, RGB to HSV color space conversion, HSV channel segmentation, Otsu threshold segmentation of V component and morphological filtering, point operation of S and V components and Otsu threshold segmentation for shadow processing | Tested on an experimental autonomous carrier in a field road 1.2 m wide with a change in altitude and curvature | 0.031 m, 0.069 m and 0.105 m mean deviation for straight, multi-curvature and undulating roads, respectively, at 2 m/s |
[21] | Multi-vision | Two COHU 2100 series monochrome cameras with 800 nm narrow band NIR filters | N/A | Full image processing, vertical transition image reduction and adaptive fuzzy linear regression | Tested on Case 2188 Axial-Flow Combine in a corn field | Not reported (The algorithm satisfactorily guided the combine) |
[22] | Multi-vision | Two COHU 2100 series monochrome cameras with 800 nm filters and 3 mm focal length lens | AFS beacon GPS | Two-class K-means threshold segmentation, run length encoding to simplify the segmented image, classification, transition detection, sequential linear regression, adaptive fuzzy evaluation | Tested on Case 2188 Axial-Flow Combine in a corn field | Not reported (The system successfully autonomously harvested corn at speeds of up to 2.66 m/s. |
[23] | Multi-vision | Three cameras | Trimble (Sunnyvale, CT) Ag122 beacon GPS, Trimble 4400 RTK GPS receiver | Adaptive segmentation based on histograms of image intensity row-by-row low-pass filtering, blob analysis by run length encoding, linear regression to track the center inter-row | Tested on a Case 2188 rotary combine in a corn field of 4.6 ha at day and night for 14 runs with a maximum speed between 0.8 m/s to 1.3 m/s | 0.006 m overall accuracy with 0.133 m SD, 0.003 m average daytime accuracy and 0.133 m SD, −0.024 m average nighttime accuracy and 0.129 m SD |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Vrochidou, E.; Oustadakis, D.; Kefalas, A.; Papakostas, G.A. Computer Vision in Self-Steering Tractors. Machines 2022, 10, 129. https://doi.org/10.3390/machines10020129
Vrochidou E, Oustadakis D, Kefalas A, Papakostas GA. Computer Vision in Self-Steering Tractors. Machines. 2022; 10(2):129. https://doi.org/10.3390/machines10020129
Chicago/Turabian StyleVrochidou, Eleni, Dimitrios Oustadakis, Axios Kefalas, and George A. Papakostas. 2022. "Computer Vision in Self-Steering Tractors" Machines 10, no. 2: 129. https://doi.org/10.3390/machines10020129
APA StyleVrochidou, E., Oustadakis, D., Kefalas, A., & Papakostas, G. A. (2022). Computer Vision in Self-Steering Tractors. Machines, 10(2), 129. https://doi.org/10.3390/machines10020129