Localization and Mapping for Robots in Agriculture and Forestry: A Survey
Abstract
:1. Introduction
2. The Localization and Mapping Problem
2.1. The SLAM Method
2.1.1. Solving the SLAM Problem
2.1.2. Mapping the Environment
2.2. The Visual Odometry Method
3. Methodology
- Application: Agricultural or forest application of the desired autonomous system.
- Localization approach: The methods and sensors used to localize the robot.
- Mapping approach: The methods and sensors used to map the environment.
- Scalability: Evaluation of the capacity of the algorithm to handle large-scale paths.
- Availability: Evaluation of the possibility of the algorithm to present reliable localization right away, without need for building prior maps of the environment.
- Description: Agricultural or forestry area where the data was collected, as well as sensor information.
- Large-scale: Whether or not the data were collected in large-scale environments, and large-scale paths.
- Long-term: Whether or not the data were collected in different seasons of the year, and different times of the day.
4. Localization and Mapping in Agriculture
5. Localization and Mapping in Forestry
6. Datasets for Localization and Mapping in Agriculture and Forestry
7. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Auat Cheein, F.A.; Carelli, R. Agricultural Robotics: Unmanned Robotic Service Units in Agricultural Tasks. IEEE Ind. Electron. Mag. 2013, 7, 48–58. [Google Scholar] [CrossRef]
- Skvortsov, E.; Skvortsova, E.; Sandu, I.; Iovlev, G. Transition of Agriculture to Digital, Intellectual and Robotics Technologies. Econ. Reg. 2018, 14, 1014–1028. [Google Scholar] [CrossRef]
- Billingsley, J.; Visala, A.; Dunn, M. Robotics in Agriculture and Forestry. In Springer Handbook of Robotics; Siciliano, B., Khatib, O., Eds.; Springer: Berlin/Heidelberg, Germany, 2008; pp. 1065–1077. [Google Scholar] [CrossRef] [Green Version]
- Roldán, J.J.; del Cerro, J.; Garzón-Ramos, D.; Garcia-Aunon, P.; Garzón, M.; de León, J.; Barrientos, A. Robots in Agriculture: State of Art and Practical Experiences. In Service Robots; InTech: London, UK, 2018. [Google Scholar] [CrossRef] [Green Version]
- Perez-Ruiz, M.; Upadhyaya, S. GNSS in Precision Agricultural Operations. In New Approach of Indoor and Outdoor Localization Systems; InTech: London, UK, 2012. [Google Scholar] [CrossRef] [Green Version]
- Guo, J.; Li, X.; Li, Z.; Hu, L.; Yang, G.; Zhao, C.; Fairbairn, D.; Watson, D.; Ge, M. Multi-GNSS precise point positioning for precision agriculture. Precis. Agric. 2018, 19, 895–911. [Google Scholar] [CrossRef] [Green Version]
- De Aguiar, A.S.P.; dos Santos, F.B.N.; dos Santos, L.C.F.; de Jesus Filipe, V.M.; de Sousa, A.J.M. Vineyard trunk detection using deep learning—An experimental device benchmark. Comput. Electron. Agric. 2020, 175, 105535. [Google Scholar] [CrossRef]
- Santos, L.C.; Aguiar, A.S.; Santos, F.N.; Valente, A.; Ventura, J.B.; Sousa, A.J. Navigation Stack for Robots Working in Steep Slope Vineyard. In Advances in Intelligent Systems and Computing; Springer International Publishing: New York, NY, USA, 2020; pp. 264–285. [Google Scholar] [CrossRef]
- Durrant-Whyte, H.; Bailey, T. Simultaneous localization and mapping: Part I. IEEE Robot. Autom. Mag. 2006, 13, 99–110. [Google Scholar] [CrossRef] [Green Version]
- Bailey, T.; Durrant-Whyte, H. Simultaneous localization and mapping (SLAM): Part II. IEEE Robot. Autom. Mag. 2006, 13, 108–117. [Google Scholar] [CrossRef] [Green Version]
- Cadena, C.; Carlone, L.; Carrillo, H.; Latif, Y.; Scaramuzza, D.; Neira, J.; Reid, I.; Leonard, J.J. Past, Present, and Future of Simultaneous Localization and Mapping: Toward the Robust-Perception Age. IEEE Trans. Robot. 2016, 32, 1309–1332. [Google Scholar] [CrossRef] [Green Version]
- Nister, D.; Naroditsky, O.; Bergen, J. Visual odometry. In Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004, CVPR 2004, Washington, DC, USA, 27 June–2 July 2004. [Google Scholar] [CrossRef]
- Scaramuzza, D.; Fraundorfer, F. Visual Odometry [Tutorial]. IEEE Robot. Autom. Mag. 2011, 18, 80–92. [Google Scholar] [CrossRef]
- Kohlbrecher, S.; Meyer, J.; Graber, T.; Petersen, K.; Klingauf, U.; von Stryk, O. Hector Open Source Modules for Autonomous Mapping and Navigation with Rescue Robots. In RoboCup 2013: Robot World Cup XVII; Springer: Berlin/Heidelberg, Germany, 2014; pp. 624–631. [Google Scholar] [CrossRef] [Green Version]
- Grisetti, G.; Stachniss, C.; Burgard, W. Improved Techniques for Grid Mapping With Rao-Blackwellized Particle Filters. IEEE Trans. Robot. 2007, 23, 34–46. [Google Scholar] [CrossRef] [Green Version]
- Dong, J.; Burnham, J.G.; Boots, B.; Rains, G.; Dellaert, F. 4D crop monitoring: Spatio-temporal reconstruction for agriculture. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017. [Google Scholar] [CrossRef] [Green Version]
- Hess, W.; Kohler, D.; Rapp, H.; Andor, D. Real-time loop closure in 2D LIDAR SLAM. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016. [Google Scholar] [CrossRef]
- Williams, B.; Cummins, M.; Neira, J.; Newman, P.; Reid, I.; Tardós, J. A comparison of loop closing techniques in monocular SLAM. Robot. Auton. Syst. 2009, 57, 1188–1197. [Google Scholar] [CrossRef] [Green Version]
- Smith, R.C.; Cheeseman, P. On the Representation and Estimation of Spatial Uncertainty. Int. J. Robot. Res. 1986, 5, 56–68. [Google Scholar] [CrossRef]
- Durrant-Whyte, H. Uncertain geometry in robotics. IEEE J. Robot. Autom. 1988, 4, 23–31. [Google Scholar] [CrossRef]
- Smith, R.; Self, M.; Cheeseman, P. Estimating Uncertain Spatial Relationships in Robotics. In Autonomous Robot Vehicles; Springer: New York, NY, USA, 1990; pp. 167–193. [Google Scholar] [CrossRef] [Green Version]
- Leonard, J.J.; Durrant-Whyte, H.F. Simultaneous map building and localization for an autonomous mobile robot. In Proceedings of the IEEE/RSJ International Workshop on Intelligent Robots and Systems (IROS ’91), Osaka, Japan, 3–5 November 1991; Volume 3, pp. 1442–1447. [Google Scholar]
- Thrun, S. Simultaneous Localization and Mapping. In Robotics and Cognitive Approaches to Spatial Mapping; Springer: Berlin/Heidelberg, Germany, 2008; pp. 13–41. [Google Scholar] [CrossRef] [Green Version]
- Bailey, T.; Nieto, J.; Guivant, J.; Stevens, M.; Nebot, E. Consistency of the EKF-SLAM Algorithm. In Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, 9–15 October 2006. [Google Scholar] [CrossRef] [Green Version]
- Paz, L.; Tardos, J.; Neira, J. Divide and Conquer: EKF SLAM in $O(n)$. IEEE Trans. Robot. 2008, 24, 1107–1120. [Google Scholar] [CrossRef]
- Kalman, R.E. A New Approach to Linear Filtering and Prediction Problems. J. Basic Eng. 1960, 82, 35–45. [Google Scholar] [CrossRef] [Green Version]
- Pinies, P.; Tardos, J.D. Scalable SLAM building conditionally independent local maps. In Proceedings of the 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems, San Diego, CA, USA, 29 October–2 November 2007. [Google Scholar] [CrossRef] [Green Version]
- Pinies, P.; Tardos, J. Large-Scale SLAM Building Conditionally Independent Local Maps: Application to Monocular Vision. IEEE Trans. Robot. 2008, 24, 1094–1106. [Google Scholar] [CrossRef]
- Maybeck, P. Stochastic Models, Estimation, and Control; Academic Press: New York, NY, USA; London, UK; Paris, France, 1982. [Google Scholar]
- Walter, M.R.; Eustice, R.M.; Leonard, J.J. Exactly Sparse Extended Information Filters for Feature-based SLAM. Int. J. Robot. Res. 2007, 26, 335–359. [Google Scholar] [CrossRef] [Green Version]
- Eustice, R.; Singh, H.; Leonard, J.; Walter, M.; Ballard, R. Visually Navigating the RMS Titanic with SLAM Information Filters. In Robotics: Science and Systems I; Robotics: Science and Systems Foundation; Massachusetts Institute of Technology: Cambridge, MA, USA, 2005. [Google Scholar] [CrossRef]
- Thrun, S.; Liu, Y. Multi-robot SLAM with Sparse Extended Information Filers. In Springer Tracts in Advanced Robotics; Springer: Berlin/Heidelberg, Germany, 2005; pp. 254–266. [Google Scholar] [CrossRef]
- Montemerlo, M.; Thrun, S.; Koller, D.; Wegbreit, B. FastSLAM: A factored solution to the simultaneous localization and mapping problem. In Proceedings of the AAAI National Conference on Artificial Intelligence/IAAI, Edmonton, AB, Canada, 28 July–2 August 2002; p. 593598. [Google Scholar]
- Montemerlo, M.; Thrun, S.; Koller, D.; Wegbreit, B. FastSLAM 2.0: An improved particle filtering algorithm for simultaneous localization and mapping that provably converges. In Proceedings of the IJCAI, Acapulco, Mexico, 9–15 August 2003; pp. 1151–1156. [Google Scholar]
- Grisettiyz, G.; Stachniss, C.; Burgard, W. Improving Grid-based SLAM with Rao-Blackwellized Particle Filters by Adaptive Proposals and Selective Resampling. In Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, 18–22 April 2005. [Google Scholar] [CrossRef]
- Lu, F.; Milios, E. Globally Consistent Range Scan Alignment for Environment Mapping. Auton. Robot. 1997, 4, 333–349. [Google Scholar] [CrossRef]
- Bresson, G.; Alsayed, Z.; Yu, L.; Glaser, S. Simultaneous Localization and Mapping: A Survey of Current Trends in Autonomous Driving. IEEE Trans. Intell. Veh. 2017, 2, 194–220. [Google Scholar] [CrossRef] [Green Version]
- Aguiar, A.S.; Santos, F.N.D.; Sousa, A.J.M.D.; Oliveira, P.M.; Santos, L.C. Visual Trunk Detection Using Transfer Learning and a Deep Learning-Based Coprocessor. IEEE Access 2020, 8, 77308–77320. [Google Scholar] [CrossRef]
- Bar-Shalom, Y.; Fortmann, T.E.; Cable, P.G. Tracking and Data Association. J. Acoust. Soc. Am. 1990, 87, 918–919. [Google Scholar] [CrossRef]
- Cox, I.J. A review of statistical data association techniques for motion correspondence. Int. J. Comput. Vis. 1993, 10, 53–66. [Google Scholar] [CrossRef]
- Montemerlo, M.; Thrun, S. Simultaneous localization and mapping with unknown data association using FastSLAM. In Proceedings of the 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422), Taipei, Taiwan, 14–19 September 2003. [Google Scholar] [CrossRef] [Green Version]
- Hähnel, D.; Thrun, S.; Wegbreit, B.; Burgard, W. Towards Lazy Data Association in SLAM. In Springer Tracts in Advanced Robotics; Springer: Berlin/Heidelberg, Germay, 2005; pp. 421–431. [Google Scholar] [CrossRef] [Green Version]
- Neira, J.; Tardos, J. Data association in stochastic mapping using the joint compatibility test. IEEE Trans. Robot. Autom. 2001, 17, 890–897. [Google Scholar] [CrossRef] [Green Version]
- Thrun, S.; Burgard, W.; Fox, D. A Probabilistic Approach to Concurrent Mapping and Localization for Mobile Robots. Mach. Learn. 1998, 31, 29–53. [Google Scholar] [CrossRef] [Green Version]
- Zhou, W.; Cao, Z.; Dong, Y. Review of SLAM Data Association Study. In Proceedings of the 2016 International Conference on Sensor Network and Computer Engineering, Xi’an, China, 8–10 July 2016; Atlantis Press: Amsterdam, The Netherlands, 2016. [Google Scholar] [CrossRef] [Green Version]
- Tardós, J.D.; Neira, J.; Newman, P.M.; Leonard, J.J. Robust Mapping and Localization in Indoor Environments Using Sonar Data. Int. J. Robot. Res. 2002, 21, 311–330. [Google Scholar] [CrossRef] [Green Version]
- Shan, T.; Englot, B. LeGO-LOAM: Lightweight and Ground-Optimized Lidar Odometry and Mapping on Variable Terrain. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018. [Google Scholar] [CrossRef]
- IEEE Standard for Robot Map Data Representation for Navigation. In Proceedings of the IROS2014 (IEEE/RSJ International Conference on Intelligent Robots and Systems) Workshop on “Standardized Knowledge Representation and Ontologies for Robotics and Automation”, Chicago, IL, USA, 14–18 September 2014; pp. 3–4. [CrossRef]
- Yi, C. Map Representation for Robots. Smart Comput. Rev. 2012. [Google Scholar] [CrossRef]
- Lowry, S.; Sunderhauf, N.; Newman, P.; Leonard, J.J.; Cox, D.; Corke, P.; Milford, M.J. Visual Place Recognition: A Survey. IEEE Trans. Robot. 2016, 32, 1–19. [Google Scholar] [CrossRef] [Green Version]
- Walter, M.; Hemachandra, S.; Homberg, B.; Tellex, S.; Teller, S. Learning Semantic Maps from Natural Language Descriptions. In Robotics: Science and Systems IX; Robotics: Science and Systems Foundation; Technische Universität Berlin: Berlin, Germany, 2013. [Google Scholar] [CrossRef]
- Vasudevan, S.; Gächter, S.; Nguyen, V.; Siegwart, R. Cognitive maps for mobile robots—an object based approach. Robot. Auton. Syst. 2007, 55, 359–371. [Google Scholar] [CrossRef] [Green Version]
- dos Santos, F.B.N.; Sobreira, H.M.P.; Campos, D.F.B.; dos Santos, R.M.P.M.; Moreira, A.P.G.M.; Contente, O.M.S. Towards a Reliable Monitoring Robot for Mountain Vineyards. In Proceedings of the 2015 IEEE International Conference on Autonomous Robot Systems and Competitions, Vila Real, Portugal, 8–10 April 2015. [Google Scholar] [CrossRef]
- Yousif, K.; Bab-Hadiashar, A.; Hoseinnezhad, R. An Overview to Visual Odometry and Visual SLAM: Applications to Mobile Robotics. Intell. Ind. Syst. 2015, 1, 289–311. [Google Scholar] [CrossRef]
- Fraundorfer, F.; Scaramuzza, D. Visual Odometry: Part II: Matching, Robustness, Optimization, and Applications. IEEE Robot. Autom. Mag. 2012, 19, 78–90. [Google Scholar] [CrossRef] [Green Version]
- Agarwal, S.; Snavely, N.; Seitz, S.M.; Szeliski, R. Bundle Adjustment in the Large. In Computer Vision—ECCV 2010; Daniilidis, K., Maragos, P., Paragios, N., Eds.; Springer: Berlin/Heidelberg, Germany, 2010; pp. 29–42. [Google Scholar]
- Ziegler, J.; Bender, P.; Schreiber, M.; Lategahn, H.; Strauss, T.; Stiller, C.; Dang, T.; Franke, U.; Appenrodt, N.; Keller, C.G.; et al. Making Bertha Drive—An Autonomous Journey on a Historic Route. IEEE Intell. Transp. Syst. Mag. 2014, 6, 8–20. [Google Scholar] [CrossRef]
- Freitas, G.; Zhang, J.; Hamner, B.; Bergerman, M.; Kantor, G. A Low-Cost, Practical Localization System for Agricultural Vehicles. In Intelligent Robotics and Applications; Springer: Berlin/Heidelberg, Germany, 2012; pp. 365–375. [Google Scholar] [CrossRef] [Green Version]
- Libby, J.; Kantor, G. Deployment of a point and line feature localization system for an outdoor agriculture vehicle. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011. [Google Scholar] [CrossRef] [Green Version]
- Duarte, M.; dos Santos, F.N.; Sousa, A.; Morais, R. Agricultural Wireless Sensor Mapping for Robot Localization. In Advances in Intelligent Systems and Computing; Springer International Publishing: New York, NY, USA, 2015; pp. 359–370. [Google Scholar] [CrossRef]
- Zaman, S.; Comba, L.; Biglia, A.; Aimonino, D.R.; Barge, P.; Gay, P. Cost-effective visual odometry system for vehicle motion control in agricultural environments. Comput. Electron. Agric. 2019, 162, 82–94. [Google Scholar] [CrossRef]
- Habibie, N.; Nugraha, A.M.; Anshori, A.Z.; Masum, M.A.; Jatmiko, W. Fruit mapping mobile robot on simulated agricultural area in Gazebo simulator using simultaneous localization and mapping (SLAM). In Proceedings of the 2017 International Symposium on Micro-NanoMechatronics and Human Science (MHS), Nagoya, Japan, 3–6 December 2017. [Google Scholar] [CrossRef]
- Younse, P.; Burks, T. Greenhouse Robot Navigation Using KLT Feature Tracking for Visual Odometry. Agric. Eng. Int. CIGR J. 2007, IX, 62744503. [Google Scholar]
- Bayar, G.; Bergerman, M.; Koku, A.B.; ilhan Konukseven, E. Localization and control of an autonomous orchard vehicle. Comput. Electron. Agric. 2015, 115, 118–128. [Google Scholar] [CrossRef] [Green Version]
- Le, T.; Gjevestad, J.G.O.; From, P.J. Online 3D Mapping and Localization System for Agricultural Robots. IFAC-PapersOnLine 2019, 52, 167–172. [Google Scholar] [CrossRef]
- Cheein, F.A.; Steiner, G.; Paina, G.P.; Carelli, R. Optimized EIF-SLAM algorithm for precision agriculture mapping based on stems detection. Comput. Electron. Agric. 2011, 78, 195–207. [Google Scholar] [CrossRef]
- Chebrolu, N.; Lottes, P.; Labe, T.; Stachniss, C. Robot Localization Based on Aerial Images for Precision Agriculture Tasks in Crop Fields. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019. [Google Scholar] [CrossRef]
- Blok, P.M.; van Boheemen, K.; van Evert, F.K.; IJsselmuiden, J.; Kim, G.H. Robot navigation in orchards with localization based on Particle filter and Kalman filter. Comput. Electron. Agric. 2019, 157, 261–269. [Google Scholar] [CrossRef]
- Piyathilaka, L.; Munasinghe, R. Vision-only outdoor localization of two-wheel tractor for autonomous operation in agricultural fields. In Proceedings of the 2011 6th International Conference on Industrial and Information Systems, Kandy, Sri Lanka, 16–19 August 2011. [Google Scholar] [CrossRef]
- Iqbal, J.; Xu, R.; Sun, S.; Li, C. Simulation of an Autonomous Mobile Robot for LiDAR-Based In-Field Phenotyping and Navigation. Robotics 2020, 9, 46. [Google Scholar] [CrossRef]
- Bietresato, M.; Carabin, G.; D’Auria, D.; Gallo, R.; Ristorto, G.; Mazzetto, F.; Vidoni, R.; Gasparetto, A.; Scalera, L. A tracked mobile robotic lab for monitoring the plants volume and health. In Proceedings of the 2016 12th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications (MESA), Auckland, New Zealand, 29–31 August 2016. [Google Scholar] [CrossRef] [Green Version]
- Utstumo, T.; Urdal, F.; Brevik, A.; Dørum, J.; Netland, J.; Overskeid, Ø.; Berge, T.W.; Gravdahl, J.T. Robotic in-row weed control in vegetables. Comput. Electron. Agric. 2018, 154, 36–45. [Google Scholar] [CrossRef]
- Fountas, S.; Mylonas, N.; Malounas, I.; Rodias, E.; Santos, C.H.; Pekkeriet, E. Agricultural Robotics for Field Operations. Sensors 2020, 20, 2672. [Google Scholar] [CrossRef]
- Qian, C.; Liu, H.; Tang, J.; Chen, Y.; Kaartinen, H.; Kukko, A.; Zhu, L.; Liang, X.; Chen, L.; Hyyppä, J. An Integrated GNSS/INS/LiDAR-SLAM Positioning Method for Highly Accurate Forest Stem Mapping. Remote Sens. 2016, 9, 3. [Google Scholar] [CrossRef] [Green Version]
- Hussein, M.; Renner, M.; Iagnemma, K. Global Localization of Autonomous Robots in Forest Environments. Photogramm. Eng. Remote Sens. 2015, 81, 839–846. [Google Scholar] [CrossRef]
- Li, Q.; Nevalainen, P.; Queralta, J.P.; Heikkonen, J.; Westerlund, T. Localization in Unstructured Environments: Towards Autonomous Robots in Forests with Delaunay Triangulation. Remote Sens. 2020, 12, 1870. [Google Scholar] [CrossRef]
- Pierzchała, M.; Giguère, P.; Astrup, R. Mapping forests using an unmanned ground vehicle with 3D LiDAR and graph-SLAM. Comput. Electron. Agric. 2018, 145, 217–225. [Google Scholar] [CrossRef]
- Rossmann, D.I.J. Navigation of Mobile Robots in Natural Environments: Using Sensor Fusion in Forestry; Springer: Berlin/Heidelberg, Germany, 2013; pp. 43–52. [Google Scholar]
- Miettinen, M.; Ohman, M.; Visala, A.; Forsman, P. Simultaneous Localization and Mapping for Forest Harvesters. In Proceedings of the 2007 IEEE International Conference on Robotics and Automation, Roma, Italy, 10–14 April 2007. [Google Scholar] [CrossRef]
- Öhman, M.; Miettinen, M.; Kannas, K.; Jutila, J.; Visala, A.; Forsman, P. Tree Measurement and Simultaneous Localization and Mapping System for Forest Harvesters. In Springer Tracts in Advanced Robotics; Springer: Berlin/Heidelberg, Germany, 2008; pp. 369–378. [Google Scholar] [CrossRef] [Green Version]
- Hyyti, H.; Visala, A. Feature Based Modeling and Mapping of Tree Trunks and Natural Terrain Using 3D Laser Scanner Measurement System. IFAC Proc. Vol. 2013, 46, 248–255. [Google Scholar] [CrossRef] [Green Version]
- Hyyti, H.; Öhman, M.; Miettinen, M.; Visala, A. Heuristic correlation based laser odometry method for unconstructed environment. In Proceedings of the IASTED International Conference on Robotics and Applications, Cambridge, MA, USA, 2–4 November 2009; pp. 194–200. [Google Scholar]
- Tang, J.; Chen, Y.; Kukko, A.; Kaartinen, H.; Jaakkola, A.; Khoramshahi, E.; Hakala, T.; Hyyppä, J.; Holopainen, M.; Hyyppä, H. SLAM-Aided Stem Mapping for Forest Inventory with Small-Footprint Mobile LiDAR. Forests 2015, 6, 4588–4606. [Google Scholar] [CrossRef] [Green Version]
- Chebrolu, N.; Lottes, P.; Schaefer, A.; Winterhalter, W.; Burgard, W.; Stachniss, C. Agricultural robot dataset for plant classification, localization and mapping on sugar beet fields. Int. J. Robot. Res. 2017, 36, 1045–1052. [Google Scholar] [CrossRef] [Green Version]
- Kragh, M.; Christiansen, P.; Laursen, M.; Larsen, M.; Steen, K.; Green, O.; Karstoft, H.; Jørgensen, R. FieldSAFE: Dataset for Obstacle Detection in Agriculture. Sensors 2017, 17, 2579. [Google Scholar] [CrossRef] [Green Version]
- Ali, I.; Durmush, A.; Suominen, O.; Yli-Hietanen, J.; Peltonen, S.; Collin, J.; Gotchev, A. FinnForest dataset: A forest landscape for visual SLAM. Robot. Auton. Syst. 2020, 132, 103610. [Google Scholar] [CrossRef]
- Pire, T.; Mujica, M.; Civera, J.; Kofman, E. The Rosario dataset: Multisensor data for localization and mapping in agricultural environments. Int. J. Robot. Res. 2019, 38, 633–641. [Google Scholar] [CrossRef] [Green Version]
- Reis, R.; dos Santos, F.N.; Santos, L. Forest Robot and Datasets for Biomass Collection. In Advances in Intelligent Systems and Computing; Springer International Publishing: New York, NY, USA, 2019; pp. 152–163. [Google Scholar] [CrossRef]
Ref. | Agricultural Application | Localization Approach | Mapping Application | Tested in Real Scenario | Accuracy | Scalability | Availability |
---|---|---|---|---|---|---|---|
Freitas et al. [58,59] (2012) | Precision agriculture in tree fruit production | (2D) EKF-based. Wheel odometry and laser data. Uses points and lines features to match witha previously built map. | Offline orchard metric mapping for artificial landmark detection. | Yes. Three experiments with more than 1.8 km each. | Low error for flat and dry terrains ( 0.2 m). High errors in steep terrains (up to 6 m). | Yes. Long-term experiments performed. | No. The method requires a previously build map. |
Duarte et al. [60] (2015) | Autonomous navigation on steep slope vineyards. | (2D) PF-based. Fusion of GPS, wheel odometry, and previously mapped landmarks. | Offline metric mapping. Use of wireless sensors to compute landmarks location. | Yes. However, tests were done in an urban environment. | Beacons mapped with 1.5m of average error. Robot pose estimation not evaluated quantitatively. | Not tested. | No. No proper performance without a previouly built metric map. |
Zaman et al. [61] (2019) | Precision agriculture. | (2D) VO algorithm based on a cross correlation approach. | - | Yes. Tested in soil, grass, concrete, asphalt, and gravel terrains. | Normalized cumulative error of 0.08 mm for short paths. | No. System performance degrades when incrementing path lenght. | Yes. No need for first passage of thealgorithm in the agricultural field. |
Habibie et al. [62] (2017) | Monitoring of ripe fruit. | (2D) Use of the state-of-the-art Gmapping [15] and Hector SLAM [14] approaches. | Combination of metric maps. Occupancy grid map generated by the SLAM approach, and fusion with tree/fruit detection. | No. Experiments only performed in simulation. | Localization not quantitatively evaluated. Accurate detection of simulated fruits, and trees. | Not tested. | Not tested. |
Younse et al. [63] (2007) | Greenhouse spraying. | (2D) VO algorithm. Use of Kanade-Lucas- Tomasi (KLT) Feature Tracker. | - | Yes. Short-term tests in indoor environments, and outdoor with different ground surfaces. | 12.4 cm translation error for a short path of 305 cm, and 8 orientation error for a 180 rotation. | Not tested. | In theory. |
Bayar et al. [64] (2015) | Autonomous navigation in orchards. | (2D) Fuses wheel odometry and laser range data. Assumes that orchards rows length is known. Localization only relative to the rows’ lines. | - | Yes. Experiments performed in several orchards rows. | Low errors relative to the rows’ lines. No quantitatively average values provided. | Partially. Only under the assumption that the row length is known, and the localization is relative to the rows’ lines. | No. Requires the previously mentioned assumptions. |
Le et al. [65] (2018) | General agricultural tasks. | (3D) Localization based on non-linear optimization techniques. Uses the Lavenberg-Marquardt algorithm. | 3D LiDAR mapping. Uses edge and planar features extracted directly from the point cloud. | Yes. Tested both on real and simulated scenarios. | Less than 2% translation error for a 500m trajectory. | Yes. A successful long-term experiment was performed. Loop closure supported. | Yes. The system is able to perform online SLAM without any priormap. |
Cheein et al. [66] (2011) | Autonomous navigation on olive groves. | (2D) Extended IF-based SLAM. Uses a laser sensor and a monocular vision system. | Metric map composed of olive stems. Support Vector Machine used to detect stems, and laser data to map them. | Yes. Tests performed in a real olive scenario. | Successful reconstruction of the entire olive. Consistent SLAM approach. Error does not exceed 0.5m. | Yes. The method is able to be long-term consistent operating in the entire olive. | Yes. The system is able to perform online SLAM without any prior map. |
Chebrolu et al. [67] (2019) | Precision agriculture in crop fields. | (2D) PF-based. Fuses wheel odometry with camera visual data. | Offline mapping. Metric-semantic map of landmarks build from aerial images. | Yes. Real experiments performed on sugarbeet field. | Maximum error of 17 cm on a >200 m path. | Yes. Experiments show good performance on long-term paths. | Partially. Only if a previously extracted aerial map is available. |
Blok et al. [68] (2019) | Autonomous navigation in orchards. | (2D) PF-based: uses a laser beam model. KF-based: uses a line-detection algorithm. | - | Yes. Tests on two real orchard paths. | Lateral deviation errors <10 cm and angular deviation <4. | Yes. Successful tests in orchards paths in 100 m. | In theory. Does not need any prior mapping information. |
Piyathilaka et al. [69] (2011) | General agricultural tasks. | (2D) EKF-based. Fusion of a VO approach with a stereo vision range measurement system. | - | Yes. Experiments performed in real outdoor environment. | Average error of 53.84 cm on the tested sequence. | No. High cumulative error for long paths. | In theory. Does not need any prior mapping information. |
Iqbal et al. [70] | Phenotyping, plant volume and canopy height measurement. | (2D) EKF-based. Fusion 2D laser, an IMU, and GPS data. | 2D point cloud map built by successive registration. | No. Experiments performed in the Gazebo simulator. | 2.25 cm error on 32.5 m path, and 7.78 cm on 38.8 m path. | Not tested. | Yes. Does not require prior mapinformation. |
Bietresato et al. [71] | Volume reconstruction and mapping of vegetation. | (2D) Fusion of sonar, IMU and RTK GPS. | Plant volume calculation using LiDAR sensors and optical-data to obtain normalized difference vegetation index (NDVI) maps. | Yes. Mapping approach tested in indoor and outdoor environments. | Localization system not tested. | Not tested. | Yes. Does not require prior map information. |
Utstumo et al. [72] | In-row weed control. | (2D) EKF-based. Uses a forward facing monocular camera and a GPS module. | Support Vector Machine (SVM) used to extract environment features and create a spray map. | No. Localization and mapping not tested. | Localization system not tested. | Not tested. | Not tested. |
Santos et al. [8] (2020) | Autonomous navigation on steep slope vineyards. | (2D) PF-based. Fuses wheel odometry with a vision system. | Metric map composed of high-level landmarks detected using Deep Learning techniques. | Yes. One experiment done on a real vineyard. | Average error of 10.12 cm over the tested sequence. | Not tested. | Yes. The approach does not need apreviously built map. |
Ref. | Forestry Application | Localization Approach | Mapping Application | Tested in Real Scenerio | Accuracy | Scalability | Availability |
---|---|---|---|---|---|---|---|
Qian et al. [74] (2016) | Accurate forest stem mapping. | (2D) Fusion of GNSS/INS with scan-match based approach solved using the Improved Maximum Likelihood Estimation. | Metric occupancy grid map built from laser scan data. | Yes. Real field experiments performed. | Positioning accuracy of 13 cm for the field data sequence. | Yes. Successful results in long-term sequence with 800 m. | Yes. The algorithm does not required prior map information. |
Hussein et al. [75] (2015) | Autonomous navigation in forests. | (2D) Localization based on a scan-matching procedure. | Metric map of trees generated by on-board LiDAR sensors. Map matching with a global map generated from aerial orthoimagery. | Yes. Experiments performed on real forest. | Average error of <2 m for robot pose. | Partially. Long-term experiments performed, but with considerable errors. | No. Requires a map generation from aerial images. |
Li et al. [76] (2020) | Autonomous harvesting and transportation. | (2D) Map matching localization approach based on Delaunay triangulation. | 3D LiDAR-based stem mapping. | Yes. Real experiment using a forestry dataset. | Location accuracy of 12 cm on the tested sequence. | Yes. Successful results in a long-term path (200 m). | No. Requires a previously built stem map. |
Pierzchała et al. [77] (2018) | 3D forest mapping. | (3D) Graph-based SLAM. Uses the Levenberg-Marquardt method. | 3D point cloud map generated using LiDAR odometry, with graph optimization through loop closure detection. | Yes. Data recorded by authors’ robot in a forest. | SLAM system provides tree positioning accuracy—mean error of 4.76 cm. | Yes. Successful long-term real experiments in sequence with 130.7 m. | Yes. The method performs online SLAM without need of prior map. |
Rossmann et al. [78] (2013) | Autonomous navigation in forests. | (2D) PF-based. Fusion of a laser sensor and a GPS. | Offline generation of forest tree map. | Yes. However, no demonstration of results available. | Authors measure the location error in sample points in time. They claim to obtain mean error of 0.55 m. | Not tested. | Not tested. |
Miettinen et al. [79,80] (2007) | Forest harvesting. | (2D Feature-based SLAM. Computed using laser odometry. | Metric feature map, built by fusing laser data and GPS information. | Yes. Experiments on real outdoor environment. | Not tested, due to the unavailability of ground truth. | Not tested. | No tested. |
Heikki et al. [81,82] (2013) | Stem diameter measure. | (3D) Laser-odometry approach, fused with an IMU. | Metric landmark map composed of stem detections and ground estimation. | Yes. Long-term real experiment performed (260 m). | 7.1 m error for a 260 m path. | No. High localization errors reported for a long-term path. | Yes. Does not need prior map information. |
Tang et al. [83] (2015) | Biomass estimation of forest inventory. | (2D) Scan-matching based SLAM. Uses the Improved Maximum Likelihood Estimation algorithm. | Metric occupancy grid map built using laser data. | Yes. Long-term real experiment performed (300 m). | Obtained positioning error <32 cm in the real world experiment. | Yes. Successful performance in long-term experiment. | Yes. Does not need prior map information. |
Ref. | Description | Large-Scale | Long-Term |
---|---|---|---|
Chebrolu et al. [84] (2017) | Agricultural dataset for plant classification and robotics navigation on sugar beet fields. Provides data from RGB-D camera, 3D LiDAR sensors, GPS, and wheel odometry. All the sensors are calibrated extrinsically, and intrinsically. | Yes. | Yes. Recorded over a period of three months, and, on average, three times a week. |
Kragh et al. [85] (2017) | Raw sensor data from sensors mounted on a tractor in a grass mowing scenario. It includes stereo camera, thermal camera, web camera, 360 camera, LiDAR, and radar. Precise vehicle localization obtained from fusion of IMU and GNSS. | Yes. Data recorded on a large field with 2 ha. | No. The dataset has approximately 2 h, all at the same day. |
Ali et al. [86] (2020) | Dataset for visual SLAM on forests. The vehicle is equipped with four RGB cameras, an IMU, and a GNSS receiver. Sensor data is calibrated and synchronized. | Yes. Range of distance travelled varies from 1.3 km to 6.48 km. | Yes. Data recorded on summer and winter conditions. Also, different times of the day were considered. |
Pire et al. [87] (2019) | Dataset with six sequences in soybean fields. Considers harsh conditions such as repetitive scenes, reflection, rough terrain, etc. Contains data from wheel odometry, IMU, stereo camera, GPS-RTK. | Yes. Total length trajectory around 2.3 km. | Partially. Data recorded in two separate days, but in the same time of the year, and the day. |
Reis et al. [88] (2019) | Dataset containing data from different sensors such as 3D laser data, thermal camera, inertial units, GNSS, and RGB camera in forest environments. | Yes. Data recorded in three different large-scale forests. | No information. |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Aguiar, A.S.; dos Santos, F.N.; Cunha, J.B.; Sobreira, H.; Sousa, A.J. Localization and Mapping for Robots in Agriculture and Forestry: A Survey. Robotics 2020, 9, 97. https://doi.org/10.3390/robotics9040097
Aguiar AS, dos Santos FN, Cunha JB, Sobreira H, Sousa AJ. Localization and Mapping for Robots in Agriculture and Forestry: A Survey. Robotics. 2020; 9(4):97. https://doi.org/10.3390/robotics9040097
Chicago/Turabian StyleAguiar, André Silva, Filipe Neves dos Santos, José Boaventura Cunha, Héber Sobreira, and Armando Jorge Sousa. 2020. "Localization and Mapping for Robots in Agriculture and Forestry: A Survey" Robotics 9, no. 4: 97. https://doi.org/10.3390/robotics9040097
APA StyleAguiar, A. S., dos Santos, F. N., Cunha, J. B., Sobreira, H., & Sousa, A. J. (2020). Localization and Mapping for Robots in Agriculture and Forestry: A Survey. Robotics, 9(4), 97. https://doi.org/10.3390/robotics9040097