Multimodal Mobile Robotic Dataset for a Typical Mediterranean Greenhouse: The GREENBOT Dataset
Abstract
:1. Introduction
2. Related Work
2.1. Synthetic Datasets
2.2. Real Datasets
2.2.1. Outdoors
Dataset | Real/ Synthetic | Indoor/ Outdoor | Environment | Main Task | RGB | Depth | GPS | IMU | LiDAR | Ground Truth | Publicly Available |
---|---|---|---|---|---|---|---|---|---|---|---|
[25] | Synthetic | Indoor | Room | SLAM | Yes | Yes | No | No | No | No | No |
[26] | Synthetic | Outdoor | Urban | Semantic segmentation for navigation | Yes | Yes | No | No | No | Pose | Yes |
[27] | Synthetic | Outdoor | Mountain | SLAM | No | No | Yes | Yes | Yes | Pose | Yes |
[28] | Real | Outdoor | Open field | SLAM | Yes | Yes | No | No | No | Semantic label | No |
[30] | Synthetic | Outdoor | Different open field | Semantic segmentation for recognise | Yes | Yes | No | No | No | No | Yes |
[7] | Real | Outdoor | Urban | SLAM | Yes | No | Yes | No | Yes | Pose | Yes |
[6] | Real | Outdoor | Urban | SLAM | Yes | Yes | Yes | Yes | Yes | Pose | Yes |
[31] | Real | Outdoor | Urban | SLAM | Yes | No | Yes | No | Yes | Pose | Yes |
[32] | Real | Outdoor | Urban | SLAM | No | No | Yes | Yes | Yes | Pose | Yes |
[33] | Real | Outdoor | Lake and rivers | Measuring | No | No | Yes | Yes | Yes | Pose | No |
[34] | Real | Outdoor | Rivers | SLAM | Yes | No | Yes | Yes | No | No | Yes |
[35] | Real | Outdoor | Various open field | Semantic segmentation for navigation | Yes | Yes | No | No | No | Pose | Yes |
[36] | Real | Outdoor | Open field | Pest detection | Yes | Yes | No | No | No | No | No |
[8] | Real | Outdoor | Soybean field | SLAM | Yes | Yes | Yes | Yes | No | Pose | Yes |
[9] | Real | Outdoor | Sugar beet field | Weeding | Yes | Yes | Yes | Yes | Yes | Pose | Yes |
[10] | Real | Outdoor | Apple field | SLAM | Yes | Yes | Yes | No | Yes | Pose | Yes |
[37] | Real | Outdoor | Building | SLAM | No | No | Yes | Yes | Yes | Pose | Yes |
[38] | Real | Indoor | House | SLAM | Yes | No | No | No | No | Pose | No |
[39] | Real | Indoor | Laboratory | SLAM | Yes | Yes | No | No | No | Pose | Yes |
[40] | Real | Outdoor | Laboratory scale greenhouse | Navigation techniques | Yes | No | No | No | No | No | No |
[41] | Real | Indoor/ Outdoor | Mountain and building | SLAM | Yes | No | Yes | Yes | No | Pose | No |
GREENBOT | Real | Indoor | Mediterranean greenhouse | SLAM | Yes | Yes | No | Yes | Yes | Pose | Yes |
- Urban environments
- Open field environments
2.2.2. Indoors
- Building environments
- Agriculture environments
3. Platform Description
- Bumblebee stereo camera BB2-08S2 (Santa Cruz, CA, USA) (https://www.flir.com/support/products/bumblebee2-firewire/ accessed on 10 March 2024): It has two lenses for capturing synchronized stereo images. It connected to the PC through a FireWire IEEE 1394 interface, providing a data transmission speed of 800 Mb/s. The stored data has a resolution of 1032 × 776 pixels. The horizontal field of view is 97°, and the vertical field of view is 66° within a range of 0.3 m to 20 m. Recording was performed at 10 Hz, with a maximum frame rate of 20 fps. This camera was placed on the front so that the images captured were ideal for obstacle identification for a robot. The collected images were stored in .bag format (ROS storages).
- Velodyne VLP16 LiDAR (San Jose, CA, USA) (https://velodynelidar.com/products/puck/ accessed on 10 March 2024): Velodyne’s LiDAR VLP16 has a maximum range of 100 m; a 360° horizontal and 30° vertical field of view was used. It was placed at the top of the runway to have the whole field of view available, except for the operator. Data were recorded with a frequency of 10 Hz.
- Ouster OS0 LiDAR (San Francisco, CA, USA) (https://ouster.com/products/hardware/os0-lidar-sensor accessed on 10 March 2024): Data were also collected from the Ouster model OS0. This is a 32-channel LiDAR with a 360° horizontal and 90° vertical field of view, ranging between 0.3 m and 50 m. The data rate was also left at the default value of 10 Hz, including the onboard 6-axis IMU. This sensor was placed in a structure 20 cm lower than the Velodyne, obstructing its view and, hence, only collecting data with an effective horizontal field of view of 275°. The point cloud output of both LiDARs were stored together with the .bag mentioned in the stereo camera, accessed with the different tools in ROS.
3.1. Computing Platform
3.2. Calibration
3.2.1. Intrinsic Parameters
3.2.2. Extrinsic Parameters
- Stereo camera: x = right; y = down; z = forward.
- LIDARs: x = forward; y = left; z = up.
4. GREENBOT Dataset
4.1. Greenhouse and Crop
4.2. Data Acquisition
4.3. Data Structure
5. SLAM Suitability Assesment
5.1. Mapping Velodyne VLP16
5.2. Mapping-Ouster OS0
6. Conclusions and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
CUDA | Compute unified device architecture |
DDR4 | Double data rate type four synchronous dynamic random-access memory |
GPS | Global positioning system |
Hum | Relative humidity [%] |
IMU | Inertial measurement unit |
Ir | Irradiance [W/m2] |
LiDAR | Light imaging detection and ranging |
MOLA | Modular Optimization framework for Localisation and mApping |
MRPT | Mobile robot programming toolkit |
RAM | Random access memory |
RGB | Red, green, and blue |
RGB-D | Red, green, and blue, and depth (color + depth channels) |
ROS | Robot operating system |
SLAM | Simultaneous localization and mapping |
Temp | Temperature [°C] |
UAL | University of Almería |
References
- Fan, M.; Shen, J.; Yuan, L.; Jiang, R.; Chen, X.; Davies, W.J.; Zhang, F. Improving crop productivity and resource use efficiency to ensure food security and environmental quality in China. J. Exp. Bot. 2012, 63, 13–24. [Google Scholar] [CrossRef]
- Evans, R.G.; Sadler, E.J. Methods and technologies to improve efficiency of water use. Water Resour. Res. 2008, 44, 1–15. [Google Scholar] [CrossRef]
- Bindraban, P.S.; Dimkpa, C.; Nagarajan, L.; Roy, A.; Rabbinge, R. Revisiting fertilisers and fertilisation strategies for improved nutrient uptake by plants. Biol. Fertil. Soils 2015, 51, 897–911. [Google Scholar] [CrossRef]
- Bechar, A.; Vigneault, C. Agricultural robots for field operations: Concepts and components. Biosyst. Eng. 2016, 149, 94–111. [Google Scholar] [CrossRef]
- Yan, Y.; Zhang, B.; Zhou, J.; Zhang, Y.; Liu, X. Real-Time Localization and Mapping Utilizing Multi-Sensor Fusion and Visual–IMU–Wheel Odometry for Agricultural Robots in Unstructured, Dynamic and GPS-Denied Greenhouse Environments. Agronomy 2022, 12, 1740. [Google Scholar] [CrossRef]
- Geiger, A.; Lenz, P.; Stiller, C.; Urtasun, R. Vision meets robotics: The kitti dataset. Int. J. Robot. Res. 2013, 32, 1231–1237. [Google Scholar] [CrossRef]
- Maddern, W.; Pascoe, G.; Linegar, C.; Newman, P. 1 year, 1000 km: The oxford robotcar dataset. Int. J. Robot. Res. 2017, 36, 3–15. [Google Scholar] [CrossRef]
- Pire, T.; Mujica, M.; Civera, J.; Kofman, E. The Rosario dataset: Multisensor data for localization and mapping in agricultural environments. Int. J. Robot. Res. 2019, 38, 633–641. [Google Scholar] [CrossRef]
- Chebrolu, N.; Lottes, P.; Schaefer, A.; Winterhalter, W.; Burgard, W.; Stachniss, C. Agricultural robot dataset for plant classification, localization and mapping on sugar beet fields. Int. J. Robot. Res. 2017, 36, 1045–1052. [Google Scholar] [CrossRef]
- Marzoa Tanco, M.; Trinidad Barnech, G.; Andrade, F.; Baliosian, J.; Llofriu, M.; Di Martino, J.; Tejera, G. Magro dataset: A dataset for simultaneous localization and mapping in agricultural environments. Int. J. Robot. Res. 2023. [Google Scholar] [CrossRef]
- Rijswick, V. World Vegetable Map 2018. Global Trade Still Fruitful; Technical Report; RaboResearch Food & Agribussiness: Utrecht, The Netherlands, 2018. [Google Scholar]
- Hick, G.W. World Greenhouse Vegetable Statistics-2019 Updates Cuesta Roble Greenhouse Vegetable Consulting; Technical Report; HortiDaily: Columbus, OH, USA, 2019. [Google Scholar]
- Sanchez-Hermosilla, J.; Rodriguez, F.; Gonzalez, R.; Luis, J.; Berenguel, M. A Mechatronic Description of an Autonomous Mobile Robot for Agricultural Tasks in Greenhouses. In Mobile Robots Navigation; InTech: London, UK, 2010. [Google Scholar] [CrossRef]
- Sánchez-Hermosilla, J.; González, R.; Rodríguez, F.; Donaire, J. Mechatronic Description of a Laser Autoguided Vehicle for Greenhouse Operations. Sensors 2013, 13, 769–784. [Google Scholar] [CrossRef]
- CAGPDR. Síntesis de la Campaña de Hortícolas protegidos de Almería-Campaña 2020/21. Observatorio de la Consejería de Agricultura, Ganadería, Pesca y Desarrollo Sostenible de la Junta de Andalucía de Precios y Mercados del Sector Hortícolas Protegidos; CAGPDR: Sevilla, Spain, 2021. [Google Scholar]
- Kondo, N.; Monta, M.; Noguchi, N. Agricultural Robots: Mechanisms and Practice; Kyoto University Press: Kyoto, Japan, 2011; p. 348. [Google Scholar]
- Rodríguez, F.; Moreno, J.C.; Sánchez, J.A.; Berenguel, M. Grasping in agriculture: State-of-the-art and main characteristics. In Grasping in Robotics. Mechanisms and Machine Science; Carbone, G., Ed.; Springer: London, UK, 2013; Volume 10, pp. 385–409. [Google Scholar] [CrossRef]
- Sánchez-Molina, J.; Rodríguez, F.; Moreno, J.; Sánchez-Hermosilla, J.; Giménez, A. Robotics in greenhouses. Scoping review. Comput. Electron. Agric. 2024, 219, 108750. [Google Scholar] [CrossRef]
- Mandow, A.; Gomez-de Gabriel, J.; Martinez, J.L.; Munoz, V.F.; Ollero, A.; Garcia-Cerezo, A. The autonomous mobile robot AURORA for greenhouse operation. IEEE Robot. Autom. Mag. 1996, 3, 18–28. [Google Scholar] [CrossRef]
- Feng, Q.; Zou, W.; Fan, P.; Zhang, C.; Wang, X. Design and test of robotic harvesting system for cherry tomato. Int. J. Agric. Biol. Eng. 2018, 11, 96–100. [Google Scholar] [CrossRef]
- Acaccia, G.; Michelini, R.; Molfino, R.; Razzoli, R. Mobile robots in greenhouse cultivation: Inspection and treatment of plants. In Proceedings of the Memories, 1st International Workshop on Advances in Services Robotics, Bardolino, Italy, 15 March 2003. [Google Scholar]
- Kurata, K. Cultivation of grafted vegetables II. Development of grafting robots in Japan. HortScience 1994, 29, 240–244. [Google Scholar] [CrossRef]
- González Sánchez, R.; Rodríguez Díaz, F.; Sánchez-Hermosilla López, J.; García Donaire, J. Navigation techniques for mobile robots in greenhouses. Appl. Eng. Agric. 2009, 25, 153–165. [Google Scholar] [CrossRef]
- López-Gázquez, A.; Mañas-Alvarez, F.J.; Moreno, J.C.; Cañadas-Aránega, F. Navigation of a Differential Robot for Transporting Tasks in Mediterranean Greenhouses. In Proceedings of the International Symposium on New Technologies for Sustainable Greenhouse Systems (GreenSys), Cancún, México, 22 October 2023. [Google Scholar]
- Li, W.; Saeedi, S.; McCormac, J.; Clark, R.; Tzoumanikas, D.; Ye, Q.; Huang, Y.; Tang, R.; Leutenegger, S. Interiornet: Mega-scale multi-sensor photo-realistic indoor scenes dataset. arXiv 2018, arXiv:1809.00716. [Google Scholar]
- Ros, G.; Sellart, L.; Materzynska, J.; Vazquez, D.; Lopez, A.M. The synthia dataset: A large collection of synthetic images for semantic segmentation of urban scenes. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 3234–3243. [Google Scholar]
- Giubilato, R.; Stürzl, W.; Wedler, A.; Triebel, R. Challenges of slam in extremely unstructured environments: The dlr planetary stereo, solid-state lidar, inertial dataset. IEEE Robot. Autom. Lett. 2022, 7, 8721–8728. [Google Scholar] [CrossRef]
- Yang, Y.; Tang, D.; Wang, D.; Song, W.; Wang, J.; Fu, M. Multi-camera visual SLAM for off-road navigation. Robot. Auton. Syst. 2020, 128, 103505. [Google Scholar] [CrossRef]
- Sinha, R.; Quirós, J.J.; Sankaran, S.; Khot, L.R. High resolution aerial photogrammetry based 3D mapping of fruit crop canopies for precision inputs management. Inf. Process. Agric. 2022, 9, 11–23. [Google Scholar] [CrossRef]
- Lu, Y.; Young, S. A survey of public datasets for computer vision tasks in precision agriculture. Comput. Electron. Agric. 2020, 178, 105760. [Google Scholar] [CrossRef]
- Majdik, A.L.; Till, C.; Scaramuzza, D. The Zurich urban micro aerial vehicle dataset. Int. J. Robot. Res. 2017, 36, 269–273. [Google Scholar] [CrossRef]
- Blanco-Claraco, J.L.; Moreno-Duenas, F.A.; González-Jiménez, J. The Málaga urban dataset: High-rate stereo and LiDAR in a realistic urban scenario. Int. J. Robot. Res. 2014, 33, 207–214. [Google Scholar] [CrossRef]
- Bandini, F.; Jakobsen, J.; Olesen, D.; Reyna-Gutierrez, J.A.; Bauer-Gottwein, P. Measuring water level in rivers and lakes from lightweight Unmanned Aerial Vehicles. J. Hydrol. 2017, 548, 237–250. [Google Scholar] [CrossRef]
- Miller, M.; Chung, S.J.; Hutchinson, S. The visual–inertial canoe dataset. Int. J. Robot. Res. 2018, 37, 13–20. [Google Scholar] [CrossRef]
- de Silva, R.; Cielniak, G.; Gao, J. Towards agricultural autonomy: Crop row detection under varying field conditions using deep learning. arXiv 2021, arXiv:2109.08247. [Google Scholar]
- Liu, L.; Wang, R.; Xie, C.; Yang, P.; Wang, F.; Sudirman, S.; Liu, W. PestNet: An end-to-end deep learning approach for large-scale multi-class pest detection and classification. IEEE Access 2019, 7, 45301–45312. [Google Scholar] [CrossRef]
- Karam, S.; Nex, F.; Chidura, B.T.; Kerle, N. Microdrone-based indoor mapping with graph slam. Drones 2022, 6, 352. [Google Scholar] [CrossRef]
- Kirsanov, P.; Gaskarov, A.; Konokhov, F.; Sofiiuk, K.; Vorontsova, A.; Slinko, I.; Zhukov, D.; Bykov, S.; Barinova, O.; Konushin, A. Discoman: Dataset of indoor scenes for odometry, mapping and navigation. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Venetian Macao, Macau, China, 3–8 November 2019; pp. 2470–2477. [Google Scholar]
- Lee, T.J.; Kim, C.H.; Cho, D.I.D. A monocular vision sensor-based efficient SLAM method for indoor service robots. IEEE Trans. Ind. Electron. 2018, 66, 318–328. [Google Scholar] [CrossRef]
- Martin, J.; Ansuategi, A.; Maurtua, I.; Gutierrez, A.; Obregón, D.; Casquero, O.; Marcos, M. A generic ROS-based control architecture for pest inspection and treatment in greenhouses using a mobile manipulator. IEEE Access 2021, 9, 94981–94995. [Google Scholar] [CrossRef]
- Brostow, G.J.; Fauqueur, J.; Cipolla, R. Semantic object classes in video: A high-definition ground truth database. Pattern Recognit. Lett. 2009, 30, 88–97. [Google Scholar] [CrossRef]
- Duggal, V.; Sukhwani, M.; Bipin, K.; Reddy, G.S.; Krishna, K.M. Plantation monitoring and yield estimation using autonomous quadcopter for precision agriculture. In Proceedings of the 2016 IEEE international conference on robotics and automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 5121–5127. [Google Scholar]
- Blanco, J.L.; Moreno, F.A.; Gonzalez, J. A collection of outdoor robotic datasets with centimeter-accuracy ground truth. Auton. Robot. 2009, 27, 327–351. [Google Scholar]
- Wang, Y.; Chen, X.; Liu, P. Statistical multipath model based on experimental GNSS data in static urban canyon environment. Sensors 2018, 18, 1149. [Google Scholar] [CrossRef]
- Ko, M.H.; Ryuh, B.S.; Kim, K.C.; Suprem, A.; Mahalik, N.P. Autonomous greenhouse mobile robot driving strategies from system integration perspective: Review and application. IEEE/ASME Trans. Mechatron. 2014, 20, 1705–1716. [Google Scholar] [CrossRef]
- Matsuzaki, S.; Masuzawa, H.; Miura, J.; Oishi, S. 3D semantic mapping in greenhouses for agricultural mobile robots with robust object recognition using robots’ trajectory. In Proceedings of the 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Miyazaki, Japan, 7–10 October 2018; pp. 357–362. [Google Scholar]
- Marks, E.; Bömer, J.; Magistri, F.; Sah, A.; Behley, J.; Stachniss, C. BonnBeetClouds3D: A Dataset Towards Point Cloud-based Organ-level Phenotyping of Sugar Beet Plants under Field Conditions. arXiv 2023, arXiv:2312.14706. [Google Scholar]
- Mañas-Álvarez, F.J.; Guinaldo, M.; Dormido, R.; Dormido, S. Robotic Park. Multi-Agent Platform for Teaching Control and Robotics. IEEE Access 2023, 11, 34899–34911. [Google Scholar] [CrossRef]
- Hernández, J.; Bonachela, S.; Granados, M.R.; López, J.C.; Magán, J.J.; Montero, J.I. Microclimate and agronomical effects of internal impermeable screens in an unheated Mediterranean greenhouse. Biosyst. Eng. 2017, 163, 66–77. [Google Scholar] [CrossRef]
- Blanco-Claraco, J.L.; Koukis, N.; Laux, H.; Alice, N.; Briales, J.; Monroy, J.; jotaraul; Tarifa, M.J.; Sahdev, R.; Fernandez-Moral, E.; et al. MRPT/mrpt. Release of v2.11.7; CERN: Genève, Switzerland, 2024. [Google Scholar] [CrossRef]
- Grupp, M. evo: Python Package for the Evaluation of Odometry and SLAM. 2017. Available online: https://github.com/MichaelGrupp/evo (accessed on 10 March 2024).
- Blanco-Claraco, J.L. A Modular Optimization Framework for Localization and Mapping. In Proceedings of the Robotics: Science and Systems, Freiburg im Breisgau, Germany, 22–26 June 2019; pp. 1–15. [Google Scholar]
- Kittas, C.; Katsoulas, N.; Bartzanas, T.; Bakker, J.C. 4. Greenhouse climate control and energy use. In Good Agricultural Practices for Greenhouse Vegetable Crops; ISHS/FAO/NCARE: Rome, Italy, 2013; pp. 63–95. [Google Scholar]
Sequence | Length [m] | Duration [s] | Section | Greenhouse Condition | Description |
---|---|---|---|---|---|
2022_10_05 | 459.25 | 696 | B | Temp: 27.01 °C Hum: 62.72% Ir: 106.7 W/m2 | Sunny, morning plant height 0.96 m |
2022_10_14 | 457.36 | 701 | B | Temp: 23.78 °C Hum: 63.12% Ir: 79.0 W/m2 | Cloudy, morning plant height 1.05 m |
2022_10_19 | 1321.21 | 1432 | A and B | Temp: 25.21 °C Hum: 75.83% Ir: 87.9 W/m2 | Sunny, morning plant height 1.12 m |
2022_10_26 | 1432.08 | 1463 | A and B | Temp: 23.22 °C Hum: 60.09% Ir: 66.4 W/m2 | Cloudy, morning plant height 1.35 m |
2022_11_02 | 1233.87 | 1486 | A and B | Temp: 15.84 °C Hum: 72.35% Ir: 70.7 W/m2 | Cloudy, morning plant height 1.41 m |
2022_11_09 | 1293.29 | 1532 | A and B | Temp: 16.23 °C Hum: 62.7% Ir: 82.7 W/m2 | Cloudy, morning plant height 1.53 m |
2022_11_19 | 1332.58 | 1752 | A and B | Temp: 17.45 °C Hum: 62.7% Ir: 64.36 W/m2 | Sunny, morning plant height 1.60 m |
2022_11_23 | 1428.45 | 1692 | A and B | Temp: 15.28 °C Hum: 62.7% Ir: 69.4 W/m2 | Cloudy, morning plant height 1.73 m |
2022_11_30 | 1440.32 | 1730 | A and B | Temp: 16.05 °C Hum: 62.7% Ir: 74.4 W/m2 | Cloudy, morning plant height 1.85 m |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Cañadas-Aránega, F.; Blanco-Claraco, J.L.; Moreno, J.C.; Rodriguez-Diaz, F. Multimodal Mobile Robotic Dataset for a Typical Mediterranean Greenhouse: The GREENBOT Dataset. Sensors 2024, 24, 1874. https://doi.org/10.3390/s24061874
Cañadas-Aránega F, Blanco-Claraco JL, Moreno JC, Rodriguez-Diaz F. Multimodal Mobile Robotic Dataset for a Typical Mediterranean Greenhouse: The GREENBOT Dataset. Sensors. 2024; 24(6):1874. https://doi.org/10.3390/s24061874
Chicago/Turabian StyleCañadas-Aránega, Fernando, Jose Luis Blanco-Claraco, Jose Carlos Moreno, and Francisco Rodriguez-Diaz. 2024. "Multimodal Mobile Robotic Dataset for a Typical Mediterranean Greenhouse: The GREENBOT Dataset" Sensors 24, no. 6: 1874. https://doi.org/10.3390/s24061874
APA StyleCañadas-Aránega, F., Blanco-Claraco, J. L., Moreno, J. C., & Rodriguez-Diaz, F. (2024). Multimodal Mobile Robotic Dataset for a Typical Mediterranean Greenhouse: The GREENBOT Dataset. Sensors, 24(6), 1874. https://doi.org/10.3390/s24061874