Assessing the Performance of RGB-D Sensors for 3D Fruit Crop Canopy Characterization under Different Operating and Lighting Conditions
Abstract
:1. Introduction
2. Materials and Methods
2.1. Experimental Set-Up
2.2. Evaluation Parameters
2.2.1. Resolution
2.2.2. Accuracy
2.2.3. Repeatability
2.2.4. Penetrability
2.2.5. Colour and NIR
2.2.6. Statistical Analysis
3. Results
3.1. Experiment I: Effect of Daylight Illuminance
3.1.1. Point Cloud Resolution
3.1.2. Point Cloud Accuracy
3.1.3. Repeatability of the Measurements
3.1.4. Spectral results: Color and NIR
3.2. Experiment II: Effect of the Distance from the Target
4. Discussion
5. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Henry, P.; Krainin, M.; Herbst, E.; Ren, X.; Fox, D. RGB-D mapping: Using Kinect-style depth cameras for dense 3D modeling of indoor environments. Int. J. Rob. Res. 2012, 31, 647–663. [Google Scholar] [CrossRef] [Green Version]
- Sanz, R.; Llorens, J.; Escolà, A.; Arnó, J.; Planas, S.; Román, C.; Rosell-Polo, J.R. LIDAR and non-LIDAR-based canopy parameters to estimate the leaf area in fruit trees and vineyard. Agric. For. Meteorol. 2018, 260, 229–239. [Google Scholar] [CrossRef]
- Gené-Mola, J.; Sanz-Cortiella, R.; Rosell-Polo, J.R.; Morros, J.-R.; Ruiz-Hidalgo, J.; Vilaplana, V.; Gregorio, E. Fruit detection and 3D location using instance segmentation neural networks and structure-from-motion photogrammetry. Comput. Electron. Agric. 2020, 169. [Google Scholar] [CrossRef]
- Sarbolandi, H.; Lefloch, D.; Kolb, A. Kinect range sensing: Structured-light versus Time-of-Flight Kinect. Comput. Vis. Image Underst. 2015. [Google Scholar] [CrossRef] [Green Version]
- Dal Mutto, C.; Zanuttigh, P.; Cortelazzo, G. Time-of-Flight Cameras and Microsoft KinectTM; Springer Science & Business Media: New York, NY, USA, 2012. [Google Scholar]
- Giancola, S.; Valenti, M.; Sala, R. A survey on 3D cameras: Metrological comparison of time-of-flight, structured-light and active stereoscopy technologies. In Springer Briefs in Computer Science; Springer: Berlin/Heidelberg, Germany, 2018. [Google Scholar]
- Gené-Mola, J.; Vilaplana, V.; Rosell-Polo, J.R.; Morros, J.R.; Ruiz-Hidalgo, J.; Gregorio, E. Multi-modal deep learning for Fuji apple detection using RGB-D cameras and their radiometric capabilities. Comput. Electron. Agric. 2019, 162, 689–698. [Google Scholar] [CrossRef]
- Nguyen, T.T.; Vandevoorde, K.; Wouters, N.; Kayacan, E.; De Baerdemaeker, J.G.; Saeys, W. Detection of red and bicoloured apples on tree with an RGB-D camera. Biosyst. Eng. 2016, 146, 33–44. [Google Scholar] [CrossRef]
- Nissimov, S.; Goldberger, J.; Alchanatis, V. Obstacle detection in a greenhouse environment using the Kinect sensor. Comput. Electron. Agric. 2015, 113, 104–115. [Google Scholar] [CrossRef]
- Xiong, Y.; Peng, C.; Grimstad, L.; From, P.J.; Isler, V. Development and field evaluation of a strawberry harvesting robot with a cable-driven gripper. Comput. Electron. Agric. 2019, 157, 392–402. [Google Scholar] [CrossRef]
- Andújar, D.; Dorado, J.; Fernández-Quintanilla, C.; Ribeiro, A. An approach to the use of depth cameras for weed volume estimation. Sensors 2016, 16, 972. [Google Scholar] [CrossRef] [Green Version]
- Gai, J.; Tang, L.; Steward, B.L. Automated crop plant detection based on the fusion of color and depth images for robotic weed control. J. F. Robot. 2020, 37, 35–52. [Google Scholar] [CrossRef]
- Chéné, Y.; Rousseau, D.; Lucidarme, P.; Bertheloot, J.; Caffier, V.; Morel, P.; Belin, É.; Chapeau-Blondeau, F. On the use of depth camera for 3D phenotyping of entire plants. Comput. Electron. Agric. 2012, 82, 122–127. [Google Scholar] [CrossRef]
- Xia, C.; Wang, L.; Chung, B.K.; Lee, J.M. In situ 3D segmentation of individual plant leaves using a RGB-D camera for agricultural automation. Sensors 2015, 15, 20463–20479. [Google Scholar] [CrossRef] [PubMed]
- Li, D.; Xu, L.; Tan, C.; Goodman, E.D.; Fu, D.; Xin, L. Digitization and visualization of greenhouse tomato plants in indoor environments. Sensors 2015, 15, 4019–4051. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Nock, C.; Taugourdeau, O.; Delagrange, S.; Messier, C. Assessing the potential of low-cost 3D cameras for the rapid measurement of plant woody structure. Sensors 2013, 13, 16216–16233. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Paulus, S.; Behmann, J.; Mahlein, A.K.; Plümer, L.; Kuhlmann, H. Low-cost 3D systems: Suitable tools for plant phenotyping. Sensors 2014, 14, 3001–3018. [Google Scholar] [CrossRef] [Green Version]
- Azzari, G.; Goulden, M.L.; Rusu, R.B. Rapid characterization of vegetation structure with a microsoft kinect sensor. Sensors 2013, 13, 2384–2398. [Google Scholar] [CrossRef] [Green Version]
- Rosell-Polo, J.R.; Cheein, F.A.; Gregorio, E.; Andújar, D.; Puigdomènech, L.; Masip, J.; Escolà, A. Advances in Structured Light Sensors Applications in Precision Agriculture and Livestock Farming. Adv. Agron. 2015, 133, 71–112. [Google Scholar] [CrossRef]
- Andújar, D.; Fernández-Quintanilla, C.; Dorado, J. Matching the best viewing angle in depth cameras for biomass estimation based on poplar seedling geometry. Sensors 2015, 15, 12999–13011. [Google Scholar] [CrossRef] [Green Version]
- Andújar, D.; Ribeiro, A.; Fernández-Quintanilla, C.; Dorado, J. Using depth cameras to extract structural parameters to assess the growth state and yield of cauliflower crops. Comput. Electron. Agric. 2016, 122, 67–73. [Google Scholar] [CrossRef]
- Vázquez-arellano, M.; Griepentrog, H.W.; Reiser, D.; Paraforos, D.S. 3-D Imaging Systems for Agricultural Applications—A Review. Sensors 2016, 16, 618. [Google Scholar] [CrossRef] [Green Version]
- Hämmerle, M.; Höfle, B. Direct derivation of maize plant and crop height from low-cost time-of-flight camera measurements. Plant Methods 2016, 12. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Vázquez-Arellano, M.; Paraforos, D.S.; Reiser, D.; Garrido-Izard, M.; Griepentrog, H.W. Determination of stem position and height of reconstructed maize plants using a time-of-flight camera. Comput. Electron. Agric. 2018, 154, 276–288. [Google Scholar] [CrossRef]
- Bao, Y.; Tang, L.; Srinivasan, S.; Schnable, P.S. Field-based architectural traits characterisation of maize plant using time-of-flight 3D imaging. Biosyst. Eng. 2019, 178, 86–101. [Google Scholar] [CrossRef]
- Vázquez-arellano, M.; Reiser, D.; Paraforos, D.S.; Garrido-izard, M.; Edgar, M.; Burce, C.; Griepentrog, H.W. 3-D reconstruction of maize plants using a time-of- fl ight camera. Comput. Electron. Agric. 2018, 145, 235–247. [Google Scholar] [CrossRef]
- Rosell-Polo, J.R.J.R.; Gregorio, E.; Gene, J.; Llorens, J.; Torrent, X.; Arno, J.; Escolà, A.; Escola, A. Kinect v2 Sensor-based Mobile Terrestrial Laser Scanner for Agricultural Outdoor Applications. IEEE/ASME Trans. Mechatron. 2017, 22, 2420–2427. [Google Scholar] [CrossRef] [Green Version]
- Bengochea-Guevara, J.M.; Andújar, D.; Sanchez-Sardana, F.L.; Cantuña, K.; Ribeiro, A. A low-cost approach to automatically obtain accurate 3D models of woody crops. Sensors 2018, 18, 30. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Andújar, D.; Dorado, J.; Bengochea-Guevara, J.M.; Conesa-Muñoz, J.; Fernández-Quintanilla, C.; Ribeiro, Á. Influence of Wind Speed on RGB-D Images in Tree Plantations. Sensors 2017, 17, 914. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zhang, J.; He, L.; Karkee, M.; Zhang, Q.; Zhang, X.; Gao, Z. Branch detection for apple trees trained in fruiting wall architecture using depth features and Regions-Convolutional Neural Network (R-CNN). Comput. Electron. Agric. 2018, 155, 386–393. [Google Scholar] [CrossRef]
- Milella, A.; Marani, R.; Petitti, A.; Reina, G. In-field high throughput grapevine phenotyping with a consumer-grade depth camera. Comput. Electron. Agric. 2019, 156, 293–306. [Google Scholar] [CrossRef]
- Dong, W.; Roy, P.; Isler, V. Semantic Mapping for Orchard Environments by Merging Two-Sides Reconstructions of Tree Rows. J. F. Robot. 2018, 37, 97–121. [Google Scholar] [CrossRef]
- Vit, A.; Shani, G. Comparing RGB-D Sensors for Close Range Outdoor Agricultural Phenotyping. Sensors 2018, 18, 4413. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Gené-Mola, J.; Llorens, J.; Rosell-Polo, J.R.; Gregorio, E.; Arnó, J.; Solanelles-Batlle, F.; Martinez-Casasnovas, J.A.; Escolà, A. KEvOr dataset. Zenodo 2020. [Google Scholar] [CrossRef]
- Gené-Mola, J.; Llorens, J.; Rosell-Polo, J.R.; Gregorio, E.; Arnó, J.; Solanelles-Batlle, F.; Martinez-Casasnovas, J.A.; Escolà, A. Matlab implementation to evaluate RGB-D sensor performance in orchard environments. GitHub Repos. 2020, in press. [Google Scholar]
- Gené-Mola, J.; Gregorio, E.; Guevara, J.; Auat, F.; Sanz-cortiella, R.; Escolà, A.; Llorens, J.; Morros, J.-R.; Ruiz-Hidalgo, J.; Vilaplana, V.; et al. Fruit detection in an apple orchard using a mobile terrestrial laser scanner. Biosyst. Eng. 2019, 187, 171–184. [Google Scholar] [CrossRef]
- Rodríguez-Gonzálvez, P.; Gonzalez-Aguilera, D.; González-Jorge, H.; Hernández-López, D. Low-Cost Reflectance-Based Method for the Radiometric Calibration of Kinect 2. IEEE Sens. J. 2016, 16, 1975–1985. [Google Scholar] [CrossRef]
- Rosell, J.R.; Sanz, R. A review of methods and applications of the geometric characterization of tree crops in agricultural activities. Comput. Electron. Agric. 2012, 81, 124–141. [Google Scholar] [CrossRef] [Green Version]
- Pfeiffer, S.A.; Guevara, J.; Cheein, F.A.; Sanz, R. Mechatronic terrestrial LiDAR for canopy porosity and crown surface estimation. Comput. Electron. Agric. 2018, 146, 104–113. [Google Scholar] [CrossRef]
- Méndez Perez, R.; Cheein, F.A.; Rosell-Polo, J.R. Flexible system of multiple RGB-D sensors for measuring and classifying fruits in agri-food Industry. Comput. Electron. Agric. 2017, 139, 231–242. [Google Scholar] [CrossRef] [Green Version]
- Nguyen, T.T.; Slaughter, D.C.; Max, N.; Maloof, J.N.; Sinha, N. Structured light-based 3D reconstruction system for plants. Sensors 2015, 15, 18587–18612. [Google Scholar] [CrossRef] [Green Version]
- Payne, A.; Walsh, K.; Subedi, P.; Jarvis, D. Estimating mango crop yield using image analysis using fruit at “stone hardening” stage and night time imaging. Comput. Electron. Agric. 2014, 100, 160–167. [Google Scholar] [CrossRef]
- Li, N.; Zhang, X.; Zhang, C.; Ge, L.; He, Y.; Wu, X. Review of machine-vision-based plant detection technologies for robotic weeding. In Proceedings of the 2019 IEEE International Conference on Robotics and Biomimetics (ROBIO), Dali, China, 6–8 December 2019; pp. 2370–2377. [Google Scholar]
- Arad, B.; Balendonck, J.; Barth, R.; Ben-Shahar, O.; Edan, Y.; Hellström, T.; Hemming, J.; Kurtser, P.; Ringdahl, O.; Tielen, T.; et al. Development of a sweet pepper harvesting robot. J. F. Robot. 2020, 37, 1027–1039. [Google Scholar] [CrossRef]
- Gongal, A.; Silwal, A.; Amatya, S.; Karkee, M.; Zhang, Q.; Lewis, K. Apple crop-load estimation with over-the-row machine vision system. Comput. Electron. Agric. 2016, 120, 26–35. [Google Scholar] [CrossRef]
- Halmetschlager-Funek, G.; Suchi, M.; Kampel, M.; Vincze, M. An empirical evaluation of ten depth cameras: Bias, precision, lateral noise, different lighting conditions and materials, and multiple sensor setups in indoor environments. IEEE Robot. Autom. Mag. 2019, 26, 67–77. [Google Scholar] [CrossRef]
- Kuan, Y.W.; Ee, N.O.; Wei, L.S. Comparative study of intel R200, Kinect v2, and primesense RGB-D sensors performance outdoors. IEEE Sens. J. 2019, 19, 8741–8750. [Google Scholar] [CrossRef]
RGB Frame Resolution | 1920 × 1080 pixels |
RGB frame rate | 30 Hz |
Infrared/Depth frame resolution | 512 × 424 pixels |
Infrared/Depth frame rate | 30 Hz |
Infrared/Depth field of view | 70° horizontal × 60° vertical |
Depth range | 0.5–4.5 m |
Parameter | Metrics | Units |
---|---|---|
Resolution | points | |
points m−3 | ||
Accuracy | mm | |
mm | ||
Repeatability | mm | |
mm | ||
Penetrability | Points distribution in depth | % |
Mean point cloud depth | m | |
Standard deviation | m | |
Colour | Mean hue (H) | % |
Mean saturation (S) | % | |
Mean brightness (V) | % | |
NIR | Mean NIR intensity | DN |
Mean NIRc intensity | DN m2 |
K2S1 | K2S2 | |
---|---|---|
Min. illuminance to measure [lx] | 2.3 | 1 |
Max. illuminance to measure [lx] | 42100 | 54100 |
[mm] | 3.7 a | 6.1 a |
[mm] | 7.3 a | 4.4 a |
[mm] | 6.6 | 7.5 |
[mm] | 9.5 | 6.0 |
Class | Lux Range [lx] | Precision Error [mm] | Standard Deviation [mm] | ||||
---|---|---|---|---|---|---|---|
K2S1 | K2S2 | K2S3 | K2S1 | K2S2 | K2S3 | ||
1 | 0 to 250 | 8.54 a | 4.62 b | 7.86 a | 15.90 α | 11.82 β | 15.80 α |
2 | 250 to 1000 | 10.76 a | 7.64 b | 6.74 c | 13.53 β | 13.49 β | 19.35 α |
3 | 1000 to 4000 | 11.76 a | 7.89 b | 12.50 a | 20.84 β | 16.38 β | 24.78 α |
4 | 4000 to 16000 | 13.43 a | 8.61 b | 14.14 a | 26.17 β | 23.28 γ | 30.18 α |
5 | 16000 to 64000 | 25.86 a | 9.88 b | -- | 54.72 α | 31.75 β | -- |
K2S4 (2.5 m) | K2S5 (1.5 m) | Difference | |
---|---|---|---|
Point cloud density [points m−3] | 215 × 103 | 646 × 103 | 200.5 % |
Penetrability [m] | 0.922 | 0.772 | 16.3 % |
Hue (H) [%] | 46.5 | 45.9 | 1.3 % |
Saturation (S) [%] | 25.5 | 27.6 | 8.2 % |
Brightness (V) [%] | 41.1 | 53.6 | 30.4 % |
Mean NIR intensity [DN] | 1205 | 3996 | 231.6 % |
Mean NIRc intensity [DN m2] | 6526 | 6404 | 1.9 % |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gené-Mola, J.; Llorens, J.; Rosell-Polo, J.R.; Gregorio, E.; Arnó, J.; Solanelles, F.; Martínez-Casasnovas, J.A.; Escolà, A. Assessing the Performance of RGB-D Sensors for 3D Fruit Crop Canopy Characterization under Different Operating and Lighting Conditions. Sensors 2020, 20, 7072. https://doi.org/10.3390/s20247072
Gené-Mola J, Llorens J, Rosell-Polo JR, Gregorio E, Arnó J, Solanelles F, Martínez-Casasnovas JA, Escolà A. Assessing the Performance of RGB-D Sensors for 3D Fruit Crop Canopy Characterization under Different Operating and Lighting Conditions. Sensors. 2020; 20(24):7072. https://doi.org/10.3390/s20247072
Chicago/Turabian StyleGené-Mola, Jordi, Jordi Llorens, Joan R. Rosell-Polo, Eduard Gregorio, Jaume Arnó, Francesc Solanelles, José A. Martínez-Casasnovas, and Alexandre Escolà. 2020. "Assessing the Performance of RGB-D Sensors for 3D Fruit Crop Canopy Characterization under Different Operating and Lighting Conditions" Sensors 20, no. 24: 7072. https://doi.org/10.3390/s20247072
APA StyleGené-Mola, J., Llorens, J., Rosell-Polo, J. R., Gregorio, E., Arnó, J., Solanelles, F., Martínez-Casasnovas, J. A., & Escolà, A. (2020). Assessing the Performance of RGB-D Sensors for 3D Fruit Crop Canopy Characterization under Different Operating and Lighting Conditions. Sensors, 20(24), 7072. https://doi.org/10.3390/s20247072