3D Camera and Single-Point Laser Sensor Integration for Apple Localization in Spindle-Type Orchard Systems
Abstract
:1. Introduction
2. Related Work
2.1. RGB-D Cameras for Apple Localization for Harvesting Systems
2.2. Integrated Sensor Systems for Apple Harvesting
- The RealSense D455f color frame was integrated with a single-point laser range sensor as a new low-cost high-accuracy integrated depth sensing system to determine the values for obtaining accurate localization coordinates of apples in a spindle-type orchard for the development of a robotic arm.
- The 3D camera integrated with a single-point laser ranger at different levels of light conditions was used to increase the accuracy of the localization of apples and to analyze the ability of the 3D camera to overcome the illumination effect during the day.
3. Materials and Methods
3.1. Data Preparation for Apple Detection
3.2. Development of an Integrated Sensor System
3.3. Calibration of the 3D Camera and Laser Range Finder
3.4. Setup of Indoor and Outdoor Experiments for Obtaining Static Depth Values
4. Results
4.1. Indoor Experimental Results
4.2. Results of the Outdoor Experiment
5. Discussion
5.1. Deep Learning-Based EfficientDet Detection Network
5.2. Integrated Sensing System
5.3. Application Environment
6. Conclusions
- The EfficientDet deep learning-based detection network [email protected] of 0.775 was capable of accurately detecting apples under different light conditions with a RealSense D455f camera from spindle-type orchard datasets.
- The developed integrated sensing system, combined with a RealSense D455f 3D camera and a single-pointed laser ranger mounted on two servo motors, could provide accurate depth values of ±2 cm compared to 3D camera positional information.
- The integrated sensing system was used under different light conditions. In the spindle-type orchard conditions, the RMSE values of the RGB-D camera depth values and integrated sensing systems varied from 3.91 to 8.36 cm and from 1.26 to 2.13 cm, respectively, at different times of day and under different environmental conditions.
- The developed low-cost high-accuracy integrated systems can be incorporated with robotic systems to localize apples under outdoor static conditions for harvesting apples at tree locations in spindle-type orchards.
- The apple localization coordinates (X, Y, and Z) values can be obtained from the proposed integrated system, and the coordinates transferring to the robotic arm can be done based on the calibration process between the robotic arm and the vision system.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Kang, H.; Zhou, H.; Wang, X.; Chen, C. Real-time fruit recognition and grasping estimation for robotic apple harvesting. Sensors 2020, 20, 5670. [Google Scholar] [CrossRef] [PubMed]
- Maheswari, P.; Raja, P.; Apolo-Apolo, O.E.; Pérez-Ruiz, M. Intelligent fruit yield estimation for orchards using deep learning based semantic segmentation techniques—A review. Front. Plant Sci. 2021, 12, 2603. [Google Scholar] [CrossRef] [PubMed]
- Wang, W.; Hu, T.; Gu, J. Edge-cloud cooperation driven self-adaptive exception control method for the smart factory. Adv. Eng. Inform. 2022, 51, 101493. [Google Scholar] [CrossRef]
- Gené-Mola, J.; Vilaplana, V.; Rosell-Polo, J.R.; Morros, J.R.; Ruiz-Hidalgo, J.; Gregorio, E. Multi-modal deep learning for fuji apple detection using RGB-d cameras and their radiometric capabilities. Comput. Electron. Agric. 2019, 162, 689–698. [Google Scholar] [CrossRef]
- Gongal, A.; Silwal, A.; Amatya, S.; Karkee, M.; Zhang, Q.; Lewis, K. Apple crop-load estimation with over-the-row machine vision system. Comput. Electron. Agric. 2016, 120, 26–35. [Google Scholar] [CrossRef]
- Gongal, A.; Karkee, M.; Amatya, S. Apple fruit size estimation using a 3D machine vision system. Inf. Process. Agric. 2018, 5, 498–503. [Google Scholar] [CrossRef]
- Zhang, Z.; Igathinathane, C.; Li, J.; Cen, H.; Lu, Y.; Flores, P. Technology progress in mechanical harvest of fresh market apples. Comput. Electron. Agric. 2020, 175, 105606. [Google Scholar] [CrossRef]
- Kang, H.; Chen, C. Fast implementation of real-time fruit detection in apple orchards using deep learning. Comput. Electron. Agric. 2020, 168, 105108. [Google Scholar] [CrossRef]
- Chu, P.; Li, Z.; Lammers, K.; Lu, R.; Liu, X. Deep learning-based apple detection using a suppression mask R-CNN. Pattern Recognit. Lett. 2021, 147, 206–211. [Google Scholar] [CrossRef]
- Koutsos, A.; Tuohy, K.M.; Lovegrove, J.A. Apples and cardiovascular health—Is the gut microbiota a core consideration? Nutrients 2015, 7, 3959–3998. [Google Scholar] [CrossRef]
- Jia, W.; Zhang, Y.; Lian, J.; Zheng, Y.; Zhao, D.; Li, C. Apple harvesting robot under information technology: A review. Int. J. Adv. Robot. Syst. 2020, 17, 1729881420925310. [Google Scholar] [CrossRef]
- Pourdarbani, R.; Sabzi, S.; Hernández-Hernández, M.; Hernández-Hernández, J.L.; García-Mateos, G.; Kalantari, D.; Molina-Martínez, J.M. Comparison of different classifiers and the majority voting rule for the detection of plum fruits in garden conditions. Remote Sens. 2019, 11, 2546. [Google Scholar] [CrossRef]
- Silwal, A.; Davidson, J.R.; Karkee, M.; Mo, C.; Zhang, Q.; Lewis, K. Design, integration, and field evaluation of a robotic apple harvester. J. Field Robot. 2017, 34, 1140–1159. [Google Scholar] [CrossRef]
- Zhang, K.; Lammers, K.; Chu, P.; Li, Z.; Lu, R. System design and control of an apple harvesting robot. Mechatronics 2021, 79, 102644. [Google Scholar] [CrossRef]
- Gongal, A.; Amatya, S.; Karkee, M.; Zhang, Q.; Lewis, K. Sensors and systems for fruit detection and localization: A review. Comput. Electron. Agric. 2015, 116, 8–19. [Google Scholar] [CrossRef]
- Kuang, H.; Liu, C.; Chan, L.L.H.; Yan, H. Multi-class fruit detection based on image region selection and improved object proposals. Neurocomputing 2018, 283, 241–255. [Google Scholar] [CrossRef]
- Zhang, C.; Liu, T.; Xiao, J.; Lam, K.M.; Wang, Q. Boosting object detectors via strong-classification weak-localization pretraining in remote sensing imagery. In Proceedings of IEEE Transactions on Instrumentation and Measurement; Art no. 5026520; IEEE: New York, NY, USA, 2023; Volume 72, pp. 1–20. [Google Scholar] [CrossRef]
- Osipov, A.; Pleshakova, E.; Bykov, A.; Kuzichkin, O.; Surzhik, D.; Suvorov, S.; Gataullin, S. Machine learning methods based on geophysical monitoring data in low time delay mode for drilling optimization. IEEE Access 2023, 11, 60349–60364. [Google Scholar] [CrossRef]
- Zong, Z.; Song, G.; Liu, Y. DETRs with collaborative hybrid assignments training. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Paris, France, 2–6 October 2023; pp. 6748–6758. [Google Scholar]
- Liu, S.; Ren, T.; Chen, J.; Zeng, Z.; Zhang, H.; Li, F.; Li, H.; Huang, J.; Su, H.; Zhu, J.; et al. Detection transformer with stable matching. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Paris, France, 2–6 October 2023; pp. 6491–6500. [Google Scholar]
- Huang, Z.; Zhang, P.; Liu, R.; Li, D. Immature apple detection method based on improved Yolov3. IECE Trans. Internet Things 2021, 1, 9–13. [Google Scholar] [CrossRef]
- Yan, B.; Fan, P.; Lei, X.; Liu, Z.; Yang, F. A real-time apple targets detection method for picking robot based on improved YOLOv5. Remote Sens. 2020, 13, 1619. [Google Scholar] [CrossRef]
- Ma, L.; Zhao, L.; Wang, Z.; Zhang, J.; Chen, G. Detection and counting of small target apples under complicated environments by using improved YOLOv7-tiny. Agronomy 2023, 13, 1419. [Google Scholar] [CrossRef]
- Fu, L.; Majeed, Y.; Zhang, X.; Karkee, M.; Zhang, Q. Faster R–CNN–based apple detection in dense-foliage fruiting-wall trees using RGB and depth features for robotic harvesting. Biosyst. Eng. 2020, 197, 245–256. [Google Scholar] [CrossRef]
- Abeyrathna, R.M.; Nakaguchi, V.M.; Minn, A.; Ahamed, T. Recognition and counting of apples in a dynamic state using a 3D camera and deep learning algorithms for robotic harvesting systems. Sensors 2022, 23, 3810. [Google Scholar] [CrossRef] [PubMed]
- Bargoti, S.; Underwood, J.P. Image segmentation for fruit detection and yield estimation in apple orchards. J. Field Robot. 2017, 34, 1039–1060. [Google Scholar] [CrossRef]
- Linker, R. A procedure for estimating the number of green mature apples in nighttime orchard images using light distribution and its application to yield estimation. Precis. Agric. 2017, 18, 59–75. [Google Scholar] [CrossRef]
- Gao, F.; Fu, L.; Zhang, X.; Majeed, Y.; Li, R.; Karkee, M.; Zhang, Q. Multi-class fruit-on-plant detection for apple in snap system using faster R-CNN. Comput. Electron. Agric. 2020, 176, 105634. [Google Scholar] [CrossRef]
- Sa, I.; Ge, Z.; Dayoub, F.; Upcroft, B.; Perez, T.; McCool, C. DeepFruits: A fruit detection system using deep neural networks. Sensors 2016, 16, 1222. [Google Scholar] [CrossRef]
- Bulanon, D.M.; Burks, T.F.; Alchanatis, V. Study on temporal variation in citrus canopy using thermal imaging for citrus fruit detection. Biosyst. Eng. 2008, 101, 161–171. [Google Scholar] [CrossRef]
- Feng, J.; Zeng, L.; He, L. Apple fruit recognition algorithm based on multi-spectral dynamic image analysis. Sensors 2019, 19, 949. [Google Scholar] [CrossRef]
- Sanz, R.; Llorens, J.; Escolà, A.; Arnó, J.; Planas, S.; Román, C.; Rosell-Polo, J.R. LIDAR and non-LIDAR-based canopy parameters to estimate the leaf area in fruit trees and vineyard. Agric. For. Meteorol. 2018, 260–261, 229–239. [Google Scholar] [CrossRef]
- Robin, C.; Lacroix, S. Multi-robot target detection and tracking: Taxonomy and survey. Auton Robots. 2016, 40, 729–760. [Google Scholar] [CrossRef]
- Ji, W.; Meng, X.; Qian, Z.; Xu, B.; Zhao, D. Branch localization method based on the skeleton feature extraction and stereo matching for apple harvesting robot. Int. J. Adv. Robot. Syst. 2017, 14, 1729881417705276. [Google Scholar] [CrossRef]
- Nguyen, T.T.; Vandevoorde, K.; Wouters, N.; Kayacan, E.; De Baerdemaeker, J.G.; Saeys, W. Detection of red and bicolored apples on tree with an RGB-D camera. Biosyst. Eng. 2016, 146, 33–44. [Google Scholar] [CrossRef]
- Liu, Y.; Jiang, J.; Sun, J.; Bai, L.; Wang, Q. A survey of depth estimation based on computer vision. In Proceedings of the 2020 IEEE Fifth International Conference on Data Science in Cyberspace (DSC), Hong Kong, China, 27–29 July 2020; pp. 135–141. [Google Scholar] [CrossRef]
- Luhmann, T.; Fraser, C.; Maas, H. Sensor modelling and camera calibration for close-range photogrammetry. J. Photogramm. Remote Sens. 2016, 115, 37–46. [Google Scholar] [CrossRef]
- Zhong, H.; Wang, H.; Wu, Z.; Zhang, C.; Zheng, Y.; Tang, T. A survey of LiDAR and camera fusion enhancement. Procedia Comput. Sci. 2020, 183, 579–588. [Google Scholar] [CrossRef]
- Tadic, V. Intel RealSense D400 Series Product Family Datasheet; Document Number: 337029-005; New Technologies Group, Intel Corporation: Satan Clara, CA, USA, 2019. [Google Scholar]
- Tadic, V.; Toth, A.; Vizvari, Z.; Klincsik, M.; Sari, Z.; Sarcevic, P.; Sarosi, J.; Biro, I. Perspectives of RealSense and ZED depth sensors for robotic vision applications. Machines 2022, 10, 183. [Google Scholar] [CrossRef]
- Grunnet-Jepsen, A.; Sweetser, J.N.; Woodfill, J. Best-Known-Methods for Tuning Intel® RealSense™ D400 Depth Cameras for Best Performance; Revision 1.9; New Technologies Group, Intel Corporation: Satan Clara, CA, USA, 2018. [Google Scholar]
- Wang, X.; Kang, H.; Zhou, H.; Au, W.; Chen, C. Geometry-aware fruit grasping estimation for robotic harvesting in apple orchards. Comput. Electron. Agric. 2022, 193, 106716. [Google Scholar] [CrossRef]
- Kang, H.; Wang, X.; Chen, C. Accurate fruit localisation using high resolution LiDAR-camera fusion and instance segmentation. Comput. Electron. Agric. 2022, 203, 107450. [Google Scholar] [CrossRef]
- Zhang, K.; Lammers, K.; Chu, P.; Li, Z.; Lu, R. An automated apple harvesting robot—From system design to field evaluation. J. Field Robot. 2023. [Google Scholar] [CrossRef]
- Zhang, K.; Chu, P.; Lammers, K.; Li, Z.; Lu, R. Active laser-camera scanning for high-precision fruit localization in robotic harvesting: System design and calibration. Horticulturae 2023, 10, 40. [Google Scholar] [CrossRef]
- Abeyrathna, R.M.; Nakaguchi, V.M.; Ahamed, T. Localization of apples at the dynamic stage for robotic arm operations based on EfficientDet and CenterNet detection neural networks. In Proceedings of the Joint Conference of Agricultural and Environmental Engineering Related Societies Conference, Tsukuba, Japan, 8 September 2023. [Google Scholar]
- Krakhmalev, O.; Gataullin, S.; Boltachev, E.; Korchagin, S.; Blagoveshchensky, I.; Liang, K. Robotic complex for harvesting apple crops. Robotics 2022, 11, 77. [Google Scholar] [CrossRef]
- Xiong, Z.; Feng, Q.; Li, T.; Xie, F.; Liu, C.; Liu, L.; Guo, X.; Zhao, C. Dual-Manipulator Optimal Design for Apple Robotic Harvesting. Agronomy 2022, 12, 3128. [Google Scholar] [CrossRef]
[email protected] | Precision | Recall | |
---|---|---|---|
EfficientDet [25] | 0.775 | 0.950 | 0.950 |
YOLOv4 [25] | 0.840 | 0.840 | 0.790 |
YOLOv5 [25] | 0.861 | 0.874 | 0.783 |
YOLOv7 [25] | 0.905 | 0.892 | 0.828 |
Indoor | Outdoor | ||||||
---|---|---|---|---|---|---|---|
Light intensity (lx) | 476~600 | 1963~2000 ×10 (10 a.m., JST, cloudy day) | 3470~3600 ×10 (11 a.m., JST, cloudy day) | 4519~4700 ×10 (3 p.m., JST, cloudy day) | 7710~7900 ×10 (10 a.m., JST, sunny day) | 8800~8900 ×10 (11:30 a.m., JST, sunny day) | 1023~1100 ×100 (1:30 p.m., JST, sunny day) |
3D camera (cm) | 3.91 | 6.52 | 6.24 | 6.67 | 6.66 | 8.05 | 8.36 |
Integrated sensing system (cm) | 1.62 | 1.82 | 1.26 | 2.13 | 1.56 | 1.46 | 1.39 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Abeyrathna, R.M.R.D.; Nakaguchi, V.M.; Liu, Z.; Sampurno, R.M.; Ahamed, T. 3D Camera and Single-Point Laser Sensor Integration for Apple Localization in Spindle-Type Orchard Systems. Sensors 2024, 24, 3753. https://doi.org/10.3390/s24123753
Abeyrathna RMRD, Nakaguchi VM, Liu Z, Sampurno RM, Ahamed T. 3D Camera and Single-Point Laser Sensor Integration for Apple Localization in Spindle-Type Orchard Systems. Sensors. 2024; 24(12):3753. https://doi.org/10.3390/s24123753
Chicago/Turabian StyleAbeyrathna, R. M. Rasika D., Victor Massaki Nakaguchi, Zifu Liu, Rizky Mulya Sampurno, and Tofael Ahamed. 2024. "3D Camera and Single-Point Laser Sensor Integration for Apple Localization in Spindle-Type Orchard Systems" Sensors 24, no. 12: 3753. https://doi.org/10.3390/s24123753
APA StyleAbeyrathna, R. M. R. D., Nakaguchi, V. M., Liu, Z., Sampurno, R. M., & Ahamed, T. (2024). 3D Camera and Single-Point Laser Sensor Integration for Apple Localization in Spindle-Type Orchard Systems. Sensors, 24(12), 3753. https://doi.org/10.3390/s24123753