A Comprehensive Review of the Research of the “Eye–Brain–Hand” Harvesting System in Smart Agriculture
Abstract
:1. Introduction
2. Intelligent Harvesting “Eye” System
2.1. Perception Hardware System
2.1.1. Object Perception Based on Binocular Vision
2.1.2. Target Perception Based on Multi-Sensor Combination
2.2. Target Perception Methods
2.2.1. Image Preprocessing Methods
2.2.2. Perception Methods Based on Target Features
2.2.3. Feature Fusion-Based Perception Methods
2.2.4. Perception Methods Based on Deep Learning
One-Stage | Model Based on | Applied Crops | Data | Evaluation Indicators | Feature | Ref | |||
Total | Training Sets | Testing Sets | Detection Speed | Others | |||||
YOLOV3 | Litchi | 545 | - | - | 26 ms | mAP: 96.43% | The detection speed is faster than Faster RCNN and SSD, enabling real-time detection | [77] | |
Light-YOLOv3 | Green Mane | 500 | - | - | 192 fps (5.21 ms) | FLOPs: 10.12 BN Volume: 44 MB F1-score: 97.7% | The problem of insufficient location and semantic information in YOLOv3 prediction feature maps is solved, and the operation speed is improved by 5 times | [80] | |
YOLOv3-tiny | Tomato | - | 5500 | - | 25 fps (40 ms) | F1-score: 91.92% | Adapts to detection in complex environments and to embedded devices | [81] | |
Yolov3-DPN | Virgin fruit | 1825 | 1460 | 365 | 58 ms | Precision Light changes: 93.54% Fruit shading: 94.59% F1-score: 94.18% | Richer semantic features of small targets can be extracted and information loss in the propagation process can be reduced | [82] | |
R-YOLO | Strawberry | 2000 | 1900 | 100 | 56 ms | Precision: 94.43% Recall: 93.46% | Detection speed is 3.6 times faster than YOLOv3, with good real-time performance | [79] | |
DSE-YOLO | Strawberry | 21,921 | 14,614 | 7307 | 18.2 fps (55 ms) | mAP: 86.58% F1-score: 81.59% | Better detection of small fruits and more accurate differentiation of different stages of fruits | [83] | |
YOLOv4 | Kiwifruit | 1160 | 928 | 232 | 25.5 ms | mAP: 91.9%. | More detailed classification of the dataset can improve the detection of YOLOv4 | [78] | |
YOLOV4-CBAM | Sweet Pepper | - | - | 100 | - | Positioning accuracy: 89.55%. F1-score: 91.84% | Compared to YOLO-V4, YOLO-V4- cbam has a higher F1 score | [47] | |
Deep sort- YOLOv4 | Strawberry | - | - | - | - | Cluster picking success rate: 62.4% | The cluster selecting success rate increased by 36.8% from the previous rate to 62.4%. | [56] | |
YOLOv4 | Banana | 1164 | 835 | 120 Validation set (Vs): 209 | 171 ms | Detection rate: 99.29% AP: 0.9995 | - | [76] | |
Improved-YOLOv4 | Plum | 1890 | 1512 | 378 | 42.55 fps (23.5 ms) | mAP: 88.56% | 77.85% size compression and 112% faster detection than YOLOv4 | [85] | |
Improved-Yolov4_tiny | Green Pepper | 1500 | 1355 | 145 | 89 fps (11.24 ms) | AP: 95.11%, Precision: 96.91% Recall: 93.85% | It can ensure real-time production and can effectively improve the detection of difficult samples of green pepper. | [84] | |
Improved-YOLOv5s | Apple | 1214 | 1014 | 200 | 66.7 fps (15 ms) | Recall: 91.48% Precision: 83.83% mAP: 86.75% F1-score: 87.49% | It can effectively identify apples that are obscured by leaves and branches | [86] | |
SSD | Mushroom | 4300 | 4000 | 300 | F1-score: 0.951 | - | [87] | ||
Two-Stage | Faster-RCNN | Apple | 800 | 560 | 120 Vs: 120 | 181 ms | AP: 0.893 | The VGG16 foreground- rgb image has an AP of up to 0.893, allowing for almost real-time monitoring | [55] |
Mask-RCNN | Tomato | - | - | 500 | 456 ms | Iou: 0.916 | The segmentation accuracy is effectively improved by the model trained based on RGB-D-I fused images | [54] | |
Mask-RCNN | Strawberry | 2000 | 1900 | 100 | 8 fps (125 ms) | MIou: 89.85% AP: 95.78% Recall: 95.41% | - | [74] | |
Mask-RCNN | Citrus | - | 1000 | - | - | MAP Fruits: 88.15% Branches: 96.27% | It can effectively detect citrus and tree branches at the same time, and can plan pick-up paths and perform reasonable obstacle avoidance. | [75] | |
FCN | Guava | 437 | 350 | 87 | 565 ms | Mean Accuracy0.893 IOU: 0.806 | - | [12] | |
R-FCN | Apple Orange Banana | 160,000 | 80,000 | 40,000 Vs: 40,000 | Accuracy Apple: 97.66% Orange: 96.50% Banana: 82.30% | Better robustness in real-world engineering | [88] | ||
Deeplabv3 | Litchi | - | - | 90 | 464 ms | Precision: 83.33% IOU: 79.46% | - | [89] | |
DeepLabV3+ | Litchi | 65,625 | 50,000 | 15,625 | - | MIoU: 0.765 | MIoU improves 0.144 over the original DeepLabV3+ model, while having stronger robustness and higher detection accuracy | [90] | |
DASNet | - | 1277 | 567 | 560 Vs: 150 | 477 ms | Precision: 0.88 F1-score: 0.871 Recall: 0.868 IoU: 0.862 | - | [52] |
3. Intelligent Harvesting “Brain” System
3.1. Spatial Partitioning and Task Allocation
3.1.1. Single Mechanical Arm Harvesting
3.1.2. Multi-Mechanical Arm Harvesting
3.2. Obstacle Avoidance Strategies
3.2.1. Passive Obstacle Avoidance Strategies
3.2.2. Active Obstacle Avoidance Strategies
3.3. Path-Planning Techniques
3.3.1. Classic Path-Planning Algorithms
3.3.2. Machine Learning-Based Path-Planning Algorithms
3.3.3. Deep Learning-Based Path-Planning Algorithms
3.3.4. Optimization Algorithm-Based Path-Planning Strategies
3.4. Control Methods
3.4.1. Classical and Modern Control Methods
3.4.2. Intelligent Control Methods
4. Intelligent Picking “Hand” System
4.1. End-Effector Modes of Operation
4.1.1. Negative-Pressure Adsorption End-Effectors
4.1.2. Shearing-Style End-Effectors
4.1.3. Cavity Retrieval End-Effectors
4.1.4. Flexible Gripping End-Effectors
4.2. Overview of Harvesting Effect Evaluation Indicators
5. Challenges and Prospects
5.1. Challenges
5.1.1. Multi-Species, Multi-Form Fruit, and Vegetable Picking Is More Difficult
5.1.2. Difficulty in Picking in Complex Environment
5.1.3. High Real-Time Requirements
5.1.4. Few Research on Walking Platforms and Navigation
5.1.5. The Working Height of Picking Robot Is Generally Limited
5.1.6. High Costs
5.2. Prospects
5.2.1. Modular Harvesting Robot
5.2.2. Sensor Fusion and Algorithm Optimization
5.2.3. Strengthening Research on Walking Platform and Navigation Algorithm
5.2.4. The Development of Picking Drones
5.2.5. Multi-Robot Collaborative Operation
5.2.6. Reducing Costs
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Gil, G.; Casagrande, D.E.; Cortés, L.P.; Verschae, R. Why the Low Adoption of Robotics in the Farms? Challenges for the Establishment of Commercial Agricultural Robots. Smart Agric. Technol. 2023, 3, 100069. [Google Scholar] [CrossRef]
- Suresh Kumar, M.; Mohan, S. Selective Fruit Harvesting: Research, Trends and Developments towards Fruit Detection and Localization—A Review. Proc. Inst. Mech. Eng. C J. Mech. Eng. Sci. 2023, 237, 1405–1444. [Google Scholar] [CrossRef]
- Rose, D.C.; Bhattacharya, M. Adoption of Autonomous Robots in the Soft Fruit Sector: Grower Perspectives in the UK. Smart Agric. Technol. 2023, 3, 100118. [Google Scholar] [CrossRef]
- Tang, Q.; Luo, Y.W.; Wu, X. Di Research on the Evaluation Method of Agricultural Intelligent Robot Design Solutions. PLoS ONE 2023, 18, e0281554. [Google Scholar] [CrossRef] [PubMed]
- Kuta, Ł.; Li, Z.; Stopa, R.; Komarnicki, P.; Słupska, M. The Influence of Manual Harvesting on the Quality of Picked Apples and the Picker’s Muscle Load. Comput. Electron. Agric. 2020, 175, 105511. [Google Scholar] [CrossRef]
- Liu, J.; Peng, Y.; Faheem, M. Experimental and Theoretical Analysis of Fruit Plucking Patterns for Robotic Tomato Harvesting. Comput. Electron. Agric. 2020, 173, 105330. [Google Scholar] [CrossRef]
- Xiong, Y.; Ge, Y.; From, P.J. Push and Drag: An Active Obstacle Separation Method for Fruit Harvesting Robots. In Proceedings of the IEEE International Conference on Robotics and Automation, Paris, France, 31 May–31 August 2020. [Google Scholar]
- Li, K.; Qi, Y. Motion Planning of Robot Manipulator for Cucumber Picking. In Proceedings of the 2018 3rd International Conference on Robotics and Automation Engineering, ICRAE 2018, Guangzhou, China, 17–19 November 2018. [Google Scholar]
- Wu, Z.; Du, H. Artificial Intelligence in Agricultural Picking Robot Displacement Trajectory Tracking Control Algorithm. Wirel Commun. Mob. Comput. 2022, 2022, 3105909. [Google Scholar] [CrossRef]
- Wang, Z.; Xun, Y.; Wang, Y.; Yang, Q. Review of Smart Robots for Fruit and Vegetable Picking in Agriculture. Int. J. Agric. Biol. Eng. 2022, 15, 33–54. [Google Scholar] [CrossRef]
- Li, Y.; Feng, Q.; Li, T.; Xie, F.; Liu, C.; Xiong, Z. Advance of Target Visual Information Acquisition Technology for Fresh Fruit Robotic Harvesting: A Review. Agronomy 2022, 12, 1336. [Google Scholar] [CrossRef]
- Lin, G.; Tang, Y.; Zou, X.; Xiong, J.; Li, J. Guava Detection and Pose Estimation Using a Low-Cost RGB-D Sensor in the Field. Sensors 2019, 19, 428. [Google Scholar] [CrossRef]
- Zheng, C.; Chen, P.; Pang, J.; Yang, X.; Chen, C.; Tu, S.; Xue, Y. A Mango Picking Vision Algorithm on Instance Segmentation and Key Point Detection from RGB Images in an Open Orchard. Biosyst. Eng. 2021, 206, 32–54. [Google Scholar] [CrossRef]
- Garillos-Manliguez, C.A.; Chiang, J.Y. Multimodal Deep Learning and Visible-Light and Hyperspectral Imaging for Fruit Maturity Estimation. Sensors 2021, 21, 1288. [Google Scholar] [CrossRef]
- Xu, N.; Song, Y.; Meng, Q. Application RFID and Wi-FI Technology in Design of IOT Sensor Terminal. In Proceedings of the Journal of Physics: Conference Series, Chongqing, China, 28–30 May 2021; Volume 1982. [Google Scholar]
- Chen, M.; Tang, Y.; Zou, X.; Huang, Z.; Zhou, H.; Chen, S. 3D Global Mapping of Large-Scale Unstructured Orchard Integrating Eye-in-Hand Stereo Vision and SLAM. Comput. Electron. Agric. 2021, 187, 106237. [Google Scholar] [CrossRef]
- Chen, M.; Tang, Y.; Zou, X.; Huang, K.; Huang, Z.; Zhou, H.; Wang, C.; Lian, G. Three-Dimensional Perception of Orchard Banana Central Stock Enhanced by Adaptive Multi-Vision Technology. Comput. Electron. Agric. 2020, 174, 105508. [Google Scholar] [CrossRef]
- Mahanti, N.K.; Pandiselvam, R.; Kothakota, A.; Ishwarya, S.P.; Chakraborty, S.K.; Kumar, M.; Cozzolino, D. Emerging Non-Destructive Imaging Techniques for Fruit Damage Detection: Image Processing and Analysis. Trends Food Sci. Technol. 2022, 120, 418–438. [Google Scholar] [CrossRef]
- Koirala, A.; Walsh, K.B.; Wang, Z.; McCarthy, C. Deep Learning—Method Overview and Review of Use for Fruit Detection and Yield Estimation. Comput. Electron. Agric. 2019, 162, 219–234. [Google Scholar] [CrossRef]
- Tang, Y.; Chen, M.; Wang, C.; Luo, L.; Li, J.; Lian, G.; Zou, X. Recognition and Localization Methods for Vision-Based Fruit Picking Robots: A Review. Front. Plant Sci. 2020, 11, 510. [Google Scholar] [CrossRef]
- Mohd Ali, M.; Hashim, N.; Abd Aziz, S.; Lasekan, O. Utilisation of Deep Learning with Multimodal Data Fusion for Determination of Pineapple Quality Using Thermal Imaging. Agronomy 2023, 13, 401. [Google Scholar] [CrossRef]
- Yang, F.; Ma, Z.; Xie, M. Image Classification with Superpixels and Feature Fusion Method. J. Electron. Sci. Technol. 2021, 19, 100096. [Google Scholar] [CrossRef]
- Shivendra; Chiranjeevi, K.; Tripathi, M.K. Detection of Fruits Image Applying Decision Tree Classifier Techniques. In Lecture Notes on Data Engineering and Communications Technologies; Springer Nature: Singapore, 2023; Volume 142. [Google Scholar]
- Zhang, C.; Wang, H.; Fu, L.H.; Pei, Y.H.; Lan, C.Y.; Hou, H.Y.; Song, H. Three-Dimensional Continuous Picking Path Planning Based on Ant Colony Optimization Algorithm. PLoS ONE 2023, 18, e0282334. [Google Scholar] [CrossRef]
- He, Z.; Ma, L.; Wang, Y.; Wei, Y.; Ding, X.; Li, K.; Cui, Y. Double-Arm Cooperation and Implementing for Harvesting Kiwifruit. Agriculture 2022, 12, 1763. [Google Scholar] [CrossRef]
- Yang, C.; Liu, Y.; Wang, Y.; Xiong, L.; Xu, H.; Zhao, W. Research and Experiment on Recognition and Location System for Citrus Picking Robot in Natural Environment. Nongye Jixie Xuebao/Trans. Chin. Soc. Agric. Mach. 2019, 50, 14–22. [Google Scholar] [CrossRef]
- Peng, H.; Shao, Y.; Chen, K.; Deng, Y.; Xue, C. Research on Multi-Class Fruits Recognition Based on Machine Vision and SVM. IFAC-Pap. 2018, 51, 817–821. [Google Scholar] [CrossRef]
- Udhaya, K.; Miruthula, R.; Pavithra, G.; Revathi, R.; Suganya, M. FPGA-Based Hardware Acceleration for Fruit Recognition Using SVM. Ir. Interdiscip. J. Sci. Res. 2022, 06, 22–29. [Google Scholar] [CrossRef]
- Xu, L.; Cao, M.; Song, B. A New Approach to Smooth Path Planning of Mobile Robot Based on Quartic Bezier Transition Curve and Improved PSO Algorithm. Neurocomputing 2022, 473, 98–106. [Google Scholar] [CrossRef]
- Guo, Y.; Wang, W.; Wu, S. Modeling Method of Mobile Robot Workspace. IEEE 2017, 2146–2150. [Google Scholar]
- Chen, W.; Xu, T.; Liu, J.; Wang, M.; Zhao, D. Picking Robot Visual Servo Control Based on Modified Fuzzy Neural Network Sliding Mode Algorithms. Electronics 2019, 8, 605. [Google Scholar] [CrossRef]
- Dai, Y.; Zhang, R.; Ma, L. Path Planning and Tracking Control of Picking Robot Based on Improved A* Algorithm. J. Chin. Agric. Mech. 2022, 43, 138. [Google Scholar] [CrossRef]
- Ma, Y.; Zhang, W.; Qureshi, W.S.; Gao, C.; Zhang, C.; Li, W. Autonomous Navigation for a Wolfberry Picking Robot Using Visual Cues and Fuzzy Control. Inf. Process. Agric. 2021, 8, 15–26. [Google Scholar] [CrossRef]
- Zhang, F.; Chen, Z.; Wang, Y.; Bao, R.; Chen, X.; Fu, S.; Tian, M.; Zhang, Y. Research on Flexible End-Effectors with Humanoid Grasp Function for Small Spherical Fruit Picking. Agriculture 2023, 13, 123. [Google Scholar] [CrossRef]
- Xu, L.; Liu, X.; Zhang, K.; Xing, J.; Yuan, Q.; Chen, J.; Duan, Z.; Ma, S.; Yu, C. Design and Test of End-Effector for Navel Orange Picking Robot. Nongye Gongcheng Xuebao/Trans. Chin. Soc. Agric. Eng. 2018, 34, 53–61. [Google Scholar] [CrossRef]
- Guo, T.; Zheng, Y.; Bo, W.; Liu, J.; Pi, J.; Chen, W.; Deng, J. Research on the Bionic Flexible End-Effector Based on Tomato Harvesting. J. Sens. 2022, 2022, 1–14. [Google Scholar] [CrossRef]
- Gharakhani, H.; Thomasson, J.A.; Lu, Y. An End-Effector for Robotic Cotton Harvesting. Smart Agric. Technol. 2022, 2, 100043. [Google Scholar] [CrossRef]
- Xiao, X.; Wang, Y.; Jiang, Y. End-Effectors Developed for Citrus and Other Spherical Crops. Appl. Sci. 2022, 12, 7945. [Google Scholar] [CrossRef]
- Hu, G.; Chen, C.; Chen, J.; Sun, L.; Sugirbay, A.; Chen, Y.; Jin, H.; Zhang, S.; Bu, L. Simplified 4-DOF Manipulator for Rapid Robotic Apple Harvesting. Comput. Electron. Agric. 2022, 199, 107177. [Google Scholar] [CrossRef]
- Chen, M.; Chen, F.; Zhou, W.; Zuo, R. Design of Flexible Spherical Fruit and Vegetable Picking End-Effector Based on Vision Recognition. In Proceedings of the Journal of Physics: Conference Series; IOP Publishing: Bristol, UK, 2022; Volume 2246. [Google Scholar]
- Gao, J.; Zhang, F.; Zhang, J.; Yuan, T.; Yin, J.; Guo, H.; Yang, C. Development and Evaluation of a Pneumatic Finger-like End-Effector for Cherry Tomato Harvesting Robot in Greenhouse. Comput. Electron. Agric. 2022, 197, 106879. [Google Scholar] [CrossRef]
- Lu, W.; Wang, P.; Du, X.; Ma, Z. Design and Experiment of a Multi-Knuckle End-Effector for Tomato Picking Robot; American Society of Agricultural and Biological Engineers: St. Joseph Charter Township, MI, USA, 2022. [Google Scholar]
- Oliveira, F.; Tinoco, V.; Magalhaes, S.; Santos, F.N.; Silva, M.F. End-Effectors for Harvesting Manipulators-State Of The Art Review. In Proceedings of the 2022 IEEE International Conference on Autonomous Robot Systems and Competitions, ICARSC 2022, Santa Maria da Feira, Portugal, 29–30 April 2022. [Google Scholar]
- Yang, Q.; Luo, S.; Chang, C.; Xun, Y.; Bao, G. Segmentation Algorithm for Hangzhou White Chrysanthemums Based on Least Squares Support Vector Machine. Int. J. Agric. Biol. Eng. 2019, 12, 127–134. [Google Scholar] [CrossRef]
- Du, X.; Yang, X.; Ji, J.; Jin, X.; Chen, L. Design and Test of a Pineapple Picking End-Effector. Appl. Eng. Agric. 2019, 35, 1045–1055. [Google Scholar] [CrossRef]
- Lin, G.; Tang, Y.; Zou, X.; Xiong, J.; Fang, Y. Color-, Depth-, and Shape-Based 3D Fruit Detection. Precis. Agric. 2020, 21, 1–17. [Google Scholar] [CrossRef]
- Ning, Z.; Luo, L.; Ding, X.M.; Dong, Z.; Yang, B.; Cai, J.; Chen, W.; Lu, Q. Recognition of Sweet Peppers and Planning the Robotic Picking Sequence in High-Density Orchards. Comput. Electron. Agric. 2022, 196, 106878. [Google Scholar] [CrossRef]
- Mu, L.; Cui, G.; Liu, Y.; Cui, Y.; Fu, L.; Gejima, Y. Design and Simulation of an Integrated End-Effector for Picking Kiwifruit by Robot. Inf. Process. Agric. 2020, 7, 58–71. [Google Scholar] [CrossRef]
- Zhang, Q.; Liu, F.; Jiang, X.; Xiong, Z.; Xu, C. Motion Planning Method and Experiments of Tomato Bunch Harvesting Manipulator. Nongye Gongcheng Xuebao/Trans. Chin. Soc. Agric. Eng. 2021, 37, 149–156. [Google Scholar]
- Yu, F.; Zhou, C.; Yang, X.; Guo, Z.; Chen, C. Design and Experiment of Tomato Picking Robot in Solar Greenhouse. Nongye Jixie Xuebao/Trans. Chin. Soc. Agric. Mach. 2022, 53, 41–49. [Google Scholar] [CrossRef]
- Li, T.; Qiu, Q.; Zhao, C.; Xie, F. Task Planning of Multi-Arm Harvesting Robots for High-Density Dwarf Orchards. Nongye Gongcheng Xuebao/Trans. Chin. Soc. Agric. Eng. 2021, 37, 1–10. [Google Scholar] [CrossRef]
- Kang, H.; Zhou, H.; Chen, C. Visual Perception and Modeling for Autonomous Apple Harvesting. IEEE Access 2020, 8, 62151–62163. [Google Scholar] [CrossRef]
- Sarabu, H.; Ahlin, K.; Hu, A.P. Graph-Based Cooperative Robot Path Planning in Agricultural Environments. In Proceedings of the IEEE/ASME International Conference on Advanced Intelligent Mechatronics, AIM, Hong Kong, China, 8–12 July 2019; Volume 2019. [Google Scholar]
- Gong, L.; Wang, W.; Wang, T.; Liu, C. Robotic Harvesting of the Occluded Fruits with a Precise Shape and Position Reconstruction Approach. J. Field Robot. 2022, 39, 69–84. [Google Scholar] [CrossRef]
- Fu, L.; Majeed, Y.; Zhang, X.; Karkee, M.; Zhang, Q. Faster R–CNN–Based Apple Detection in Dense-Foliage Fruiting-Wall Trees Using RGB and Depth Features for Robotic Harvesting. Biosyst. Eng. 2020, 197, 245–256. [Google Scholar] [CrossRef]
- Xiong, Y.; Ge, Y.; From, P.J. An Improved Obstacle Separation Method Using Deep Learning for Object Detection and Tracking in a Hybrid Visual Control Loop for Fruit Picking in Clusters. Comput. Electron. Agric. 2021, 191, 106508. [Google Scholar] [CrossRef]
- Lv, J.; Wang, Y.; Xu, L.; Gu, Y.; Zou, L.; Yang, B.; Ma, Z. A Method to Obtain the Near-Large Fruit from Apple Image in Orchard for Single-Arm Apple Harvesting Robot. Sci. Hortic. 2019, 257, 108758. [Google Scholar] [CrossRef]
- Wang, L.; Li, H.R.; Zhou, K.; Mu, B. Design of Binocular Vision System for Fruit and Vegetable Picking Based on Embedded Arm. Guangdianzi Jiguang/J. Optoelectron. Laser 2020, 31, 71–80. [Google Scholar] [CrossRef]
- Yu, X.; Fan, Z.; Wang, X.; Wan, H.; Wang, P.; Zeng, X.; Jia, F. A Lab-Customized Autonomous Humanoid Apple Harvesting Robot. Comput. Electr. Eng. 2021, 96, 107459. [Google Scholar] [CrossRef]
- Zhou, T.; Zhang, D.; Zhou, M.; Xi, H.; Chen, X. System Design of Tomatoes Harvesting Robot Based on Binocular Vision. In Proceedings of the 2018 Chinese Automation Congress, CAC 2018, Xi’an, China, 30 November 2019. [Google Scholar]
- Jin, Z.; Sun, W.; Zhang, J.; Shen, C.; Zhang, H.; Han, S. Intelligent Tomato Picking Robot System Based on Multimodal Depth Feature Analysis Method. In Proceedings of the IOP Conference Series: Earth and Environmental Science; IOP Publishing: Bristol, UK, 2020; Volume 440. [Google Scholar]
- Ye, L.; Duan, J.; Yang, Z.; Zou, X.; Chen, M.; Zhang, S. Collision-Free Motion Planning for the Litchi-Picking Robot. Comput. Electron. Agric. 2021, 185, 106151. [Google Scholar] [CrossRef]
- Oktarina, Y.; Dewi, T.; Risma, P.; Nawawi, M. Tomato Harvesting Arm Robot Manipulator; A Pilot Project. In Proceedings of the Journal of Physics: Conference Series, South Sumatera, Indonesia, 9–10 October 2020; Volume 1500. [Google Scholar]
- Feng, Q.; Zou, W.; Fan, P.; Zhang, C.; Wang, X. Design and Test of Robotic Harvesting System for Cherry Tomato. Int. J. Agric. Biol. Eng. 2018, 11, 96–100. [Google Scholar] [CrossRef]
- Sepulveda, D.; Fernandez, R.; Navas, E.; Armada, M.; Gonzalez-De-Santos, P. Robotic Aubergine Harvesting Using Dual-Arm Manipulation. IEEE Access 2020, 8, 121889–121904. [Google Scholar] [CrossRef]
- Feng, Q.; Chen, J.; Zhang, M.; Wang, X. Design and Test of Harvesting Robot for Table-Top Cultivated Strawberry. In Proceedings of the WRC SARA 2019—World Robot Conference Symposium on Advanced Robotics and Automation 2019, Beijing, China, 21–22 August 2019. [Google Scholar]
- Xiong, Y.; Ge, Y.; Grimstad, L.; From, P.J. An Autonomous Strawberry-harvesting Robot: Design, Development, Integration, and Field Evaluation. J. Field Robot. 2020, 37, 202–224. [Google Scholar] [CrossRef]
- Zhuang, J.; Hou, C.; Tang, Y.; He, Y.; Guo, Q.; Zhong, Z.; Luo, S. Computer Vision-Based Localisation of Picking Points for Automatic Litchi Harvesting Applications towards Natural Scenarios. Biosyst. Eng. 2019, 187, 1–20. [Google Scholar] [CrossRef]
- Septiarini, A.; Hamdani, H.; Hatta, H.R.; Anwar, K. Automatic Image Segmentation of Oil Palm Fruits by Applying the Contour-Based Approach. Sci. Hortic. 2020, 261, 108939. [Google Scholar] [CrossRef]
- Mao, S.; Li, Y.; Ma, Y.; Zhang, B.; Zhou, J.; Kai, W. Automatic Cucumber Recognition Algorithm for Harvesting Robots in the Natural Environment Using Deep Learning and Multi-Feature Fusion. Comput. Electron. Agric. 2020, 170, 105254. [Google Scholar] [CrossRef]
- Xiong, Y.; Peng, C.; Grimstad, L.; From, P.J.; Isler, V. Development and Field Evaluation of a Strawberry Harvesting Robot with a Cable-Driven Gripper. Comput. Electron. Agric. 2019, 157, 392–402. [Google Scholar] [CrossRef]
- Liu, D.; Shen, J.; Yang, H.; Niu, Q.; Guo, Q. Recognition and Localization of Actinidia Arguta Based on Image Recognition. EURASIP J. Image Video Process. 2019, 2019, 21. [Google Scholar] [CrossRef]
- Kurpaska, S.; Bielecki, A.; Sobol, Z.; Bielecka, M.; Habrat, M.; Śmigielski, P. The Concept of the Constructional Solution of the Working Section of a Robot for Harvesting Strawberries. Sensors 2021, 21, 3933. [Google Scholar] [CrossRef]
- Yu, Y.; Zhang, K.; Yang, L.; Zhang, D. Fruit Detection for Strawberry Harvesting Robot in Non-Structural Environment Based on Mask-RCNN. Comput. Electron. Agric. 2019, 163, 104846. [Google Scholar] [CrossRef]
- Yang, C.H.; Xiong, L.Y.; Wang, Z.; Wang, Y.; Shi, G.; Kuremot, T.; Zhao, W.H.; Yang, Y. Integrated Detection of Citrus Fruits and Branches Using a Convolutional Neural Network. Comput. Electron. Agric. 2020, 174, 105469. [Google Scholar] [CrossRef]
- Fu, L.; Duan, J.; Zou, X.; Lin, J.; Zhao, L.; Li, J.; Yang, Z. Fast and Accurate Detection of Banana Fruits in Complex Background Orchards. IEEE Access 2020, 8, 196835–196846. [Google Scholar] [CrossRef]
- Liang, C.; Xiong, J.; Zheng, Z.; Zhong, Z.; Li, Z.; Chen, S.; Yang, Z. A Visual Detection Method for Nighttime Litchi Fruits and Fruiting Stems. Comput. Electron. Agric. 2020, 169, 105192. [Google Scholar] [CrossRef]
- Suo, R.; Gao, F.; Zhou, Z.; Fu, L.; Song, Z.; Dhupia, J.; Li, R.; Cui, Y. Improved Multi-Classes Kiwifruit Detection in Orchard to Avoid Collisions during Robotic Picking. Comput. Electron. Agric. 2021, 182, 106052. [Google Scholar] [CrossRef]
- Yu, Y.; Zhang, K.; Liu, H.; Yang, L.; Zhang, D. Real-Time Visual Localization of the Picking Points for a Ridge-Planting Strawberry Harvesting Robot. IEEE Access 2020, 8, 116556–116568. [Google Scholar] [CrossRef]
- Xu, Z.F.; Jia, R.S.; Sun, H.M.; Liu, Q.M.; Cui, Z. Light-YOLOv3: Fast Method for Detecting Green Mangoes in Complex Scenes Using Picking Robots. Appl. Intell. 2020, 50, 4670–4687. [Google Scholar] [CrossRef]
- Xu, Z.F.; Jia, R.S.; Liu, Y.B.; Zhao, C.Y.; Sun, H.M. Fast Method of Detecting Tomatoes in a Complex Scene for Picking Robots. IEEE Access 2020, 8, 55289–55299. [Google Scholar] [CrossRef]
- Chen, J.; Wang, Z.; Wu, J.; Hu, Q.; Zhao, C.; Tan, C.; Teng, L.; Luo, T. An Improved Yolov3 Based on Dual Path Network for Cherry Tomatoes Detection. J. Food Process. Eng. 2021, 44, e13803. [Google Scholar] [CrossRef]
- Wang, Y.; Yan, G.; Meng, Q.; Yao, T.; Han, J.; Zhang, B. DSE-YOLO: Detail Semantics Enhancement YOLO for Multi-Stage Strawberry Detection. Comput. Electron. Agric. 2022, 198, 107057. [Google Scholar] [CrossRef]
- Li, X.; Pan, J.; Xie, F.; Zeng, J.; Li, Q.; Huang, X.; Liu, D.; Wang, X. Fast and Accurate Green Pepper Detection in Complex Backgrounds via an Improved Yolov4-Tiny Model. Comput. Electron. Agric. 2021, 191, 106503. [Google Scholar] [CrossRef]
- Wang, L.; Zhao, Y.; Liu, S.; Li, Y.; Chen, S.; Lan, Y. Precision Detection of Dense Plums in Orchards Using the Improved YOLOv4 Model. Front. Plant Sci. 2022, 13, 839269. [Google Scholar] [CrossRef] [PubMed]
- Yan, B.; Fan, P.; Lei, X.; Liu, Z.; Yang, F. A Real-Time Apple Targets Detection Method for Picking Robot Based on Improved YOLOv5. Remote Sens. 2021, 13, 1619. [Google Scholar] [CrossRef]
- Qian, Y.; Jiacheng, R.; Pengbo, W.; Zhan, Y.; Changxing, G. Real-Time Detection and Localization Using SSD Method for Oyster Mushroom Picking Robot. In Proceedings of the 2020 IEEE International Conference on Real-Time Computing and Robotics, RCAR 2020, Asahikawa, Japan, 28–29 September 2020. [Google Scholar]
- Liu, J.; Zhao, M.; Guo, X. A Fruit Detection Algorithm Based on R-FCN in Natural Scene. In Proceedings of the 32nd Chinese Control and Decision Conference, CCDC 2020, Hefei, China, 22–24 August 2020. [Google Scholar]
- Li, J.; Tang, Y.; Zou, X.; Lin, G.; Wang, H. Detection of Fruit-Bearing Branches and Localization of Litchi Clusters for Vision-Based Harvesting Robots. IEEE Access 2020, 8, 117746–117758. [Google Scholar] [CrossRef]
- Peng, H.; Xue, C.; Shao, Y.; Chen, K.; Xiong, J.; Xie, Z.; Zhang, L. Semantic Segmentation of Litchi Branches Using Deeplabv3+ Model. IEEE Access 2020, 8, 164546–164555. [Google Scholar] [CrossRef]
- Xiong, Y.; Ge, Y.; From, P.J. An Obstacle Separation Method for Robotic Picking of Fruits in Clusters. Comput. Electron. Agric. 2020, 175, 105397. [Google Scholar] [CrossRef]
- Mghames, S.; Hanheide, M.; Ghalamzan, E.A. Interactive Movement Primitives: Planning to Push Occluding Pieces for Fruit Picking. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Las Vegas, NV, USA, 24 October 2020–24 January 2021. [Google Scholar]
- Cao, X.; Yan, H.; Huang, Z.; Ai, S.; Xu, Y.; Fu, R.; Zou, X. A Multi-Objective Particle Swarm Optimization for Trajectory Planning of Fruit Picking Manipulator. Agronomy 2021, 11, 2286. [Google Scholar] [CrossRef]
- Chen, J.; Qiang, H.; Wu, J.; Xu, G.; Wang, Z. Navigation Path Extraction for Greenhouse Cucumber-Picking Robots Using the Prediction-Point Hough Transform. Comput. Electron. Agric. 2021, 180, 105911. [Google Scholar] [CrossRef]
- Colucci, G.; Botta, A.; Tagliavini, L.; Cavallone, P.; Baglieri, L.; Quaglia, G. Kinematic Modeling and Motion Planning of the Mobile Manipulator Agri.Q for Precision Agriculture. Machines 2022, 10, 321. [Google Scholar] [CrossRef]
- Le Flécher, E.; Durand-Petiteville, A.; Cadenat, V.; Sentenac, T. Visual Predictive Control of Robotic Arms with Overlapping Workspace. In Proceedings of the ICINCO 2019—Proceedings of the 16th International Conference on Informatics in Control, Automation and Robotics, Prague, Czech Republic, 29–31 July 2019; 2019; Volume 1, pp. 130–137. [Google Scholar]
- Ji, W.; Zhang, J.; Xu, B.; Tang, C.; Zhao, D. Grasping Mode Analysis and Adaptive Impedance Control for Apple Harvesting Robotic Grippers. Comput. Electron. Agric. 2021, 186, 106210. [Google Scholar] [CrossRef]
- Wang, Y.; Zhang, H.; Wang, L.; Li, G.; Zhang, Y.; Liu, X. Development of Control System for Cotton Picking Test Bench Based on Fuzzy PID Control. Nongye Gongcheng Xuebao/Trans. Chin. Soc. Agric. Eng. 2018, 34, 23–32. [Google Scholar] [CrossRef]
- Ramin Shamshiri, R.; Weltzien, C.; Hameed, I.A.; Yule, I.J.; Grift, T.; Balasundram, S.; Pitonakova, L.; Ahmad, D.; Chowdhary, G. Research and Development in Agricultural Robotics: A Perspective of Digital Farming. Int. J. Agric. Biol. Eng. 2018, 11, 1–11. [Google Scholar] [CrossRef]
- Navas, E.; Fernández, R.; Sepúlveda, D.; Armada, M.; Gonzalez-de-Santos, P. Soft Grippers for Automatic Crop Harvesting: A Review. Sensors 2021, 21, 2689. [Google Scholar] [CrossRef]
- Liu, J.; Yuan, Y.; Gao, Y.; Tang, S.; Li, Z. Virtual Model of Grip-and-Cut Picking for Simulation of Vibration and Falling of Grape Clusters. Trans ASABE 2019, 62, 603–614. [Google Scholar] [CrossRef]
- Wei, J.; Yi, D.; Bo, X.; Guangyu, C.; Dean, Z. Adaptive Variable Parameter Impedance Control for Apple Harvesting Robot Compliant Picking. Complexity 2020, 2020, 1–15. [Google Scholar] [CrossRef]
- Miao, Y.; Zheng, J. Optimization Design of Compliant Constant-Force Mechanism for Apple Picking Actuator. Comput. Electron. Agric. 2020, 170, 105232. [Google Scholar] [CrossRef]
- Liu, C.H.; Chiu, C.H.; Chen, T.L.; Pai, T.Y.; Chen, Y.; Hsu, M.C. A Soft Robotic Gripper Module with 3d Printed Compliant Fingers for Grasping Fruits. In Proceedings of the IEEE/ASME International Conference on Advanced Intelligent Mechatronics, AIM, Auckland, New Zealand, 9–12 July 2018; Volume 2018. [Google Scholar]
- Pi, J.; Liu, J.; Zhou, K.; Qian, M. An Octopus-Inspired Bionic Flexible Gripper for Apple Grasping. Agriculture 2021, 11, 1014. [Google Scholar] [CrossRef]
- Hohimer, C.J.; Wang, H.; Bhusal, S.; Miller, J.; Mo, C.; Karkee, M. Design and Field Evaluation of a Robotic Apple Harvesting System with a 3D-Printed Soft-Robotic End-Effector. Trans ASABE 2019, 62, 405–414. [Google Scholar] [CrossRef]
- Vu, Q.; Ronzhin, A. Models and algorithms for design robotic gripper for agricultural products. Comptes Rendus De L’Academie Bulg. Des Sci. 2020, 73, 103–110. [Google Scholar]
- Chen, Z.; Yang, M.; Li, Y.; Yang, L. Design and Experiment of Tomato Picking End-Effector Based on Non-Destructive Pneumatic Clamping Control. Nongye Gongcheng Xuebao/Trans. Chin. Soc. Agric. Eng. 2021, 37, 27–35. [Google Scholar] [CrossRef]
- Yung, I.; Maccarana, Y.; Maroni, G.; Previdi, F. Partially Structured Robotic Picking for Automation of Tomato Transplantation. In Proceedings of the 2019 IEEE International Conference on Mechatronics, ICM 2019, Ilmenau, Germany, 18–20 March 2019. [Google Scholar]
- Zhang, J.; Lai, S.; Yu, H.; Wang, E.; Wang, X.; Zhu, Z. Fruit Classification Utilizing a Robotic Gripper with Integrated Sensors and Adaptive Grasping. Math. Probl. Eng. 2021, 2021, 1–15. [Google Scholar] [CrossRef]
- Habegger, R.; Bergamo, E.; Schwab, W.; Berninger, T.; Rixen, D. Impact of Intensive Modification of Sweet Pepper Plants on Performance of End Effectors for Autonomous Harvesting. Eur. J. Hortic. Sci. 2021, 86, 354–359. [Google Scholar] [CrossRef]
- De Preter, A.; Anthonis, J.; De Baerdemaeker, J. Development of a Robot for Harvesting Strawberries. FAC-PapersOnLine 2018, 51, 14–19. [Google Scholar] [CrossRef]
- Li, Z.; Miao, F.; Yang, Z.; Wang, H. An Anthropometric Study for the Anthropomorphic Design of Tomato-Harvesting Robots. Comput. Electron. Agric. 2019, 163, 104881. [Google Scholar] [CrossRef]
- Li, Z.; Miao, F.; Yang, Z.; Chai, P.; Yang, S. Factors Affecting Human Hand Grasp Type in Tomato Fruit-Picking: A Statistical Investigation for Ergonomic Development of Harvesting Robot. Comput. Electron. Agric. 2019, 157, 90–97. [Google Scholar] [CrossRef]
- Hou, Z.; Li, Z.; Fadiji, T.; Fu, J. Soft Grasping Mechanism of Human Fingers for Tomato-Picking Bionic Robots. Comput. Electron. Agric. 2021, 182, 106010. [Google Scholar] [CrossRef]
- Öz, E.; Jakob, M. Ergonomic Evaluation of Simulated Apple Hand Harvesting by Using 3D Motion Analysis. Ege Üniversitesi Ziraat Fakültesi Derg. 2020, 57, 249–256. [Google Scholar] [CrossRef]
- Liu, X.; Xu, H.; Chen, F. Research on Vision and Trajectory Planning System for Tomato Picking Robots. In Proceedings of the 2020 5th International Conference on Mechanical, Control and Computer Engineering, ICMCCE 2020, Harbin, China, 25–27 December 2020. [Google Scholar]
- Zhang, L.; Jia, J.; Gui, G.; Hao, X.; Gao, W.; Wang, M. Deep Learning Based Improved Classification System for Designing Tomato Harvesting Robot. IEEE Access 2018, 6, 67940–67950. [Google Scholar] [CrossRef]
- Horng, G.J.; Liu, M.X.; Chen, C.C. The Smart Image Recognition Mechanism for Crop Harvesting System in Intelligent Agriculture. IEEE Sens. J. 2020, 20, 2766–2781. [Google Scholar] [CrossRef]
- Xiong, Y.; From, P.J.; Isler, V. Design and Evaluation of a Novel Cable-Driven Gripper with Perception Capabilities for Strawberry Picking Robots. In Proceedings of the IEEE International Conference on Robotics and Automation, Brisbane, QLD, Australia, 21–25 May 2018. [Google Scholar]
- Zhong, Z.; Xiong, J.; Zheng, Z.; Liu, B.; Liao, S.; Huo, Z.; Yang, Z. A Method for Litchi Picking Points Calculation in Natural Environment Based on Main Fruit Bearing Branch Detection. Comput. Electron. Agric. 2021, 189, 106398. [Google Scholar] [CrossRef]
Active/Passive Vision | Type | Advantages | Disadvantages | Representative Cameras | |
---|---|---|---|---|---|
Active Vision | Structured-light | More mature and easier to miniaturize Low power consumption Can be used at night High accuracy and resolution within a certain range | Easily disturbed by ambient light The accuracy deteriorates as the detection distance increases | RealSenseD435i | |
Kinect v1 | |||||
OAK-D-Pro | |||||
TOF | Long detection distance Less interference by ambient light | High requirements for equipment High power consumption Low edge accuracy Lower frame rate and resolution | Kinect v2 | ||
PMD CamCube 3.0 | |||||
Passive Vision | - | Low hardware requirements and low cost Suitable for both indoor and outdoor use | Very sensitive to ambient light Not for monotonous scenes that lack texture Calculations are more complex | Digital Cameras | |
Thermal camera | |||||
Multispectral camera |
Applied Crops | Perception | Sensors | Characteristic | Structure | Effect | Ref. |
---|---|---|---|---|---|---|
Red tomato, Green tomato | Monocular RGB +Ultrasonic | Pi camera (mobile) + ultrasonic sensor HC-SR04 (mobile) | Simple method, low cost, and adaptable to the limited computing resources of microcontrollers | Picking time Red: 4.932 s Green: 5.276 s | [63] | |
Cherry tomato | Monocular RGB + Laser | FL3-U type RGB camera + LY-LDS-61 type laser sensor | Simple structure, accurate distance measurement, avoiding obstacle obstruction | Harvesting success rate: 83% | [64,65] | |
Eggplant | Monocular RGB + depth camera (TOF) | Prosilica GC2450C RGB camera + Mesa SwissRanger SR4000 depth measurement capture camera | High precision and sensitivity | Sensing time: 0.81 s | [65] | |
Strawberry | Far-field monocular RGB + Close up monocular RGB | 8 mm lens 1280 × 976 pixels 1/3 inch CCD telephoto camera (fixed) + 5 mm lens 640 × 480 pixels 1/2 inch CCD close-up camera (mobile) | Global and local image information can be obtained simultaneously and possible occlusions can be avoided, but it takes longer. | Harvesting success rate: 84% Average harvest time: 10.7 s | [66] |
Splitting Technology | Applied Crops | Description | Advantages | Disadvantages | Applicable Environment | Examples | Ref. |
---|---|---|---|---|---|---|---|
Color Threshold | Strawberry Cherry Tomato Litchi | One or several thresholds to classify the grayscale histogram, grayscale values in the same category belong to the same object | Most commonly used, simple, fast and efficient calculation | Cannot effectively segment targets with little difference in grayscale values and overlap, more sensitive to noise | Applicable when the difference between image background and target features is obvious | Otsu K-means clustering Maximum entropy method | [64,68,71] |
Edge detection | Oil palm fruit | Different images have different grayscale, and there are generally distinct edges at the boundary, so use this feature for image segmentation | Faster retrieval and better detection of different image edges | More sensitive to noise, conflicts between noise immunity and detection accuracy | Applicable when low noise, large difference in edge features between different regions | Canny Sobel Robert Prewitt Laplaeian | [69] |
Regional Growth | Eggplant Kiwifruit Chili Guava | Divide the image into different segmentation regions according to the similarity criterion | It has better area characteristics and overcomes the disadvantage of continuous segmentation area that exists in other methods | Prone to over-segmentation | Applicable when a more definite structural division of the area is required | Meyer Watershed Method, Adams Seed Area Growth Method, Gonzalez Regional Split Merge Method | [46,65,72] |
Graph Theory | - | The essence is to remove specific edges and divide the graph into several subgraphs to achieve segmentation. | Suitable for a wide range of target shapes | Longer operation time | - | Graph Cuts Grab Cuts Random Walk | - |
Applied Crops | Classification Type | Mechanical Arm | Feature | Sketch Map | Ref. |
---|---|---|---|---|---|
Tomato | - | Single | Sieve out invalid subspace, solve the problem of difficult return, improve work efficiency | [49] | |
Strawberry | Regional independence | Double | Sufficient safety distance to avoid collision of two robotic arms | [67] | |
Strawberry | Double | The picking area is completely independent, completely solving the collision problem | [91] | ||
Apple | Regional sharing | Quadra | Effectively solving the problem of double picking and missed picking | [51] | |
Eggplant | Double | Avoiding collisions between robotic arms, effectively shortening picking time | [65] |
Control Type | Control Method | Applied Crops | Mechanical Arm | Advantages | Disadvantages | Ref. |
---|---|---|---|---|---|---|
Classic Control | PID | Strawberry | Single | Simple to implement Easy to adapt Fast response time Good stability | Sensitive to noise Difficult to adjust parameters Unable to handle non-linear systems Unable to handle time-varying systems | [71] |
Eggplant | Double | [65] | ||||
Modern Control | NMPC | - | Double | Wide applicability Robustness Optimizable for multiple objectives Can handle constraints | Large calculation volume Difficult to adjust parameters High impact of model error Poor stability | [96] |
Impedance Control | Apple | Single | Wide adaptability High robustness High control accuracy Flexible interaction possible | Large calculation volume Difficult parameter adjustment High requirements for sensors Not very stable | [97] | |
SMC | Famous Tea | Single | Robust Rapid response | High-frequency oscillation Complexity of nonlinear design | [56] | |
Intelligent Control | Fuzzy Control | Wolfberry | - | Robustness Wide adaptability Adjustable control effects Flexible knowledge representation | Large calculation volume Difficult parameter adjustment Unstable control effect | [33] |
Fuzzy PID Control | Wool | - | Robust Flexible fuzzy rules Easy operation | Computationally complex Poor interpretability Difficulty in choosing parameters | [98] |
Figure | Applied Crops | Advantages | Improvements | Gripper Size | Recognition Accuracy | Picking Success Rate | Picking Time | Ref. | |
---|---|---|---|---|---|---|---|---|---|
a | Apple | Minimized damage rate | picking speed, and accuracy | Diameter: 10.5 cm | - | - | 10.3 s | [10] | |
b | Apple | Minimized damage rate, Less costly | Picking rate | - | 91% | 85% | 12 s | [99] | |
c | Tomato | Simple structure, Less costly | Positioning speed, picking speed | Diameter: 9 cm | - | - | 9.6 s | [10] | |
d | Hangzhou white chrysanthemum | High recognition rate | Picking efficiency and accuracy | Diameter: 3 cm | 90 | 80% | 12.5 s | [44] |
Figure | Number | Applied Crops | Advantages | Improvements | Gripper Size | Recognition Accuracy | Picking Success Rate | Picking Time | Ref. |
---|---|---|---|---|---|---|---|---|---|
a | Strawberry | Accurately separate obstacles | Targeting accuracy, picking rate | Maximum clamping diameter 60 mm, open diameter 45 mm | - | 96.8% | 10.6 s | [71] | |
b | Litchi | Non-destructive picking | Picking rate | - | - | - | - | [62] | |
c | Cherry Tomatoes | Stable clamping, low fruit falling rate | Picking success rate | - | - | 83% | 8 s | [64] | |
d | Tomatoes | Fast picking rate | Recognition accuracy | - | - | - | 9.676 s | [63] | |
e | Grape | Small size, Flexible | Robustness | Gripping shaft length of 30 mm | - | - | 9.6 s | [101] |
Figure | Number | Applied Crops | Advantages | Improvements | Gripper Size | Recognition Accuracy | Picking Success Rate | Picking Time | Ref. |
---|---|---|---|---|---|---|---|---|---|
a | pineapple | Minimized damage rate, High picking success rate | Picking rate | Cylindrical radius: 100 mm Blade diameter: 130 mm | 95% | 80% | 14.9 S | [45] | |
b | mulberry | Accurate separation of obstacles | Positioning accuracy | Maximum clamping diameter: 40 mm, open diameter: 25 mm | - | - | 10.6 s | [100] | |
c | Apple | Minimized damage rate | Picking rate and accuracy | - | - | - | 7.81 s | [102] | |
d | Apple | Minimized damage rate | Picking rate | Maximum diameter: 14 cm | 91% | 82% | 9.8 s | [103] |
Fruit | Gripper Category | Recognition Rate | Recognition Accuracy | Average Picking Time | Picking Success Rate | Ref. | Year |
---|---|---|---|---|---|---|---|
Apples | flexible grasping | - | 82.5% | 14.6 s | 72% | [59] | 2021 |
0.012 s | - | - | 100% | [105] | 2021 | ||
- | - | 25.5 s | 96.67% | [93] | 2021 | ||
- | - | 7.3 s | 67% | [106] | 2019 | ||
shearing-style | 0.015 s | - | - | - | [86] | 2021 | |
- | 0.181 s | 89% | - | - | [55] | 2020 | |
- | 0.235 s | 87.1% | 7 s | - | [52] | 2020 | |
Tomatoes | shearing-style | - | 92.8% | - | 73.04% | [54] | 2021 |
- | - | 9.676 s | - | [63] | 2019 | ||
0.021 s | 94% | - | 100% | [117] | 2020 | ||
- | 0.096 s | - | - | 91.9% | [118] | 2018 | |
- | 91.92% | - | - | [81] | 2020 | ||
flexible grasping | 0.016 s | - | 8 s | - | [113] | 2021 | |
- | - | 98% | - | - | [110] | 2021 | |
- | 89% | - | - | [119] | 2020 | ||
Strawberries | cavity retrieval | 0.136 s | - | 6.1 s | 97.1% | [67] | 2019 |
- | - | 10.6 s | 96.8% | [71] | 2019 | ||
flexible grasping | 0.086 s | 93.1% | 4 s | - | [112] | 2018 | |
- | - | 9.05 s | 96.8% | [73] | 2021 | ||
0.049 s | - | 10.62 s | 96.77% | [120] | 2018 | ||
- | - | 86.58% | - | - | [83] | 2022 | |
0.062 s | 95.78% | - | - | [74]- | 2019 | ||
shearing-style | - | 94.43% | - | 84.35% | [79] | 2020 | |
- | - | 10.7 s | 84% | [66] | 2019 | ||
Sweet papers | shearing-style | - | 91.84% | - | 90.04% | [47] | 2020 |
- | - | 96.91% | - | - | [84] | 2021 | |
1.41 s | 86.4% | - | - | [46] | 2020 | ||
Litchi fruits | shearing-style | 0.154 s | 93.5% | - | - | [121] | 2021 |
0.464 s | 83.33% | - | - | [89] | 2020 | ||
- | - | 96.78% | - | - | [77] | 2020 | |
Cherry tomatoes | flexible grasping | - | - | 6.4 s | 84% | [41] | 2022 |
shearing-style | - | - | 8 s | 83% | [64] | 2018 | |
- | - | 12.51 s | 99.81% | [49] | 2021 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ji, W.; Huang, X.; Wang, S.; He, X. A Comprehensive Review of the Research of the “Eye–Brain–Hand” Harvesting System in Smart Agriculture. Agronomy 2023, 13, 2237. https://doi.org/10.3390/agronomy13092237
Ji W, Huang X, Wang S, He X. A Comprehensive Review of the Research of the “Eye–Brain–Hand” Harvesting System in Smart Agriculture. Agronomy. 2023; 13(9):2237. https://doi.org/10.3390/agronomy13092237
Chicago/Turabian StyleJi, Wanteng, Xianhao Huang, Shubo Wang, and Xiongkui He. 2023. "A Comprehensive Review of the Research of the “Eye–Brain–Hand” Harvesting System in Smart Agriculture" Agronomy 13, no. 9: 2237. https://doi.org/10.3390/agronomy13092237
APA StyleJi, W., Huang, X., Wang, S., & He, X. (2023). A Comprehensive Review of the Research of the “Eye–Brain–Hand” Harvesting System in Smart Agriculture. Agronomy, 13(9), 2237. https://doi.org/10.3390/agronomy13092237