A Review of Perception Technologies for Berry Fruit-Picking Robots: Advantages, Disadvantages, Challenges, and Prospects
Abstract
:1. Introduction
2. Mechanism of Berry Fruit-Picking Robots
2.1. Principle
- Approaching the object: The end-effector is open, and the robot system positions the end-effector close to the object.
- Coming into contact: The end-effector is in contact with the gripping object.
- Increasing gripping force: The end-effector outputs a certain amount of gripping force to ensure the object is gripped.
- Securing the object: When the gripping force meets the object, the gripping force should be adjusted to stop increasing once it securely grips the object.
- Lifting the object: The end-effector grips the object, moving it to the desired position, combined with the robot system moving the object to the desired position.
- Release the object: The end-effector releases its grip, separating from the gripped object.
- Monitoring the grasp: the gripping process is monitored by sensors to judge the end-effector’s contact with the gripped object and the effectiveness of the grip.
2.2. Picking and Gripping
2.3. Structures
2.3.1. Mobile Platform
2.3.2. Manipulator
2.3.3. End-Effector
3. An Overview of Perception Technologies for Berry Fruit-Picking Robots
- Visual Perception
- Tactile Perception
- Distance Measurement
- Switching Sensors
4. Methods and Analysis
4.1. Visual Perception
4.1.1. Methods
4.1.2. Analysis
4.2. Tactile Perception
4.2.1. Methods
4.2.2. Analysis
4.3. Distance Measurement
4.3.1. Methods
4.3.2. Analysis
4.4. Switching Sensors
4.4.1. Methods
4.4.2. Analysis
5. Discussion and Prospect
5.1. The Technical Characteristics of the Perception Technology
- Visual perception: the visual sensors exhibit high resolution and can accurately identify the fruit’s location, size, and ripeness. They facilitate real-time monitoring of fruit location and status, enabling the robot to adjust its actions accordingly. Additionally, visual perception can automatically distinguish between different types of fruits, ensuring the accuracy of the robot’s picking operations. Moreover, one of the primary challenges in practical visual perception is the accuracy of perception in different scenes and conditions [180]. It is, therefore, necessary to adapt visual perception to different light and background conditions in different environments to improve the reliability and accuracy of fruit-picking tasks.
- Tactile perception: The fruit’s shape, hardness, and surface properties enable the robot to grasp precisely. The tactile sensor delivers real-time data feedback, assisting the robot in adjusting its picking actions to enhance efficiency and quality. Additionally, the tactile sensor detects the pressure exerted by the robot upon contact with the fruit, preventing excessive force that could potentially damage it. Furthermore, the tactile sensor adapts to the varying characteristics of different fruit types and sizes, ensuring precise and reliable harvesting.
- Distance measurement: the distance measurement sensor accurately gauges the distance between the robot and the object fruit, ensuring the fruit can be grasped without damage. Additionally, the sensor monitors real-time changes in distance, enabling prompt adjustments to the robot’s movements for a stable picking process. The sensor also detects distances to obstacles, allowing the robot to avoid collisions and ensuring the safety of the robot and its environment. Furthermore, it aids the robot in accurately locating fruit, thereby enhancing the efficiency and precision of the picking process.
- Switching sensors: Firstly, they enable control of the robot’s actions based on detected state information, such as initiating or halting the movement of the manipulator and directing the robot’s movement. Secondly, they provide precise control over the robot’s motions. The switch sensors monitor the status of various robot components, including the opening and closing of the robotic manipulator and overall movement. They can also detect contact between the robot and obstacles, immediately stopping movement upon collision to prevent damage to both the robot and its surroundings. Furthermore, the switch sensors assess the operational status of various components, facilitating timely fault detection, diagnosis, and repair.
5.2. The Advanced Technical of Berry Fruit Picking
5.3. Challenges and Future Opportunities
- Accuracy: The accuracy of a robot in berry fruit picking hinges on its ability to precisely identify and locate fruits, assess their ripeness, and execute picking actions with minimal damage or errors. Reliable perception technology is essential for correctly identifying different types of fruits. However, current technology often fails to achieve this desired level of accuracy. Challenges such as visual perception errors caused by lighting conditions, occlusion, wind effects, and background interference contribute to omissions and misdetections in optical perception systems. As a result, ensuring consistent accuracy in the berry fruit-picking process remains uncertain.
- Adaptability: Adaptability in robotics refers to the ability of a robot to adjust its operations and techniques in response to varying conditions and situations. This includes accommodating different types of berries, adapting to changes in weather conditions, navigating different soil types, and handling various crop layouts. The goal is to optimize efficiency and effectiveness in harvesting. During berry picking, environmental factors like light, humidity, and temperature can affect sensor performance, impacting perception capabilities. It is essential for perception technology to be adaptable in natural environments, enabling robots to perform tasks such as obstacle avoidance, object recognition, and precise object separation. These tasks can be challenging due to the unpredictable characteristics of natural environments.
- Durability: Durability in robotics refers to a robot’s ability to withstand wear, corrosion, and physical damage in its operational environment. This includes resisting high temperatures, humidity, dust, and the vibrations and shocks encountered during berry picking. Ensuring sensors remain stable and functional over extended periods is crucial for reliable performance in challenging conditions.
- Data processing: Data processing in robotics involves the collection, analysis, and utilization of sensor data. Sensors gather extensive information during operations, such as the location, ripeness, and size of berries. Real-time processing is essential to adjust the robot’s movements promptly. The application of numerical techniques for the acquisition of sensory data may be susceptible to the influence of internal bias [184]. Additionally, data may contain noise or errors, requiring cleaning and correction. Integration of data from various sensors is vital for comprehensive analysis. Thus, data processing poses a major challenge in perceptual technology.
- Low cost: Low cost is defined as a cost-effectiveness that reduces research and development expenses, thereby making the technology affordable and accessible for farms of all sizes. The sensor is a crucial component of the perception system. However, high-quality sensors with solid performance are still expensive, and they may become faulty or damaged during use, requiring maintenance or replacement. Therefore, it is necessary to reduce the cost of sensors while ensuring their quality and performance.
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
References
- Yang, L.; Sun, G. Research progress of berry and berry juice. Food Res. Dev. 2008, 5, 183–188. [Google Scholar] [CrossRef]
- Kähkönen, M.P.; Hopia, A.I.; Vuorela, H.J.; Rauha, J.-P.; Pihlaja, K.; Kujala, T.S.; Heinonen, M. Antioxidant Activity of Plant Extracts Containing Phenolic Compounds. J. Agric. Food Chem. 1999, 47, 3954–3962. [Google Scholar] [CrossRef] [PubMed]
- Hertog, M.G.L.; Feskens, E.J.M.; Kromhout, D.; Hertog, M.G.L.; Hollman, P.C.H.; Hertog, M.G.L.; Katan, M.B. Dietary Antioxidant Flavonoids and Risk of Coronary Heart Disease: The Zutphen Elderly Study. Lancet 1993, 342, 1007–1011. [Google Scholar] [CrossRef] [PubMed]
- Li, J.; Gao, C.; Xiao, B. Wild Fruit Development and Comprehensive Utilization, 1st ed.; Scientific and Technical Documentation Press: Beijing, China, 1998; ISBN 7-5023-0922-5. [Google Scholar]
- Du, G.; Yao, F.; Cao, J.; Li, Y.; Li, F.; Jia, Q. Design of the artificial-assisted single-drive device for picking multi-fruit strawberries of ridge cultivation. J. Mach. Des. 2020, 37, 19–23. [Google Scholar] [CrossRef]
- Yuan, P.; Zhu, X.; You, J.; Han, C.; Zhang, X.; Guo, H. Development of crankshaft vibration threshing and harvesting equipment for wine grape. Trans. CSAE 2020, 36, 67–74. [Google Scholar] [CrossRef]
- Tai, K.; El-Sayed, A.-R.; Shahriari, M.; Biglarbegian, M.; Mahmud, S. State of the Art Robotic Grippers and Applications. Robotics 2016, 5, 11. [Google Scholar] [CrossRef]
- Brown, E.; Rodenberg, N.; Amend, J.; Mozeika, A.; Steltz, E.; Zakin, M.R.; Lipson, H.; Jaeger, H.M. Universal Robotic Gripper Based on the Jamming of Granular Material. Proc. Natl. Acad. Sci. USA 2010, 107, 18809–18814. [Google Scholar] [CrossRef]
- Muscato, G.; Prestifilippo, M.; Abbate, N.; Rizzuto, I. A Prototype of an Orange Picking Robot: Past History, the New Robot and Experimental Results. Ind. Robot 2005, 32, 128–138. [Google Scholar] [CrossRef]
- Clement, R.G.E.; Bugler, K.E.; Oliver, C.W. Bionic Prosthetic Hands: A Review of Present Technology and Future Aspirations. Surgeon 2011, 9, 336–340. [Google Scholar] [CrossRef] [PubMed]
- Pettersson, A.; Davis, S.; Gray, J.O.; Dodd, T.J.; Ohlsson, T. Design of a Magnetorheological Robot Gripper for Handling of Delicate Food Products with Varying Shapes. J. Food Eng. 2010, 98, 332–338. [Google Scholar] [CrossRef]
- Huang, S.; Wang, B.; Zhao, Z.; Wang, L.; Weng, L. Recognition of Magnetostrictive Tactile Sensor Array Applied to Manipulator. Trans. China Electrotech. Soc. 2021, 36, 1416–1424. [Google Scholar] [CrossRef]
- Iñiguez-Moreno, M.; González-González, R.B.; Flores-Contreras, E.A.; Araújo, R.G.; Chen, W.N.; Alfaro-Ponce, M.; Iqbal, H.M.N.; Melchor-Martínez, E.M.; Parra-Saldívar, R. Nano and Technological Frontiers as a Sustainable Platform for Postharvest Preservation of Berry Fruits. Foods 2023, 12, 3159. [Google Scholar] [CrossRef] [PubMed]
- Zhang, B.; Xie, Y.; Zhou, J.; Wang, K.; Zhang, Z. State-of-the-Art Robotic Grippers, Grasping and Control Strategies, as Well as Their Applications in Agricultural Robots: A Review. Comput. Electron. Agric. 2020, 177, 105694. [Google Scholar] [CrossRef]
- Jin, Y.; Liu, J.; Wang, J.; Xu, Z.; Yuan, Y. Far-near Combined Positioning of Picking-Point Based on Depth Data Features for Horizontal-Trellis Cultivated Grape. Comput. Electron. Agric. 2022, 194, 106791. [Google Scholar] [CrossRef]
- Ma, L.; He, Z.; Zhu, Y.; Jia, L.; Wang, Y.; Ding, X.; Cui, Y. A Method of Grasping Detection for Kiwifruit Harvesting Robot Based on Deep Learning. Agronomy 2022, 12, 3096. [Google Scholar] [CrossRef]
- Parsa, S.; Debnath, B.; Khan, M.A.; Amir, G.E. Modular Autonomous Strawberry Picking Robotic System. J. Field Robot. 2023, 1–21. [Google Scholar] [CrossRef]
- Tang, Q.; Liang, J.; Zhu, F. A Comparative Review on Multi-Modal Sensors Fusion Based on Deep Learning. Signal Process. 2023, 213, 109165. [Google Scholar] [CrossRef]
- Chlingaryan, A.; Sukkarieh, S.; Whelan, B. Machine Learning Approaches for Crop Yield Prediction and Nitrogen Status Estimation in Precision Agriculture: A Review. Comput. Electron. Agric. 2018, 151, 61–69. [Google Scholar] [CrossRef]
- Van Henten, E.J.; Schenk, E.J.; Van Willigenburg, L.G.; Meuleman, J.; Barreiro, P. Collision-Free Inverse Kinematics of the Redundant Seven-Link Manipulator Used in a Cucumber Picking Robot. Biosyst. Eng. 2010, 106, 112–124. [Google Scholar] [CrossRef]
- Ting, K.C.; Giacomelli, G.A.; Shen, S.J.; Kabala, W.P. Robot Workcell for Transplanting of Seedlings Part II—End-effector Development. Trans. ASAE 1990, 33, 1013–1017. [Google Scholar] [CrossRef]
- Aqeel-ur-Rehman; Abbasi, A.Z.; Islam, N.; Shaikh, Z.A. A Review of Wireless Sensors and Networks’ Applications in Agriculture. Comput. Stand. Interfaces 2014, 36, 263–270. [Google Scholar] [CrossRef]
- Bac, C.W.; van Henten, E.J.; Hemming, J.; Edan, Y. Harvesting Robots for High-value Crops: State-of-the-art Review and Challenges Ahead. J. Field Robot. 2014, 31, 888–911. [Google Scholar] [CrossRef]
- Weiss, U.; Biber, P. Plant Detection and Mapping for Agricultural Robots Using a 3D LIDAR Sensor. Robot. Auton. Syst. 2011, 59, 265–273. [Google Scholar] [CrossRef]
- Fantoni, G.; Gabelloni, D.; Tilli, J. Concept Design of New Grippers Using Abstraction and Analogy. Proc. Inst. Mech. Eng. Part B: J. Eng. Manuf. 2013, 227, 1521–1532. [Google Scholar] [CrossRef]
- Fantoni, G.; Santochi, M.; Dini, G.; Tracht, K.; Scholz-Reiter, B.; Fleischer, J.; Kristoffer Lien, T.; Seliger, G.; Reinhart, G.; Franke, J.; et al. Grasping Devices and Methods in Automated Production Processes. CIRP Ann. 2014, 63, 679–701. [Google Scholar] [CrossRef]
- Williamson, J.G.; Cline, W.O. Mechanized Harvest of Southern Highbush Blueberries for the Fresh Market: An Introduction and Overview of the Workshop Proceedings. HortTechnology 2013, 23, 416–418. [Google Scholar] [CrossRef]
- S, V.R.; Parsa, S.; Parsons, S.; E, A.G. Peduncle Gripping and Cutting Force for Strawberry Harvesting Robotic End-Effector Design. In Proceedings of the 2022 4th International Conference on Control and Robotics (ICCR), Guangzhou, China, 2–4 December 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 59–64. [Google Scholar]
- Hayashi, S.; Shigematsu, K.; Yamamoto, S.; Kobayashi, K.; Kohno, Y.; Kamata, J.; Kurita, M. Evaluation of a Strawberry-Harvesting Robot in a Field Test. Biosyst. Eng. 2010, 105, 160–171. [Google Scholar] [CrossRef]
- Aliasgarian, S.; Ghassemzadeh, H.R.; Moghaddam, M.; Ghaffari, H. Mechanical Damage Of Strawberry During Harvest And Postharvest Operations. Acta Technol. Agric. 2015, 18, 1–5. [Google Scholar] [CrossRef]
- Kurpaska, S.; Sobol, Z.; Pedryc, N.; Hebda, T.; Nawara, P. Analysis of the Pneumatic System Parameters of the Suction Cup Integrated with the Head for Harvesting Strawberry Fruit. Sensors 2020, 20, 4389. [Google Scholar] [CrossRef] [PubMed]
- Xiong, Y.; Peng, C.; Grimstad, L.; From, P.J.; Isler, V. Development and Field Evaluation of a Strawberry Harvesting Robot with a Cable-Driven Gripper. Comput. Electron. Agric. 2019, 157, 392–402. [Google Scholar] [CrossRef]
- Nevliudov, I.; Novoselov, S.; Sychova, O.; Tesliuk, S. Development of the Architecture of the Base Platform Agricultural Robot for Determining the Trajectory Using the Method of Visual Odometry. In Proceedings of the 2021 IEEE XVIIth International Conference on the Perspective Technologies and Methods in MEMS Design (MEMSTECH), Polyana (Zakarpattya), Ukraine, 12–16 May 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 64–68. [Google Scholar]
- Wang, J.; Luo, Z.; Wang, Y.; Yang, B.; Assadian, F. Coordination Control of Differential Drive Assist Steering and Vehicle Stability Control for Four-Wheel-Independent-Drive EV. IEEE Trans. Veh. Technol. 2018, 67, 11453–11467. [Google Scholar] [CrossRef]
- De Santiago, J.; Bernhoff, H.; Ekergård, B.; Eriksson, S.; Ferhatovic, S.; Waters, R.; Leijon, M. Electrical Motor Drivelines in Commercial All-Electric Vehicles: A Review. IEEE Trans. Veh. Technol. 2012, 61, 475–484. [Google Scholar] [CrossRef]
- Wu, X.; Xu, M.; Wang, L. Differential Speed Steering Control for Four-Wheel Independent Driving Electric Vehicle. In Proceedings of the 2013 IEEE International Symposium on Industrial Electronics, Taipei, Taiwan, 28–31 May 2013; IEEE: Piscataway, NJ, USA, 2023; pp. 1–6. [Google Scholar]
- Veneri, M.; Massaro, M. The Effect of Ackermann Steering on the Performance of Race Cars. Veh. Syst. Dyn. 2021, 59, 907–927. [Google Scholar] [CrossRef]
- Xu, T.; Ma, S.; Xu, H.; Mo, S.; Li, Y. Application of Ackermann Steering in Obstacle Crossing Platform of Six-Wheeled Robots. In Proceedings of the 2023 2nd International Symposium on Control Engineering and Robotics (ISCER), Hangzhou, China, 17–19 October 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 239–243. [Google Scholar]
- Simionescu, P.A.; Beale, D. Optimum Synthesis of the Four-Bar Function Generator in Its Symmetric Embodiment: The Ackermann Steering Linkage. Mech. Mach. Theory 2002, 37, 1487–1504. [Google Scholar] [CrossRef]
- Zhao, J.-S.; Liu, X.; Feng, Z.-J.; Dai, J.S. Design of an Ackermann-Type Steering Mechanism. Proc. Inst. Mech. Eng. Part C: J. Mech. Eng. Sci. 2013, 227, 2549–2562. [Google Scholar] [CrossRef]
- Kuslits, M.; Bestle, D. Modelling and Control of a New Differential Steering Concept. Veh. Syst. Dyn. 2019, 57, 520–542. [Google Scholar] [CrossRef]
- Gfrerrer, A. Geometry and Kinematics of the Mecanum Wheel. Comput. Aided Geom. Des. 2008, 25, 784–791. [Google Scholar] [CrossRef]
- Dickerson, S.L.; Lapin, B.D. Control of an Omni-Directional Robotic Vehicle with Mecanum Wheels. In Proceedings of the NTC ’91—National Telesystems Conference Proceedings, Atlanta, GA, USA, 26–27 March 1991; IEEE: Atlanta, GA, USA, 1991; pp. 323–328. [Google Scholar]
- Hryniewicz, P.; Gwiazda, A.; Banaś, W.; Sękala, A.; Foit, K. Modelling of a Mecanum Wheel Taking into Account the Geometry of Road Rollers. IOP Conf. Ser. Mater. Sci. Eng. 2017, 227, 012060. [Google Scholar] [CrossRef]
- Ben-Tzvi, P.; Saab, W. A Hybrid Tracked-Wheeled Multi-Directional Mobile Robot. J. Mech. Robot. 2019, 11, 041008. [Google Scholar] [CrossRef]
- Sun, Y.; Xu, L.; Jing, B.; Chai, X.; Li, Y. Development of a Four-Point Adjustable Lifting Crawler Chassis and Experiments in a Combine Harvester. Comput. Electron. Agric. 2020, 173, 105416. [Google Scholar] [CrossRef]
- Tinoco, V.; Silva, M.F.; Santos, F.N.; Valente, A.; Rocha, L.F.; Magalhães, S.A.; Santos, L.C. An Overview of Pruning and Harvesting Manipulators. IR 2022, 49, 688–695. [Google Scholar] [CrossRef]
- Lu, J.; Zou, T.; Jiang, X. A Neural Network Based Approach to Inverse Kinematics Problem for General Six-Axis Robots. Sensors 2022, 22, 8909. [Google Scholar] [CrossRef] [PubMed]
- Boryga, M.; Kołodziej, P.; Graboś, A.; Gołacki, K. Mapping Accuracy of Trajectories of Manipulator Motion. ITM Web Conf. 2018, 21, 00009. [Google Scholar] [CrossRef]
- Kucuk, S.; Bingul, Z. Robot Kinematics: Forward and Inverse Kinematics. In Industrial Robotics: Theory, Modelling and Control; Cubero, S., Ed.; Pro Literatur Verlag: Berlin, Germany; ARS: Linz, Austria, 2006; ISBN 978-3-86611-285-8. [Google Scholar]
- Ames, B.; Morgan, J.; Konidaris, G. IKFlow: Generating Diverse Inverse Kinematics Solutions. IEEE Robot. Autom. Lett. 2022, 7, 7177–7184. [Google Scholar] [CrossRef]
- Fang, G.; Tian, Y.; Yang, Z.-X.; Geraedts, J.M.P.; Wang, C.C.L. Efficient Jacobian-Based Inverse Kinematics with Sim-to-Real Transfer of Soft Robots by Learning. IEEE/ASME Trans. Mechatron. 2022, 27, 5296–5306. [Google Scholar] [CrossRef]
- Marconi, G.M.; Camoriano, R.; Rosasco, L.; Ciliberto, C. Structured Prediction for CRiSP Inverse Kinematics Learning with Misspecified Robot Models. IEEE Robot. Autom. Lett. 2021, 6, 5650–5657. [Google Scholar] [CrossRef]
- Zhao, G.; Jiang, D.; Liu, X.; Tong, X.; Sun, Y.; Tao, B.; Kong, J.; Yun, J.; Liu, Y.; Fang, Z. A Tandem Robotic Arm Inverse Kinematic Solution Based on an Improved Particle Swarm Algorithm. Front. Bioeng. Biotechnol. 2022, 10, 832829. [Google Scholar] [CrossRef] [PubMed]
- Pfeiffer, F.; Johanni, R. A Concept for Manipulator Trajectory Planning. IEEE J. Robot. Autom. 1987, 3, 115–123. [Google Scholar] [CrossRef]
- Luh, J.; Walker, M.; Paul, R. Resolved-Acceleration Control of Mechanical Manipulators. IEEE Trans. Automat. Contr. 1980, 25, 468–474. [Google Scholar] [CrossRef]
- Meng, F.; Li, J.; Zhang, Y.; Qi, S.; Tang, Y. Transforming Unmanned Pineapple Picking with Spatio-Temporal Convolutional Neural Networks. Comput. Electron. Agric. 2023, 214, 108298. [Google Scholar] [CrossRef]
- Dimeas, F.; Sako, D.V.; Moulianitis, V.C.; Aspragathos, N.A. Design and Fuzzy Control of a Robotic Gripper for Efficient Strawberry Harvesting. Robotica 2014, 33, 1085–1098. [Google Scholar] [CrossRef]
- Gunderman, A.; Collins, J.; Myers, A.; Threlfall, R.; Chen, Y. Tendon-Driven Soft Robotic Gripper for Blackberry Harvesting. IEEE Robot. Autom. Lett. 2022, 7, 2652–2659. [Google Scholar] [CrossRef]
- Zhang, Y.; Zhang, K.; Yang, L.; Zhang, D.; Cui, T.; Yu, Y.; Liu, H. Design and Simulation Experiment of Ridge Planting Strawberry Picking Manipulator. Comput. Electron. Agric. 2023, 208, 107690. [Google Scholar] [CrossRef]
- Feng, Q.; Chen, J.; Zhang, M.; Wang, X. Design and Test of Harvesting Robot for Table-Top Cultivated Strawberry. In Proceedings of the 2019 WRC Symposium on Advanced Robotics and Automation (WRC SARA), Beijing, China, 21–22 August 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 80–85. [Google Scholar]
- Williams, H.; Ting, C.; Nejati, M.; Jones, M.H.; Penhall, N.; Lim, J.; Seabright, M.; Bell, J.; Ahn, H.S.; Scarfe, A.; et al. Improvements to and Large-scale Evaluation of a Robotic Kiwifruit Harvester. J. Field Robot. 2020, 37, 187–201. [Google Scholar] [CrossRef]
- Carbone, G. (Ed.) Grasping in Robotics; Mechanisms and Machine Science; Springer: London, UK, 2013; Volume 10, ISBN 978-1-4471-4663-6. [Google Scholar]
- Li, H.; Gu, Z.; He, D.; Wang, X.; Huang, J.; Mo, Y.; Li, P.; Huang, Z.; Wu, F. A Lightweight Improved YOLOv5s Model and Its Deployment for Detecting Pitaya Fruits in Daytime and Nighttime Light-Supplement Environments. Comput. Electron. Agric. 2024, 220, 108914. [Google Scholar] [CrossRef]
- Bello, R.-W.; Oladipo, M.A. Mask YOLOv7-Based Drone Vision System for Automated Cattle Detection and Counting. AIA 2024, 2, 129–139. [Google Scholar] [CrossRef]
- Chen, L.; Li, S.; Bai, Q.; Yang, J.; Jiang, S.; Miao, Y. Review of Image Classification Algorithms Based on Convolutional Neural Networks. Remote Sens. 2021, 13, 4712. [Google Scholar] [CrossRef]
- Russakovsky, O.; Deng, J.; Su, H.; Krause, J.; Satheesh, S.; Ma, S.; Huang, Z.; Karpathy, A.; Khosla, A.; Bernstein, M.; et al. ImageNet Large Scale Visual Recognition Challenge. Int. J. Comput. Vis. 2015, 115, 211–252. [Google Scholar] [CrossRef]
- Hafiz, A.M.; Bhat, G.M. A Survey on Instance Segmentation: State of the Art. Int. J. Multimed. Info. Retr. 2020, 9, 171–189. [Google Scholar] [CrossRef]
- Kim, D.-H.; Lu, N.; Ghaffari, R.; Kim, Y.-S.; Lee, S.P.; Xu, L.; Wu, J.; Kim, R.-H.; Song, J.; Liu, Z.; et al. Materials for Multifunctional Balloon Catheters with Capabilities in Cardiac Electrophysiological Mapping and Ablation Therapy. Nat. Mater. 2011, 10, 316–323. [Google Scholar] [CrossRef] [PubMed]
- Lee, M.H.; Nicholls, H.R. Review Article Tactile Sensing for Mechatronics—A State of the Art Survey. Mechatronics 1999, 9, 1–31. [Google Scholar] [CrossRef]
- Qu, J.; Mao, B.; Li, Z.; Xu, Y.; Zhou, K.; Cao, X.; Fan, Q.; Xu, M.; Liang, B.; Liu, H.; et al. Recent Progress in Advanced Tactile Sensing Technologies for Soft Grippers. Adv. Funct. Mater. 2023, 33, 2306249. [Google Scholar] [CrossRef]
- Dargahi, J.; Najarian, S. Advances in Tactile Sensors Design/Manufacturing and Its Impact on Robotics Applications—A Review. Ind. Robot 2005, 32, 268–281. [Google Scholar] [CrossRef]
- Sam, R.; Nefti, S. Design and Development of Flexible Robotic Gripper for Handling Food Products. In Proceedings of the 2008 10th International Conference on Control, Automation, Robotics and Vision, Hanoi, Vietnam, 17–20 December 2008; IEEE: Piscataway, NJ, USA, 2008; pp. 1684–1689. [Google Scholar]
- Shi, J.; Wang, L.; Dai, Z.; Zhao, L.; Du, M.; Li, H.; Fang, Y. Multiscale Hierarchical Design of a Flexible Piezoresistive Pressure Sensor with High Sensitivity and Wide Linearity Range. Small 2018, 14, 1800819. [Google Scholar] [CrossRef] [PubMed]
- Cao, M.; Su, J.; Fan, S.; Qiu, H.; Su, D.; Li, L. Wearable Piezoresistive Pressure Sensors Based on 3D Graphene. Chem. Eng. J. 2021, 406, 126777. [Google Scholar] [CrossRef]
- Yang, T.; Deng, W.; Chu, X.; Wang, X.; Hu, Y.; Fan, X.; Song, J.; Gao, Y.; Zhang, B.; Tian, G.; et al. Hierarchically Microstructure-Bioinspired Flexible Piezoresistive Bioelectronics. ACS Nano 2021, 15, 11555–11563. [Google Scholar] [CrossRef] [PubMed]
- Hwang, J.; Kim, Y.; Yang, H.; Oh, J.H. Fabrication of Hierarchically Porous Structured PDMS Composites and Their Application as a Flexible Capacitive Pressure Sensor. Compos. Part B Eng. 2021, 211, 108607. [Google Scholar] [CrossRef]
- Yang, J.C.; Kim, J.-O.; Oh, J.; Kwon, S.Y.; Sim, J.Y.; Kim, D.W.; Choi, H.B.; Park, S. Microstructured Porous Pyramid-Based Ultrahigh Sensitive Pressure Sensor Insensitive to Strain and Temperature. ACS Appl. Mater. Interfaces 2019, 11, 19472–19480. [Google Scholar] [CrossRef] [PubMed]
- Yang, J.; Luo, S.; Zhou, X.; Li, J.; Fu, J.; Yang, W.; Wei, D. Flexible, Tunable, and Ultrasensitive Capacitive Pressure Sensor with Microconformal Graphene Electrodes. ACS Appl. Mater. Interfaces 2019, 11, 14997–15006. [Google Scholar] [CrossRef] [PubMed]
- Lin, W.; Wang, B.; Peng, G.; Shan, Y.; Hu, H.; Yang, Z. Skin-Inspired Piezoelectric Tactile Sensor Array with Crosstalk-Free Row+Column Electrodes for Spatiotemporally Distinguishing Diverse Stimuli. Adv. Sci. 2021, 8, 2002817. [Google Scholar] [CrossRef] [PubMed]
- Peng, Y.; Que, M.; Lee, H.E.; Bao, R.; Wang, X.; Lu, J.; Yuan, Z.; Li, X.; Tao, J.; Sun, J.; et al. Achieving High-Resolution Pressure Mapping via Flexible GaN/ ZnO Nanowire LEDs Array by Piezo-Phototronic Effect. Nano Energy 2019, 58, 633–640. [Google Scholar] [CrossRef]
- Wang, X.; Zhang, H.; Yu, R.; Dong, L.; Peng, D.; Zhang, A.; Zhang, Y.; Liu, H.; Pan, C.; Wang, Z.L. Dynamic Pressure Mapping of Personalized Handwriting by a Flexible Sensor Matrix Based on the Mechanoluminescence Process. Adv. Mater. 2015, 27, 2324–2331. [Google Scholar] [CrossRef] [PubMed]
- Wang, X.; Zhang, H.; Dong, L.; Han, X.; Du, W.; Zhai, J.; Pan, C.; Wang, Z.L. Self-Powered High-Resolution and Pressure-Sensitive Triboelectric Sensor Matrix for Real-Time Tactile Mapping. Adv. Mater. 2016, 28, 2896–2903. [Google Scholar] [CrossRef] [PubMed]
- Wang, L.; Liu, Y.; Liu, Q.; Zhu, Y.; Wang, H.; Xie, Z.; Yu, X.; Zi, Y. A Metal-Electrode-Free, Fully Integrated, Soft Triboelectric Sensor Array for Self-Powered Tactile Sensing. Microsyst. Nanoeng. 2020, 6, 59. [Google Scholar] [CrossRef]
- Wang, X.; Zhang, Y.; Zhang, X.; Huo, Z.; Li, X.; Que, M.; Peng, Z.; Wang, H.; Pan, C. A Highly Stretchable Transparent Self-Powered Triboelectric Tactile Sensor with Metallized Nanofibers for Wearable Electronics. Adv. Mater. 2018, 30, 1706738. [Google Scholar] [CrossRef] [PubMed]
- Pang, C. A Flexible and Highly Sensitive Strain-Gauge Sensor Using Reversible Interlocking of Nanofibres. Nat. Mater. 2012, 11, 795–801. [Google Scholar] [CrossRef] [PubMed]
- Tiwana, M.I.; Redmond, S.J.; Lovell, N.H. A Review of Tactile Sensing Technologies with Applications in Biomedical Engineering. Sens. Actuators A Phys. 2012, 179, 17–31. [Google Scholar] [CrossRef]
- Duan, Y.; He, S.; Wu, J.; Su, B.; Wang, Y. Recent Progress in Flexible Pressure Sensor Arrays. Nanomaterials 2022, 12, 2495. [Google Scholar] [CrossRef] [PubMed]
- Gao, Y.; Xiao, T.; Li, Q.; Chen, Y.; Qiu, X.; Liu, J.; Bian, Y.; Xuan, F. Flexible Microstructured Pressure Sensors: Design, Fabrication and Applications. Nanotechnology 2022, 33, 322002. [Google Scholar] [CrossRef] [PubMed]
- Mishra, R.B.; El-Atab, N.; Hussain, A.M.; Hussain, M.M. Recent Progress on Flexible Capacitive Pressure Sensors: From Design and Materials to Applications. Adv. Mater. Technol. 2021, 6, 2001023. [Google Scholar] [CrossRef]
- Hammock, M.L.; Chortos, A.; Tee, B.C.-K.; Tok, J.B.-H.; Bao, Z. 25th Anniversary Article: The Evolution of Electronic Skin (E-Skin): A Brief History, Design Considerations, and Recent Progress. Adv. Mater. 2013, 25, 5997–6038. [Google Scholar] [CrossRef] [PubMed]
- Peng, Y.; Yang, N.; Xu, Q.; Dai, Y.; Wang, Z. Recent Advances in Flexible Tactile Sensors for Intelligent Systems. Sensors 2021, 21, 5392. [Google Scholar] [CrossRef] [PubMed]
- Zhou, K.; Zhao, Y.; Sun, X.; Yuan, Z.; Zheng, G.; Dai, K.; Mi, L.; Pan, C.; Liu, C.; Shen, C. Ultra-Stretchable Triboelectric Nanogenerator as High-Sensitive and Self-Powered Electronic Skins for Energy Harvesting and Tactile Sensing. Nano Energy 2020, 70, 104546. [Google Scholar] [CrossRef]
- Benet, G.; Blanes, F.; Simó, J.E.; Pérez, P. Using Infrared Sensors for Distance Measurement in Mobile Robots. Robot. Auton. Syst. 2002, 40, 255–266. [Google Scholar] [CrossRef]
- Abbas, I.; Liu, J.; Faheem, M.; Noor, R.S.; Shaikh, S.A.; Solangi, K.A.; Raza, S.M. Different Sensor Based Intelligent Spraying Systems in Agriculture. Sens. Actuators A Phys. 2020, 316, 112265. [Google Scholar] [CrossRef]
- Hauptmann, P.; Hoppe, N.; Püttmer, A. Application of Ultrasonic Sensors in the Process Industry. Meas. Sci. Technol. 2002, 13, R73–R83. [Google Scholar] [CrossRef]
- Jiang, Q.; Zhang, M.; Xu, B. Application of Ultrasonic Technology in Postharvested Fruits and Vegetables Storage: A Review. Ultrason. Sonochem. 2020, 69, 105261. [Google Scholar] [CrossRef] [PubMed]
- Li, N.; Ho, C.P.; Xue, J.; Lim, L.W.; Chen, G.; Fu, Y.H.; Lee, L.Y.T. A Progress Review on Solid-State LiDAR and Nano-photonics-Based LiDAR Sensors. Laser Photonics Rev. 2022, 16, 2100511. [Google Scholar] [CrossRef]
- Wang, J.; Chortos, A. Control Strategies for Soft Robot Systems. Adv. Intell. Syst. 2022, 4, 2100165. [Google Scholar] [CrossRef]
- Li, P.; Liu, X. Common Sensors in Industrial Robots: A Review. J. Phys. Conf. Ser. 2019, 1267, 012036. [Google Scholar] [CrossRef]
- Apneseth, C.; Dzung, D.; Kjesbu, S.; Scheible, G.; Zimmermann, W. Wireless—Introducing Wireless Proximity Switches. Sens. Rev. 2003, 23, 116–122. [Google Scholar] [CrossRef]
- Monkman, G.J.; Hesse, S.; Steinmann, R.; Schunk, H. Robot Grippers, 1st ed.; Wiley: Hoboken, NJ, USA, 2006; ISBN 978-3-527-40619-7. [Google Scholar]
- Pallay, M.; Miles, R.N.; Towfighian, S. A Tunable Electrostatic MEMS Pressure Switch. IEEE Trans. Ind. Electron. 2020, 67, 9833–9840. [Google Scholar] [CrossRef]
- An, Q.; Wang, K.; Li, Z.; Song, C.; Tang, X.; Song, J. Real-Time Monitoring Method of Strawberry Fruit Growth State Based on YOLO Improved Model. IEEE Access 2022, 10, 124363–124372. [Google Scholar] [CrossRef]
- Chen, J.; Wang, Z.; Wu, J.; Hu, Q.; Zhao, C.; Tan, C.; Teng, L.; Luo, T. An Improved Yolov3 Based on Dual Path Network for Cherry Tomatoes Detection. J Food Process Eng. 2021, 44, e13803. [Google Scholar] [CrossRef]
- Gai, R.; Chen, N.; Yuan, H. A Detection Algorithm for Cherry Fruits Based on the Improved YOLO-v4 Model. Neural Comput. Appl. 2023, 35, 13895–13906. [Google Scholar] [CrossRef]
- Yang, W.; Ma, X.; Hu, W.; Tang, P. Lightweight Blueberry Fruit Recognition Based on Multi-Scale and Attention Fusion NCBAM. Agronomy 2022, 12, 2354. [Google Scholar] [CrossRef]
- Fan, Y.; Zhang, S.; Feng, K.; Qian, K.; Wang, Y.; Qin, S. Strawberry Maturity Recognition Algorithm Combining Dark Channel Enhancement and YOLOv5. Sensors 2022, 22, 419. [Google Scholar] [CrossRef] [PubMed]
- Habaragamuwa, H.; Ogawa, Y.; Suzuki, T.; Shiigi, T.; Ono, M.; Kondo, N. Detecting Greenhouse Strawberries (Mature and Immature), Using Deep Convolutional Neural Network. Eng. Agric. Environ. Food 2018, 11, 127–138. [Google Scholar] [CrossRef]
- Liu, G.; Nouaze, J.C.; Touko Mbouembe, P.L.; Kim, J.H. YOLO-Tomato: A Robust Algorithm for Tomato Detection Based on YOLOv3. Sensors 2020, 20, 2145. [Google Scholar] [CrossRef] [PubMed]
- Lawal, M.O. Tomato Detection Based on Modified YOLOv3 Framework. Sci. Rep. 2021, 11, 1447. [Google Scholar] [CrossRef] [PubMed]
- Yu, Y.; Zhang, K.; Liu, H.; Yang, L.; Zhang, D. Real-Time Visual Localization of the Picking Points for a Ridge-Planting Strawberry Harvesting Robot. IEEE Access 2020, 8, 116556–116568. [Google Scholar] [CrossRef]
- Fu, L.; Feng, Y.; Wu, J.; Liu, Z.; Gao, F.; Majeed, Y.; Al-Mallahi, A.; Zhang, Q.; Li, R.; Cui, Y. Fast and Accurate Detection of Kiwifruit in Orchard Using Improved YOLOv3-Tiny Model. Precis. Agric. 2021, 22, 754–776. [Google Scholar] [CrossRef]
- Zabawa, L.; Kicherer, A.; Klingbeil, L.; Milioto, A.; Topfer, R.; Kuhlmann, H.; Roscher, R. Detection of Single Grapevine Berries in Images Using Fully Convolutional Neural Networks. In Proceedings of the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Long Beach, CA, USA, 16–17 June 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 2571–2579. [Google Scholar]
- Rong, Q.; Hu, C.; Hu, X.; Xu, M. Picking Point Recognition for Ripe Tomatoes Using Semantic Segmentation and Morphological Processing. Comput. Electron. Agric. 2023, 210, 107923. [Google Scholar] [CrossRef]
- Tang, C.; Chen, D.; Wang, X.; Ni, X.; Liu, Y.; Liu, Y.; Mao, X.; Wang, S. A Fine Recognition Method of Strawberry Ripeness Combining Mask R-CNN and Region Segmentation. Front. Plant Sci. 2023, 14, 1211830. [Google Scholar] [CrossRef]
- Peng, Y.; Wang, A.; Liu, J.; Faheem, M. A Comparative Study of Semantic Segmentation Models for Identification of Grape with Different Varieties. Agriculture 2021, 11, 997. [Google Scholar] [CrossRef]
- Ilyas, T.; Umraiz, M.; Khan, A.; Kim, H. DAM: Hierarchical Adaptive Feature Selection Using Convolution Encoder Decoder Network for Strawberry Segmentation. Front. Plant Sci. 2021, 12, 591333. [Google Scholar] [CrossRef] [PubMed]
- Roscher, R.; Herzog, K.; Kunkel, A.; Kicherer, A.; Töpfer, R.; Förstner, W. Automated Image Analysis Framework for High-Throughput Determination of Grapevine Berry Sizes Using Conditional Random Fields. Comput. Electron. Agric. 2014, 100, 148–158. [Google Scholar] [CrossRef]
- Milella, A. In-Field High Throughput Grapevine Phenotyping with a Consumer-Grade Depth Camera. Comput. Electron. Agric. 2019, 156, 293–306. [Google Scholar] [CrossRef]
- Wang, C.; Yang, G.; Huang, Y.; Liu, Y.; Zhang, Y. A Transformer-Based Mask R-CNN for Tomato Detection and Segmentation. IFS 2023, 44, 8585–8595. [Google Scholar] [CrossRef]
- Lei, H.; Huang, K.; Jiao, Z.; Tang, Y.; Zhong, Z.; Cai, Y. Bayberry Segmentation in a Complex Environment Based on a Multi-Module Convolutional Neural Network. Appl. Soft Comput. 2022, 119, 108556. [Google Scholar] [CrossRef]
- Zabawa, L.; Kicherer, A.; Klingbeil, L.; Töpfer, R.; Kuhlmann, H.; Roscher, R. Counting of Grapevine Berries in Images via Semantic Segmentation Using Convolutional Neural Networks. ISPRS J. Photogramm. Remote Sens. 2020, 164, 73–83. [Google Scholar] [CrossRef]
- Gonzalez, S.; Arellano, C.; Tapia, J.E. Deepblueberry: Quantification of Blueberries in the Wild Using Instance Segmentation. IEEE Access 2019, 7, 105776–105788. [Google Scholar] [CrossRef]
- Ni, X.; Li, C.; Jiang, H.; Takeda, F. Deep Learning Image Segmentation and Extraction of Blueberry Fruit Traits Associated with Harvestability and Yield. Hortic. Res. 2020, 7, 110. [Google Scholar] [CrossRef] [PubMed]
- Chen, Y.; Li, X.; Jia, M.; Li, J.; Hu, T.; Luo, J. Instance Segmentation and Number Counting of Grape Berry Images Based on Deep Learning. Appl. Sci. 2023, 13, 6751. [Google Scholar] [CrossRef]
- Wang, Y.; Lv, J.; Xu, L.; Gu, Y.; Zou, L.; Ma, Z. A Segmentation Method for Waxberry Image under Orchard Environment. Sci. Hortic. 2020, 266, 109309. [Google Scholar] [CrossRef]
- Luo, L.; Liu, W.; Lu, Q.; Wang, J.; Wen, W.; Yan, D.; Tang, Y. Grape Berry Detection and Size Measurement Based on Edge Image Processing and Geometric Morphology. Machines 2021, 9, 233. [Google Scholar] [CrossRef]
- Cai, C.; Tan, J.; Zhang, P.; Ye, Y.; Zhang, J. Determining Strawberries’ Varying Maturity Levels by Utilizing Image Segmentation Methods of Improved DeepLabV3+. Agronomy 2022, 12, 1875. [Google Scholar] [CrossRef]
- Xu, P.; Fang, N.; Liu, N.; Lin, F.; Yang, S.; Ning, J. Visual Recognition of Cherry Tomatoes in Plant Factory Based on Improved Deep Instance Segmentation. Comput. Electron. Agric. 2022, 197, 106991. [Google Scholar] [CrossRef]
- Pérez-Borrero, I.; Marín-Santos, D.; Gegúndez-Arias, M.E.; Cortés-Ancos, E. A Fast and Accurate Deep Learning Method for Strawberry Instance Segmentation. Comput. Electron. Agric. 2020, 178, 105736. [Google Scholar] [CrossRef]
- Hu, H.; Kaizu, Y.; Zhang, H.; Xu, Y.; Imou, K.; Li, M.; Huang, J.; Dai, S. Recognition and Localization of Strawberries from 3D Binocular Cameras for a Strawberry Picking Robot Using Coupled YOLO/Mask R-CNN. Int. J. Agric. Biol. Eng. 2022, 15, 175–179. [Google Scholar] [CrossRef]
- Afzaal, U.; Bhattarai, B.; Pandeya, Y.R.; Lee, J. An Instance Segmentation Model for Strawberry Diseases Based on Mask R-CNN. Sensors 2021, 21, 6565. [Google Scholar] [CrossRef] [PubMed]
- Mo, Y.; Wu, Y.; Yang, X.; Liu, F.; Liao, Y. Review the State-of-the-Art Technologies of Semantic Segmentation Based on Deep Learning. Neurocomputing 2022, 493, 626–646. [Google Scholar] [CrossRef]
- Gu, W.; Bai, S.; Kong, L. A Review on 2D Instance Segmentation Based on Deep Neural Networks. Image Vis. Comput. 2022, 120, 104401. [Google Scholar] [CrossRef]
- Li, Y.; Chen, Y.; Li, Y. Pre-Charged Pneumatic Soft Gripper with Closed-Loop Control. IEEE Robot. Autom. Lett. 2019, 4, 1402–1408. [Google Scholar] [CrossRef]
- Ruotolo, W.; Brouwer, D.; Cutkosky, M.R. From Grasping to Manipulation with Gecko-Inspired Adhesives on a Multifinger Gripper. Sci. Robot. 2021, 6, eabi9773. [Google Scholar] [CrossRef] [PubMed]
- Visentin, F.; Castellini, F.; Muradore, R. A Soft, Sensorized Gripper for Delicate Harvesting of Small Fruits. Comput. Electron. Agric. 2023, 213, 108202. [Google Scholar] [CrossRef]
- Jin, L.; Wang, Z.; Tian, S.; Feng, J.; An, C.; Xu, H. Grasping Perception and Prediction Model of Kiwifruit Firmness Based on Flexible Sensing Claw. Comput. Electron. Agric. 2023, 215, 108389. [Google Scholar] [CrossRef]
- Lehnert, C.; McCool, C.; Sa, I.; Perez, T. Performance Improvements of a Sweet Pepper Harvesting Robot in Protected Cropping Environments. J. Field Robot. 2020, 37, 1197–1223. [Google Scholar] [CrossRef]
- Min, Y.; Kim, Y.; Jin, H.; Kim, H.J. Intelligent Gripper Systems Using Air Gap-Controlled Bimodal Tactile Sensors for Deformable Object Classification. Adv. Intell. Syst. 2023, 5, 2300317. [Google Scholar] [CrossRef]
- Shih, B.; Christianson, C.; Gillespie, K.; Lee, S.; Mayeda, J.; Huo, Z.; Tolley, M.T. Design Considerations for 3D Printed, Soft, Multimaterial Resistive Sensors for Soft Robotics. Front. Robot. AI 2019, 6, 30. [Google Scholar] [CrossRef]
- Yoder, Z.; Macari, D.; Kleinwaks, G.; Schmidt, I.; Acome, E.; Keplinger, C. A Soft, Fast and Versatile Electrohydraulic Gripper with Capacitive Object Size Detection. Adv. Funct. Mater. 2023, 33, 2209080. [Google Scholar] [CrossRef]
- Hu, Z.; Chu, Z.; Chen, G.; Cui, J. Design of Capacitive Pressure Sensors Integrated with Anisotropic Wedge Microstructure-Based Dielectric Layer. IEEE Sens. J. 2023, 23, 21040–21049. [Google Scholar] [CrossRef]
- Fastier-Wooller, J.W.; Vu, T.-H.; Nguyen, H.; Nguyen, H.-Q.; Rybachuk, M.; Zhu, Y.; Dao, D.V.; Dau, V.T. Multimodal Fibrous Static and Dynamic Tactile Sensor. ACS Appl. Mater. Interfaces 2022, 14, 27317–27327. [Google Scholar] [CrossRef] [PubMed]
- Qiu, Y.; Sun, S.; Wang, X.; Shi, K.; Wang, Z.; Ma, X.; Zhang, W.; Bao, G.; Tian, Y.; Zhang, Z.; et al. Nondestructive Identification of Softness via Bioinspired Multisensory Electronic Skins Integrated on a Robotic Hand. NPJ Flex. Electron. 2022, 6, 45. [Google Scholar] [CrossRef]
- Chen, S.; Pang, Y.; Yuan, H.; Tan, X.; Cao, C. Smart Soft Actuators and Grippers Enabled by Self-Powered Tribo-Skins. Adv. Mater. Technol. 2020, 5, 1901075. [Google Scholar] [CrossRef]
- Li, N.; Yin, Z.; Zhang, W.; Xing, C.; Peng, T.; Meng, B.; Yang, J.; Peng, Z. A Triboelectric-Inductive Hybrid Tactile Sensor for Highly Accurate Object Recognition. Nano Energy 2022, 96, 107063. [Google Scholar] [CrossRef]
- Xu, J.; Xie, Z.; Yue, H.; Lu, Y.; Yang, F. A Triboelectric Multifunctional Sensor Based on the Controlled Buckling Structure for Motion Monitoring and Bionic Tactile of Soft Robots. Nano Energy 2022, 104, 107845. [Google Scholar] [CrossRef]
- Amjadi, M.; Kyung, K.; Park, I.; Sitti, M. Stretchable, Skin-Mountable, and Wearable Strain Sensors and Their Potential Applications: A Review. Adv. Funct. Mater. 2016, 26, 1678–1698. [Google Scholar] [CrossRef]
- Trung, T.Q.; Lee, N. Flexible and Stretchable Physical Sensor Integrated Platforms for Wearable Human-Activity Monitoringand Personal Healthcare. Adv. Mater. 2016, 28, 4338–4372. [Google Scholar] [CrossRef] [PubMed]
- Li, J.; Fang, L.; Sun, B.; Li, X.; Kang, S.H. Review—Recent Progress in Flexible and Stretchable Piezoresistive Sensors and Their Applications. J. Electrochem. Soc. 2020, 167, 037561. [Google Scholar] [CrossRef]
- Hannigan, B.C.; Cuthbert, T.J.; Geng, W.; Tavassolian, M.; Menon, C. Understanding the Impact of Machine Learning Models on the Performance of Different Flexible Strain Sensor Modalities. Front. Mater. 2021, 8, 639823. [Google Scholar] [CrossRef]
- Kim, K.; Kim, J.; Jiang, X.; Kim, T. Static Force Measurement Using Piezoelectric Sensors. J. Sens. 2021, 2021, 1–8. [Google Scholar] [CrossRef]
- Wu, C.; Wang, A.C.; Ding, W.; Guo, H.; Wang, Z.L. Triboelectric Nanogenerator: A Foundation of the Energy for the New Era. Adv. Energy Mater. 2019, 9, 1802906. [Google Scholar] [CrossRef]
- Song, C.; Wang, K.; Wang, C.; Tian, Y.; Wei, X.; Li, C.; An, Q.; Song, J. TDPPL-Net: A Lightweight Real-Time Tomato Detection and Picking Point Localization Model for Harvesting Robots. IEEE Access 2023, 11, 37650–37664. [Google Scholar] [CrossRef]
- Zhu, Y.; Zhang, T.; Liu, L.; Liu, P.; Li, X. Fast Location of Table Grapes Picking Point Based on Infrared Tube. Inventions 2022, 7, 27. [Google Scholar] [CrossRef]
- Mejia, G.; Montes De Oca, A.; Flores, G. Strawberry Localization in a Ridge Planting with an Autonomous Rover. Eng. Appl. Artif. Intell. 2023, 119, 105810. [Google Scholar] [CrossRef]
- Gao, J.; Zhang, F.; Zhang, J.; Yuan, T.; Yin, J.; Guo, H.; Yang, C. Development and Evaluation of a Pneumatic Finger-like End-Effector for Cherry Tomato Harvesting Robot in Greenhouse. Comput. Electron. Agric. 2022, 197, 106879. [Google Scholar] [CrossRef]
- Ren, G.; Wu, H.; Bao, A.; Lin, T.; Ting, K.-C.; Ying, Y. Mobile Robotics Platform for Strawberry Temporal–Spatial Yield Monitoring within Precision Indoor Farming Systems. Front. Plant Sci. 2023, 14, 1162435. [Google Scholar] [CrossRef] [PubMed]
- Rapado-Rincón, D.; Van Henten, E.J.; Kootstra, G. Development and Evaluation of Automated Localisation and Reconstruction of All Fruits on Tomato Plants in a Greenhouse Based on Multi-View Perception and 3D Multi-Object Tracking. Biosyst. Eng. 2023, 231, 78–91. [Google Scholar] [CrossRef]
- Miao, Z.; Yu, X.; Li, N.; Zhang, Z.; He, C.; Li, Z.; Deng, C.; Sun, T. Efficient Tomato Harvesting Robot Based on Image Processing and Deep Learning. Precis. Agric 2023, 24, 254–287. [Google Scholar] [CrossRef]
- Kishore, C.S.; Qamar, U.Z.; Arnold, W.S.; David, C. Percival Detecting Weed and Bare-Spot in Wild Blueberry Using Ultrasonic Sensor Technology. In Proceedings of the 2009 ASABE Annual International Meeting, Reno, NV, USA, 21–24 June 2009; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2009; p. 096879. [Google Scholar]
- Martínez-Guanter, J.; Garrido-Izard, M.; Valero, C.; Slaughter, D.; Pérez-Ruiz, M. Optical Sensing to Determine Tomato Plant Spacing for Precise Agrochemical Application: Two Scenarios. Sensors 2017, 17, 1096. [Google Scholar] [CrossRef] [PubMed]
- Wang, D.; Dong, Y.; Lian, J.; Gu, D. Adaptive End-effector Pose Control for Tomato Harvesting Robots. J. Field Robot. 2023, 40, 535–551. [Google Scholar] [CrossRef]
- Popa, D.; Udrea, F. Towards Integrated Mid-Infrared Gas Sensors. Sensors 2019, 19, 2076. [Google Scholar] [CrossRef] [PubMed]
- Tang, Y.; Qi, S.; Zhu, L.; Zhuo, X.; Zhang, Y.; Meng, F. Obstacle Avoidance Motion in Mobile Robotics. J. Syst. Simul. 2024, 36, 1. [Google Scholar]
- Dalla Mura, M.; Prasad, S.; Pacifici, F.; Gamba, P.; Chanussot, J.; Benediktsson, J.A. Challenges and Opportunities of Multimodality and Data Fusion in Remote Sensing. Proc. IEEE 2015, 103, 1585–1601. [Google Scholar] [CrossRef]
- Navas, E.; Shamshiri, R.R.; Dworak, V.; Weltzien, C.; Fernández, R. Soft Gripper for Small Fruits Harvesting and Pick and Place Operations. Front. Robot. AI 2024, 10, 1330496. [Google Scholar] [CrossRef]
- Li, Z.; Yuan, X.; Yang, Z. Design, Simulation, and Experiment for the End Effector of a Spherical Fruit Picking Robot. Int. J. Adv. Robot. Syst. 2023, 20, 17298806231213442. [Google Scholar] [CrossRef]
- Mu, L.; Cui, G.; Liu, Y.; Cui, Y.; Fu, L.; Gejima, Y. Design and Simulation of an Integrated End-Effector for Picking Kiwifruit by Robot. Inf. Process. Agric. 2020, 7, 58–71. [Google Scholar] [CrossRef]
- Fu, M.; Guo, S.; Chen, A.; Cheng, R.; Cui, X. Design and Experimentation of Multi-Fruit Envelope-Cutting Kiwifruit Picking Robot. Front. Plant Sci. 2024, 15, 1338050. [Google Scholar] [CrossRef] [PubMed]
- Kumar Uppalapati, N.; Walt, B.; Havens, A.; Mahdian, A.; Chowdhary, G.; Krishnan, G. A Berry Picking Robot with A Hybrid Soft-Rigid Arm: Design and Task Space Control. In Proceedings of the Robotics: Science and Systems XVI, Virtual, 12–16 July 2020; Robotics: Science and Systems Foundation: Delft, The Netherlands, 2020. [Google Scholar]
- Chappell, D.; Bello, F.; Kormushev, P.; Rojas, N. The Hydra Hand: A Mode-Switching Underactuated Gripper with Precision and Power Grasping Modes. IEEE Robot. Autom. Lett. 2023, 8, 7599–7606. [Google Scholar] [CrossRef]
- Zhang, B.; Zhou, J.; Meng, Y.; Zhang, N.; Gu, B.; Yan, Z.; Idris, S.I. Comparative Study of Mechanical Damage Caused by a Two-Finger Tomato Gripper with Different Robotic Grasping Patterns for Harvesting Robots. Biosyst. Eng. 2018, 171, 245–257. [Google Scholar] [CrossRef]
- Zheng, Y.; Pi, J.; Guo, T.; Xu, L.; Liu, J.; Kong, J. Design and Simulation of a Gripper Structure of Cluster Tomato Based on Manual Picking Behavior. Front. Plant Sci. 2022, 13, 974456. [Google Scholar] [CrossRef] [PubMed]
- Gao, J.; Zhang, F.; Zhang, J.; Guo, H.; Gao, J. Picking Patterns Evaluation for Cherry Tomato Robotic Harvesting End-Effector Design. Biosyst. Eng. 2024, 239, 1–12. [Google Scholar] [CrossRef]
- Bac, C.W.; Hemming, J.; Van Tuijl, B.A.J.; Barth, R.; Wais, E.; Van Henten, E.J. Performance Evaluation of a Harvesting Robot for Sweet Pepper. J. Field Robot. 2017, 34, 1123–1139. [Google Scholar] [CrossRef]
- Wu, B.; Jiang, T.; Yu, Z.; Zhou, Q.; Jiao, J.; Jin, M.L. Proximity Sensing Electronic Skin: Principles, Characteristics, and Applications. Adv. Sci. 2024, 11, 2308560. [Google Scholar] [CrossRef] [PubMed]
- Mokayed, H.; Quan, T.Z.; Alkhaled, L.; Sivakumar, V. Real-Time Human Detection and Counting System Using Deep Learning Computer Vision Techniques. AIA 2022, 1, 221–229. [Google Scholar] [CrossRef]
- Junge, K.; Pires, C.; Hughes, J. Lab2Field Transfer of a Robotic Raspberry Harvester Enabled by a Soft Sensorized Physical Twin. Commun. Eng. 2023, 2, 40. [Google Scholar] [CrossRef]
- Jones, T.J.; Jambon-Puillet, E.; Marthelot, J.; Brun, P.-T. Bubble casting soft robotics. Nature 2021, 599, 229–233. [Google Scholar] [CrossRef]
- Acome, E.; Mitchell, S.K.; Morrissey, T.G.; Emmett, M.B.; Benjamin, C.; King, M.; Radakovitz, M.; Keplinger, C. Hydraulically Amplified Self-Healing Electrostatic Actuators with Muscle-like Performance. Science 2018, 359, 61–65. [Google Scholar] [CrossRef] [PubMed]
- Rebahi, Y.; Gharra, M.; Rizzi, L.; Zournatzis, I. Combining Computer Vision, Artificial Intelligence and 3D Printing in Wheelchair Design Customization: The Kyklos 4.0 Approach. AIA 2023. [Google Scholar] [CrossRef]
Type of Tactile Sensors | Materials | Detection Limit | Sensitivity | Range | Response | Stability | Ref. |
---|---|---|---|---|---|---|---|
Piezoresistive | Patterned graphene/PDMS | 5 Pa | 1.2 kPa−1 | 0~25 kPa | — | 1000 | [74] |
3D graphene | 1 Pa | 0.152 kPa−1 | 0~27 kPa | 96 ms | 9000 | [75] | |
Polyaniline/polyvinylidene | — | 53 kPa−1 | 5.2~98.7 kPa | 38 ms | 50,000 | [76] | |
Capacitive | Silver/PDMS | 10 Pa | 0.18 kPa−1 | 0~400 kPa | 100 ms | 10,000 | [77] |
Silicon/polystyrene | 0.14 Pa | 44.5 kPa−1 | 0~100 Pa | 9 ms | 5000 | [78] | |
adhesive/graphene | 1 mg | 3.19 kPa−1 | 0~4 kPa | 30 ms | 500 | [79] | |
Piezoelectric | PDMS/silver paste/pvdf | — | 7.7 mVkPa−1 | — | 10 ms | 80,000 | [80] |
Flexible GaN/ZnO NWs | — | — | — | 180 ms | 4000 | [81] | |
ZnS:Mn particles | — | 2.2 cps kPa−1 | 0.6~50 Mpa | 10 ms | 10,000 | [82] | |
Triboelectric | PDMS/Ag nanofibers | — | — | — | 70 ms | 2800 | [83] |
PET/PDMS/Ag electrodes | — | 0.06 kPa−1 | 1 kPa | 70 ms | 10,000 | [84] | |
Ecoflex and PVA/PEI | — | 0.063 VkPa−1 | 5~50 kpa | — | 2250 | [85] |
Methods | Object | Advantage | Disadvantage | Ref. |
---|---|---|---|---|
Object Detection | Strawberry | This method detects mature and immature strawberries in greenhouse images, yielding highly accurate test results. | The ripe category struggles with obscured strawberries, while the immature category faces confusion issues. | [104] |
Cherry tomato | This method uses an enhanced YOLOv3 algorithm for cherry tomato detection, achieving a precision of 94.29%. | Identifying heavily shaded fruits was difficult, and the integration of ripeness in detecting fruits at various growth stages was not performed. | [105] | |
Cherry | This cherry detection method is based on an improved YOLOv4 model. It accurately detects ripe, semi-ripe, unripe, and ripe cherry fruits. | The speed of object detection is relatively slow for cherries. | [106] | |
Blueberry | A YOLOv5 network model that can detect blueberries improved accuracy by 2.4% over the original YOLOv5 network. | The disadvantage is that the method has more network parameters, and further research is needed to improve the detection ability. | [107] | |
Strawberry | YOLOv5 combined with dark channel enhancement improves strawberry fruit-picking accuracy and robustness in complex environments. | This method does not use different image enhancement methods for various periods of lighting conditions. | [108] | |
Strawberry | A strawberry growth detection algorithm improves the precision and accuracy of fruit growth state monitoring in complex environments. | Deep learning-based strawberry growth state monitoring is server-dependent and still has limitations. | [109] | |
Tomato | The YOLO-Tomato models effectively detect tomatoes in complex environmental conditions, and their performance is excellent. | The method is designed to enumerate fruit, and the YOLOv3 model performs poorly in detecting small fruits. | [110] | |
Tomato | The YOLO-Tomato model effectively detects tomatoes in complex environments, outperforming state-of-the-art methods. | Models are poorly adapted to different conditions and may lose semantic information under heavily occluded conditions. | [111] | |
Strawberry | The R-YOLO model improves the localization precision, increasing strawberry harvesting robots’ harvest rate and real-time performance. | It does not discuss the detection accuracy under different environmental conditions. There is a lack of comparative analysis of existing methods. | [112] | |
Kiwifruit | The improved DY3TNet model accurately detects kiwifruits in orchards with minimal data weight. | The method mentions the effect of flash on kiwifruit image detection, but no statistical tests were performed. | [113] | |
Semantic Segmentation | Grape | This CNN accurately detects grape berries for yield estimation and prediction in viticulture. | It shows a notably worse accuracy for the class edge, with 41.7%. | [114] |
Tomato | This method accurately segments tomatoes, with intersection and pixel accuracies of 82.5% and 89.79%, respectively. | This method may be some burrs in the process of extracting tomato stems and calyxes that may affect the identification of the final picking points. | [115] | |
Strawberry | This method accurately assesses strawberry maturity in challenging fields with 93.7% accuracy. | Many strawberries in the transition maturity stage may cause classification confusion. | [116] | |
Grape | DeepLabv3+ combined with transfer learning is more suitable for accurately segmenting grape clusters, with better performance. | The method mentions increased accuracy with HE image enhancement methods but lacks details on statistical tests. | [117] | |
Strawberry | The Hierarchical adaptive feature fusion method significantly improves real-time strawberry segmentation compared to existing methods. | This methodology does not provide detailed comparative results or discussion with other related methods. | [118] | |
Grape | This method proposes a conditional random field-based approach to identify grapes with adaptivity and multi-feature fusion. | The present study conducted experiments with artificially selected reference berries, which may introduce a subjective factor for identification. | [119] | |
Grape | The method was developed to efficiently estimate canopy volume, detect clusters, and count grapevines in the field. | The present method is a more reliable option for sensitivity to light conditions, and the RGB-D sensor’s data quality is better. | [120] | |
Tomato | A transformer-based R-CN model effectively accurately identifies tomato varieties and ripening stages. | The method does not explore tomato detection in various occluded regions, raising practical applicability concerns. | [121] | |
Bayberry | A CNN-based model for prune segmentation in complex environments that resists the limitations of light variations and occlusion. | Higher false and missed segmentation rates and poorer segmentation of small objects in fruit clusters. | [122] | |
Grape | This method can detect and mask single berry objects with a semantic segmentation network using a class edge to separate single objects. | The training of this method is time and computationally intensive and has limited adaptability to different training systems. | [123] | |
Instance Segmentation | Blueberry | The ResNet50 backbone with Mask R-CNN accurately quantifies wild blueberries using high-definition images. | Many experiments are required to select the best hyper-parameters in order, and this method has a slower recognition speed. | [124] |
Blueberry | This method effectively detects and segments blueberry fruits, extracting traits related to yield and monitoring fruit development. | This method has a slower recognition speed. | [125] | |
Grape | The enhanced model achieved high detection accuracy and robust generalization across diverse varieties in complex growth environments. | Detection accuracy is limited under varying light conditions and occlusion, while the model’s detection speed requires further improvement. | [126] | |
Waxberry | The waxberry identification network achieved high precision and demonstrated strong robustness to occlusion. | It did not increase the immature waxberries for fruit counting and yield estimation. It does not solve the problems caused by fruit stacking. | [127] | |
Grape | This method is versatile for grape berry counting and size detection, enabling precise discernment of berry features. | Errors in segmentation may result in the misclassification of non-berry contours as a group. | [128] | |
Strawberry | The improved DeepLabV3+ model accurately segments strawberries with different maturities, reducing environmental factors. | This method does not conduct experiments on strawberries of different locations, and the recognition speed is slow. | [129] | |
Cherry tomato | Using bimodal eigenmaps and a balanced multitask loss model, this model enhances stem segmentation accuracy in cherry tomato picking. | The present model has challenges regarding color and shape similarity, loss of stem features, and category differences. | [130] | |
Strawberry | A method for strawberry segmentation demonstrated efficiency in a natural system and created a database with images and entries. | New methods emerge, making rigorous performance comparison impossible due to long computational times. | [131] | |
Strawberry | The method identified and localized strawberries and provided location information. | The model has some drawbacks regarding error detection and speed in the recognition process. | [132] | |
Strawberry | A Mask R-CNN model detects strawberries and other diseases, providing plant disease detection. | The model’s accuracy is low, and the recognition time is extended. | [133] |
Methods | Object | Advantage | Disadvantage | Ref. |
---|---|---|---|---|
Piezoresistive | Tomato | Servo motors control the bending angle and speed of the soft gripper, with tactile sensors for tomato gripping. | The sensor is non-stretchable, which influences the flexibility of the gripper. | [136] |
Grape | An anthropomorphic end-effector that combines the adhesion principle with a multi-contact design with piezoresistive tactile sensors. | This method has some drawbacks in maximizing the contact area and lacks a direct measurement and adjustment process. | [137] | |
Strawberry | A soft, sensitized gripper is introduced with a robotic system for picking small fruits like strawberries. | The present gripper jaws’ piezoresistive tactile sensors are unsuitable for handling fragile berries with a normal force range. | [138] | |
Blackberry | A tendon-driven gripper for automated blackberry harvesting, incorporating a flexible resistive force sensor for providing force feedback. | Complex calibration process and poor signal immunity. | [59] | |
Kiwifruit | This methodology enables detecting and classifying kiwifruit according to fruit hardness, utilizing data from force and bending sensors. | The gripper may damage the fruit during gripping, and the perception function may be limited due to the limited number of sensors. | [139] | |
Sweet pepper | The end-effector can successfully pick the pepper, and the vacuum pressure sensor provides feedback that the suction cup has grabbed it. | Vacuum pressure sensors are less sensitive at low pressures and are highly influenced by temperature. | [140] | |
Tomato | The method enables the grasping of tomatoes while combining tactile perception information and algorithms to classify the size and ripeness. | This method lacks practical scenarios and performance evaluations and raises concerns about reliability and robustness. | [141] | |
Strawberry | A piezoresistive tactile sensor manufactured by a 3D printer with no additional modifications can be used with a soft robot to grasp strawberries. | The print material’s increased resistance lowers sensor sensitivity. Material deformation causes readings to drift and oscillate. | [142] | |
Capacitive | Tomato | Electrohydraulic bending actuators with tactile sensors for real-time gripping detection and fruit size estimation. | The manual fabrication process and multi-component design represent drawbacks. | [143] |
Strawberry | Capacitive sensors integrated into robotic gripper jaws enable gentle picking of strawberries, providing susceptible data output. | Increasing pressure causes a rise in compressive stiffness, impairs sensor performance, and limits the linear measurement range. | [144] | |
Piezoelectric | Tomato | Piezoelectric sensors can measure the end-effector’s initial contact with the tomato, signal oscillations, and vibrations. | This approach does not solve the problems of poor surface contact and adhesion that affect pressure sensors. | [145] |
Tomato | The bionic manipulator features adaptive gripping capabilities, using piezoelectric and strain sensors to measure the softness of tomatoes. | Without a sensor module, the robot cannot adjust the contact force and is less accurate when using a single piezo to recognize softness. | [146] | |
Triboelectric | Tomato | A three-finger actuator and a triboelectric tactile sensor have been designed to monitor and clamp tomatoes precisely. | Other factors strongly influence sensor performance and are poorly adapted to changes in the shape of the gripped object. | [147] |
Kiwifruit | A hybrid sensor integrates a triboelectric perception unit and an inductive sensor. When combined with vision, it accurately detects kiwifruit. | The method has some limitations when dealing with complex input signals, including the inability to recognize other features of the object. | [148] | |
Tomato | The end-effector fingers are equipped with TENG sensors, featuring a simple structure, high sensitivity, and durability for harvesting tomatoes. | Drawbacks include non-linearity, creep issues, dependency on external light sources, and magnetic fields. | [149] |
Methods | Object | Advantage | Disadvantage | Ref. |
---|---|---|---|---|
Infrared | Strawberry | The manipulator uses three internal infrared sensors to optimally identify and position the object. | The infrared sensor may cause incorrect positioning due to occlusion, and the sensor may not be detected due to unsuccessful cutting. | [32] |
Tomato | The camera consists of a pair of stereo infrared sensors. The 3D coordinates of the object tomato in the camera coordinate system were measured. | Environmental, background, lighting, occlusion, and overlap issues can make detecting and locating object tomatoes difficult. | [156] | |
Grape | The infrared sensor detects and synchronizes the position of the grapes for harvesting. | Infrared sensors have limited detection range and narrow transmission and reception angles. | [157] | |
Strawberry | RGB-D cameras can estimate the position of strawberries in three-dimensional space by combining infrared with depth information. | When calculating the strawberry’s position, there is an error due to the offset of the image center from the camera’s mounting position. | [158] | |
Cherry tomato | RGB-D cameras provide high precision and accuracy in locating cherry tomatoes by acquiring multimodal images. | Positioning becomes more complicated when the camera is in front of the fruit. | [159] | |
LiDAR | Strawberry | The finger is equipped with a pair of LiDAR sensors; when the fruit stem blocks the laser beam, the control module operates to cut off the stem. | LiDAR sensors have errors, and the sensor’s small detection distance makes it unable to adapt to narrow and low-channel environments. | [160] |
Strawberry | Fusion of laser sensors with monocular cameras for accurate navigation and improved system fault tolerance. | LiDAR sensors do not provide semantic information for scene recovery in complex environments and limit accuracy and light changes. | [161] | |
Tomato | Using LIDAR sensors, accurate representation and positioning of crops such as tomatoes can be achieved in occlusion. | Tracking systems may suffer from errors, switching between objects, or missed and false detections. | [162] | |
Tomato | Using LIDAR, the mobile platform is provided with environmental awareness and navigation capabilities for autonomous movement. | LIDAR data processing requires conversion and fusion with other data, and LIDAR has difficulties in dealing with background interference. | [163] | |
Ultrasonic | Blueberry | Ultrasonic sensors can detect weeds and determine a plant’s height by measuring the distance between it and the sensor. | Ultrasonic sensors may have some errors in measuring crop height, and the input voltage and the characteristics of the sensor itself limit its output. | [164] |
Multimodal | Tomato | Infrared sensors measure plant position, LiDAR acquires 3D information about the plants, and RGB-D cameras validate LiDAR results. | Occlusion limits the infrared sensor’s field of view and light signal, and lighting and incident light affect camera image quality. | [165] |
Tomato | Using LIDAR and infrared cameras, it is possible to navigate, build maps, detect tomatoes, and locate the center point of a tomato. | LIDAR and depth camera tomato recognition outdoors is affected by light. | [166] |
Methods | Object | Advantage | Disadvantage | Ref. |
---|---|---|---|---|
Photoelectric | Kiwifruit | The position data from the switch sensor is transmitted to the stepper motor, which drives the gripping mechanism to clamp the fruit. | There may be a discrepancy between the finger and the kiwi due to the light restriction caused by the switch mounting position. | [169] |
Grape | The photoelectric switching sensor, integrated into the gripper, provides feedback that enables precise strawberry gripping. | Errors in the photoelectric switching sensor may result from occlusion and stacking problems associated with the strawberry fruit. | [170] | |
Proximity | Kiwifruit | The robot equipped with a hall sensor can discern alterations in the magnetic field and thereby determine the position of the kiwifruit. | It is possible that the hall sensor detection is not sufficiently sensitive due to the kiwifruit’s non-magnetic shape. | [171] |
Kiwifruit | Hall sensors discern alterations in magnetic fields and regulate the direction and number of steps stepper motors undertake. | When catching fruit, kiwifruit-picking robots tend to inhale surrounding debris, such as branches and leaves. | [172] | |
Tomato | Magnetic proximity switches are used for end-effector control and are used to achieve control of the tomato-picking position of the gripper. | Proximity switches are difficult to apply in cluttered, messy, and fragile plant environments. | [173] | |
Pressure | Grape | Fingers can be switched to clamp the grapes flexibly using the information obtained from the contact force with the grapes to the actuator. | Due to the hydraulic brake’s low control accuracy, the fruit may be damaged during the clamping process. | [174] |
Tomato | The gripper, driven by a self-service motor, measures and precisely controls the end-effector gripping tomato using a resistance strain gauge. | Temperature variations and electromagnetic interference can affect resistance strain gauges, leading to errors in the results. | [175] | |
Tomato | The gripper has a membrane pressure switch that stops the finger from closing when the pressure reaches a minimum destructive value. | The pressure switch’s lack of flexibility can easily damage the tomato rind. | [176] | |
Cherry tomato | The pressure switch controls the force of the end-effector to pick the tomato after measuring the applied force on a human finger. | The pressure switch is not mounted on the end-effector, and due to individual tomato differences, tomatoes may be destroyed. | [177] | |
Sweet pepper | Vacuum switching sensors detect the results of sweet pepper fruit gripping, and harvesting operations continue after successful fruit gripping. | Vacuum switch sensors are more costly and have longer response times. | [178] |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, C.; Pan, W.; Zou, T.; Li, C.; Han, Q.; Wang, H.; Yang, J.; Zou, X. A Review of Perception Technologies for Berry Fruit-Picking Robots: Advantages, Disadvantages, Challenges, and Prospects. Agriculture 2024, 14, 1346. https://doi.org/10.3390/agriculture14081346
Wang C, Pan W, Zou T, Li C, Han Q, Wang H, Yang J, Zou X. A Review of Perception Technologies for Berry Fruit-Picking Robots: Advantages, Disadvantages, Challenges, and Prospects. Agriculture. 2024; 14(8):1346. https://doi.org/10.3390/agriculture14081346
Chicago/Turabian StyleWang, Chenglin, Weiyu Pan, Tianlong Zou, Chunjiang Li, Qiyu Han, Haoming Wang, Jing Yang, and Xiangjun Zou. 2024. "A Review of Perception Technologies for Berry Fruit-Picking Robots: Advantages, Disadvantages, Challenges, and Prospects" Agriculture 14, no. 8: 1346. https://doi.org/10.3390/agriculture14081346
APA StyleWang, C., Pan, W., Zou, T., Li, C., Han, Q., Wang, H., Yang, J., & Zou, X. (2024). A Review of Perception Technologies for Berry Fruit-Picking Robots: Advantages, Disadvantages, Challenges, and Prospects. Agriculture, 14(8), 1346. https://doi.org/10.3390/agriculture14081346