The Expanding Role of Artificial Intelligence in Collaborative Robots for Industrial Applications: A Systematic Review of Recent Works
Abstract
:1. Introduction
- Coexistence: there is no common office, yet humans and robots coexist side by side without a boundary.
- Human and robot activity occurs in a shared workspace, but their movements are sequential; they do not simultaneously work on a component.
- Cooperation: while both are in movement, a robot and a person work simultaneously on the same component.
- Responsive collaboration: the robots react instantly to human worker movement.
- ○
- What have researchers found in the literature on the expanding role of AI on cobots?
- ○
- Will the implementation of AI on cobots be able to reduce previous concerns about industrial applications and contribute to better performance?
- ○
- to research the key distinctions between robots and cobots
- ○
- to research the common characteristics and capacities of robots
- ○
- to discuss the various levels of industrial situations including robots, the role of AI, and collaboration
2. Methodology
3. Findings
3.1. Cobots
- How cobots acquire knowledge and abilities with little or no task-specific coding
- How cobots replicate user perception and motor control to carry out a physical task and reach objectives
- How cobots with enhanced mobility complete a difficult task in a wide-open area
3.1.1. Difference between the Robot and Cobot
3.1.2. Advantages of Cobots
3.1.3. Disadvantages of Cobots
3.2. Artificial Intelligence
3.3. Analysis
3.3.1. Non-Collaborative Workspace-Type Robots
3.3.2. Collaborative Workspace-Type Robots
3.3.3. Industrial Robots Employing Machine Learning
3.3.4. Cobot-Related Works without AI
3.3.5. Vision Systems in Cobots
4. Discussion
5. Recommendations and Future Directions
5.1. Advanced Autonomous Algorithms
5.2. Safety Devices
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Hentout, A.; Aouache, M.; Maoudj, A.; Akli, I. Human–robot interaction in industrial collaborative robotics: A literature review of the decade 2008–2017. Adv. Robot. 2019, 33, 764–799. [Google Scholar] [CrossRef]
- “Demystifying Collaborative Industrial Robots”, International Federation of Robotics, Frankfurt, Germany. Available online: https://web.archive.org/web/20190823143255/https://ifr.org/downloads/papers/IFR_Demystifying_Collaborative_Robots.pdf (accessed on 6 October 2022).
- Li, Y.; Sena, A.; Wang, Z.; Xing, X.; Babič, J.; van Asseldonk, E.; Burdet, E. A review on interaction control for contact robots through intent detection. Prog. Biomed. Eng. 2022, 4, 032004. [Google Scholar] [CrossRef]
- Mukherjee, D.; Gupta, K.; Chang, L.H.; Najjaran, H. A Survey of Robot Learning Strategies for Human-Robot Collaboration in Industrial Settings. Robot. Comput. Integr. Manuf. 2021, 73, 102231. [Google Scholar] [CrossRef]
- Service Robots. Available online: https://ifr.org/service-robots (accessed on 7 October 2022).
- Gualtieri, L.; Rauch, E.; Vidoni, R. Emerging research fields in safety and ergonomics in industrial collaborative robotics: A systematic literature review. Robot. Comput. Integr. Manuf. 2021, 67, 101998. [Google Scholar] [CrossRef]
- Mobile Robots Improve Patient Care, Employee Satisfaction, Safety, Productivity and More. Available online: https://aethon.com/mobile-robots-for-healthcare/ (accessed on 9 October 2022).
- Maadi, M.; Khorshidi, H.A.; Aickelin, U. A Review on Human–AI Interaction in Machine Learning and Insights for Medical Applications. Int. J. Environ. Res. Public Health 2021, 18, 2121. [Google Scholar] [CrossRef]
- Matheson, E.; Minto, R.; Zampieri, E.G.G.; Faccio, M.; Rosati, G. Human-robot collaboration in manufacturing applications: A review. Robotics 2019, 8, 100. [Google Scholar] [CrossRef] [Green Version]
- Faccio, M.; Granata, I.; Menini, A.; Milanese, M.; Rossato, C.; Bottin, M.; Minto, R.; Pluchino, P.; Gamberini, L.; Boschetti, G.; et al. Human factors in cobot era: A review of modern production systems features. J. Intell. Manuf. 2022, 34, 85–106. [Google Scholar] [CrossRef]
- Castillo, J.F.; Hamilton Ortiz, H.; Díaz Velásquez, F.M.; Saavedra, F.D. COBOTS in Industry 4.0: Safe and Efficient Interaction. In Collaborative and Humanoid Robots; IntechOpen: London, UK, 2021; p. 13. [Google Scholar] [CrossRef]
- Collaborative Robots Market Size, Share & Trends Analysis Report by Payload Capacity, by Application (Assembly, Handling, Packaging, Quality Testing), by Vertical, by Region, and Segment Forecasts, 2022–2030. Available online: https://www.grandviewresearch.com/industry-analysis/collaborative-robots-market (accessed on 19 November 2022).
- Semeraro, F.; Griffiths, A.; Cangelosi, A. Human–robot collaboration and machine learning: A systematic review of recent research. Robot. Comput. Integr. Manuf. 2022, 79, 102432. [Google Scholar] [CrossRef]
- Proia, S.; Carli, R.; Cavone, G.; Dotoli, M. A Literature Review on Control Techniques for Collaborative Robotics in Industrial Applications. IEEE Int. Conf. Autom. Sci. Eng. 2021, 2021, 591–596. [Google Scholar] [CrossRef]
- Lins, R.G.; Givigi, S.N. Cooperative Robotics and Machine Learning for Smart Manufacturing: Platform Design and Trends within the Context of Industrial Internet of Things. IEEE Access 2021, 9, 95444–95455. [Google Scholar] [CrossRef]
- Spezialetti, M.; Placidi, G.; Rossi, S. Emotion Recognition for Human-Robot Interaction: Recent Advances and Future Perspectives. Front. Robot. AI 2020, 7, 1–11. [Google Scholar] [CrossRef]
- Vicentini, F. Collaborative Robotics: A Survey. J. Mech. Des. Trans. ASME 2020, 143, 1–20. [Google Scholar] [CrossRef]
- Robots and Humans Can Work Together with New ISO Guidance. 2016. Available online: https://www.iso.org/news/2016/03/Ref2057.html (accessed on 5 October 2022).
- Robots and Robotic Devices—Safety Requirements for Industrial Robots—Part 2: Robot Systems and Integration. 2011. Available online: https://www.iso.org/obp/ui/#iso:std:iso:10218:-2:ed-1:v1:en (accessed on 5 October 2022).
- Pierson, H.A.; Gashler, M.S. Deep learning in robotics: A review of recent research. Adv. Robot. 2017, 31, 821–835. [Google Scholar] [CrossRef] [Green Version]
- Mobile Cobots. Available online: https://www.dimalog.com/mobile-cobots/ (accessed on 9 November 2022).
- Cohen, Y.; Shoval, S.; Faccio, M.; Minto, R. Deploying cobots in collaborative systems: Major considerations and productivity analysis. Int. J. Prod. Res. 2021, 60, 1815–1831. [Google Scholar] [CrossRef]
- Bi, Z.M.; Luo, M.; Miao, Z.; Zhang, B.; Zhang, W.J.; Wang, L. Safety assurance mechanisms of collaborative robotic systems in manufacturing. Robot. Comput. Integr. Manuf. 2020, 67, 102022. [Google Scholar] [CrossRef]
- Kashish, G.; Homayoun, N. Curriculum-Based Deep Reinforcement Learning for Adaptive Robotics: A Mini-Review. Int. J. Robot. Eng. 2021, 6, 31. [Google Scholar] [CrossRef]
- Knudsen, M.; Kaivo-Oja, J. Collaborative Robots: Frontiers of Current Literature. J. Intell. Syst. Theory Appl. 2020, 3, 13–20. [Google Scholar] [CrossRef]
- Eyam, T. Emotion-Driven Human-Cobot Intreraction Based on EEG in Industrial Applications. Master’s Thesis, Tampere University, Tampere, Finland, 2019. [Google Scholar]
- Nurhanim, K.; Elamvazuthi, I.; Izhar, L.I.; Capi, G.; Su, S. EMG Signals Classification on Human Activity Recognition using Machine Learning Algorithm. In Proceedings of the 2021 8th NAFOSTED Conference on Information and Computer Science (NICS), Hanoi, Vietnam, 21–22 December 2021; pp. 369–373. [Google Scholar] [CrossRef]
- Reddy, K.V.V.; Elamvazuthi, I.; Aziz, A.A.; Paramasivam, S.; Chua, H.N.; Pranavanand, S. Heart Disease Risk Prediction Using Machine Learning Classifiers with Attribute Evaluators. Appl. Sci. 2021, 11, 8352. [Google Scholar] [CrossRef]
- Ganesan, T.; Elamvazuthi, I.; Vasant, P. Solving Engineering Optimization Problems with the Karush-Kuhn-Tucker Hopfield Neural Networks. Int. Rev. Mech. Eng. 2011, 5, 1333–1339. [Google Scholar]
- Ali, Z.; Elamvazuthi, I.; Alsulaiman, M.; Muhammad, G. Detection of Voice Pathology using Fractal Dimension in a Multiresolution Analysis of Normal and Disordered Speech Signals. J. Med. Syst. 2016, 40, 20. [Google Scholar] [CrossRef]
- Sathyan, A.; Cohen, K.; Ma, O. Comparison Between Genetic Fuzzy Methodology and Q-Learning for Collaborative Control Design. Int. J. Artif. Intell. Appl. 2019, 10, 1–15. [Google Scholar] [CrossRef]
- Vasant, P.; Ganesan, T.; Elamvazuthi, I.; Webb, J.F. Interactive fuzzy programming for the production planning: The case of textile firm. Int. Rev. Model. Simul. 2011, 4, 961–970. [Google Scholar]
- Reddy, K.V.V.; Elamvazuthi, I.; Aziz, A.A.; Paramasivam, S.; Chua, H.N.; Pranavanand, S. Prediction of Heart Disease Risk Using Machine Learning with Correlation-based Feature Selection and Optimization Techniques. In Proceedings of the 2021 7th International Conference on Signal Processing and Communication (ICSC), Noida, India, 25–27 November 2021; pp. 228–233. [Google Scholar] [CrossRef]
- Sharon, H.; Elamvazuthi, I.; Lu, C.; Parasuraman, S.; Natarajan, E. Development of Rheumatoid Arthritis Classification from Electronic Image Sensor Using Ensemble Method. Sensors 2019, 20, 167. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Roveda, L.; Maskani, J.; Franceschi, P.; Abdi, A.; Braghin, F.; Tosatti, L.M.; Pedrocchi, N. Model-Based Reinforcement Learning Variable Impedance Control for Human-Robot Collaboration. J. Intell. Robot. Syst. Theory Appl. 2020, 100, 417–433. [Google Scholar] [CrossRef]
- Ali, Z.; Alsulaiman, M.; Elamvazuthi, I.; Muhammad, G.; Mesallam, T.A.; Farahat, M.; Malki, K.H. Voice pathology detection based on the modified voice contour and SVM. Biol. Inspired Cogn. Archit. 2016, 15, 10–18. [Google Scholar] [CrossRef]
- Ananias, E.; Gaspar, P.D. A Low-Cost Collaborative Robot for Science and Education Purposes to Foster the Industry 4.0 Implementation. Appl. Syst. Innov. 2022, 5, 72. [Google Scholar] [CrossRef]
- Gupta, R.; Elamvazuthi, I.; Dass, S.C.; Faye, I.; Vasant, P.; George, J.; Izza, F. Curvelet based automatic segmentation of supraspinatus tendon from ultrasound image: A focused assistive diagnostic method. Biomed. Eng. Online 2014, 13, 157. [Google Scholar] [CrossRef] [Green Version]
- Reddy, K.V.V.; Elamvazuthi, I.; Aziz, A.A.; Paramasivam, S.; Chua, H.N. Heart Disease Risk Prediction using Machine Learning with Principal Component Analysis. In Proceedings of the 2020 8th International Conference on Intelligent and Advanced Systems (ICIAS), Kuching, Malaysia, 13–15 July 2021; pp. 1–6. [Google Scholar]
- Rahim, K.N.K.A.; Elamvazuthi, I.; Izhar, L.I.; Capi, G. Classification of human daily activities using ensemble methods based on smartphone inertial sensors. Sensors 2018, 18, 4132. [Google Scholar] [CrossRef] [Green Version]
- Ali, Z.; Alsulaiman, M.; Muhammad, G.; Elamvazuthi, I.; Mesallam, T.A. Vocal fold disorder detection based on continuous speech by using MFCC and GMM. In Proceedings of the 2013 7th IEEE GCC Conference and Exhibition (GCC), Doha, Qatar, 17–20 November 2013; pp. 292–297. [Google Scholar] [CrossRef]
- Kolbinger, F.R.; Leger, S.; Carstens, M.; Rinner, F.M.; Krell, S.; Chernykh, A.; Nielen, T.P.; Bodenstedt, S.; Welsch, T.; Kirchberg, J.; et al. Artificial Intelligence for context-aware surgical guidance in complex robot-assisted oncological procedures: An exploratory feasibility study. medRxiv 2022. [Google Scholar]
- Reddy, K.V.V.; Elamvazuthi, I.; Aziz, A.A.; Paramasivam, S.; Chua, H.N.; Pranavanand, S. Rotation Forest Ensemble Classifier to Improve the Cardiovascular Disease Risk Prediction Accuracy. In Proceedings of the 2021 8th NAFOSTED Conference on Information and Computer Science (NICS), Hanoi, Vietnam, 21–22 December 2021; pp. 404–409. [Google Scholar] [CrossRef]
- Galin, R.; Meshcheryakov, R. Collaborative robots: Development of robotic perception system, safety issues, and integration of ai to imitate human behavior. Smart Innov. Syst. Technol. 2021, 187, 175–185. [Google Scholar] [CrossRef]
- Ibarz, J.; Tan, J.; Finn, C.; Kalakrishnan, M.; Pastor, P.; Levine, S. How to train your robot with deep reinforcement learning: Lessons we have learned. Int. J. Rob. Res. 2021, 40, 698–721. [Google Scholar] [CrossRef]
- Kragic, D.; Gustafson, J.; Karaoguz, H.; Jensfelt, P.; Krug, R. Interactive, collaborative robots: Challenges and opportunities. IJCAI Int. Jt. Conf. Artif. Intell. 2018, 2018, 18–25. [Google Scholar] [CrossRef] [Green Version]
- Bagheri, E.; De Winter, J.; Vanderborght, B. Transparent Interaction Based Learning for Human-Robot Collaboration. Front. Robot. AI 2022, 9, 1–9. [Google Scholar] [CrossRef]
- Amarillo, A.; Sanchez, E.; Caceres, J.; Oñativia, J. Collaborative Human–Robot Interaction Interface: Development for a Spinal Surgery Robotic Assistant. Int. J. Soc. Robot. 2021, 13, 1473–1484. [Google Scholar] [CrossRef]
- Nicora, M.L.; Andre, E.; Berkmans, D.; Carissoli, C.; D’Orazio, T.; Fave, A.D.; Gebhard, P.; Marani, R.; Mira, R.M.; Negri, L.; et al. A human-driven control architecture for promoting good mental health in collaborative robot scenarios. In Proceedings of the 2021 30th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN), Vancouver, BC, Canada, 8–12 August 2021; pp. 285–291. [Google Scholar] [CrossRef]
- Oliff, H.; Liu, Y.; Kumar, M.; Williams, M.; Ryan, M. Reinforcement learning for facilitating human-robot-interaction in manufacturing. J. Manuf. Syst. 2020, 56, 326–340. [Google Scholar] [CrossRef]
- Story, M.; Webb, P.; Fletcher, S.R.; Tang, G.; Jaksic, C.; Carberry, J. Do Speed and Proximity Affect Human-Robot Collaboration with an Industrial Robot Arm? Int. J. Soc. Robot. 2022, 14, 1087–1102. [Google Scholar] [CrossRef]
- Zhang, R.; Lv, Q.; Li, J.; Bao, J.; Liu, T.; Liu, S. A reinforcement learning method for human-robot collaboration in assembly tasks. Robot. Comput. Integr. Manuf. 2021, 73, 102227. [Google Scholar] [CrossRef]
- Silva, G.; Rekik, K.; Kanso, A.; Schnitman, L. Multi-perspective human robot interaction through an augmented video interface supported by deep learning. In Proceedings of the 2022 31st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Napoli, Italy, 29 August–2 September 2022; pp. 1168–1173. [Google Scholar] [CrossRef]
- Buerkle, A.; Eaton, W.; Lohse, N.; Bamber, T.; Ferreira, P. EEG based arm movement intention recognition towards enhanced safety in symbiotic Human-Robot Collaboration. Robot. Comput. Integr. Manuf. 2021, 70, 102137. [Google Scholar] [CrossRef]
- De Winter, J.; De Beir, A.; El Makrini, I.; Van de Perre, G.; Nowé, A.; Vanderborght, B. Accelerating interactive reinforcement learning by human advice for an assembly task by a cobot. Robotics 2019, 8, 104. [Google Scholar] [CrossRef] [Green Version]
- Ghadirzadeh, A.; Chen, X.; Yin, W.; Yi, Z.; Bjorkman, M.; Kragic, D. Human-Centered Collaborative Robots with Deep Reinforcement Learning. IEEE Robot. Autom. Lett. 2020, 6, 566–571. [Google Scholar] [CrossRef]
- Akkaladevi, S.C.; Plasch, M.; Pichler, A.; Ikeda, M. Towards reinforcement based learning of an assembly process for human robot collaboration. Procedia Manuf. 2019, 38, 1491–1498. [Google Scholar] [CrossRef]
- Heo, Y.J.; Kim, D.; Lee, W.; Kim, H.; Park, J.; Chung, W.K. Collision detection for industrial collaborative robots: A deep learning approach. IEEE Robot. Autom. Lett. 2019, 4, 740–746. [Google Scholar] [CrossRef]
- Gomes, N.M.; Martins, F.N.; Lima, J.; Wörtche, H. Reinforcement Learning for Collaborative Robots Pick-and-Place Applications: A Case Study. Automation 2022, 3, 223–241. [Google Scholar] [CrossRef]
- Chen, X.; Wang, N.; Cheng, H.; Yang, C. Neural Learning Enhanced Variable Admittance Control for Human-Robot Collaboration. IEEE Access 2020, 8, 25727–25737. [Google Scholar] [CrossRef]
- Qureshi, H.; Nakamura, Y.; Yoshikawa, Y.; Ishiguro, H. Intrinsically motivated reinforcement learning for human–robot interaction in the real-world. Neural Netw. 2018, 107, 23–33. [Google Scholar] [CrossRef] [Green Version]
- Wang, P.; Liu, H.; Wang, L.; Gao, R.X. Deep learning-based human motion recognition for predictive context-aware human-robot collaboration. CIRP Ann. 2018, 67, 17–20. [Google Scholar] [CrossRef]
- Lv, Q.; Zhang, R.; Liu, T.; Zheng, P.; Jiang, Y.; Li, J.; Bao, J.; Xiao, L. A strategy transfer approach for intelligent human-robot collaborative assembly. Comput. Ind. Eng. 2022, 168, 108047. [Google Scholar] [CrossRef]
- Weiss, A.; Wortmeier, A.K.; Kubicek, B. Cobots in Industry 4.0: A Roadmap for Future Practice Studies on Human-Robot Collaboration. IEEE Trans. Hum. -Mach. Syst. 2021, 51, 335–345. [Google Scholar] [CrossRef]
- Sasagawa, A.; Fujimoto, K.; Sakaino, S.; Tsuji, T. Imitation learning based on bilateral control for human-robot cooperation. IEEE Robot. Autom. Lett. 2020, 5, 6169–6176. [Google Scholar] [CrossRef]
- Lu, W.; Hu, Z.; Pan, J. Human-Robot Collaboration using Variable Admittance Control and Human Intention Prediction. IEEE Int. Conf. Autom. Sci. Eng. 2020, 2020, 1116–1121. [Google Scholar] [CrossRef]
- Karn, A.L.; Sengan, S.; Kotecha, K.; Pustokhina, I.V.; Pustokhin, D.A.; Subramaniyaswamy, V.; Buddhi, D. ICACIA: An Intelligent Context-Aware framework for COBOT in defense industry using ontological and deep learning models. Rob. Auton. Syst. 2022, 157, 104234. [Google Scholar] [CrossRef]
- De Miguel Lazaro, O.; Mohammed, W.M.; Ferrer, B.R.; Bejarano, R.; Lastra, J.L.M. An approach for adapting a cobot workstation to human operator within a deep learning camera. IEEE Int. Conf. Ind. Inform. 2019, 2019, 789–794. [Google Scholar] [CrossRef]
- Mohammed, A.; Wang, L. Advanced human-robot collaborative assembly using electroencephalogram signals of human brains. Procedia CIRP 2020, 93, 1200–1205. [Google Scholar] [CrossRef]
- Wang, J.; Pradhan, M.R.; Gunasekaran, N. Machine learning-based human-robot interaction in ITS. Inf. Process. Manag. 2021, 59, 102750. [Google Scholar] [CrossRef]
- Aliev, K.; Antonelli, D. Proposal of a monitoring system for collaborative robots to predict outages and to assess reliability factors exploiting machine learning. Appl. Sci. 2021, 11, 1621. [Google Scholar] [CrossRef]
- Malik, A.; Brem, A. Digital twins for collaborative robots: A case study in human-robot interaction. Robot. Comput. Integr. Manuf. 2020, 68, 102092. [Google Scholar] [CrossRef]
- Walker, T.; Ahlin, K.J.; Joffe, B.P. Robotic Rehang with Machine Vision. In Proceedings of the 2021 ASABE Annual International Virtual Meeting, Virtual, 12–16 July 2021. [Google Scholar] [CrossRef]
- Edmonds, M.; Gao, F.; Liu, H.; Xie, X.; Qi, S.; Rothrock, B.; Zhu, Y.; Wu, Y.N.; Lu, H.; Zhu, S.-C. A tale of two explanations: Enhancing human trust by explaining robot behavior. Sci. Robot. 2019, 37, 1–14. [Google Scholar] [CrossRef]
- Grigore, L.S.; Priescu, I.; Joita, D.; Oncioiu, I. The Integration of Collaborative Robot Systems and Their Environmental Impacts. Processes 2020, 8, 494. [Google Scholar] [CrossRef] [Green Version]
- Bader, K.B.; Hendley, S.A.; Bollen, V. Assessment of Collaborative Robot (Cobot)-Assisted Histotripsy for Venous Clot Ablation. IEEE Trans. Biomed. Eng. 2020, 68, 1220–1228. [Google Scholar] [CrossRef]
- Eyam, A.; Mohammed, W.M.; Lastra, J.L.M. Emotion-driven analysis and control of human-robot interactions in collaborative applications. Sensors 2021, 21, 4626. [Google Scholar] [CrossRef]
- Yang, C.; Wu, H.; Li, Z.; He, W.; Wang, N.; Su, C.-Y. Mind Control of a Robotic Arm With Visual Fusion Technology. IEEE Trans. Ind. Inform. 2018, 14, 3822–3830. [Google Scholar] [CrossRef]
- Cunha, A.; Ferreira, F.; Sousa, E.; Louro, L.; Vicente, P.; Monteiro, S.; Erlhagen, W.; Bicho, E. Towards collaborative robots as intelligent co-workers in human-robot joint tasks: What to do and who does it? In Proceedings of the 52th International Symposium on Robotics, Online, 9–10 December 2020; pp. 141–148. [Google Scholar]
- Ivorra, E.; Ortega, M.; Alcaniz, M.; Garcia-Aracil, N. Multimodal Computer Vision Framework for Human Assistive Robotics. In Proceedings of the 2018 Workshop on Metrology for Industry 4.0 and IoT, Brescia, Italy, 16–18 April 2018; pp. 18–22. [Google Scholar] [CrossRef]
- Pagani, R.; Nuzzi, C.; Ghidelli, M.; Borboni, A.; Lancini, M.; Legnani, G. Cobot user frame calibration: Evaluation and comparison between positioning repeatability performances achieved by traditional and vision-based methods. Robotics 2021, 10, 45. [Google Scholar] [CrossRef]
- Martínez-Franco, J.C.; Álvarez-Martínez, D. Machine Vision for Collaborative Robotics Using Synthetic Data-Driven Learning. In Service Oriented, Holonic and Multi-Agent Manufacturing Systems for Industry of the Future; Springer: Cham, Switzerland, 2021; pp. 69–81. [Google Scholar]
- How Vision Cobots Are Reshaping Factory Automation. Available online: https://www.lanner-america.com/blog/vision-cobots-reshaping-factory-automation/ (accessed on 24 November 2022).
- Xu, L.; Li, G.; Song, P.; Shao, W. Vision-Based Intelligent Perceiving and Planning System of a 7-DoF Collaborative Robot. Comput. Intell. Neurosci. 2021, 2021, 5810371. [Google Scholar] [CrossRef] [PubMed]
- Jia, F.; Jebelli, A.; Ma, Y.; Ahmad, R. An Intelligent Manufacturing Approach Based on a Novel Deep Learning Method for Automatic Machine and Working Status Recognition. Appl. Sci. 2022, 12, 5697. [Google Scholar] [CrossRef]
- Xiong, R.; Zhang, S.; Gan, Z.; Qi, Z.; Liu, M.; Xu, X.; Wang, Q.; Zhang, J.; Li, F.; Chen, X. A novel 3D-vision–based collaborative robot as a scope holding system for port surgery: A technical feasibility study. Neurosurg. Focus 2022, 52, E13. [Google Scholar] [CrossRef]
- Comari, S.; Di Leva, R.; Carricato, M.; Badini, S.; Carapia, A.; Collepalumbo, G.; Gentili, A.; Mazzotti, C.; Staglianò, K.; Rea, D. Mobile cobots for autonomous raw-material feeding of automatic packaging machines. J. Manuf. Syst. 2022, 64, 211–224. [Google Scholar] [CrossRef]
- Zhou, Z.; Li, L.; Fürsterling, A.; Durocher, H.J.; Mouridsen, J.; Zhang, X. Learning-based object detection and localization for a mobile robot manipulator in SME production. Robot. Comput. Integr. Manuf. 2021, 73, 102229. [Google Scholar] [CrossRef]
- Zaki, M.A.; Fathy, A.M.M.; Carnevale, M.; Giberti, H. Application of Realtime Robotics platform to execute unstructured industrial tasks involving industrial robots, cobots, and human operators. Procedia Comput. Sci. 2022, 200, 1359–1367. [Google Scholar] [CrossRef]
- Židek, K.; Pite, J.; Balog, M.; Hošovský, A.; Hladký, V.; Lazorík, P.; Iakovets, A.; Demčák, J. Cnn training using 3d virtual models for assisted assembly with mixed reality and collaborative robots. Appl. Sci. 2021, 11, 4269. [Google Scholar] [CrossRef]
- Olesen, S.; Gergaly, B.B.; Ryberg, E.A.; Thomsen, M.R.; Chrysostomou, D. A collaborative robot cell for random bin-picking based on deep learning policies and a multi-gripper switching strategy. Procedia Manuf. 2020, 51, 3–10. [Google Scholar] [CrossRef]
- Amin, M.; Rezayati, M.; Van De Venn, H.W. A Mixed-Perception Approach for Safe Human–Robot. Sensors 2020, 20, 6347. [Google Scholar] [CrossRef]
- Bejarano, R.; Ferrer, B.R.; Mohammed, W.M.; Lastra, J.L.M. Implementing a human-robot collaborative assembly workstation. IEEE Int. Conf. Ind. Inform. 2019, 2019, 557–564. [Google Scholar] [CrossRef]
Characteristics | Conventional Robots | Cobots |
---|---|---|
Role | Substituting human employee | Aiding human employee |
Human collaboration | Coding used to specify motions, positions, and grips | Recognizes gestures and voice commands and predicts operator movements |
Workstation | Robot and operator workstations are typically fenced | A shared workstation without a fence |
Reprogramming | Rarely required | Necessitates frequently |
Mobility | Fast movements | Slow movements |
Handling payloads | Capable of carrying large payloads | Cannot handle large payloads |
Capability to work in a dynamic environment with moving objects | Restricted | Yes |
No. | Author(s) and Year | Robot Type | Sensing/Simulation Tool | Task Type | Technique | Remarks |
---|---|---|---|---|---|---|
1. | Bagheri et al. [47], 2022 | Franka robotic arm | T-GUI | Assemble toys | Interactive reinforcement learning | The experiment was carried out online and potential behaviors of the cobot across all circumstances were recorded. During the learning process using the cobot’s answers, the human was not permitted to assist. |
2. | Amarillo et al. [48], 2021 | Staubli TX40 | Optoforce FT sensor | Spinal surgery | Control algorithms | Safety issues were not considered. Therefore, the robot’s joint accelerations and velocities have limited use. |
3. | Nicora et al. [49], 2021 | Virtual robot | Azure Kinect cameras | Predicting mental health conditions | Machine learning | The experiments were carried out in simulation. No real robot was utilized to perform the collaborative tasks. |
4. | Oliff et al. [50], 2020 | - | Deep learning-4-Java | Pick and place, move, scrap, and manipulate products | Deep Q-learning networks (DQN) | The model that determines the behavior of the robot was validated through simulation only. Safety issues regarding HRI were not addressed. |
5. | Story et al. [51], 2022 | UR5 | Microsoft Kinect v2 vision | Assembly task | Linear mixed effects model | According to the research, there are correlations between two important robot characteristics, speed and proximity, and psychological tests that were created for many other manufacturing applications with higher levels of automation but not for collaborative work. |
No. | Author(s) and Year | Robot Type | Sensing/Simulation Tool | Task Type | Technique | Remarks |
---|---|---|---|---|---|---|
1. | Zhang et al. [52], 2022 | UR5 robot | Deep image sensor | Simulated alternator assembly | Reinforcement learning | The overall completion time was influenced by several factors, including product features and process modifications. How to calculate and adjust the operating time and resource utilization during collaborative learning in real time was not investigated. |
2. | Silva et al. [53], 2022 | Baxter mobile base | 2D cameras with 1280x720, 30 FPS | Homograph pixel mapping | Deep learning (Scaled-Yolo V4) | When the robot was moving with a significant velocity, a timing discrepancy between the robot placement inside the camera as well as its overlayed position lead both to cover separate portions of the video frame. |
3. | Buerkle et al. [54], 2021 | UR10 | mobile EEG Epoc+ | Assembly tasks | Long short-term memory recurrent neural network | During the pre-movement period, the EEG data from multiple subjects often showed strong comparable patterns that were consistent, such as a decrease in amplitude and a variation in frequency. |
4. | Winter et al. [55], 2019 | - | GUI | Cranfield Assembly task | Interactive reinforcement learning | The assembly was not carried out in real-time, the participant’s knowledge was represented as a consequence graph. The type of robot was not specified. |
5. | Ghadirzadeh et al. [56], 2021 | ABB YuMi robot | Rokoko motion capture suit | Pick, place, and packing | Graph convolutional networks, recurrent Q-learning | Unwanted delays were reduced but the safety issues were not addressed in the work. |
6. | Akkaladevi et al. [57], 2019 | UR10 with SCHUNK 2-finger parallel gripper | RGBD and 3D sensors | Assembly task | Reinforcement learning | In order to understand how things are put together, the robotic system actively suggested a series of appropriate actions based on the current situation. |
7. | Jin Heo et al. [58], 2019 | Indy-7 | Force sensitive resistor | Collision detection | Deep learning (1-D CNN) | Model uncertainty and sensor noise were mostly insensitive to the proposed deep neural network. |
8. | Gomes et al. [59], 2022 | UR3 | RGBD camera | Pick and place | Reinforcement learning (CNN) | The drawback of this model is the lengthy training process, which took several hours to complete the setup before it could be used. The model has restricted flexibility as it excluded the gripper rotation and adversarial geometry. |
9.. | Chen et al. [60], 2020 | Robotic arm | Force sensor | Sawing wooden piece | Neural learning | The EMG signals employed in this work were used to track the levels of muscle activation; muscle exhaustion was not taken into account. There was no discussion of safety concerns during the collaborative task. |
10. | Qureshi et al. [61], 2018 | Aldebaran’s Pepper | 2D camera, 3D sensor, and FSR touch sensor | Societal interaction skills (handshake, eye contact, smile) | Reinforcement learning (DNN) | The existing system performed only a few actions and had no memory. Therefore, the robot was not able to remember the actions executed by people and could not recognize them. |
11. | Wang et al. [62], 2018 | - | - | Engine assembly | DCNN, AlexNet | A collaborative experiment was not discussed clearly. The type of robot and sensors used for the assembly task were not specified. |
12. | Q. Lv et al. [63], 2022 | Industrial robotic arm | Intel RealSense depth camera (D435) and GUI | Lithium battery assembly | Reinforcement learning | The system required extensive coding for assembly tasks. Safety measures were not addressed clearly. |
13. | Weiss et al. [64], 2021 | UR10 | - | Assembling combustion engine, polishing molds | Interactive learning | The task assigned to the robot during assembly was tightening the screw and safety precautions were not discussed. |
14. | Sasagawa et al. [65], 2020 | Master and slave robots | Touch USP haptic device | Handling of objects | Long short-term memory model | The robot proved competent at carrying out tasks using the suggested technique in reaction to modifications in items and settings. |
15. | Lu et al. [66], 2020 | Franka Emika robot with 7 d.o.f | Joint torque sensors | Handling of objects | Long-short term memory model, Q-learning | The findings of the research demonstrate that the suggested methodology performs well in predicting human intentions and that the controller obtains the least jerky trajectory with the least amount of contact force. |
16. | Karn et al. [67], 2022 | Hexahedral robot | RGB camera | Defense | Long-short term memory model | Regarding the length of time it takes for individuals to converse with one another, the architecture is effective. |
17. | De Miguel Lazaro et al. [68], 2019 | YuMi IRB 14000 robot | AWS DeepLens camera, Apache MXNet | Identifying human operator | Deep learning (CNN) | The developed model was not tested for the assembly process to determine the algorithm’s performance level. |
No. | Author(s) and Year | Robot Type | Sensing/Simulation Tool | Task Type | Workspace Type | Technique | Remarks |
---|---|---|---|---|---|---|---|
1. | Mohammed et al. [69], 2022 | ABB IRB 120 | RobotStudio | Assembly tasks | Collaboration | Machine learning | Outside interference was not prevented during the practice session. The changes in brain activity throughout the day were not considered. |
2. | Wang et al. [70], 2022 | Industrial Robot | Smart sensors andcamera | Traffic monitoring | HRI | Machine learning | The researchers took into account social, technical, and economic aspects regarding safety. They did not take into account other human elements. Robot abilities were not discussed. |
3. | Aliev et al. [71], 2021 | UR3 | Sensors, Real-time data exchange | Predict outages and safe stops | Online monitoring (AutoML) | Machine learning | The research was not carried out on various working environments and the human factors were not reviewed clearly. |
4. | Malik et al. [72], 2021 | UR-5 e-series | Tecnomatix process simulation, CAD, proximity sensor | Assembly, pick, and place tasks | Sequential | Machine learning | The physical robot performed the tasks without a worker. The collaborative tasks were explained with the help of digital twins only. |
No. | Author(s) and Year | Robot Type | Sensing/Simulation Tool | Task Type | Workspace Type | Technique | Remarks |
---|---|---|---|---|---|---|---|
1. | Walker et al. [73], 2021 | UR5 | Realsense 435d RGB-D camera | Shackling chickens | Collaborative | Machine vision | Robot learning methods and abilities were examined without considering the implications for contemporary production plants. However, no additional human aspects were examined. |
2. | Edmonds et al. [74], 2019 | Baxter robot | Tactile glove with force sensors, Generalized Earley Parser | Medicine bottle cap opening | Human explanations | - | The robot learning techniques and safety issues were not discussed. |
3. | Grigore et al. [75], 2020 | Firefighting, Hexacopter, HIRRUS V1 | Electro-optical/ infrared cameras | Disaster and recovery tasks | UAV-UGV collaborative | - | The operating scope of the robots in the article did not consider AI techniques. |
4. | Bader et al. [76], 2021 | UR5e | Transducer, GUI | Histotripsy ablation system | Collaborative | - | The low resolution of the passive cavitation images employed in this study was a drawback. AI techniques were not addressed. |
5. | Eyam et al. [77], 2021 | ABB YuMirobot | EEG Epoc+ headset | Box alignment task | Collaborative | Human profiling | The work has not focused on the internal effect caused by the stress that may produce unstable robot reactions. |
6. | Yang et al. [78], 2018 | Baxter robot | Bumblebee2 camera | Object picking task | Collaborative | Least squares method | The experiments were validated with only two healthy subjects. |
7. | Cunha et al. [79], 2020 | Articulatedrobotic arm with7 DoF | The vision system, Rethink robot sawyer | Pipe joining task | Collaborative | Dynamic neural fields | The conceptual framework suggested in this study was validated in a real cooperative assembly activity and is flexible enough to accommodate unforeseen events. |
No. | Author(s) and Year | Robot Type | Sensing Tool | Task Type | Workspace Type | Technique | Remarks |
---|---|---|---|---|---|---|---|
1. | Xu et al. [84], 2021 | Seven DoF manipulator and three-finger robot hand | Two 3D RGB cameras, one depth camera, eye tracker | Hand tracking, environment perceiving, grasping, and trajectory planning | Collaborative | Convolutional neural network | The experiments demonstrate a decrease in planning time and length as well as a posture error, suggesting that the planning process may be more accurate and efficient. |
2. | Jia et al. [85], 2022 | DOBOT CR5 manipulator | Webcam with 1920 × 1080 pixels | Text recognition, working status recognition | Autonomous | Siamese region proposal network, convolutional recurrent neural network | Compared to broad object recognition, text detection and recognition are much more susceptible to image quality. Lettering may also appear blurry when an image is taken due to camera movement. |
3. | Xiong et al. [86], 2022 | UR5 | 3D camera, Basler acA2440-20gm GigE (Basler AG) | Port surgery | Collaborative | Machine vision | Throughout simulating port surgery, the cobot effectively served as a reliable scope-carrying system to deliver a steady and optimized surgical vision. |
4. | Comari et al. [87], 2022 | LBR iiwa has seven degrees of freedom | Laser pointer, monochrome 2D camera | Raw material feeding | Collaborative | Computer vision | The suggested robotic device can load raw ingredients autonomously into a tea-packaging machine while operating securely in the same space as human workers. |
5. | Zhou et al. [88], 2022 | UR5e, Robotiq 2F-85 gripper | PMD 3D camera and See3Cam 2D camera | Printing and cutting of nametags, plug-in charging | Collaborative | Point-voxel region-based CNN (PV-RCNN) | Presented a broad robotic method using a mobile manipulator that was outfitted with cameras and an adaptable gripper for automatic nametag manufacture and plug-in charging in SMEs. |
6. | Ahmed Zaki et al. [89], 2022 | RV-2FRB Mitsubishi industrial robot and Mitsubishi Assista cobot | Intel Realsense D435 3D stereo cameras | Industrial tasks | Collaborative | Computer vision | The implemented system, which was built on dynamic road-map technology, enables run-time trajectory planning for collision prevention between robots and human workers. |
7. | Zidek et al. [90], 2021 | ABB YuMi | Dual 4K e-con, Cognex 7200, and MS Hololens cameras | Assembly process | Collaborative | Deep learning (CNN) | The work discussed in this paper introduces a CNN training approach for implementing deep learning into the assisted assembly operation. |
8. | Olesen et al. [91], 2020 | UR5 manipulator, Schunk WSG 50-110 gripper | Intel RealSense D415 Camera, URG-04LX-UG01 scanner | Mobile phone assembly | Collaborative | CNN, YOLOv3 network | The suggested method deals with the assembly of sample phone prototypes without engaging in actual manufacturing procedures. However, the overall success rate was achieved as 47% only. |
9. | Amin et al. [92], 2020 | Franka Emika robot | Two Kinect V2 cameras | Human action recognition, contact detection | Collaborative | 3D-CNN and 1-D CNN | The human action recognition system achieved an accuracy of 99.7% in an HRC environment using the 3D-CNN algorithm, and 96% of accuracy in physical perception using 1D-CNN. |
10. | Bejarano et al. [93], 2019 | A 7 DoF dual arm ABB YuMi robot | Cognex AE3 camera | Assembling a product box | Collaborative | Machine vision | The design, development, and validation of the assembly process and workstation are shown. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Borboni, A.; Reddy, K.V.V.; Elamvazuthi, I.; AL-Quraishi, M.S.; Natarajan, E.; Azhar Ali, S.S. The Expanding Role of Artificial Intelligence in Collaborative Robots for Industrial Applications: A Systematic Review of Recent Works. Machines 2023, 11, 111. https://doi.org/10.3390/machines11010111
Borboni A, Reddy KVV, Elamvazuthi I, AL-Quraishi MS, Natarajan E, Azhar Ali SS. The Expanding Role of Artificial Intelligence in Collaborative Robots for Industrial Applications: A Systematic Review of Recent Works. Machines. 2023; 11(1):111. https://doi.org/10.3390/machines11010111
Chicago/Turabian StyleBorboni, Alberto, Karna Vishnu Vardhana Reddy, Irraivan Elamvazuthi, Maged S. AL-Quraishi, Elango Natarajan, and Syed Saad Azhar Ali. 2023. "The Expanding Role of Artificial Intelligence in Collaborative Robots for Industrial Applications: A Systematic Review of Recent Works" Machines 11, no. 1: 111. https://doi.org/10.3390/machines11010111
APA StyleBorboni, A., Reddy, K. V. V., Elamvazuthi, I., AL-Quraishi, M. S., Natarajan, E., & Azhar Ali, S. S. (2023). The Expanding Role of Artificial Intelligence in Collaborative Robots for Industrial Applications: A Systematic Review of Recent Works. Machines, 11(1), 111. https://doi.org/10.3390/machines11010111