Recent Advancements in Augmented Reality for Robotic Applications: A Survey
Abstract
:1. Introduction
2. Augmented Reality and Robotics
2.1. Augmented Reality
2.2. Robotics Applications
2.3. Robotic Applications with Augmented Reality
3. Augmented Reality for Medical Robotic Applications
3.1. Preoperative and Surgical Task Planning
Application | References | Robot Platform | AR Medium | Detailed Contents |
---|---|---|---|---|
Surgical task planning | Samei et al. [52] | da Vinci | Console | Prostatectomy |
Zhang et al. [53] | KUKA KR 6 R900 | Projector | Spinal | |
Jiang et al. [15] | Custom | Monitor | Dental | |
Fu et al. [54] | KUKA LWR IV+ | OST-HMD | PCNL | |
Ho et al. [55] | Custom | OST-HMD | Spinal | |
Robot system setup | Fotouhi et al. [56] | KUKA iiwa 7 R800 | OST-HMD | MIS |
Żelechowski et al. [57] | KUKA LBR iiwa Med | OST-HMD | MIS | |
Wörn et al. [59] | da Vinci | Projector | Laparoscopy | |
Weede et al. [60] | KUKA LWR IV+ | Projector | MIS | |
Fu et al. [62] | KUKA LWR IV+ | OST-HMD | MIS | |
Needle insertion planning | Vörös et al. [64] | KUKA LWR IV+ | OST-HMD | Spinal |
Qian et al. [21] | da Vinci | OST-HMD | MIS | |
Ferraguti et al. [65] | KUKA LWR IV+ | OST-HMD | PCNL | |
Wen et al. [66] | Custom | Projector | RF Ablation | |
Boles et al. [67] | Custom | OST-HMD | Spinal |
3.2. Image-Guided Robotic Surgery
Application | References | Platform | AR Medium | Detailed Contents |
---|---|---|---|---|
Head-and-Neck | Chen et al. [71] | da Vinci | HRSV | Anatomy |
Head-and-Neck | Chan et al. [48] | da Vinci | HRSV | Anatomy |
Hepatic | Pessaux et al. [73] | da Vinci | HRSV | Anatomy |
Hepatic | Marques et al. [74] | da Vinci | OST-HMD | Anatomy |
Thyroid | Lee et al. [75] | da Vinci | HRSV | Anatomy |
Colorectal | Shen et al. [76] | Custom | OST-HMD | Anatomy |
Colorectal | Kalia et al. [77] | da Vinci | HRSV | US Scans |
Colorectal | Porpiglia et al. [78] | da Vinci | HRSV | Anatomy |
Urology | Piana et al. [79] | da Vinci | HRSV | Anatomy |
Urology | Edgcumbe et al. [80] | da Vinci | HRSV | Instruments |
Abdominal Cavity | Qian et al. [45] | da Vinci | OST-HMD | Instruments |
3.3. Surgical Training and Simulation
3.4. Telesurgery
4. Augmented Reality for Industrial Robotic Applications
4.1. Human–Robot Interaction and Collaboration
4.2. Path Planning and Task Allocation
Category | Reference | Method | Robot | Medium | AR Content |
---|---|---|---|---|---|
Intuitive robot programming | Ong et al. [119] | Piecewise linear parameterization algorithm for generation of 3D path curve from data points | Scorbot ER VII | HMD | Virtual robot, workspace’s physical entity, and probe marker |
Occlusion removal and path planning | Young et al. [121] | Coordinate mapping between robot and tablet and rapidly exploring random tree for path planning | Industrial robot | Tablet PC | Virtual robot and planned path |
Semi-automatic offline task programming | Solyman et al. [122] | Stereo matching algorithms for overlaying of virtual graphics on real scenes and interactive robot path planning | 6-DoF robot arm | Tablet PC | 2D workspace boundary, rendering robot path, and exception notification |
Welding task optimization | Tavares et al. [123] | Laser scanner TCP calibration and genetic algorithm for robot trajectory optimization | Welding robot | MediaLas ILP 622 projector | Location where the operator should place and tack weld the metal parts |
Inferred goals communication | Mullen et al. [124] | AR-based passive visualization and haptic wristband-based active prompts | 7-DoF robot arm | HoloLens | Robot motion goal and text alert |
Grasping task planning | Weisz et al. [125] | OpenRAVE for motion planning and RANSAC method for target object localization | BarrettHand gripper | Tablet PC | Scene point cloud, selection button, and object model |
Navigation trajectory decision | Chadalavada et al. [126] | Eye-gaze-based human-to-robot intention transference | AGV system | Optoma X320UST projector | Line (robot path) and arrow (robot direction) |
Multi-robot task planning | Li et al. [127] | AR for robot teleoperation, reinforcement learning for robot motion planning | Universal Robot 5 | HoloLens 2 | Task video, control button, and virtual robot model |
Adaptive HRC task allocation | Zheng et al. [128] | Human action recognition, object 6-DoF estimation, 3D point cloud segmentation and knowledge graph for task strategy generation | Universal Robot 5 | HoloLens 2 | Robot states and task instruction |
Task allocation under uncertainties | Zheng et al. [129] | Knowledge-graph-based task planning, human digital twin modeling, robot learning from demonstration | Universal Robot 5 | HoloLens 2 | Virtual robot model and overlapped distance |
4.3. Training and Simulation
4.4. Teleoperation Control/Assistance
Category | Reference | Robot | Medium | AR Content |
---|---|---|---|---|
Industrial robot teleoperation | Solanes et al. [141] | 6R industrial robot | HoloLens | Virtual work environment |
Bimanual robot teleoperation | Garcia et al. [142] | Two industrial robots | HoloLens | Working environment simulation |
Industrial robot teleoperation with orientation and speed requirements | Pan et al. [143] | ABB IRB120 | PC screen | Robot work cell |
Mobile manipulator teleoperation | Su et al. [144] | Mobile manipulator | HTC Vive | Virtual work scenarios |
Spraying task assistance | Elsdon and Demiris [145] | Handheld spraying robot | HoloLens | Virtual processing paths and menu |
Comparison of haptic and AR cues for assisting teleoperation | Lin et al. [147] | KINOVA Gen 3 | HTC Vive | Robot workspace |
Robotic welding with AR and haptic assistance | Su et al. [148] | UR5 | HTC Vive HMD | Virtual scene |
Object manipulation | Frank et al. [149] | Collaborative robots | Tablets | Virtural objects |
Industrial robot teleoperation | Su et al. [150] | Industrial robot | HTablet | Virtual robot workspace |
Collision-aware telemanipulation | Piyavichayanon et al. [151] | 7-DOF manipulator | Mobilephone | Virtual robot workspace |
5. Discussion
5.1. Limitations and Challenges
5.2. Future Perspectives
6. Conclusions
Author Contributions
Funding
Conflicts of Interest
Abbreviations
AR | Augmented reality |
BCI | Brain–computer interface |
CT | Computerized tomography |
DoFs | Degrees of freedoms |
EE | End effector |
FoV | Field of view |
GUI | Graphical user interface |
HMD | Head–mounted display |
HMI | Human–machine interface |
HRC | Human–robot collaboration |
HRI | Human–robot interaction |
HRSV | High-resolution stereo viewers |
MR | Mixed reality |
MRI | Magnetic resonance image |
MIS | Minimally invasive surgery |
OST-HMD | Optical see-through head-mounted display |
OR | Operating room |
PCNL | Percutaneous nephrolithotomy |
RAS | Robot-assisted surgery |
RA-MIS | Robot-assisted minimally invasive surgery |
THP | Total hip replacement |
UI | User interface |
References
- Dupont, P.E.; Nelson, B.J.; Goldfarb, M.; Hannaford, B.; Menciassi, A.; O’Malley, M.K.; Simaan, N.; Valdastri, P.; Yang, G.Z. A decade retrospective of medical robotics research from 2010 to 2020. Sci. Robot. 2021, 6, eabi8017. [Google Scholar] [CrossRef] [PubMed]
- Saeidi, H.; Opfermann, J.D.; Kam, M.; Wei, S.; Léonard, S.; Hsieh, M.H.; Kang, J.U.; Krieger, A. Autonomous robotic laparoscopic surgery for intestinal anastomosis. Sci. Robot. 2022, 7, eabj2908. [Google Scholar] [CrossRef] [PubMed]
- Casalino, A.; Bazzi, D.; Zanchettin, A.M.; Rocco, P. Optimal proactive path planning for collaborative robots in industrial contexts. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 6540–6546. [Google Scholar]
- Zhao, J.; Giammarino, A.; Lamon, E.; Gandarias, J.M.; De Momi, E.; Ajoudani, A. A hybrid learning and optimization framework to achieve physically interactive tasks with mobile manipulators. IEEE Robot. Autom. Lett. 2022, 7, 8036–8043. [Google Scholar] [CrossRef]
- Fu, J.; Poletti, M.; Liu, Q.; Iovene, E.; Su, H.; Ferrigno, G.; De Momi, E. Teleoperation Control of an Underactuated Bionic Hand: Comparison between Wearable and Vision-Tracking-Based Methods. Robotics 2022, 11, 61. [Google Scholar] [CrossRef]
- Attanasio, A.; Scaglioni, B.; De Momi, E.; Fiorini, P.; Valdastri, P. Autonomy in surgical robotics. Annu. Rev. Control Robot. Auton. Syst. 2021, 4, 651–679. [Google Scholar] [CrossRef]
- Faoro, G.; Maglio, S.; Pane, S.; Iacovacci, V.; Menciassi, A. An Artificial Intelligence-Aided Robotic Platform for Ultrasound-Guided Transcarotid Revascularization. IEEE Robot. Autom. Lett. 2023, 8, 2349–2356. [Google Scholar] [CrossRef]
- Iovene, E.; Casella, A.; Iordache, A.V.; Fu, J.; Pessina, F.; Riva, M.; Ferrigno, G.; De Momi, E. Towards Exoscope Automation in Neurosurgery: A Markerless Visual-Servoing Approach. IEEE Trans. Med. Robot. Bionics 2023, 5, 411–420. [Google Scholar] [CrossRef]
- Zheng, P.; Wang, H.; Sang, Z.; Zhong, R.Y.; Liu, Y.; Liu, C.; Mubarok, K.; Yu, S.; Xu, X. Smart manufacturing systems for Industry 4.0: Conceptual framework, scenarios, and future perspectives. Front. Mech. Eng. 2018, 13, 137–150. [Google Scholar] [CrossRef]
- Young, S.N.; Peschel, J.M. Review of human-machine interfaces for small unmanned systems with robotic manipulators. IEEE Trans. Hum.-Mach. Syst. 2020, 50, 131–143. [Google Scholar] [CrossRef]
- Guo, L.; Lu, Z.; Yao, L. Human-machine interaction sensing technology based on hand gesture recognition: A review. IEEE Trans. Hum.-Mach. Syst. 2021, 51, 300–309. [Google Scholar] [CrossRef]
- Aronson, R.M.; Santini, T.; Kübler, T.C.; Kasneci, E.; Srinivasa, S.; Admoni, H. Eye-hand behavior in human-robot shared manipulation. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, Chicago, IL, USA, 5–8 March 2018; pp. 4–13. [Google Scholar]
- Palumbo, M.C.; Saitta, S.; Schiariti, M.; Sbarra, M.C.; Turconi, E.; Raccuia, G.; Fu, J.; Dallolio, V.; Ferroli, P.; Votta, E.; et al. Mixed Reality and Deep Learning for External Ventricular Drainage Placement: A Fast and Automatic Workflow for Emergency Treatments. In Proceedings of the Medical Image Computing and Computer Assisted Intervention—MICCAI 2022: 25th International Conference, Singapore, 18–22 September 2022; Proceedings, Part VII. Springer: Cham, Switzerland, 2022; pp. 147–156. [Google Scholar]
- Gadre, S.Y.; Rosen, E.; Chien, G.; Phillips, E.; Tellex, S.; Konidaris, G. End-user robot programming using mixed reality. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 2707–2713. [Google Scholar]
- Jiang, J.; Guo, Y.; Huang, Z.; Zhang, Y.; Wu, D.; Liu, Y. Adjacent surface trajectory planning of robot-assisted tooth preparation based on augmented reality. Eng. Sci. Technol. Int. J. 2022, 27, 101001. [Google Scholar] [CrossRef]
- Bianchi, L.; Chessa, F.; Angiolini, A.; Cercenelli, L.; Lodi, S.; Bortolani, B.; Molinaroli, E.; Casablanca, C.; Droghetti, M.; Gaudiano, C.; et al. The use of augmented reality to guide the intraoperative frozen section during robot-assisted radical prostatectomy. Eur. Urol. 2021, 80, 480–488. [Google Scholar] [CrossRef] [PubMed]
- Richter, F.; Zhang, Y.; Zhi, Y.; Orosco, R.K.; Yip, M.C. Augmented reality predictive displays to help mitigate the effects of delayed telesurgery. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 444–450. [Google Scholar]
- Wang, Z.; Bai, X.; Zhang, S.; Billinghurst, M.; He, W.; Wang, P.; Lan, W.; Min, H.; Chen, Y. A comprehensive review of augmented reality-based instruction in manual assembly, training and repair. Robot. Comput.-Integr. Manuf. 2022, 78, 102407. [Google Scholar] [CrossRef]
- Wang, X.V.; Wang, L.; Lei, M.; Zhao, Y. Closed-loop augmented reality towards accurate human-robot collaboration. CIRP Ann. 2020, 69, 425–428. [Google Scholar] [CrossRef]
- Mourtzis, D.; Siatras, V.; Zogopoulos, V. Augmented reality visualization of production scheduling and monitoring. Procedia CIRP 2020, 88, 151–156. [Google Scholar] [CrossRef]
- Qian, L.; Deguet, A.; Wang, Z.; Liu, Y.H.; Kazanzides, P. Augmented reality assisted instrument insertion and tool manipulation for the first assistant in robotic surgery. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 5173–5179. [Google Scholar]
- Bertolo, R.; Hung, A.; Porpiglia, F.; Bove, P.; Schleicher, M.; Dasgupta, P. Systematic review of augmented reality in urological interventions: The evidences of an impact on surgical outcomes are yet to come. World J. Urol. 2020, 38, 2167–2176. [Google Scholar] [CrossRef]
- Makhataeva, Z.; Varol, H.A. Augmented reality for robotics: A review. Robotics 2020, 9, 21. [Google Scholar] [CrossRef] [Green Version]
- Suzuki, R.; Karim, A.; Xia, T.; Hedayati, H.; Marquardt, N. Augmented reality and robotics: A survey and taxonomy for ar-enhanced human-robot interaction and robotic interfaces. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, Orleans, LA, USA, 29 April–5 May 2022; pp. 1–33. [Google Scholar]
- Ungureanu, D.; Bogo, F.; Galliani, S.; Sama, P.; Duan, X.; Meekhof, C.; Stühmer, J.; Cashman, T.J.; Tekin, B.; Schönberger, J.L.; et al. Hololens 2 research mode as a tool for computer vision research. arXiv 2020, arXiv:2008.11239. [Google Scholar]
- Milgram, P.; Kishino, F. A taxonomy of mixed reality visual displays. IEICE Trans. Inf. Syst. 1994, 77, 1321–1329. [Google Scholar]
- Dwivedi, Y.K.; Hughes, L.; Baabdullah, A.M.; Ribeiro-Navarrete, S.; Giannakis, M.; Al-Debei, M.M.; Dennehy, D.; Metri, B.; Buhalis, D.; Cheung, C.M.; et al. Metaverse beyond the hype: Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. Int. J. Inf. Manag. 2022, 66, 102542. [Google Scholar] [CrossRef]
- Dargan, S.; Bansal, S.; Kumar, M.; Mittal, A.; Kumar, K. Augmented Reality: A Comprehensive Review. Arch. Comput. Methods Eng. 2023, 30, 1057–1080. [Google Scholar] [CrossRef]
- Azuma, R.T. A survey of augmented reality. Presence Teleoper. Virtual Environ. 1997, 6, 355–385. [Google Scholar] [CrossRef]
- Speicher, M.; Hall, B.D.; Nebeling, M. What is mixed reality? In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–15. [Google Scholar]
- Qian, L.; Wu, J.Y.; DiMaio, S.P.; Navab, N.; Kazanzides, P. A review of augmented reality in robotic-assisted surgery. IEEE Trans. Med. Robot. Bionics 2019, 2, 1–16. [Google Scholar] [CrossRef]
- Carbone, M.; Cutolo, F.; Condino, S.; Cercenelli, L.; D’Amato, R.; Badiali, G.; Ferrari, V. Architecture of a hybrid video/optical see-through head-mounted display-based augmented reality surgical navigation platform. Information 2022, 13, 81. [Google Scholar] [CrossRef]
- Lin, G.; Panigrahi, T.; Womack, J.; Ponda, D.J.; Kotipalli, P.; Starner, T. Comparing order picking guidance with Microsoft hololens, magic leap, google glass xe and paper. In Proceedings of the 22nd International Workshop on Mobile Computing Systems and Applications, Virtual, 24–26 February 2021; pp. 133–139. [Google Scholar]
- Andrews, C.M.; Henry, A.B.; Soriano, I.M.; Southworth, M.K.; Silva, J.R. Registration techniques for clinical applications of three-dimensional augmented reality devices. IEEE J. Transl. Eng. Health Med. 2020, 9, 4900214. [Google Scholar] [CrossRef]
- Michalos, G.; Makris, S.; Papakostas, N.; Mourtzis, D.; Chryssolouris, G. Automotive assembly technologies review: Challenges and outlook for a flexible and adaptive approach. CIRP J. Manuf. Sci. Technol. 2010, 2, 81–91. [Google Scholar] [CrossRef]
- Lin, J.C. The role of robotic surgical system in the management of vascular disease. Ann. Vasc. Surg. 2013, 27, 976–983. [Google Scholar] [CrossRef]
- Okamura, A.M.; Matarić, M.J.; Christensen, H.I. Medical and health-care robotics. IEEE Robot. Autom. Mag. 2010, 17, 26–37. [Google Scholar] [CrossRef]
- Hägele, M.; Nilsson, K.; Pires, J.N.; Bischoff, R. Industrial robotics. In Springer Handbook of Robotics; Springer: Cham, Switzerland, 2016; pp. 1385–1422. [Google Scholar]
- Yang, G.Z.; Cambias, J.; Cleary, K.; Daimler, E.; Drake, J.; Dupont, P.E.; Hata, N.; Kazanzides, P.; Martel, S.; Patel, R.V.; et al. Medical robotics—Regulatory, ethical, and legal considerations for increasing levels of autonomy. Sci. Robot. 2017, 2, eaam8638. [Google Scholar] [CrossRef]
- Galin, R.; Meshcheryakov, R. Collaborative robots: Development of robotic perception system, safety issues, and integration of ai to imitate human behavior. In Proceedings of the 15th International Conference on Electromechanics and Robotics “Zavalishin’s Readings” ER (ZR) 2020, Ufa, Russia, 15–18 April 2020; Springer: Singapore, 2021; pp. 175–185. [Google Scholar]
- Raj, M.; Seamans, R. Primer on artificial intelligence and robotics. J. Organ. Des. 2019, 8, 1–14. [Google Scholar] [CrossRef] [Green Version]
- Bandari, V.K.; Schmidt, O.G. System-Engineered Miniaturized Robots: From Structure to Intelligence. Adv. Intell. Syst. 2021, 3, 2000284. [Google Scholar] [CrossRef]
- Frijns, H.A.; Schürer, O.; Koeszegi, S.T. Communication models in human–robot interaction: An asymmetric MODel of ALterity in human–robot interaction (AMODAL-HRI). Int. J. Soc. Robot. 2023, 15, 473–500. [Google Scholar] [CrossRef]
- Porpiglia, F.; Checcucci, E.; Amparore, D.; Manfredi, M.; Massa, F.; Piazzolla, P.; Manfrin, D.; Piana, A.; Tota, D.; Bollito, E.; et al. Three-dimensional elastic augmented-reality robot-assisted radical prostatectomy using hyperaccuracy three-dimensional reconstruction technology: A step further in the identification of capsular involvement. Eur. Urol. 2019, 76, 505–514. [Google Scholar] [CrossRef] [PubMed]
- Qian, L.; Deguet, A.; Kazanzides, P. ARssist: Augmented reality on a head-mounted display for the first assistant in robotic surgery. Healthc. Technol. Lett. 2018, 5, 194–200. [Google Scholar] [CrossRef]
- Ong, S.; Nee, A.; Yew, A.; Thanigaivel, N. AR-assisted robot welding programming. Adv. Manuf. 2020, 8, 40–48. [Google Scholar] [CrossRef]
- Ostanin, M.; Mikhel, S.; Evlampiev, A.; Skvortsova, V.; Klimchik, A. Human-robot interaction for robotic manipulator programming in Mixed Reality. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 2805–2811. [Google Scholar]
- Chan, J.Y.; Holsinger, F.C.; Liu, S.; Sorger, J.M.; Azizian, M.; Tsang, R.K. Augmented reality for image guidance in transoral robotic surgery. J. Robot. Surg. 2020, 14, 579–583. [Google Scholar] [CrossRef]
- Rosenberg, L.B. Virtual fixtures: Perceptual tools for telerobotic manipulation. In Proceedings of the IEEE Virtual Reality Annual International Symposium, Seattle, WA, USA, 18–22 September 1993; pp. 76–82. [Google Scholar]
- Kwoh, Y.S.; Hou, J.; Jonckheere, E.A.; Hayati, S. A robot with improved absolute positioning accuracy for CT guided stereotactic brain surgery. IEEE Trans. Biomed. Eng. 1988, 35, 153–160. [Google Scholar] [CrossRef]
- Anderson, J.; Chui, C.K.; Cai, Y.; Wang, Y.; Li, Z.; Ma, X.; Nowinski, W.; Solaiyappan, M.; Murphy, K.; Gailloud, P.; et al. Virtual reality training in interventional radiology: The Johns Hopkins and Kent Ridge digital laboratory experience. Semin. Interv. Radiol. 2002, 19, 179–186. [Google Scholar] [CrossRef] [Green Version]
- Samei, G.; Tsang, K.; Kesch, C.; Lobo, J.; Hor, S.; Mohareri, O.; Chang, S.; Goldenberg, S.L.; Black, P.C.; Salcudean, S. A partial augmented reality system with live ultrasound and registered preoperative MRI for guiding robot-assisted radical prostatectomy. Med. Image Anal. 2020, 60, 101588. [Google Scholar] [CrossRef]
- Zhang, F.; Chen, L.; Miao, W.; Sun, L. Research on accuracy of augmented reality surgical navigation system based on multi-view virtual and real registration technology. IEEE Access 2020, 8, 122511–122528. [Google Scholar] [CrossRef]
- Fu, J.; Matteo, P.; Palumbo, M.C.; Iovene, E.; Rota, A.; Riggio, D.; Ilaria, B.; Redaelli, A.C.L.; Ferrigno, G.; De Momi, E.; et al. Augmented Reality and Shared Control Framework for Robot-Assisted Percutaneous Nephrolithotomy. In Proceedings of the 2023 IEEE International Conference on Robotics and Automation (ICRA) Workshop—New Evolutions in Surgical Robotics: Embracing Multimodal Imaging Guidance, Intelligence, and Bio-Inspired Mechanisms, London, UK, 29 May–2 June 2023; pp. 1–2. [Google Scholar]
- Ho, T.H.; Song, K.T. Supervised control for robot-assisted surgery using augmented reality. In Proceedings of the 2020 20th International Conference on Control, Automation and Systems (ICCAS), Busan, Republic of Korea, 13–16 October 2020; pp. 329–334. [Google Scholar]
- Fotouhi, J.; Song, T.; Mehrfard, A.; Taylor, G.; Wang, Q.; Xian, F.; Martin-Gomez, A.; Fuerst, B.; Armand, M.; Unberath, M.; et al. Reflective-ar display: An interaction methodology for virtual-to-real alignment in medical robotics. IEEE Robot. Autom. Lett. 2020, 5, 2722–2729. [Google Scholar] [CrossRef]
- Żelechowski, M.; Karnam, M.; Faludi, B.; Gerig, N.; Rauter, G.; Cattin, P.C. Patient positioning by visualising surgical robot rotational workspace in augmented reality. Comput. Methods Biomech. Biomed. Eng. Imaging Vis. 2022, 10, 451–457. [Google Scholar] [CrossRef]
- Żelechowski, M.; Faludi, B.; Karnam, M.; Gerig, N.; Rauter, G.; Cattin, P.C. Automatic patient positioning based on robot rotational workspace for extended reality. Int. J. Comput. Assist. Radiol. Surg. 2023, 1–9. [Google Scholar] [CrossRef] [PubMed]
- Wörn, H.; Weede, O. Optimizing the setup configuration for manual and robotic assisted minimally invasive surgery. In Proceedings of the World Congress on Medical Physics and Biomedical Engineering, Munich, Germany, 7–12 September 2009; Vol. 25/6 Surgery, Nimimal Invasive Interventions, Endoscopy and Image Guided Therapy. Springer: Berlin/Heidelberg, Germany, 2009; pp. 55–58. [Google Scholar]
- Weede, O.; Wünscher, J.; Kenngott, H.; Müller-Stich, B.; Wörn, H. Knowledge-based planning of port positions for minimally invasive surgery. In Proceedings of the 2013 IEEE Conference on Cybernetics and Intelligent Systems (CIS), Manila, Philippines, 12–15 November 2013; pp. 12–17. [Google Scholar]
- Unberath, M.; Fotouhi, J.; Hajek, J.; Maier, A.; Osgood, G.; Taylor, R.; Armand, M.; Navab, N. Augmented reality-based feedback for technician-in-the-loop C-arm repositioning. Healthc. Technol. Lett. 2018, 5, 143–147. [Google Scholar] [CrossRef] [PubMed]
- Fu, J.; Palumbo, M.C.; Iovene, E.; Liu, Q.; Burzo, I.; Redaelli, A.; Ferrigno, G.; De Momi, E. Augmented Reality-Assisted Robot Learning Framework for Minimally Invasive Surgery Task. In Proceedings of the 2023 IEEE International Conference on Robotics and Automation (ICRA), London, UK, 29 May–2 June 2023; pp. 11647–11653. [Google Scholar]
- Giannone, F.; Felli, E.; Cherkaoui, Z.; Mascagni, P.; Pessaux, P. Augmented reality and image-guided robotic liver surgery. Cancers 2021, 13, 6268. [Google Scholar] [CrossRef] [PubMed]
- Vörös, V.; Li, R.; Davoodi, A.; Wybaillie, G.; Vander Poorten, E.; Niu, K. An Augmented Reality-Based Interaction Scheme for Robotic Pedicle Screw Placement. J. Imaging 2022, 8, 273. [Google Scholar] [CrossRef]
- Ferraguti, F.; Minelli, M.; Farsoni, S.; Bazzani, S.; Bonfè, M.; Vandanjon, A.; Puliatti, S.; Bianchi, G.; Secchi, C. Augmented reality and robotic-assistance for percutaneous nephrolithotomy. IEEE Robot. Autom. Lett. 2020, 5, 4556–4563. [Google Scholar] [CrossRef]
- Wen, R.; Chui, C.K.; Ong, S.H.; Lim, K.B.; Chang, S.K.Y. Projection-based visual guidance for robot-aided RF needle insertion. Int. J. Comput. Assist. Radiol. Surg. 2013, 8, 1015–1025. [Google Scholar] [CrossRef] [PubMed]
- Boles, M.; Fu, J.; Iovene, E.; Francesco, C.; Ferrigno, G.; De Momi, E. Augmented Reality and Robotic Navigation System for Spinal Surgery. In Proceedings of the 11th Joint Workshop on New Technologies for Computer/Robot Assisted Surgery, Napoli, Italy, 25–27 April 2022; pp. 96–97. [Google Scholar]
- Fotouhi, J.; Mehrfard, A.; Song, T.; Johnson, A.; Osgood, G.; Unberath, M.; Armand, M.; Navab, N. Development and pre-clinical analysis of spatiotemporal-aware augmented reality in orthopedic interventions. IEEE Trans. Med. Imaging 2020, 40, 765–778. [Google Scholar] [CrossRef]
- Andress, S.; Johnson, A.; Unberath, M.; Winkler, A.F.; Yu, K.; Fotouhi, J.; Weidert, S.; Osgood, G.; Navab, N. On-the-fly augmented reality for orthopedic surgery using a multimodal fiducial. J. Med. Imaging 2018, 5, 021209. [Google Scholar] [CrossRef]
- Carl, B.; Bopp, M.; Saß, B.; Voellger, B.; Nimsky, C. Implementation of augmented reality support in spine surgery. Eur. Spine J. 2019, 28, 1697–1711. [Google Scholar] [CrossRef] [PubMed]
- Chen, W.; Kalia, M.; Zeng, Q.; Pang, E.H.; Bagherinasab, R.; Milner, T.D.; Sabiq, F.; Prisman, E.; Salcudean, S.E. Towards transcervical ultrasound image guidance for transoral robotic surgery. Int. J. Comput. Assist. Radiol. Surg. 2023, 18, 1061–1068. [Google Scholar] [CrossRef] [PubMed]
- Rahman, R.; Wood, M.E.; Qian, L.; Price, C.L.; Johnson, A.A.; Osgood, G.M. Head-mounted display use in surgery: A systematic review. Surg. Innov. 2020, 27, 88–100. [Google Scholar] [CrossRef] [PubMed]
- Pessaux, P.; Diana, M.; Soler, L.; Piardi, T.; Mutter, D.; Marescaux, J. Towards cybernetic surgery: Robotic and augmented reality-assisted liver segmentectomy. Langenbeck’s Arch. Surg. 2015, 400, 381–385. [Google Scholar] [CrossRef]
- Marques, B.; Plantefève, R.; Roy, F.; Haouchine, N.; Jeanvoine, E.; Peterlik, I.; Cotin, S. Framework for augmented reality in Minimally Invasive laparoscopic surgery. In Proceedings of the 2015 17th International Conference on E-Health Networking, Application & Services (HealthCom), Boston, MA, USA, 14–17 October 2015; pp. 22–27. [Google Scholar]
- Lee, D.; Kong, H.J.; Kim, D.; Yi, J.W.; Chai, Y.J.; Lee, K.E.; Kim, H.C. Preliminary study on application of augmented reality visualization in robotic thyroid surgery. Ann. Surg. Treat. Res. 2018, 95, 297–302. [Google Scholar] [CrossRef]
- Shen, J.; Zemiti, N.; Taoum, C.; Aiche, G.; Dillenseger, J.L.; Rouanet, P.; Poignet, P. Transrectal ultrasound image-based real-time augmented reality guidance in robot-assisted laparoscopic rectal surgery: A proof-of-concept study. Int. J. Comput. Assist. Radiol. Surg. 2020, 15, 531–543. [Google Scholar] [CrossRef] [Green Version]
- Kalia, M.; Avinash, A.; Navab, N.; Salcudean, S. Preclinical evaluation of a markerless, real-time, augmented reality guidance system for robot-assisted radical prostatectomy. Int. J. Comput. Assist. Radiol. Surg. 2021, 16, 1181–1188. [Google Scholar] [CrossRef]
- Porpiglia, F.; Checcucci, E.; Amparore, D.; Piana, A.; Piramide, F.; Volpi, G.; De Cillis, S.; Manfredi, M.; Fiori, C.; Piazzolla, P.; et al. PD63-12 extracapsular extension on neurovascular bundles during robot-assisted radical prostatectomy precisely localized by 3D automatic augmented-reality rendering. J. Urol. 2020, 203, e1297. [Google Scholar] [CrossRef]
- Piana, A.; Gallioli, A.; Amparore, D.; Diana, P.; Territo, A.; Campi, R.; Gaya, J.M.; Guirado, L.; Checcucci, E.; Bellin, A.; et al. Three-dimensional Augmented Reality–guided Robotic-assisted Kidney Transplantation: Breaking the Limit of Atheromatic Plaques. Eur. Urol. 2022, 82, 419–426. [Google Scholar] [CrossRef]
- Edgcumbe, P.; Singla, R.; Pratt, P.; Schneider, C.; Nguan, C.; Rohling, R. Augmented reality imaging for robot-assisted partial nephrectomy surgery. In Proceedings of the Medical Imaging and Augmented Reality: 7th International Conference, MIAR 2016, Bern, Switzerland, 24–26 August 2016; Proceedings 7. Springer: Cham, Switzerland, 2016; pp. 139–150. [Google Scholar]
- Peden, R.G.; Mercer, R.; Tatham, A.J. The use of head-mounted display eyeglasses for teaching surgical skills: A prospective randomised study. Int. J. Surg. 2016, 34, 169–173. [Google Scholar] [CrossRef]
- Long, Y.; Cao, J.; Deguet, A.; Taylor, R.H.; Dou, Q. Integrating artificial intelligence and augmented reality in robotic surgery: An initial dvrk study using a surgical education scenario. In Proceedings of the 2022 International Symposium on Medical Robotics (ISMR), Atlanta, GA, USA, 13–15 April 2022; pp. 1–8. [Google Scholar]
- Rewkowski, N.; State, A.; Fuchs, H. Small Marker Tracking with Low-Cost, Unsynchronized, Movable Consumer Cameras for Augmented Reality Surgical Training. In Proceedings of the 2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), Recife, Brazil, 9–13 November 2020; pp. 90–95. [Google Scholar]
- Barresi, G.; Olivieri, E.; Caldwell, D.G.; Mattos, L.S. Brain-controlled AR feedback design for user’s training in surgical HRI. In Proceedings of the 2015 IEEE International Conference on Systems, Man, and Cybernetics, Hong Kong, China, 9–12 October 2015; pp. 1116–1121. [Google Scholar]
- Wang, Y.; Zeng, H.; Song, A.; Xu, B.; Li, H.; Zhu, L.; Wen, P.; Liu, J. Robotic arm control using hybrid brain-machine interface and augmented reality feedback. In Proceedings of the 2017 8th International IEEE/EMBS Conference on Neural Engineering (NER), Shanghai, China, 25–28 May 2017; pp. 411–414. [Google Scholar]
- Zeng, H.; Wang, Y.; Wu, C.; Song, A.; Liu, J.; Ji, P.; Xu, B.; Zhu, L.; Li, H.; Wen, P. Closed-loop hybrid gaze brain-machine interface based robotic arm control with augmented reality feedback. Front. Neurorobot. 2017, 11, 60. [Google Scholar] [CrossRef] [Green Version]
- Gras, G.; Yang, G.Z. Context-aware modeling for augmented reality display behaviour. IEEE Robot. Autom. Lett. 2019, 4, 562–569. [Google Scholar] [CrossRef]
- Condino, S.; Viglialoro, R.M.; Fani, S.; Bianchi, M.; Morelli, L.; Ferrari, M.; Bicchi, A.; Ferrari, V. Tactile augmented reality for arteries palpation in open surgery training. In Proceedings of the Medical Imaging and Augmented Reality: 7th International Conference, MIAR 2016, Bern, Switzerland, 24–26 August 2016; Proceedings 7. Springer: Cham, Switzerland, 2016; pp. 186–197. [Google Scholar]
- Jørgensen, M.K.; Kraus, M. Real-time augmented reality for robotic-assisted surgery. In Proceedings of the 3rd AAU Workshop on Human-Centered Robotics, Aalborg Universitetsforlag, Aalborg, Denmark, 30 October 2014; pp. 19–23. [Google Scholar]
- Si, W.X.; Liao, X.Y.; Qian, Y.L.; Sun, H.T.; Chen, X.D.; Wang, Q.; Heng, P.A. Assessing performance of augmented reality-based neurosurgical training. Vis. Comput. Ind. Biomed. Art 2019, 2, 1–10. [Google Scholar] [CrossRef] [Green Version]
- Su, H.; Yang, C.; Ferrigno, G.; De Momi, E. Improved human-robot collaborative control of redundant robot for teleoperated minimally invasive surgery. IEEE Robot. Autom. Lett. 2019, 4, 1447–1453. [Google Scholar] [CrossRef] [Green Version]
- Dinh, A.; Yin, A.L.; Estrin, D.; Greenwald, P.; Fortenko, A. Augmented Reality in Real-time Telemedicine and Telementoring: Scoping Review. JMIR mHealth uHealth 2023, 11, e45464. [Google Scholar] [CrossRef] [PubMed]
- Lin, Z.; Zhang, T.; Sun, Z.; Gao, H.; Ai, X.; Chen, W.; Yang, G.Z.; Gao, A. Robotic Telepresence Based on Augmented Reality and Human Motion Mapping for Interventional Medicine. IEEE Trans. Med. Robot. Bionics 2022, 4, 935–944. [Google Scholar] [CrossRef]
- Gasques, D.; Johnson, J.G.; Sharkey, T.; Feng, Y.; Wang, R.; Xu, Z.R.; Zavala, E.; Zhang, Y.; Xie, W.; Zhang, X.; et al. ARTEMIS: A collaborative mixed-reality system for immersive surgical telementoring. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 8–13 May 2021; pp. 1–14. [Google Scholar]
- Black, D.; Salcudean, S. Human-as-a-robot performance in mixed reality teleultrasound. Int. J. Comput. Assist. Radiol. Surg. 2023, 1–8. [Google Scholar] [CrossRef] [PubMed]
- Qian, L.; Zhang, X.; Deguet, A.; Kazanzides, P. Aramis: Augmented reality assistance for minimally invasive surgery using a head-mounted display. In Proceedings of the Medical Image Computing and Computer Assisted Intervention—MICCAI 2019: 22nd International Conference, Shenzhen, China, 13–17 October 2019; Proceedings, Part V 22. Springer: Cham, Switzerland, 2019; pp. 74–82. [Google Scholar]
- Huang, T.; Li, R.; Li, Y.; Zhang, X.; Liao, H. Augmented reality-based autostereoscopic surgical visualization system for telesurgery. Int. J. Comput. Assist. Radiol. Surg. 2021, 16, 1985–1997. [Google Scholar] [CrossRef]
- Lin, Z.; Gao, A.; Ai, X.; Gao, H.; Fu, Y.; Chen, W.; Yang, G.Z. ARei: Augmented-reality-assisted touchless teleoperated robot for endoluminal intervention. IEEE/ASME Trans. Mechatron. 2021, 27, 3144–3154. [Google Scholar] [CrossRef]
- Fu, Y.; Lin, W.; Yu, X.; Rodríguez-Andina, J.J.; Gao, H. Robot-Assisted Teleoperation Ultrasound System Based on Fusion of Augmented Reality and Predictive Force. IEEE Trans. Ind. Electron. 2022, 70, 7449–7456. [Google Scholar] [CrossRef]
- Ma, X.; Song, C.; Qian, L.; Liu, W.; Chiu, P.W.; Li, Z. Augmented reality-assisted autonomous view adjustment of a 6-DOF robotic stereo flexible endoscope. IEEE Trans. Med. Robot. Bionics 2022, 4, 356–367. [Google Scholar] [CrossRef]
- Bonne, S.; Panitch, W.; Dharmarajan, K.; Srinivas, K.; Kincade, J.L.; Low, T.; Knoth, B.; Cowan, C.; Fer, D.; Thananjeyan, B.; et al. A Digital Twin Framework for Telesurgery in the Presence of Varying Network Quality of Service. In Proceedings of the 2022 IEEE 18th International Conference on Automation Science and Engineering (CASE), Mexico City, Mexico, 20–24 August 2022; pp. 1325–1332. [Google Scholar]
- Gonzalez, G.; Balakuntala, M.; Agarwal, M.; Low, T.; Knoth, B.; Kirkpatrick, A.W.; McKee, J.; Hager, G.; Aggarwal, V.; Xue, Y.; et al. ASAP: A Semi-Autonomous Precise System for Telesurgery during Communication Delays. IEEE Trans. Med. Robot. Bionics 2023, 5, 66–78. [Google Scholar] [CrossRef]
- Acemoglu, A.; Krieglstein, J.; Caldwell, D.G.; Mora, F.; Guastini, L.; Trimarchi, M.; Vinciguerra, A.; Carobbio, A.L.C.; Hysenbelli, J.; Delsanto, M.; et al. 5G robotic telesurgery: Remote transoral laser microsurgeries on a cadaver. IEEE Trans. Med. Robot. Bionics 2020, 2, 511–518. [Google Scholar] [CrossRef]
- Wang, L. A futuristic perspective on human-centric assembly. J. Manuf. Syst. 2022, 62, 199–201. [Google Scholar] [CrossRef]
- Wang, L.; Gao, R.; Váncza, J.; Krüger, J.; Wang, X.V.; Makris, S.; Chryssolouris, G. Symbiotic human-robot collaborative assembly. CIRP Ann. 2019, 68, 701–726. [Google Scholar] [CrossRef] [Green Version]
- Li, S.; Zheng, P.; Liu, S.; Wang, Z.; Wang, X.V.; Zheng, L.; Wang, L. Proactive human–robot collaboration: Mutual-cognitive, predictable, and self-organising perspectives. Robot. Comput.-Integr. Manuf. 2023, 81, 102510. [Google Scholar] [CrossRef]
- Ji, Z.; Liu, Q.; Xu, W.; Yao, B.; Liu, J.; Zhou, Z. A closed-loop brain-computer interface with augmented reality feedback for industrial human-robot collaboration. Int. J. Adv. Manuf. Technol. 2021, 124, 3083–3098. [Google Scholar] [CrossRef]
- Sanna, A.; Manuri, F.; Fiorenza, J.; De Pace, F. BARI: An Affordable Brain-Augmented Reality Interface to Support Human–Robot Collaboration in Assembly Tasks. Information 2022, 13, 460. [Google Scholar] [CrossRef]
- Choi, S.H.; Park, K.B.; Roh, D.H.; Lee, J.Y.; Mohammed, M.; Ghasemi, Y.; Jeong, H. An integrated mixed reality system for safety-aware human-robot collaboration using deep learning and digital twin generation. Robot. Comput.-Integr. Manuf. 2022, 73, 102258. [Google Scholar] [CrossRef]
- Umbrico, A.; Orlandini, A.; Cesta, A.; Faroni, M.; Beschi, M.; Pedrocchi, N.; Scala, A.; Tavormina, P.; Koukas, S.; Zalonis, A.; et al. Design of advanced human–robot collaborative cells for personalized human–robot collaborations. Appl. Sci. 2022, 12, 6839. [Google Scholar] [CrossRef]
- Aivaliotis, S.; Lotsaris, K.; Gkournelos, C.; Fourtakas, N.; Koukas, S.; Kousi, N.; Makris, S. An augmented reality software suite enabling seamless human robot interaction. Int. J. Comput. Integr. Manuf. 2023, 36, 3–29. [Google Scholar] [CrossRef]
- Szczurek, K.A.; Prades, R.M.; Matheson, E.; Rodriguez-Nogueira, J.; Di Castro, M. Multimodal multi-user mixed reality human–robot interface for remote operations in hazardous environments. IEEE Access 2023, 11, 17305–17333. [Google Scholar] [CrossRef]
- Hietanen, A.; Pieters, R.; Lanz, M.; Latokartano, J.; Kämäräinen, J.K. AR-based interaction for human-robot collaborative manufacturing. Robot. Comput.-Integr. Manuf. 2020, 63, 101891. [Google Scholar] [CrossRef]
- Chan, W.P.; Hanks, G.; Sakr, M.; Zhang, H.; Zuo, T.; Van der Loos, H.M.; Croft, E. Design and evaluation of an augmented reality head-mounted display interface for human robot teams collaborating in physically shared manufacturing tasks. ACM Trans. Hum.-Robot. Interact. (THRI) 2022, 11, 1–19. [Google Scholar] [CrossRef]
- Moya, A.; Bastida, L.; Aguirrezabal, P.; Pantano, M.; Abril-Jiménez, P. Augmented Reality for Supporting Workers in Human–Robot Collaboration. Multimodal Technol. Interact. 2023, 7, 40. [Google Scholar] [CrossRef]
- Liu, C.; Zhang, Z.; Tang, D.; Nie, Q.; Zhang, L.; Song, J. A mixed perception-based human-robot collaborative maintenance approach driven by augmented reality and online deep reinforcement learning. Robot. Comput.-Integr. Manuf. 2023, 83, 102568. [Google Scholar] [CrossRef]
- Wang, B.; Hu, S.J.; Sun, L.; Freiheit, T. Intelligent welding system technologies: State-of-the-art review and perspectives. J. Manuf. Syst. 2020, 56, 373–391. [Google Scholar] [CrossRef]
- Li, S.; Zheng, P.; Fan, J.; Wang, L. Toward proactive human–robot collaborative assembly: A multimodal transfer-learning-enabled action prediction approach. IEEE Trans. Ind. Electron. 2021, 69, 8579–8588. [Google Scholar] [CrossRef]
- Ong, S.; Chong, J.; Nee, A. A novel AR-based robot programming and path planning methodology. Robot. Comput.-Integr. Manuf. 2010, 26, 240–249. [Google Scholar] [CrossRef]
- Fang, H.; Ong, S.; Nee, A. Interactive robot trajectory planning and simulation using augmented reality. Robot. Comput.-Integr. Manuf. 2012, 28, 227–237. [Google Scholar] [CrossRef]
- Young, K.Y.; Cheng, S.L.; Ko, C.H.; Su, Y.H.; Liu, Q.F. A novel teaching and training system for industrial applications based on augmented reality. J. Chin. Inst. Eng. 2020, 43, 796–806. [Google Scholar] [CrossRef]
- Solyman, A.E.; Ibrahem, K.M.; Atia, M.R.; Saleh, H.I.; Roman, M.R. Perceptive augmented reality-based interface for robot task planning and visualization. Int. J. Innov. Comput. Inf. Control 2020, 16, 1769–1785. [Google Scholar]
- Tavares, P.; Costa, C.M.; Rocha, L.; Malaca, P.; Costa, P.; Moreira, A.P.; Sousa, A.; Veiga, G. Collaborative welding system using BIM for robotic reprogramming and spatial augmented reality. Autom. Constr. 2019, 106, 102825. [Google Scholar] [CrossRef]
- Mullen, J.F.; Mosier, J.; Chakrabarti, S.; Chen, A.; White, T.; Losey, D.P. Communicating inferred goals with passive augmented reality and active haptic feedback. IEEE Robot. Autom. Lett. 2021, 6, 8522–8529. [Google Scholar] [CrossRef]
- Weisz, J.; Allen, P.K.; Barszap, A.G.; Joshi, S.S. Assistive grasping with an augmented reality user interface. Int. J. Robot. Res. 2017, 36, 543–562. [Google Scholar] [CrossRef]
- Chadalavada, R.T.; Andreasson, H.; Schindler, M.; Palm, R.; Lilienthal, A.J. Bi-directional navigation intent communication using spatial augmented reality and eye-tracking glasses for improved safety in human–robot interaction. Robot. Comput.-Integr. Manuf. 2020, 61, 101830. [Google Scholar] [CrossRef]
- Li, C.; Zheng, P.; Li, S.; Pang, Y.; Lee, C.K. AR-assisted digital twin-enabled robot collaborative manufacturing system with human-in-the-loop. Robot. Comput.-Integr. Manuf. 2022, 76, 102321. [Google Scholar] [CrossRef]
- Zheng, P.; Li, S.; Xia, L.; Wang, L.; Nassehi, A. A visual reasoning-based approach for mutual-cognitive human-robot collaboration. CIRP Ann. 2022, 71, 377–380. [Google Scholar] [CrossRef]
- Zheng, P.; Li, S.; Fan, J.; Li, C.; Wang, L. A collaborative intelligence-based approach for handling human-robot collaboration uncertainties. CIRP Ann. 2023, 72, 1–4. [Google Scholar] [CrossRef]
- Li, S.; Zheng, P.; Pang, S.; Wang, X.V.; Wang, L. Self-organising multiple human–robot collaboration: A temporal subgraph reasoning-based method. J. Manuf. Syst. 2023, 68, 304–312. [Google Scholar] [CrossRef]
- Sievers, T.S.; Schmitt, B.; Rückert, P.; Petersen, M.; Tracht, K. Concept of a Mixed-Reality Learning Environment for Collaborative Robotics. Procedia Manuf. 2020, 45, 19–24. [Google Scholar] [CrossRef]
- Leutert, F.; Schilling, K. Projector-based Augmented Reality support for shop-floor programming of industrial robot milling operations. IEEE Int. Conf. Control Autom. ICCA 2022, 2022, 418–423. [Google Scholar]
- Wassermann, J.; Vick, A.; Krüger, J. Intuitive robot programming through environment perception, augmented reality simulation and automated program verification. Procedia CIRP 2018, 76, 161–166. [Google Scholar] [CrossRef]
- Ong, S.K.; Yew, A.W.; Thanigaivel, N.K.; Nee, A.Y. Augmented reality-assisted robot programming system for industrial applications. Robot. Comput.-Integr. Manuf. 2020, 61, 101820. [Google Scholar] [CrossRef]
- Hernandez, J.D.; Sobti, S.; Sciola, A.; Moll, M.; Kavraki, L.E. Increasing robot autonomy via motion planning and an augmented reality interface. IEEE Robot. Autom. Lett. 2020, 5, 1017–1023. [Google Scholar] [CrossRef]
- Quintero, C.P.; Li, S.; Pan, M.K.; Chan, W.P.; Loos, H.F.M.V.D.; Croft, E. Robot Programming Through Augmented Trajectories in Augmented Reality. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Madrid, Spain, 1–5 October 2018; pp. 1838–1844. [Google Scholar]
- Rosen, E.; Whitney, D.; Phillips, E.; Chien, G.; Tompkin, J.; Konidaris, G.; Tellex, S. Communicating and controlling robot arm motion intent through mixed-reality head-mounted displays. Int. J. Robot. Res. 2019, 38, 1513–1526. [Google Scholar] [CrossRef]
- Diehl, M.; Plopski, A.; Kato, H.; Ramirez-Amaro, K. Augmented Reality interface to verify Robot Learning. In Proceedings of the 29th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2020, Naples, Italy, 31 August–4 September 2020; pp. 378–383. [Google Scholar]
- Bates, T.; Ramirez-Amaro, K.; Inamura, T.; Cheng, G. On-line simultaneous learning and recognition of everyday activities from virtual reality performances. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; pp. 3510–3515. [Google Scholar]
- Luebbers, M.B.; Brooks, C.; Mueller, C.L.; Szafir, D.; Hayes, B. ARC-LfD: Using Augmented Reality for Interactive Long-Term Robot Skill Maintenance via Constrained Learning from Demonstration. In Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi’an, China, 30 May–5 June 2021; Volume 2021, pp. 909–916. [Google Scholar]
- Solanes, J.E.; Muñoz, A.; Gracia, L.; Martí, A.; Girbés-Juan, V.; Tornero, J. Teleoperation of industrial robot manipulators based on augmented reality. Int. J. Adv. Manuf. Technol. 2020, 111, 1077–1097. [Google Scholar] [CrossRef]
- García, A.; Solanes, J.E.; Muñoz, A.; Gracia, L.; Tornero, J. Augmented Reality-Based Interface for Bimanual Robot Teleoperation. Appl. Sci. 2022, 12, 4379. [Google Scholar] [CrossRef]
- Pan, Y.; Chen, C.; Li, D.; Zhao, Z.; Hong, J. Augmented reality-based robot teleoperation system using RGB-D imaging and attitude teaching device. Robot. Comput.-Integr. Manuf. 2021, 71, 102167. [Google Scholar] [CrossRef]
- Su, Y.; Chen, X.; Zhou, T.; Pretty, C.; Chase, G. Mixed reality-integrated 3D/2D vision mapping for intuitive teleoperation of mobile manipulator. Robot. Comput.-Integr. Manuf. 2022, 77, 102332. [Google Scholar] [CrossRef]
- Elsdon, J.; Demiris, Y. Augmented reality for feedback in a shared control spraying task. In Proceedings of the IEEE International Conference on Robotics and Automation, Brisbane, QLD, Australia, 21–25 May 2018; pp. 1939–1946. [Google Scholar]
- Wonsick, M.; Keleștemur, T.; Alt, S.; Padır, T. Telemanipulation via virtual reality interfaces with enhanced environment models. In Proceedings of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September–1 October 2021; pp. 2999–3004. [Google Scholar]
- Lin, T.C.; Krishnan, A.U.; Li, Z. Comparison of Haptic and Augmented Reality Visual Cues for Assisting Tele-manipulation. In Proceedings of the IEEE International Conference on Robotics and Automation, Philadelphia, PA, USA, 23–27 May 2022; pp. 9309–9316. [Google Scholar]
- Su, Y.P.; Chen, X.Q.; Zhou, T.; Pretty, C.; Chase, J.G. Mixed Reality-Enhanced Intuitive Teleoperation with Hybrid Virtual Fixtures for Intelligent Robotic Welding. Appl. Sci. 2021, 11, 11280. [Google Scholar] [CrossRef]
- Frank, J.A.; Moorhead, M.; Kapila, V. Realizing mixed-reality environments with tablets for intuitive human-robot collaboration for object manipulation tasks. In Proceedings of the 25th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN 2016, New York, NY, USA, 26–31 August 2016; pp. 302–307. [Google Scholar]
- Su, Y.H.; Chen, C.Y.; Cheng, S.L.; Ko, C.H.; Young, K.Y. Development of a 3D AR-Based Interface for Industrial Robot Manipulators. In Proceedings of the 2018 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2018, Miyazaki, Japan, 7–10 October 2019; pp. 1809–1814. [Google Scholar]
- Piyavichayanon, C.; Koga, M.; Hayashi, E.; Chumkamon, S. Collision-Aware AR Telemanipulation Using Depth Mesh. In Proceedings of the 2022 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), Sapporo, Japan, 11–15 July 2022; Volume 2022, pp. 386–392. [Google Scholar]
- Li, J.R.; Fu, J.L.; Wu, S.C.; Wang, Q.H. An active and passive combined gravity compensation approach for a hybrid force feedback device. Proc. Inst. Mech. Eng. Part J. Mech. Eng. Sci. 2021, 235, 4368–4381. [Google Scholar] [CrossRef]
- Enayati, N.; De Momi, E.; Ferrigno, G. Haptics in robot-assisted surgery: Challenges and benefits. IEEE Rev. Biomed. Eng. 2016, 9, 49–65. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Platform | Field of View (degrees) | Per-Eye Resolution (pixels) | Tracking Type (DoF) | Eye Tracking | Latency (ms) |
---|---|---|---|---|---|
Google Glass | - | 640 × 360 | 3 Non-positional | No | - |
HoloLens 1 | 34 | 1268 × 720 | 6 Inside-out | No | 16 |
Magic Leap 1 | 50 | 1280 × 960 | 6 Inside-out | Yes | 8 |
HoloLens 2 | 52 | 2048 × 1080 | 6 Inside-out | Yes | 16 |
Magic Leap 2 | 70 | 1440 × 1760 | 6 Inside-out | Yes | 8 |
Meta Quest Pro | 96 | 1800 × 1920 | 6 Inside-out | Yes | 10 |
Application | References | Platform | AR Medium | Assessment |
---|---|---|---|---|
General Skills | Rewkowski et al. [83] | Custom | OST-HMD | None |
General Skills | Long et al. [82] | daVinci | HRSV | Quantitative |
Supervision | Jørgensen et al. [89] | daVinci | HRSV | Qualitative |
Safety | Barresi et al. [84] | Custom | OST-HMD | Qualitative and Quantitative |
Rehab | Wang et al. [85] | Custom | Screen | Quantitative |
Rehab | Zeng et al. [86] | Custom | Screen | Quantitative |
Injury Detection | Gras et al. [87] | Simulator | Display | Quantitative User Study |
Preoperative Training | Si et al. [90] | Simulator | OST-HMD | Questionnaire |
Haptic | Condino et al. [88] | Custom | Custom | Questionnaire |
Application | Reference | Robot Platform | AR Medium | Detailed Contents |
---|---|---|---|---|
Remote visualization | Lin et al. [93] | KUKA LBR Med | OST-HMD | Interventions |
Gasques et al. [94] | Custom | OST-HMD | Trauma | |
Black et al. [95] | Custom | OST-HMD | Ultrasound | |
Qian et al. [96] | da Vinci | OST-HMD | MIS | |
Huang et al. [97] | KUKA LBR iiwa | Monitor | Percutaneous | |
Teleoperation control | Lin et al. [98] | Custom | OST-HMD | Endoluminal |
Fu et al. [99] | Universal Robot 5 | Monitor | Ultrasound | |
Ho et al. [55] | da Vinci | Projector | Laparoscopy | |
Ma et al. [100] | da Vinci and Custom | OST-HMD | Laparoscopy | |
Latency and motion prediction | Richter et al. [17] | da Vinci | Console | MIS |
Bonne et al. [101] | da Vinci | OST-HMD | MIS | |
Gonzalez et al. [102] | da Vinci | Monitor | MIS | |
Fu et al. [99] | Universal Robot 5 | Monitor | Ultrasound |
Category | Reference | Robot | Medium | AR Content |
---|---|---|---|---|
Accurate robot control | Wang et al. [19] | KUKA KR6 R700 | HoloLens | Move cube, rotate cube, and create waypoint |
Interactive path planning | Ji et al. [107] | ABB IRB1200 | HoloLens | Robot trajectory point, user interface |
Pick-and-place selection | Sanna et al. [108] | COMAU e.DO manipulator | HoloLens 2 | Explore and select NeuroTags |
Safety-aware HRC | Choi et al. [109] | Universal Robot 3 | HoloLens 2 | Synchronized digital twin, safety information, motion preview |
User-aware control | Umbrico et al. [110] | Universal Robot 10e | HoloLens 2 headset and Samsung Galaxy S4 tablet | 3D model, arrow control guidance, task instruction, operator feedback |
Mobile HRI | Aivaliotis et al. [111] | Mobile robot platform | HoloLens 2 | Programming interface, production status, safety zones, and recovering instruction |
Multi-user robot teleoperation | Szczurek et al. [112] | CERNBot | HoloLens 2 | Video, 3D point cloud, and audio feedback |
Engine assembly task | Hietanen et al. [113] | Universal Robot 5 | HoloLens, LCD projector | danger zone, changed region, robot status, and control button |
Carbon-fiber-reinforced polymer fabrication | Chan et al. [114] | KUKA IIWA LBR14 | HoloLens | Virtual robot, workpiece model, and robot trajectory |
Gearbox assembly task | Moya et al. [115] | Universal Robot 10 | Tablet | 3D model animation, audio, PDF file, image, and video |
Maintenance manipulation | Liu et al. [116] | AE AIR4–560 | HoloLens 2 | Text, 3D model, execute task, remote expert |
Category | Reference | Robot | Medium | AR Content |
---|---|---|---|---|
Human–robot collaboration learning environment | Sievers et al. [131] | Collaborative robots | HMD/Tablet | Experimental modular assembly plant |
Industrial robot milling | Leutert and Schilling [132] | Industrial robot | Projector | Virtual processing paths and menu |
Workspace simulation and program verification | Wassermann et al. [133] | KUKA KR6 | Tablet | 2D image, 3D point cloud |
Robot work cell simulation | Ong et al. [134] | Industrial robot | Oculus Rift DK2 HMD | Work cell, 3D points and paths |
High-level augmented reality specifications | Hernandez et al. [135] | Fetch | HMD | Virtual objects |
Augmented trajectories simulation | Quintero et al. [136] | Barrett WAM | HoloLens | Robot-augmented trajectories |
Robot motion intent visualization | Rosen et al. [137] | Baxter | HoloLens | Sequences of robot movement |
Robot trajectory simulation | Gadre et al. [14] | Baxter | HoloLens | Motion preview |
Robot learning verification | Diehl et al. [138] | UR5e | HoloLens/Tablet | Virtual robot, target objects |
Robot learned skills modification | Luebbers et al. [140] | Sawyer | HoloLens | Virtual robot and tasks |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Fu, J.; Rota, A.; Li, S.; Zhao, J.; Liu, Q.; Iovene, E.; Ferrigno, G.; De Momi, E. Recent Advancements in Augmented Reality for Robotic Applications: A Survey. Actuators 2023, 12, 323. https://doi.org/10.3390/act12080323
Fu J, Rota A, Li S, Zhao J, Liu Q, Iovene E, Ferrigno G, De Momi E. Recent Advancements in Augmented Reality for Robotic Applications: A Survey. Actuators. 2023; 12(8):323. https://doi.org/10.3390/act12080323
Chicago/Turabian StyleFu, Junling, Alberto Rota, Shufei Li, Jianzhuang Zhao, Qingsheng Liu, Elisa Iovene, Giancarlo Ferrigno, and Elena De Momi. 2023. "Recent Advancements in Augmented Reality for Robotic Applications: A Survey" Actuators 12, no. 8: 323. https://doi.org/10.3390/act12080323
APA StyleFu, J., Rota, A., Li, S., Zhao, J., Liu, Q., Iovene, E., Ferrigno, G., & De Momi, E. (2023). Recent Advancements in Augmented Reality for Robotic Applications: A Survey. Actuators, 12(8), 323. https://doi.org/10.3390/act12080323