Integrating Virtual, Mixed, and Augmented Reality to Human–Robot Interaction Applications Using Game Engines: A Brief Review of Accessible Software Tools and Frameworks
Abstract
:1. Introduction
1.1. Related Surveys
1.2. Motivation and Contributions
2. Background and Article Organization
2.1. Game Engines
2.1.1. Unity
2.1.2. Unreal Engine
2.1.3. Paper Organization
3. Methodology
3.1. Definition of Research Questions
- 1.
- RQ1: Which type of XR applications or systems using real robots have been developed using game engines?
- 2.
- RQ2: Which software frameworks or tools are commonly used or have been proposed to build these applications?
- 3.
- RQ3: Which are the reported hardware and software limitations by the authors of these applications?
3.2. Study Selection, Quality Assessment, and Data Extraction
- 1.
- The article focuses only on simulation (i.e., the robot is only displayed on a monitor) or there is no interaction between the user and a physical robot
- 2.
- The article is a previous version of a more recent study or system proposing the same software architecture or framework by the same authors
- 3.
- The article is a short paper, review article, thesis, technical report, or book chapter
- 4.
- The article is a repeated article in different databases
- 5.
- The article does not propose a solution using the Unreal Engine or Unity Engine
- 1.
- Is the article published in a journal that is indexed on Web of Science?
- 2.
- Was the article presented at a conference in the top twenty Google Scholar list for Robotics, Engineering and Computer Science, Human–Computer Interaction, or Manufacturing and Machines?
- 3.
- Are the methods, hardware, and software architecture clearly described?
3.3. Limitations of the Study
4. Software Tools for Developing XR Applications in HRI
4.1. Communication Tools
4.1.1. Robotic Middleware and Bridging Mechanism
4.1.2. Multiplayer Networking
4.2. Interaction Tools
5. Applications in Human–Robot Interaction
5.1. Social Robotics
5.2. End User Programming of Industrial Robots
5.3. Teleoperation
5.4. Human–Robot Collaboration
6. Challenges and Future Opportunities
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Kwok, A.O.; Koh, S.G. COVID-19 and extended reality (XR). Curr. Issues Tour. 2021, 24, 1935–1940. [Google Scholar] [CrossRef]
- Doolani, S.; Wessels, C.; Kanal, V.; Sevastopoulos, C.; Jaiswal, A.; Nambiappan, H.; Makedon, F. A review of extended reality (xr) technologies for manufacturing training. Technologies 2020, 8, 77. [Google Scholar] [CrossRef]
- Bogaerts, B.; Sels, S.; Vanlanduit, S.; Penne, R. Connecting the CoppeliaSim robotics simulator to virtual reality. SoftwareX 2020, 11, 100426. [Google Scholar] [CrossRef]
- Topini, A.; Sansom, W.; Secciani, N.; Bartalucci, L.; Ridolfi, A.; Allotta, B. Variable admittance control of a hand exoskeleton for virtual reality-based rehabilitation tasks. Front. Neurorobot. 2021, 15, 188. [Google Scholar] [CrossRef] [PubMed]
- Nguyen, V.T.; Dang, T. Setting up Virtual Reality and Augmented Reality Learning Environment in Unity. In Proceedings of the 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), Recife, Brazil, 9–13 November 2017; pp. 315–320. [Google Scholar] [CrossRef]
- Morse, C. Gaming Engines: Unity, Unreal, and Interactive 3D Spaces; Taylor & Francis: London, UK, 2021. [Google Scholar]
- Bartneck, C.; Soucy, M.; Fleuret, K.; Sandoval, E.B. The robot engine—Making the unity 3D game engine work for HRI. In Proceedings of the 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Kobe, Japan, 31 August–4 September 2015; IEEE: Piscataway Township, NJ, USA, 2015; pp. 431–437. [Google Scholar]
- De Melo, M.S.P.; da Silva Neto, J.G.; Da Silva, P.J.L.; Teixeira, J.M.X.N.; Teichrieb, V. Analysis and comparison of robotics 3d simulators. In Proceedings of the 2019 21st Symposium on Virtual and Augmented Reality (SVR), Rio de Janeiro, Brazil, 28–31 October 2019; IEEE: Piscataway Township, NJ, USA, 2019; pp. 242–251. [Google Scholar]
- Eswaran, M.; Gulivindala, A.K.; Inkulu, A.K.; Bahubalendruni, M. Augmented reality-based guidance in product assembly and maintenance/repair perspective: A state of the art review on challenges and opportunities. Expert Syst. Appl. 2023, 213, 1–18. [Google Scholar] [CrossRef]
- Zhang, W.; Wang, Z. Theory and Practice of VR/AR in K-12 Science Education—A Systematic Review. Sustainability 2021, 13, 12646. [Google Scholar] [CrossRef]
- Ho, P.T.; Albajez, J.A.; Santolaria, J.; Yagüe-Fabra, J.A. Study of Augmented Reality Based Manufacturing for Further Integration of Quality Control 4.0: A Systematic Literature Review. Appl. Sci. 2022, 12, 1961. [Google Scholar] [CrossRef]
- Boboc, R.G.; Gîrbacia, F.; Butilă, E.V. The application of augmented reality in the automotive industry: A systematic literature review. Appl. Sci. 2020, 10, 4259. [Google Scholar] [CrossRef]
- Dianatfar, M.; Latokartano, J.; Lanz, M. Review on existing VR/AR solutions in human–robot collaboration. Procedia CIRP 2021, 97, 407–411. [Google Scholar] [CrossRef]
- Suzuki, R.; Karim, A.; Xia, T.; Hedayati, H.; Marquardt, N. Augmented Reality and Robotics: A Survey and Taxonomy for AR-enhanced Human-Robot Interaction and Robotic Interfaces. In Proceedings of the CHI Conference on Human Factors in Computing Systems 2022, New Orleans, LA, USA, 30 April–5 May 2022; pp. 1–33. [Google Scholar]
- Costa, G.d.M.; Petry, M.R.; Moreira, A.P. Augmented Reality for Human &;Robot Collaboration and Cooperation in Industrial Applications: A Systematic Literature Review. Sensors 2022, 22, 2725. [Google Scholar] [CrossRef]
- Xie, J.; Liu, S.; Wang, X. Framework for a closed-loop cooperative human Cyber-Physical System for the mining industry driven by VR and AR: MHCPS. Comput. Ind. Eng. 2022, 168, 108050. [Google Scholar] [CrossRef]
- Sonkoly, B.; Haja, D.; Németh, B.; Szalay, M.; Czentye, J.; Szabó, R.; Ullah, R.; Kim, B.S.; Toka, L. Scalable edge cloud platforms for IoT services. J. Netw. Comput. Appl. 2020, 170, 102785. [Google Scholar] [CrossRef]
- Zanero, S. Cyber-Physical Systems. Computer 2017, 50, 14–16. [Google Scholar] [CrossRef]
- Tao, F.; Qi, Q.; Wang, L.; Nee, A. Digital Twins and Cyber–Physical Systems toward Smart Manufacturing and Industry 4.0: Correlation and Comparison. Engineering 2019, 5, 653–661. [Google Scholar] [CrossRef]
- Maruyama, T.; Ueshiba, T.; Tada, M.; Toda, H.; Endo, Y.; Domae, Y.; Nakabo, Y.; Mori, T.; Suita, K. Digital Twin-Driven Human Robot Collaboration Using a Digital Human. Sensors 2021, 21, 8266. [Google Scholar] [CrossRef] [PubMed]
- van der Aalst, W.M.; Hinz, O.; Weinhardt, C. Resilient digital twins. Bus. Inf. Syst. Eng. 2021, 63, 615–619. [Google Scholar] [CrossRef]
- Sepasgozar, S.M. Differentiating digital twin from digital shadow: Elucidating a paradigm shift to expedite a smart, sustainable built environment. Buildings 2021, 11, 151. [Google Scholar] [CrossRef]
- Zhou, J.; Zhou, Y.; Wang, B.; Zang, J. Human–cyber–physical systems (HCPSs) in the context of new-generation intelligent manufacturing. Engineering 2019, 5, 624–636. [Google Scholar] [CrossRef]
- Coronado, E.; Kiyokawa, T.; Ricardez, G.A.G.; Ramirez-Alpizar, I.G.; Venture, G.; Yamanobe, N. Evaluating quality in human-robot interaction: A systematic search and classification of performance and human-centered factors, measures and metrics towards an industry 5.0. J. Manuf. Syst. 2022, 63, 392–410. [Google Scholar] [CrossRef]
- Huang, S.; Wang, B.; Li, X.; Zheng, P.; Mourtzis, D.; Wang, L. Industry 5.0 and Society 5.0—Comparison, complementation and co-evolution. J. Manuf. Syst. 2022, 64, 424–428. [Google Scholar] [CrossRef]
- Eberly, D. 3D Game Engine Design: A Practical Approach to Real-Time Computer Graphics; CRC Press: Boca Raton, FL, USA, 2006. [Google Scholar]
- Gregory, J. Game Engine Architecture; AK Peters/CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar]
- Dickson, P.E.; Block, J.E.; Echevarria, G.N.; Keenan, K.C. An experience-based comparison of unity and unreal for a stand-alone 3D game development course. In Proceedings of the 2017 ACM Conference on Innovation and Technology in Computer Science Education, Bologna, Italy, 3–5 July 2017; pp. 70–75. [Google Scholar]
- Juliani, A.; Berges, V.P.; Teng, E.; Cohen, A.; Harper, J.; Elion, C.; Goy, C.; Gao, Y.; Henry, H.; Mattar, M.; et al. Unity: A general platform for intelligent agents. arXiv 2018, arXiv:1809.02627. [Google Scholar]
- Tricco, A.C.; Lillie, E.; Zarin, W.; O’Brien, K.K.; Colquhoun, H.; Levac, D.; Moher, D.; Peters, M.D.; Horsley, T.; Weeks, L.; et al. PRISMA extension for scoping reviews (PRISMA-ScR): Checklist and explanation. Ann. Intern. Med. 2018, 169, 467–473. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Budgen, D.; Brereton, P. Performing systematic literature reviews in software engineering. In Proceedings of the International conference on Software engineering. Assoc. Comput. Mach. 2006, 1051–1052. [Google Scholar] [CrossRef]
- Kitchenham, B. Procedures for Performing Systematic Reviews; Technical Report; Keele University: Newcastle, UK, 2004. [Google Scholar]
- Petersen, K.; Vakkalanka, S.; Kuzniarz, L. Guidelines for conducting systematic mapping studies in software engineering: An update. Inf. Softw. Technol. 2015, 64, 1–18. [Google Scholar] [CrossRef]
- Keele, S. Guidelines for Performing Systematic Literature Reviews in Software Engineering; Technical Report, EBSE Technical Report; Elsevier: Amsterdam, The Netherlands, 2007. [Google Scholar]
- Wohlin, C.; Runeson, P.; Neto, P.A.d.M.S.; Engström, E.; do Carmo Machado, I.; De Almeida, E.S. On the reliability of mapping studies in software engineering. J. Syst. Softw. 2013, 86, 2594–2610. [Google Scholar] [CrossRef] [Green Version]
- Rosbridge Suite. 2022. Available online: http://wiki.ros.org/rosbridge/_suite (accessed on 10 October 2022).
- Inamura, T.; Mizuchi, Y. SIGVerse: A cloud-based VR platform for research on multimodal human-robot interaction. Front. Robot. AI 2021, 8, 549360. [Google Scholar] [CrossRef] [PubMed]
- ROS Sharp. 2022. Available online: https://github.com/siemens/ros-sharp (accessed on 10 October 2022).
- ROS TCP Connector. 2022. Available online: https://github.com/Unity-Technologies/ROS-TCP-Connector (accessed on 10 October 2022).
- Babaians, E.; Tamiz, M.; Sarfi, Y.; Mogoei, A.; Mehrabi, E. Ros2unity3d; high-performance plugin to interface ros with unity3d engine. In Proceedings of the 2018 9th Conference on Artificial Intelligence and Robotics and 2nd Asia-Pacific International Symposium, Kish Island, Iran, 10 December 2018; IEEE: Piscataway Township, NJ, USA, 2018; pp. 59–64. [Google Scholar]
- Coronado, E.; Venture, G. Towards IoT-Aided Human–Robot Interaction Using NEP and ROS: A Platform-Independent, Accessible and Distributed Approach. Sensors 2020, 20, 1500. [Google Scholar] [CrossRef] [Green Version]
- ZeroMQ Socket Api. 2022. Available online: https://zeromq.org/socket-api/ (accessed on 10 October 2022).
- Photon Engine. 2022. Available online: https://www.photonengine.com/ (accessed on 10 October 2022).
- Mirror Networking. 2022. Available online: https://mirror-networking.gitbook.io/docs/ (accessed on 10 October 2022).
- Netcode for GameObjects. 2022. Available online: https://docs-multiplayer.unity3d.com/ (accessed on 10 October 2022).
- Dimitropoulos, N.; Togias, T.; Michalos, G.; Makris, S. Operator support in human–robot collaborative environments using AI enhanced wearable devices. Procedia Cirp 2021, 97, 464–469. [Google Scholar] [CrossRef]
- Togias, T.; Gkournelos, C.; Angelakis, P.; Michalos, G.; Makris, S. Virtual reality environment for industrial robot control and path design. Procedia CIRP 2021, 100, 133–138. [Google Scholar] [CrossRef]
- Gao, Y.; Huang, C.M. PATI: A projection-based augmented table-top interface for robot programming. In Proceedings of the 24th International Conference on Intelligent User Interfaces, Marina del Ray, CA, USA, 17–20 March 2019; pp. 345–355. [Google Scholar]
- TouchScript. 2022. Available online: https://github.com/TouchScript/TouchScript (accessed on 10 October 2022).
- Aldoma, A.; Marton, Z.C.; Tombari, F.; Wohlkinger, W.; Potthast, C.; Zeisl, B.; Rusu, R.B.; Gedikli, S.; Vincze, M. Tutorial: Point cloud library: Three-dimensional object recognition and 6 dof pose estimation. IEEE Robot. Autom. Mag. 2012, 19, 80–91. [Google Scholar] [CrossRef]
- Fischler, M.A.; Bolles, R.C. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 1981, 24, 381–395. [Google Scholar] [CrossRef]
- Zhou, T.; Zhu, Q.; Du, J. Intuitive robot teleoperation for civil engineering operations with virtual reality and deep learning scene reconstruction. Adv. Eng. Inform. 2020, 46, 101170. [Google Scholar] [CrossRef]
- PointNet. 2022. Available online: https://github.com/charlesq34/pointnet (accessed on 10 October 2022).
- Vuforia Engine Package Unity. 2022. Available online: https://library.vuforia.com/getting-started/vuforia-engine-package-unity (accessed on 10 October 2022).
- Chacko, S.M.; Granado, A.; Kapila, V. An augmented reality framework for robotic tool-path teaching. Procedia CIRP 2020, 93, 1218–1223. [Google Scholar] [CrossRef]
- Google ARCore. 2022. Available online: https://developers.google.com/ar (accessed on 10 October 2022).
- Mixed Reality Toolkit. 2022. Available online: https://github.com/microsoft/MixedRealityToolkit-Unity (accessed on 10 October 2022).
- Lotsaris, K.; Gkournelos, C.; Fousekis, N.; Kousi, N.; Makris, S. AR based robot programming using teaching by demonstration techniques. Procedia CIRP 2021, 97, 459–463. [Google Scholar] [CrossRef]
- Zxing. 2022. Available online: https://github.com/zxing/zxing (accessed on 10 October 2022).
- Botev, J.; Rodríguez Lera, F.J. Immersive robotic telepresence for remote educational scenarios. Sustainability 2021, 13, 4717. [Google Scholar] [CrossRef]
- IAI Kinect. 2022. Available online: https://nuitrack.com/ (accessed on 1 January 2023).
- Bradski, G.; Kaehler, A. OpenCV. Dr. Dobb’s J. Softw. Tools 2000, 3, 120. [Google Scholar]
- Find Object 2D ROS package. 2022. Available online: http://wiki.ros.org/find_object_2d (accessed on 10 October 2022).
- Moveit. Motion Planning Framework. 2020. Available online: https://moveit.ros.org/ (accessed on 10 October 2022).
- IAI Kinect. 2022. Available online: https://github.com/code-iai/iai_kinect2 (accessed on 1 January 2023).
- RobCog-IAI. 2022. Available online: https://github.com/robcog-iai (accessed on 1 January 2023).
- Newton VR. 2022. Available online: https://assetstore.unity.com/packages/tools/newtonvr-75712 (accessed on 10 October 2022).
- Li, R.; van Almkerk, M.; van Waveren, S.; Carter, E.; Leite, I. Comparing human-robot proxemics between virtual reality and the real world. In Proceedings of the 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea, 11–14 March 2019; IEEE: Piscataway Township, NJ, USA, 2019; pp. 431–439. [Google Scholar]
- Alonso, R.; Bonini, A.; Reforgiato Recupero, D.; Spano, L.D. Exploiting virtual reality and the robot operating system to remote-control a humanoid robot. Multimed. Tools Appl. 2022, 81, 15565–15592. [Google Scholar] [CrossRef]
- Shariati, A.; Shahab, M.; Meghdari, A.; Amoozandeh Nobaveh, A.; Rafatnejad, R.; Mozafari, B. Virtual reality social robot platform: A case study on Arash social robot. In Proceedings of the International Conference on Social Robotics; Springer: Berlin/Heidelberg, Germany, 2018; pp. 551–560. [Google Scholar]
- Pot, E.; Monceaux, J.; Gelin, R.; Maisonnier, B. Choregraphe: A graphical tool for humanoid robot programming. In Proceedings of the RO-MAN 2009-The 18th IEEE International Symposium on Robot and Human Interactive Communication, New Delhi, India, 14–18 October 2009; IEEE: Piscataway Township, NJ, USA, 2009; pp. 46–51. [Google Scholar]
- Coronado, E.; Mastrogiovanni, F.; Indurkhya, B.; Venture, G. Visual programming environments for end-user development of intelligent and social robots, a systematic review. J. Comput. Lang. 2020, 58, 100970. [Google Scholar] [CrossRef]
- Cao, Y.; Wang, T.; Qian, X.; Rao, P.S.; Wadhawan, M.; Huo, K.; Ramani, K. GhostAR: A time-space editor for embodied authoring of human-robot collaborative task with augmented reality. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology, New Orleans, LA, USA, 20–23 October 2019; pp. 521–534. [Google Scholar]
- Ostanin, M.; Mikhel, S.; Evlampiev, A.; Skvortsova, V.; Klimchik, A. Human-robot interaction for robotic manipulator programming in Mixed Reality. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; IEEE: Piscataway Township, NJ, USA, 2020; pp. 2805–2811. [Google Scholar]
- Soares, I.; Petry, M.; Moreira, A.P. Programming Robots by Demonstration Using Augmented Reality. Sensors 2021, 21, 5976. [Google Scholar] [CrossRef]
- Karan, M.S.; Berkman, M.İ.; Çatak, G. Smartphone as a Paired Game Input Device: An Application on HoloLens Head Mounted Augmented Reality System. In Game+ Design Education; Springer: Berlin/Heidelberg, Germany, 2021; pp. 265–277. [Google Scholar]
- Mahmood, F.; Mahmood, E.; Dorfman, R.G.; Mitchell, J.; Mahmood, F.U.; Jones, S.B.; Matyal, R. Augmented reality and ultrasound education: Initial experience. J. Cardiothorac. Vasc. Anesth. 2018, 32, 1363–1367. [Google Scholar] [CrossRef]
- Tian-Han, G.; Qiao-Yu, T.; Shuo, Z. The virtual museum based on HoloLens and vuforia. In Proceedings of the 2018 4th Annual International Conference on Network and Information Systems for Computers (ICNISC), Wuhan, China, 20–22 April 2018; IEEE: Piscataway Township, NJ, USA, 2018; pp. 382–386. [Google Scholar]
- Solanes, J.E.; Muñoz, A.; Gracia, L.; Martí, A.; Girbés-Juan, V.; Tornero, J. Teleoperation of industrial robot manipulators based on augmented reality. Int. J. Adv. Manuf. Technol. 2020, 111, 1077–1097. [Google Scholar] [CrossRef]
- Naceri, A.; Mazzanti, D.; Bimbo, J.; Tefera, Y.T.; Prattichizzo, D.; Caldwell, D.G.; Mattos, L.S.; Deshpande, N. The Vicarios Virtual Reality Interface for Remote Robotic Teleoperation. J. Intell. Robot. Syst. 2021, 101, 1–16. [Google Scholar] [CrossRef]
- Su, Y.; Chen, X.; Zhou, T.; Pretty, C.; Chase, G. Mixed reality-integrated 3D/2D vision mapping for intuitive teleoperation of mobile manipulator. Robot. Comput. Integr. Manuf. 2022, 77, 102332. [Google Scholar] [CrossRef]
- Whitney, D.; Rosen, E.; Ullman, D.; Phillips, E.; Tellex, S. Ros reality: A virtual reality framework using consumer-grade hardware for ros-enabled robots. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain; IEEE: Piscataway Township, NJ, USA, 2018; pp. 1–9. [Google Scholar]
- Palmarini, R.; del Amo, I.F.; Bertolino, G.; Dini, G.; Erkoyuncu, J.A.; Roy, R.; Farnsworth, M. Designing an AR interface to improve trust in Human-Robots collaboration. Procedia CIRP 2018, 70, 350–355. [Google Scholar] [CrossRef]
- Wang, X.V.; Wang, L.; Lei, M.; Zhao, Y. Closed-loop augmented reality towards accurate human-robot collaboration. CIRP Ann. 2020, 69, 425–428. [Google Scholar] [CrossRef]
- Mahadevan, K.; Sousa, M.; Tang, A.; Grossman, T. “grip-that-there”: An investigation of explicit and implicit task allocation techniques for human-robot collaboration. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 8–13 May 2021; pp. 1–14. [Google Scholar]
- Chandan, K.; Kudalkar, V.; Li, X.; Zhang, S. ARROCH: Augmented reality for robots collaborating with a human. In Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi’an, China, 30 May–5 June 2021; IEEE: Piscataway Township, NJ, USA, 2021; pp. 3787–3793. [Google Scholar]
- Weber, D.; Kasneci, E.; Zell, A. Exploiting Augmented Reality for Extrinsic Robot Calibration and Eye-based Human-Robot Collaboration. In Proceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction, Sapporo, Hokkaido, Japan, 7–10 March 2022; pp. 284–293. [Google Scholar]
- Liu, S.; Wang, X.V.; Wang, L. Digital twin-enabled advance execution for human-robot collaborative assembly. CIRP Ann. 2022, 71, 25–28. [Google Scholar] [CrossRef]
- Tuli, T.B.; Manns, M.; Zeller, S. Human motion quality and accuracy measuring method for human–robot physical interactions. Intell. Serv. Robot. 2022, 15, 1–10. [Google Scholar] [CrossRef]
- Aivaliotis, S.; Lotsaris, K.; Gkournelos, C.; Fourtakas, N.; Koukas, S.; Kousi, N.; Makris, S. An augmented reality software suite enabling seamless human robot interaction. Int. J. Comput. Integr. Manuf. 2022, 35, 1–27. [Google Scholar] [CrossRef]
- Lotsaris, K.; Fousekis, N.; Koukas, S.; Aivaliotis, S.; Kousi, N.; Michalos, G.; Makris, S. Augmented reality (ar) based framework for supporting human workers in flexible manufacturing. Procedia CIRP 2021, 96, 301–306. [Google Scholar] [CrossRef]
- Malik, A.A.; Brem, A. Digital twins for collaborative robots: A case study in human-robot interaction. Robot. Comput. Integr. Manuf. 2021, 68, 102092. [Google Scholar] [CrossRef]
- Wang, L.; Liu, S.; Cooper, C.; Wang, X.V.; Gao, R.X. Function block-based human-robot collaborative assembly driven by brainwaves. CIRP Ann. 2021, 70, 5–8. [Google Scholar] [CrossRef]
- Rebenitsch, L.; Owen, C. Estimating cybersickness from virtual reality applications. Virtual Real. 2021, 25, 165–174. [Google Scholar] [CrossRef]
- Vosniakos, G.C.; Ouillon, L.; Matsas, E. Exploration of two safety strategies in human-robot collaborative manufacturing using Virtual Reality. Procedia Manuf. 2019, 38, 524–531. [Google Scholar] [CrossRef]
- Montalvo, W.; Bonilla-Vasconez, P.; Altamirano, S.; Garcia, C.A.; Garcia, M.V. Industrial Control Robot Based on Augmented Reality and IoT Protocol. In Proceedings of the International Conference on Augmented Reality, Virtual Reality and Computer Graphics, Virtual Event, 7–10 September 2020; Springer: Berlin/Heidelberg, Germany, 2020; pp. 345–363. [Google Scholar]
- Botev, J.; Rodríguez Lera, F.J. Immersive Telepresence Framework for Remote Educational Scenarios. In Proceedings of the International Conference on Human-Computer Interaction; Springer: Berlin/Heidelberg, Germany, 2020; pp. 373–390. [Google Scholar]
Reference | Year | Topics Focused on Each Paper | Differences with This Article |
---|---|---|---|
[13] | 2021 |
|
|
[15] | 2022 |
|
|
[14] | 2022 |
|
|
Database | Search String |
---|---|
IEEE Xplore | (Unity OR “Unreal engine”) AND Robot AND Human AND (Interaction OR Collaboration) AND (“Virtual reality” OR “Augmented reality” OR “Mixed reality”) |
ACM Digital Library | [[All: unity] OR [All: “unreal engine”]] AND [Title: robot] AND [Abstract: human] AND [[Abstract: interaction] OR [Abstract: collaboration]] |
Springer Link | (Unity OR “Unreal engine”) AND Robot AND Human AND (Interaction OR Collaboration) AND (“Virtual reality” OR “Augmented reality” OR “Mixed reality”) NOT Medicine NOT Surgery AND (Social OR Industry OR Manufacturing), Content type: Article |
Science Direct | All: (Unity OR “Unreal engine”), Title and abstract: Robot AND Human AND (Interaction OR Collaboration OR Control), Subject areas: Engineering, Article type: Research articles |
Web of Science | Robot AND Human AND (Interaction OR Collaboration) AND (“Virtual reality” OR “Augmented reality” OR “Mixed reality”), Citation topics: Robotics, and Human-Computer Interaction, Document Types: Article |
Database | Search Results | Articles Included | Articles Excluded | Articles Finally |
---|---|---|---|---|
(Step 1) | (Step 2 and QA) | Included | ||
IEEE Xplore | 26 | 13 | 9 | 4 |
ACM Digital Library | 29 | 6 | 1 | 5 |
Springer Link | 137 | 11 | 8 | 3 |
Science Direct | 99 | 16 | 8 | 8 |
Web of Science | 117 | 23 | 18 | 4 |
Total | 408 | 69 | 44 | 24 |
Tool | Programming Languages | General Description and Main Capabilities/Functionalities |
---|---|---|
TouchScript | C# | An open-source framework that enables the use of basic gestures (e.g., press, release, tap, long press, flick, pinch/scale/rotate) in devices with touch input |
Point Cloud Library (PCL) | C++ | An open-source library for 2D/3D image and point cloud processing. Some of its main modules include visualization, segmentation, and registration of point clouds. |
PointNet | Python | A deep net architecture and library written on top of TensorFlow. It can be used to reason about 3D geometric data (e.g., point clouds and meshes). Its functionalities include part segmentation, object classification, and scene semantic parsing. |
Vuforia Engine SDK | C++, Objective-C++, Java, and C# | A framework for building Augmented Reality applications in Android, iOS, and Windows applications able to be executed on mobile devices and AR glasses. It provides free, academic, and premium plans. |
Google’s ARCore | Kotlin/Java, Unity/C#, Unreal/C++ | A framework for building Augmented Reality applications in Android, iOS, Unreal, and Unity. It uses motion tracking, environmental understanding, and light estimation to integrate virtual models with the real world |
Mixed Reality Toolkit | C# | Is a cross-platform framework providing a set of components and features for accelerating Mixed and Augmented reality application development. Its functionalities include eye and hand tracking, spatial awareness, speech dictation, and XR device and game control manager. |
ZXing | Java | An open-source library for 1D/2D barcode image processing. Its functionalities include analysis and extraction of information from images containing barcodes or QR codes |
NuiTrack | C++, C#, and Python | A 3D body tracking middleware. This framework can interpret depth maps as 3D point clouds, detect floor planes and background objects, detect and track persons, perform skeleton and hand tracking, and perform face analysis. |
OpenCV | C++/C, Python, and Java | A cross-platform and general-purpose real-time computer vision library. Its functionalities include image acquisition from RGB cameras, image segmentation and filtering, object recognition, camera calibration, and changing image color spaces, among many others. |
Find_Object_2D | C++ | A ROS-based package using a Qt interface that implements different feature detectors and descriptors for calculating the 3D position of objects using OpenCV |
MoveIt | C++ and Python | Robot motion planning framework enabling motion planning, manipulation, collision checking, 3D perception, kinematics, control, and navigation |
IAI Kinect | C++ | Collection of tools for enabling the use of Kinect with ROS using libfreenect2 (a driver for Kinect) |
RobCog-IAI | C++ | Provides Unreal Engine plugins for building applications for robotics |
Newton VR | C# | A VR interaction system enabling to pick up, throw, and use objects using tracked controllers |
Article | Hardware Integrated | Software Tools Integrated | ||||
---|---|---|---|---|---|---|
Year | Reference | XR Type | Robots | Devices | Communication | Interaction |
2019 | [68] | VR | Pepper | HTC VIVE | TCP/IP | NAO SDK |
2021 | [60] | VR | QTRobot | Oculus Rift, iphone | ROS and ROS bridge | NuiTrack, Find_Object_2D |
2022 | [69] | VR | NAO | Oculus Rift | ROS, ROS# | NAO SDK |
Article | Hardware Integrated | Software Tools Integrated | ||||
---|---|---|---|---|---|---|
Year | Reference | XR Type | Robots | Devices | Communication | Interaction |
2019 | [48] | AR | UR5 | Kinect 2, projector | ROS, ROS# | TouchScript, Point Cloud Library |
2019 | [73] | AR | GripperBot, CamBot, Armbot | Oculus Rift and IR-LED Sensors, ZED stereocamera | ROS, ROS# | N.A |
2020 | [74] | MR | KUKA iiwa, UR10e | HoloLens | ROS (no details on how Unity and ROS are connected) | MoveIt, Mixed Reality Toolkit |
2020 | [55] | MR/AR | OpenManipulator-X | Smartphone | Bluetooth | Google’s ARCore |
2021 | [75] | AR | UR5 and ABB IRB 2600 | HoloLens 2 | ROS, ROS#, TCP/IP sockets | N.A. |
2021 | [58] | AR | UR10 | HoloLens | ROS, ROS# | ZXing, Mixed Reality Toolkit, MoveIt |
Article | Hardware Integrated | Software Tools Integrated | ||||
---|---|---|---|---|---|---|
Year | Reference | XR Type | Robots | Devices | Communication | Interaction |
2020 | [79] | AR | Kuka KR6 r900 | Realsense D435, Gamepad, HoloLens | ROS and ROS bridge | Mixed Reality Toolkit |
2020 | [52] | VR | Baxter | Kinect, HTC VIVE | ROS, ROS2Unity, ROS# | PointNet, IAI Kinect, libfreenect2, PCL |
2021 | [47] | VR | COMAU Dual Arm | Oculus Rift | ROS, ROS#, Mirror | Moveit, Open Motion Planning Library |
2021 | [80] | VR | UR5 | Pro, ZED-mini, Intel Realsense, HTC VIVE | ROS and ROS bridge, TCP sockets | RobCog-IAI |
2022 | [81] | MR | UR5 | Camera, Kinect, HTC VIVE | ROS and ROS bridge | OpenCV |
Article | Hardware Integrated | Software Tools Integrated | ||||
---|---|---|---|---|---|---|
Year | Reference | XR Type | Robots | Devices | Communication | Interaction |
2018 | [83] | AR | Turtlebot | Tablet | ROS (no details on how Unity and ROS are connected) | Vuforia SDK |
2020 | [46] | AR | COMAU AURA, COMAU Racer5 HoloLens | Smartwatch, Projector, Camera | ROS Bridge and Mirror | Microsoft Speech recognition library |
2020 | [84] | AR | KUKA KR6 R7000 Sixx | HoloLens | ROS and ROS bridge #, | Vuforia SDK |
2021 | [85] | VR | Franka Emika | N.A | ROS, ROS# | N.A. |
2021 | [86] | AR | Turtlebot2 | Tablet | ROS, ROS# | N.A. |
2021 | [37] | VR/AR | Turtlebot2, Turtlebot3, TIAGo, among others | Oculus Rift | ROS, ROS Bridge, TCP/IP, Photon | NewtonVR |
2022 | [87] | VR | Scitos G5 mobile robot | HoloLens 2, Azure Kinect | ROS, ROS# | Mixed Reality Toolkit |
2022 | [88] | MR/AR | ABB industrial robot | HoloLens | FB network | N.A |
2022 | [89] | VR | Universal Robot | Xsens motion, HTC VIVE | ROS, ROS# | N.A |
2022 | [90] | AR | Mobile dual-arm platform | HoloLens 2 | ROS, ROS# | Mixed Reality Toolkit |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Coronado, E.; Itadera, S.; Ramirez-Alpizar, I.G. Integrating Virtual, Mixed, and Augmented Reality to Human–Robot Interaction Applications Using Game Engines: A Brief Review of Accessible Software Tools and Frameworks. Appl. Sci. 2023, 13, 1292. https://doi.org/10.3390/app13031292
Coronado E, Itadera S, Ramirez-Alpizar IG. Integrating Virtual, Mixed, and Augmented Reality to Human–Robot Interaction Applications Using Game Engines: A Brief Review of Accessible Software Tools and Frameworks. Applied Sciences. 2023; 13(3):1292. https://doi.org/10.3390/app13031292
Chicago/Turabian StyleCoronado, Enrique, Shunki Itadera, and Ixchel G. Ramirez-Alpizar. 2023. "Integrating Virtual, Mixed, and Augmented Reality to Human–Robot Interaction Applications Using Game Engines: A Brief Review of Accessible Software Tools and Frameworks" Applied Sciences 13, no. 3: 1292. https://doi.org/10.3390/app13031292
APA StyleCoronado, E., Itadera, S., & Ramirez-Alpizar, I. G. (2023). Integrating Virtual, Mixed, and Augmented Reality to Human–Robot Interaction Applications Using Game Engines: A Brief Review of Accessible Software Tools and Frameworks. Applied Sciences, 13(3), 1292. https://doi.org/10.3390/app13031292