UX in AR-Supported Industrial Human–Robot Collaborative Tasks: A Systematic Review
Abstract
:1. Introduction
- -
- Augmented reality, as defined by Azuma et al., “supplements the real world with virtual (computer-generated) objects that appear to coexist in the same space as the real world” [4];
- -
- Virtual reality implies a full immersion into a fictious and digitally generated world which shuts out completely the physical world [5];
- -
- Mixed reality combines both the previous technologies while enabling a strict interaction between the digital and physical world. Thus, the user interaction with the computer-generated environment provides feedbacks and vice versa [6].
- -
- The human-centered view is primarily concerned “with how a robot can fulfil its task specification in a manner that is acceptable and comfortable to humans”;
- -
- The robot-centered view “emphasizes the view of a robot as a creature, i.e., an autonomous entity that is pursuing its own goals based on its motivations, drives and emotions, whereby interaction with people serves to fulfil some of its ‘needs’”;
- -
- The robot-cognition view considers “the robot as an intelligent system (in a traditional AI sense), i.e., a machine that makes decisions on its own and solves problems it faces as part of the tasks it needs to perform in a particular application domain.”.
- A systematic review on AR-supported applications for human–robot collaborative tasks in industry, focusing on human aspects. As a result, the reader can understand whether and how UX approaches are currently adopted in the design of AR-supported collaborative solutions, as well as the main benefits and challenges of the application of UX methods in this field;
- A UX-driven framework to design user-centric AR interfaces for industrial HRI, discussing also the main potential future developments, after having revealed the lack of such structured framework in literature.
2. Methodology
2.1. Systematic Literature Review
- Population consists of AR-supported industrial collaborative tasks;
- Intervention involves the HCD and UX approaches to design AR application for industrial collaborative tasks;
- Comparison can be done considering current design approaches and similar set-ups;
- Outcomes can be measured in terms of common Key Performance Indicators (KPI) like time to complete the operation, task’s cognitive demand or physical workload;
- Context includes industrial human–robot applications.
2.2. Research Questions
- Q1: What are the state of the art UX approaches in AR-supported collaborative solutions?
- Q2: What are the main benefits of adopting UX approaches in designing AR-supported collaborative solutions?
- Q3: What are the main challenges in designing AR-supported collaborative solutions?
2.3. Search and Selection Process
- ○
- Typology: the study considers articles on international journals and papers on conference proceedings, or books;
- ○
- Topics: the study contains the keywords “augmented reality” + “human robot interaction” or “human robot collaboration” + “user experience” or “user interface”. The search has been applied to “Title”, “Abstract”, and “Keywords” (TAK) fields. No reference to the “Mixed Reality” term was included since it subsumes both AR and VR;
- ○
- Year: the study has not been limited in terms of the publication year.
- ○
- Language: the paper is not written in English;
- ○
- Scope: the paper is out of scope and focuses on different research domain;
- ○
- Accessibility: the paper is not available.
- QC1: It reflects the quality of the journal on which the paper is published, where Qi refers to the quartile score, according to Scimago Journal Ranking [32]. A score of 1 was assigned to Q1 journals, 0.5 to Q2, and 0.25 to Q3. If the journal belongs to Q4, or if it does not belong to a specific quartile yet or it is part of a conference proceeding, a “/” is assigned counting as 0;
- QC2: It reflects the relevance of the specific paper. A value of 1 is assigned if the paper specifically has “User Experience”, “User Interface”, or “Human-centered Design” as one or more paper keywords. This choice was made to further understand if the paper was intended to be searchable for UX, HCD, or UI-related topics;
- QC3: It reflects the citation impact. It considers the number of total citations of the paper (c) compared to the maximum number of citations of the most cited paper (mc) among those included in the review. Certainly, this criterion will not be quantitatively relevant for the most recent works, but it helps to understand the most significant works as recognized from the scientific community. As a consequence, a final score ranging from 0 to 1 has been determined for each paper (i) included in the review:
3. Review Results
3.1. What Are the State of the Art UX Approaches in AR-Supported Collaborative Solutions?
- -
- User testing is usually based on the collection of deconstructed data regarding device or interface usability, system likability, cognitive and physical workload, or the overall subjective sense of safety in performing the selected operation, without a robust reference model;
- -
- Even if a good attention in using multimodal interfaces to optimize HRI is arising, this trend is not mature enough to enhance human sensorial capabilities by integrating different sensors (e.g., force/torque sensors, microphones, cameras, smartwatches, and AR glasses);
- -
- AR application design does not consider the user perspective and does not help in the improvement of the ease of use of industrial workplaces, avoiding uncomfortable conditions (e.g., extra lightning and noise).
3.2. What Are the Main Benefits of Adopting UX Approaches in Designing AR-Supported Collaborative Solutions?
3.3. What Are the Main Challenges in Designing AR-Supported Collaborative Solutions?
4. Discussion on Review Results
- Requirements Gathering;
- AR Interface Design and Prototyping;
- UX Assessment.
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- UNI EN ISO 9241-210:2019. Ergonomics of Human-System Interaction—Part 210: Human-Centred Design for Interactive Systems; ISO: Geneva, Switzerland, 2019.
- UNI EN ISO 9241-11:2018. Ergonomics of Human-System Interaction—Part 11: Usability: Definitions and Concepts; ISO: Geneva, Switzerland, 2018.
- Romero, D.; Stahre, J.; Wuest, T.; Noran, O.; Bernus, P.; Fasth, A.; Gorecky, D. Towards an operator 4.0 typology: A hu-man-centric perspective on the fourth industrial revolution technologies. In Proceedings of the International Conference on Computers & Industrial Engineering (CIE46), Tianjin, China, 29–31 October 2017; pp. 1–11, ISSN 2164-8670 CD-ROM, ISSN 2164–8689 ON-LINE. [Google Scholar]
- Azuma, R.; Baillot, Y.; Behringer, R.; Feiner, S.; Julier, S.; MacIntyre, B. Recent advances in augmented reality. IEEE Comput. Graph. Appl. 2001, 21, 34–47. [Google Scholar] [CrossRef] [Green Version]
- Peruzzini, M.; Grandi, F.; Cavallaro, S.; Pellicciari, M. Using virtual manufacturing to design human-centric factories: An industrial case. Int. J. Adv. Manuf. Technol. 2020, 115, 873–887. [Google Scholar] [CrossRef]
- Milgram, P. A Taxonomy of Mixed Reality Visual Displays. IEICE Trans. Inf. Syst. 1994, 77, 1321–1329. [Google Scholar]
- Krauß, M.; Leutert, F.; Scholz, M.R.; Fritscher, M.; Heß, R.; Lilge, C.; Schilling, K. Digital Manufacturing for Smart Small Satellites Systems. Procedia Comput. Sci. 2021, 180, 150–161. [Google Scholar] [CrossRef]
- Hietanen, A.; Pieters, R.; Lanz, M.; Latokartano, J.; Kämäräinen, J.-K. AR-based interaction for human-robot collaborative manufacturing. Robot. Comput. Manuf. 2020, 63, 101891. [Google Scholar] [CrossRef]
- Pacaux-Lemoine, M.-P.; Flemisch, F. Layers of Shared and Cooperative Control, assistance and automation. IFAC-PapersOnLine 2016, 49, 159–164. [Google Scholar] [CrossRef]
- Prati, E.; Peruzzini, M.; Pellicciari, M.; Raffaeli, R. How to include User eXperience in the design of Human-Robot Interaction. Robot. Comput. Manuf. 2021, 68, 102072. [Google Scholar] [CrossRef]
- Dautenhahn, K. Socially intelligent robots: Dimensions of human–robot interaction. Philos. Trans. R. Soc. B Biol. Sci. 2007, 362, 679–704. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- UNI EN ISO 10218-1:2012. Robot e Attrezzature per Robot—Requisiti di Sicurezza per Robot Industriali—Parte 1: Robot; ISO: Geneva, Switzerland, 2012.
- ISO/TS 15066:2016. Robots and Robotic Devices—Collaborative Robots; ISO: Geneva, Switzerland, 2016.
- Avalle, G.; De Pace, F.; Fornaro, C.; Manuri, F.; Sanna, A. An Augmented Reality System to Support Fault Visualization in Industrial Robotic Tasks. IEEE Access 2019, 7, 132343–132359. [Google Scholar] [CrossRef]
- Wöhle, L.; Gebhard, M. Towards Robust Robot Control in Cartesian Space Using an Infrastructureless Head- and Eye-Gaze Interface. Sensors 2021, 21, 1798. [Google Scholar] [CrossRef]
- Rosen, E.; Whitney, D.; Phillips, E.; Chien, G.; Tompkin, J.; Konidaris, G.; Tellex, S. Communicating and controlling robot arm motion intent through mixed-reality head-mounted displays. Int. J. Robot. Res. 2019, 38, 1513–1526. [Google Scholar] [CrossRef]
- Liu, H.; Wang, L. An AR-based Worker Support System for Human-Robot Collaboration. Procedia Manuf. 2017, 11, 22–30. [Google Scholar] [CrossRef]
- De Pace, F.; Manuri, F.; Sanna, A.; Fornaro, C. A systematic review of Augmented Reality interfaces for collaborative industrial robots. Comput. Ind. Eng. 2020, 149, 106806. [Google Scholar] [CrossRef]
- Michalos, G.; Karagiannis, P.; Makris, S.; Tokçalar, Ö.; Chryssolouris, G. Augmented Reality (AR) Applications for Supporting Human-robot Interactive Cooperation. Procedia CIRP 2016, 41, 370–375. [Google Scholar] [CrossRef] [Green Version]
- Bolano, G.; Juelg, C.; Roennau, A.; Dillmann, R. Transparent Robot Behavior Using Augmented Reality in Close Human-Robot Interaction. In Proceedings of the 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), New Delhi, India, 14–18 October 2019. [Google Scholar] [CrossRef]
- Andersen, R.S.; Madsen, O.; Moeslund, T.B.; Ben Amor, H. Projecting robot intentions into human environments. In Proceedings of the 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), New York, NY, USA, 26–31 August 2016; pp. 294–301. [Google Scholar] [CrossRef]
- Geng, J.; Song, X.; Pan, Y.; Tang, J.; Liu, Y.; Zhao, D.; Ma, Y. A systematic design method of adaptive augmented reality work instruction for complex industrial operations. Comput. Ind. 2020, 119, 103229. [Google Scholar] [CrossRef]
- Grandi, F.; Khamaisi, R.; Peruzzini, M.; Raffaeli, R.; Pellicciari, M. A Reference Framework to Combine Model-Based Design and AR to Improve Social Sustainability. Sustainability 2021, 13, 2031. [Google Scholar] [CrossRef]
- Chakravorty, A.; Rowe, A. UX design principles for mobile augmented reality applications. In MCCSIS 2018-Multi Conference on Computer Science and Information Systems, Proceedings of the International Conferences on Interfaces and Human Computer Interaction 2018, Game and Entertainment Technologies 2018 and Computer Graphics, Visualization, Madrid, Spain, 17–20 July 2018; IADIS Publications: Lisbon, Portugal, 2018; pp. 319–323. [Google Scholar]
- Cauchi, M.; Scerri, D. Enriching Tourist UX via a Location Based AR Treasure Hunt Game. In Proceedings of the 2019 IEEE 9th International Conference on Consumer Electronics (ICCE-Berlin), Berlin, Germany, 8–11 September 2019; pp. 199–204. [Google Scholar]
- Lankes, M.; Stiglbauer, B. GazeAR: Mobile Gaze-Based Interaction in the Context of Augmented Reality Games. Adv. Auton. Robot. 2016, 9768, 397–406. [Google Scholar] [CrossRef]
- Law, E.L.-C.; Heintz, M. Augmented reality applications for K-12 education: A systematic review from the usability and user experience perspective. Int. J. Child-Comput. Interact. 2021, 30, 100321. [Google Scholar] [CrossRef]
- Rusu, C.; Rusu, V.; Roncagliolo, S.; González, C.S. Usability and User Experience. Int. J. Inf. Technol. Syst. Approach 2015, 8, 1–12. [Google Scholar] [CrossRef] [Green Version]
- Alenljung, B.; Lindblom, J.; Andreasson, R.; Ziemke, T. User Experience in Social Human-Robot Interaction. Int. J. Ambient. Comput. Intell. 2017, 8, 12–31. [Google Scholar] [CrossRef] [Green Version]
- del Amo, I.F.; Galeotti, E.; Palmarini, R.; Dini, G.; Erkoyuncu, J.A.; Roy, R. An innovative user-centred support tool for Augmented Reality maintenance systems design: A preliminary study. Procedia CIRP 2018, 70, 362–367. [Google Scholar] [CrossRef]
- Booth, A.; Sutton, A.; Papaioannou, D. Systematic Approaches to a Successful Literature Review, 1st ed.; Sage Publications Ltd.: London, UK, 2016; pp. 245–267. [Google Scholar]
- Scimago.com. Available online: https://www.scimagojr.com/ (accessed on 26 May 2021).
- Romero, D.; Stahre, J.; Taisch, M. The Operator 4.0: Towards socially sustainable factories of the future. Comput. Ind. Eng. 2020, 139, 106128. [Google Scholar] [CrossRef]
- Papanastasiou, S.; Kousi, N.; Karagiannis, P.; Gkournelos, C.; Papavasileiou, A.; Dimoulas, K.; Baris, K.; Koukas, S.; Michalos, G.; Makris, S. Towards seamless human robot collaboration: Integrating multimodal interaction. Int. J. Adv. Manuf. Technol. 2019, 105, 3881–3897. [Google Scholar] [CrossRef]
- Huy, D.Q.; Vietcheslav, I.; Lee, G.S.G. See-through and spatial augmented reality—A novel framework for human-robot interaction. In Proceedings of the 2017 3rd International Conference on Control, Automation and Robotics (ICCAR), Nagoya, Japan, 24–26 April 2017; pp. 719–726. [Google Scholar]
- Materna, Z.; Kapinus, M.; Beran, V.; Smrz, P.; Zemcik, P. Interactive Spatial Augmented Reality in Collaborative Robot Programming: User Experience Evaluation. In Proceedings of the 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Nanjing, China, 27–31 August 2018; pp. 80–87. [Google Scholar] [CrossRef]
- Aschenbrenner, D.; Li, M.; Dukalski, R.; Verlinden, J.; Lukosch, S. Collaborative Production Line Planning with Augmented Fabrication. In Proceedings of the 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Tuebingen/Reutlingen, Germany, 18–22 March 2018; pp. 509–510. [Google Scholar] [CrossRef]
- De Tommaso, D.; Calinon, S.; Caldwell, D.G. A Tangible Interface for Transferring Skills. Int. J. Soc. Robot. 2012, 4, 397–408. [Google Scholar] [CrossRef]
- Bazzano, F.; Gentilini, F.; Lamberti, F.; Sanna, A.; Paravati, G.; Gatteschi, V.; Gaspardone, M. Immersive Virtual Reality-Based Simulation to Support the Design of Natural Human-Robot Interfaces for Service Robotic Applications. Lect. Notes Comput. Sci. 2016, 9768, 33–51. [Google Scholar] [CrossRef]
- Cao, Y.; Wang, T.; Qian, X.; Rao, P.S.; Wadhawan, M.; Huo, K.; Ramani, K. GhostAR: A Time-space Editor for Embodied Authoring of Human-Robot Collaborative Task with Augmented Reality. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology, New Orleans, LA, USA, 20–23 October 2019; pp. 521–534. [Google Scholar] [CrossRef]
- Materna, Z.; Kapinus, M.; Beran, V.; SmrĚ, P.; Giuliani, M.; Mirnig, N.; Stadler, S.; Stollnberger, G.; Tscheligi, M. Using Persona, Scenario, and Use Case to Develop a Human-Robot Augmented Reality Collaborative Workspace. In Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, Vienna, Austria, 6–9 March 2017; pp. 201–202. [Google Scholar] [CrossRef]
- Kyjanek, O.; Al Bahar, B.; Vasey, L.; Wannemacher, B.; Menges, A. Implementation of an Augmented Reality AR Workflow for Human Robot Collaboration in Timber Prefabrication. In Proceedings of the 36th International Symposium on Automation and Robotics in Construction (ISARC), Banff, AB, Canada, 21–24 May 2019; pp. 1223–1230. [Google Scholar]
- Leutert, F.; Herrmann, C.; Schilling, K. A Spatial Augmented Reality system for intuitive display of robotic data. In Proceedings of the 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Tokyo, Japan, 3–6 March 2013; pp. 179–180. [Google Scholar] [CrossRef]
- Ji, Z.; Liu, Q.; Xu, W.; Yao, B.; Hu, Y.; Feng, H.; Zhou, Z. Augmented reality-enabled intuitive interaction for industrial human-robot collaboration. In Advanced Human-Robot Collaboration in Manufacturing; Springer: Cham, Switzerland, 2021; pp. 395–411. [Google Scholar]
- Frank, J.A.; Moorhead, M.; Kapila, V. Realizing mixed-reality environments with tablets for intuitive human-robot collaboration for object manipulation tasks. In Proceedings of the 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), New York, NY, USA, 26–31 August 2016; pp. 302–307. [Google Scholar] [CrossRef]
- Green, S.A.; Chase, J.G.; Chen, X.; Billinghurst, M. Evaluating the augmented reality human-robot collaboration system. Int. J. Intell. Syst. Technol. Appl. 2010, 8, 130. [Google Scholar] [CrossRef]
- Jones, B.; Zhang, Y.; Wong, P.N.Y.; Rintel, S. VROOM: Virtual Robot Overlay for Online Meetings. In Proceedings of the Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–10. [Google Scholar] [CrossRef]
- Xin, M.; Sharlin, E. Sheep and wolves: Test bed for human-robot interaction. In Proceedings of the CHI’ 06 Extended Abstracts on Human Factors in Computing Systems, Montréal, QC, Canada, 22–27 April 2006; pp. 1553–1558. [Google Scholar]
- Fuste, A.; Reynolds, B.; Hobin, J.; Heun, V.; Ptc, B.A.F. Kinetic AR: A Framework for Robotic Motion Systems in Spatial Computing. In Proceedings of the Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–8. [Google Scholar]
- Chan, W.P.; Hanks, G.; Sakr, M.; Zuo, T.; Van der Loos, H.M.; Croft, E. An Augmented Reality Human-Robot Physical Collaboration Interface Design for Shared, Large-Scale, Labour-Intensive Manufacturing Tasks. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 24 October 2020; pp. 11308–11313. [Google Scholar] [CrossRef]
- Diehl, M.; Plopski, A.; Kato, H.; Ramirez-Amaro, K. Augmented Reality interface to verify Robot Learning. In Proceedings of the 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Naples, Italy, 31 August–4 September 2020; pp. 378–383. [Google Scholar] [CrossRef]
- Scafà, M.; Serrani, E.B.; Papetti, A.; Brunzini, A.; Germani, M. Assessment of Students’ Cognitive Conditions in Medical Simulation Training: A Review Study. In Advances in Intelligent Systems and Computing; Springer International Publishing: Cham, Switzerland, 2017; pp. 224–233. [Google Scholar]
- Brooke, J. SUS: A ’Quick and Dirty’ Usability Scale. Usability Eval. Ind. 1996, 189, 4–7. [Google Scholar]
- AttrakDiff. Available online: http://attrakdiff.de/index-en.html (accessed on 5 July 2021).
- Hone, K.S.; Graham, R. Towards a tool for the Subjective Assessment of Speech System Interfaces (SASSI). Nat. Lang. Eng. 2000, 6, 287–303. [Google Scholar] [CrossRef]
- Palmarini, R.; Erkoyuncu, J.A.; Roy, R.; Torabmostaedi, H. A systematic review of augmented reality applications in maintenance. Robot. Comput. Manuf. 2018, 49, 215–228. [Google Scholar] [CrossRef] [Green Version]
- Quintero, C.P.; Li, S.; Pan, M.K.; Chan, W.P.; Van der Loos, H.M.; Croft, E. Robot Programming Through Augmented Trajectories in Augmented Reality. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 1838–1844. [Google Scholar] [CrossRef]
- Liu, H.; Zhang, Y.; Si, W.; Xie, X.; Zhu, Y.; Zhu, S.-C. Interactive Robot Knowledge Patching Using Augmented Reality. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QLD, Australia, 13 September 2018; pp. 1947–1954. [Google Scholar] [CrossRef]
- Bloomberg. Available online: https://www.bloomberg.com/press-releases/2021–02-02/-125-billion-growth-in-global-augmented-reality-ar-and-virtual-reality-vr-market-2020–2024-apac-to-emerge-as-major-market (accessed on 2 July 2021).
- Why We Believe VR/AR will Boost Global GDP by $1.5 Trillion. Available online: https://www.pwc.co.uk/services/economics/insights/vr-ar-to-boost-global-gdp.html (accessed on 2 July 2021).
- Masood, T.; Egger, J. Augmented reality in support of Industry 4.0—Implementation challenges and success factors. Robot. Comput. Manuf. 2019, 58, 181–195. [Google Scholar] [CrossRef]
- Gonçalves, E.M.N.; Freitas, A.; Botelho, S. An AutomationML Based Ontology for Sensor Fusion in Industrial Plants. Sensors 2019, 19, 1311. [Google Scholar] [CrossRef] [Green Version]
- Romero, D.; Bernus, P.; Noran, O.; Stahre, J.; Berglund, Å.F. The operator 4.0: Human cyber-physical systems & adaptive automation towards human-automation symbiosis work systems. In IFIP Advances in Information and Communication Technology; Springer: New York, NY, USA, 2016; Volume 488, pp. 677–686. [Google Scholar]
- Woo, J.; Ohyama, Y.; Kubota, N. Robot Partner Development Platform for Human-Robot Interaction Based on a User-Centered Design Approach. Appl. Sci. 2020, 10, 7992. [Google Scholar] [CrossRef]
- Palmarini, R.; del Amo, I.F.; Bertolino, G.; Dini, G.; Erkoyuncu, J.A.; Roy, R.; Farnsworth, M. Designing an AR interface to improve trust in Human-Robots collaboration. Procedia CIRP 2018, 70, 350–355. [Google Scholar] [CrossRef]
- Still, B.; Crane, K. Fundamentals of user-centered design: A practical approach, 1st ed; CRC Press: Boca Raton, FL, USA, 2017. [Google Scholar]
- Prati, E.; Grandi, F.; Peruzzini, M. Usability Testing on Tractor’s HMI: A Study Protocol. In Lecture Notes in Computer Science; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2021. [Google Scholar] [CrossRef]
- William, A.; Tullis, T. Measuring the User Experience: Collecting, Analyzing, and Presenting Usability Metrics, 2nd ed.; Elsevier Inc.: Amsterdam, The Netherlands, 2013. [Google Scholar]
- Minge, M.; Thüring, M.; Wagner, I.; Kuhr, C.V. The meCUE Questionnaire: A Modular Tool for Measuring User Experience. Adv. Intell. Syst. Comput. 2017, 486, 115–128. [Google Scholar] [CrossRef]
- Peruzzini, M.; Grandi, F.; Pellicciari, M. Benchmarking of Tools for User Experience Analysis in Industry 4.0. Procedia Manuf. 2017, 11, 806–813. [Google Scholar] [CrossRef]
Search String | Database | Date | Found |
---|---|---|---|
TITLE-ABS-KEY ((augmented AND reality) AND (human AND robot AND interaction OR human AND robot AND collaboration) AND (user AND experience OR user AND interface)) | Scopus | 30/04/2021 | 27 |
Exclusion Criteria | Found | ||
Language | 27 | ||
Scope | 23 | ||
Accessibility | 21 |
Paper | Year of Publication | Publication Destination | QC1 | QC2 | QC3 | Quality |
---|---|---|---|---|---|---|
Hietanen, A., Pieters, R., Lanz, M., Latokartano, J., Kämäräinen, J.-K. [8] | 2020 | Journal | 1 | 1 | 0.53 | 2.53 |
Papanastasiou, S., Kousi, N., Karagiannis, P., Gkournelos, C., Papavasileiou, A., Dimoulas, K., Baris, K., Koukas, S., Michalos, G., Makris, S. [34] | 2019 | Journal | 1 | 1 | 0.53 | 2.53 |
De Pace, F., Manuri, F., Sanna, A., Fornaro, C. [18] | 2020 | Journal | 1 | 1 | 0 | 2 |
Huy, D.Q., Vietcheslav, I., Gerald, S.G.L. [35] | 2017 | Int. Conference | / | 1 | 0.33 | 1.33 |
Materna, Z., Kapinus, M., Beran, V., Smrž, P., Zemčík, P. [36] | 2018 | Int. Conference | / | 1 | 0.26 | 1.26 |
Aschenbrenner, D., Li, M., Dukalski, R., Verlinden, J., Lukosch, S. [37] | 2018 | Int. Conference | / | 1 | 0.26 | 1.26 |
de Tommaso, D., Calinon, S., Caldwell, D.G. [38] | 2012 | Journal | 1 | 0 | 0.13 | 1.13 |
Bazzano, F., Gentilini, F., Lamberti, F., Sanna, A., Paravati, G., Gatteschi, V., Gaspardone, M. [39] | 2016 | Journal | / | 1 | 0.13 | 1.13 |
Cao, Y., Wang, T., Qian, X., Rao, P.S., Wadhawan, M., Huo, K., Ramani, K. [40] | 2019 | Int. Conference | / | 1 | 0.1 | 1.1 |
Materna, Z., Kapinus, M., Beran, V., Smrž, P., Giuliani, M., Mirnig, N., Stadler, S., Stollnberger, G., Tscheligi, M. [41] | 2017 | Int. Conference | / | 1 | 0.06 | 1.06 |
Kyjanek, O., Al Bahar, B., Vasey, L., Wannemacher, B., Menges, A. [42] | 2019 | Int. Conference | / | 1 | 0.03 | 1.03 |
Leutert, F., Herrmann, C., Schilling, K. [43] | 2013 | Int. Conference | / | 0 | 1 | 1 |
Ji, Z., Liu, Q., Xu, W., Yao, B., Hu, Y., Feng, H., Zhou, Z. [44] | 2019 | Int. Conference | / | 1 | 0 | 1 |
Frank, J.A., Moorhead, M., Kapila, V. [45] | 2016 | Int. Conference | / | 0 | 0.83 | 0.83 |
Green, S.A., Chase, J.G., Chen, X.Q., Billinghurst, M. [46] | 2010 | Journal | / | 0 | 0.56 | 0.56 |
Jones, B., Zhang, Y., Wong, P.N.Y., Rintel, S. [47] | 2020 | Int. Conference | / | 0 | 0.03 | 0.03 |
Xin, M., Sharlin, E. [48] | 2006 | Int. Conference | / | 0 | 0.2 | 0.2 |
Fuste, A., Reynolds, B., Hobin, J., Heun, V. [49] | 2020 | Int. Conference | / | 0 | 0 | 0 |
Chan, W.P., Hanks, G., Sakr, M., Zuo, T., Machiel Van Der Loos, H.F., Croft, E. [50] | 2020 | Int. Conference | / | 0 | 0 | 0 |
Krauß, M., Leutert, F., Scholz, M.R., Fritscher, M., Heß, R., Lilge, C., Schilling, K. [6] | 2021 | Journal | / | 0 | 0 | 0 |
Diehl, M., Plopski, A., Kato, H., Ramirez-Amaro, K. [51] | 2020 | Int. Conference | / | 0 | 0 | 0 |
Paper | Benefits | Adopted UX Tools | Area of Application |
---|---|---|---|
J. A. Frank, M. Moorhead, and V. Kapila [45] | End-user’s intentions understanding to reduce operator cognitive burden | Custom questionnaire | Object manipulation |
W. P. Chan, G. Hanks, M. Sakr, T. Zuo, H. F. Machiel Van Der Loos, and E. Croft [50] | The system’s final application must be considered to prevent wrong choices in terms of interfaces and to avoid physical and cognitive repercussion on the user | NASA-TLX | Large-scale, labor-intensive manufacturing tasks |
C. P. Quintero, S. Li, M. K. Pan, W. P. Chan, H. F. Machiel Van Der Loos, and E. Croft [57] | Reducing robots’ programming operation time and cognitive demand | Custom questionnaire | Robot programming |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Khamaisi, R.K.; Prati, E.; Peruzzini, M.; Raffaeli, R.; Pellicciari, M. UX in AR-Supported Industrial Human–Robot Collaborative Tasks: A Systematic Review. Appl. Sci. 2021, 11, 10448. https://doi.org/10.3390/app112110448
Khamaisi RK, Prati E, Peruzzini M, Raffaeli R, Pellicciari M. UX in AR-Supported Industrial Human–Robot Collaborative Tasks: A Systematic Review. Applied Sciences. 2021; 11(21):10448. https://doi.org/10.3390/app112110448
Chicago/Turabian StyleKhamaisi, Riccardo Karim, Elisa Prati, Margherita Peruzzini, Roberto Raffaeli, and Marcello Pellicciari. 2021. "UX in AR-Supported Industrial Human–Robot Collaborative Tasks: A Systematic Review" Applied Sciences 11, no. 21: 10448. https://doi.org/10.3390/app112110448
APA StyleKhamaisi, R. K., Prati, E., Peruzzini, M., Raffaeli, R., & Pellicciari, M. (2021). UX in AR-Supported Industrial Human–Robot Collaborative Tasks: A Systematic Review. Applied Sciences, 11(21), 10448. https://doi.org/10.3390/app112110448