Human Factors in Human–Robot Interaction

A special issue of Robotics (ISSN 2218-6581). This special issue belongs to the section "Humanoid and Human Robotics".

Deadline for manuscript submissions: closed (31 December 2023) | Viewed by 36217

Special Issue Editors


E-Mail Website
Guest Editor
Department of Computer Science, Montclair State University, Montclair, NJ, USA
Interests: collaborative robotics; human-robot interaction; smart cyber-physical systems; human factors and cognitive ergonomics in HRI; distributed robotics and control systems; artificial intelligence; multimodal sensing

E-Mail Website
Guest Editor
Department of Psychology, Montclair State University, Montclair, NJ, USA
Interests: psychology and cognitive science; decisions about delayed and risky rewards

E-Mail Website
Guest Editor
Hefei Institutes of Physical Science, Chinese Academy of Sciences, Hefei, China
Interests: intelligent robotics; human–robot interaction

Special Issue Information

Dear Colleagues,

Robots are becoming part of people's everyday social lives. In future years, robots may become caretaking assistants for the elderly, academic tutors for our children, or medical assistants, day care assistants, or psychological counsellors. Robots are, or soon will be, used in such critical domains as search and rescue, military battle, mine and bomb detection, scientific exploration, law enforcement, and hospital care. Such robots must coordinate their behaviors with the requirements and expectations of human team members; they are more than mere tools but rather quasi-team members whose tasks have to be integrated with those of humans. Human factors in human–robot interaction (HRI) such as comfort and trust are critical to enable high-quality human–robot teamwork.

This Special Issue intends to collect and publish high quality papers as well as review articles discussing and addressing the design, development, and application of human factors related research questions in HRI. Studies which focus on computational modeling and socially interactive robots in different methodological roles to explore questions about social development with robots including, but not limited to, psychology, cognitive science, HCI, human factors, artificial intelligence, and robotics, are welcome to this Special Issue.

Potential topics include but are not limited to the following:

  • User studies of HRI;
  • Experiments on HRI collaboration;
  • Human comfort and trust in HRI;
  • Decision making and emotion recognition in HRI;
  • Mixed initiative interaction;
  • Psychology modeling and application in HRI;
  • Multi-modal interaction;
  • Robot autonomy;
  • Cognitive control for heterogenous teaming;
  • HRI group dynamics;
  • HRI software architectures;
  • Task allocation and coordination;
  • Distributed cognition (or distributed cognitive systems);
  • Self-learning and healing of social robots;
  • Robot intermediaries;
  • Lifelike/humanoid robots;
  • Remote robots;
  • HRI communication;
  • Robot–team learning;
  • Assistive robotics;
  • Risks such as privacy or safety;
  • Awareness and monitoring of humans;
  • Implicit dialogue.

Dr. Weitian Wang
Dr. Michael Bixter
Prof. Dr. Quanjun Song
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Robotics is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (9 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

30 pages, 3362 KiB  
Article
Got It? Comparative Ergonomic Evaluation of Robotic Object Handover for Visually Impaired and Sighted Users
by Dorothea Langer, Franziska Legler, Pia Diekmann, André Dettmann, Sebastian Glende and Angelika C. Bullinger
Robotics 2024, 13(3), 43; https://doi.org/10.3390/robotics13030043 - 5 Mar 2024
Viewed by 1881
Abstract
The rapidly growing research on the accessibility of digital technologies has focused on blind or visually impaired (BVI) users. However, the field of human–robot interaction has largely neglected the needs of BVI users despite the increasing integration of assistive robots into daily life [...] Read more.
The rapidly growing research on the accessibility of digital technologies has focused on blind or visually impaired (BVI) users. However, the field of human–robot interaction has largely neglected the needs of BVI users despite the increasing integration of assistive robots into daily life and their potential benefits for our aging societies. One basic robotic capability is object handover. Robots assisting BVI users should be able to coordinate handovers without eye contact. This study gathered insights on the usability of human–robot handovers, including 20 BVI and 20 sighted participants. In a standardized experiment with a mixed design, a handover robot prototype equipped with a voice user interface and haptic feedback was evaluated. The robot handed over everyday objects (i) by placing them on a table and (ii) by allowing for midair grasping. The usability target was met, and all user groups reported a positive user experience. In total, 97.3% of all handovers were successful. The qualitative feedback showed an appreciation for the clear communication of the robot’s actions and the handover reliability. However, the duration of the handover was seen as a critical issue. According to all subjective criteria, the BVI participants showed higher variances compared to the sighted participants. Design recommendations for improving robotic handovers equally supporting both user groups are given. Full article
(This article belongs to the Special Issue Human Factors in Human–Robot Interaction)
Show Figures

Figure 1

13 pages, 1457 KiB  
Article
Are Friendly Robots Trusted More? An Analysis of Robot Sociability and Trust
by Travis Kadylak, Megan A. Bayles and Wendy A. Rogers
Robotics 2023, 12(6), 162; https://doi.org/10.3390/robotics12060162 - 29 Nov 2023
Viewed by 2605
Abstract
Older individuals prefer to maintain their autonomy while maintaining social connection and engagement with their family, peers, and community. Though individuals can encounter barriers to these goals, socially assistive robots (SARs) hold the potential for promoting aging in place and independence. Such domestic [...] Read more.
Older individuals prefer to maintain their autonomy while maintaining social connection and engagement with their family, peers, and community. Though individuals can encounter barriers to these goals, socially assistive robots (SARs) hold the potential for promoting aging in place and independence. Such domestic robots must be trusted, easy to use, and capable of behaving within the scope of accepted social norms for successful adoption to scale. We investigated perceived associations between robot sociability and trust in domestic robot support for instrumental activities of daily living (IADLs). In our multi-study approach, we collected responses from adults aged 65 years and older using two separate online surveys (Study 1, N = 51; Study 2, N = 43). We assessed the relationship between perceived robot sociability and robot trust. Our results consistently demonstrated a strong positive relationship between perceived robot sociability and robot trust for IADL tasks. These data have design implications for promoting robot trust and acceptance of SARs for use in the home by older adults. Full article
(This article belongs to the Special Issue Human Factors in Human–Robot Interaction)
Show Figures

Figure 1

12 pages, 2566 KiB  
Article
Human–Exoskeleton Interaction Force Estimation in Indego Exoskeleton
by Mohammad Shushtari and Arash Arami
Robotics 2023, 12(3), 66; https://doi.org/10.3390/robotics12030066 - 1 May 2023
Cited by 5 | Viewed by 2718
Abstract
Accurate interaction force estimation can play an important role in optimizing human–robot interaction in an exoskeleton. In this work, we propose a novel approach for the system identification of exoskeleton dynamics in the presence of interaction forces as a whole multibody system without [...] Read more.
Accurate interaction force estimation can play an important role in optimizing human–robot interaction in an exoskeleton. In this work, we propose a novel approach for the system identification of exoskeleton dynamics in the presence of interaction forces as a whole multibody system without imposing any constraints on the exoskeleton dynamics. We hung the exoskeleton through a linear spring and excited the exoskeleton joints with chirp commands while measuring the exoskeleton–environment interaction force. Several structures of neural networks were trained to model the exoskeleton passive dynamics and estimate the interaction force. Our testing results indicated that a deep neural network with 250 neurons and 10 time–delays could obtain a sufficiently accurate estimation of the interaction force, resulting in an RMSE of 1.23 on Z–normalized applied torques and an adjusted R2 of 0.89. Full article
(This article belongs to the Special Issue Human Factors in Human–Robot Interaction)
Show Figures

Figure 1

10 pages, 1454 KiB  
Article
Human Factors Assessment of a Novel Pediatric Lower-Limb Exoskeleton
by Anthony C. Goo, Jason J. Wiebrecht, Douglas A. Wajda and Jerzy T. Sawicki
Robotics 2023, 12(1), 26; https://doi.org/10.3390/robotics12010026 - 9 Feb 2023
Cited by 3 | Viewed by 2978
Abstract
While several lower-limb exoskeletons have been designed for adult patients, there remains a lack of pediatric-oriented devices. This paper presented a human factor assessment of an adjustable pediatric lower-limb exoskeleton for childhood gait assistance. The hip and knee exoskeleton uses an adjustable frame [...] Read more.
While several lower-limb exoskeletons have been designed for adult patients, there remains a lack of pediatric-oriented devices. This paper presented a human factor assessment of an adjustable pediatric lower-limb exoskeleton for childhood gait assistance. The hip and knee exoskeleton uses an adjustable frame for compatibility with children 6–11 years old. This assessment evaluates the device’s comfort and ease of use through timed donning, doffing, and reconfiguration tasks. The able-bodied study participants donned the device in 6 min and 8 s, doffed it in 2 min and 29 s, and reconfigured it in 8 min and 23 s. The results of the timed trials suggest that the exoskeleton can be easily donned, doffed, and reconfigured to match the anthropometrics of pediatric users. A 6-min unpowered walking experiment was conducted while the child participant wore the exoskeletal device. Inspection of both the device and participant yielded no evidence of damage to either the device or wearer. Participant feedback on the device was positive with a system usability scale rating of 80/100. While minor improvements can be made to the adjustability indicators and padding placement, the results indicate the exoskeleton is suitable for further experimental evaluation through assistive control assessments. Full article
(This article belongs to the Special Issue Human Factors in Human–Robot Interaction)
Show Figures

Figure 1

16 pages, 2969 KiB  
Article
Transformable Wheelchair–Exoskeleton Hybrid Robot for Assisting Human Locomotion
by Ronnapee Chaichaowarat, Sarunpat Prakthong and Siri Thitipankul
Robotics 2023, 12(1), 16; https://doi.org/10.3390/robotics12010016 - 18 Jan 2023
Cited by 17 | Viewed by 6832
Abstract
This paper presents a novel wheelchair–exoskeleton hybrid robot that can transform between sitting and walking modes. The lower-limb exoskeleton uses planetary-geared motors to support the hip and knee joints. Meanwhile, the ankle joints are passive. The left and right wheel modules can be [...] Read more.
This paper presents a novel wheelchair–exoskeleton hybrid robot that can transform between sitting and walking modes. The lower-limb exoskeleton uses planetary-geared motors to support the hip and knee joints. Meanwhile, the ankle joints are passive. The left and right wheel modules can be retracted to the lower legs of the exoskeleton to prepare for walking or stepping over obstacles. The chair legs are designed to form a stable sitting posture to avoid falling while traveling on smooth surfaces with low energy consumption. Skateboard hub motors are used as the front driving wheels along with the rear caster wheels. The turning radius trajectory as the result of differential driving was observed in several scenarios. For assisting sit-to-stand motion, the desired joint velocities are commanded by the user while the damping of the motors is set. For stand-to-sit motion, the equilibrium of each joint is set to correspond to the standing posture, while stiffness is adjusted on the basis of assistive levels. The joint torques supported by the exoskeleton were recorded during motion, and leg muscle activities were studied via surface electromyography for further improvement. Full article
(This article belongs to the Special Issue Human Factors in Human–Robot Interaction)
Show Figures

Figure 1

24 pages, 3449 KiB  
Article
Digital Twin as Industrial Robots Manipulation Validation Tool
by Vladimir Kuts, Jeremy A. Marvel, Murat Aksu, Simone L. Pizzagalli, Martinš Sarkans, Yevhen Bondarenko and Tauno Otto
Robotics 2022, 11(5), 113; https://doi.org/10.3390/robotics11050113 - 18 Oct 2022
Cited by 12 | Viewed by 3630
Abstract
The adoption of Digital Twin (DT) solutions for industrial purposes is increasing among small- and medium-sized enterprises and is already being integrated into many large-scale companies. As there is an increasing need for faster production and shortening of the learning curve for new [...] Read more.
The adoption of Digital Twin (DT) solutions for industrial purposes is increasing among small- and medium-sized enterprises and is already being integrated into many large-scale companies. As there is an increasing need for faster production and shortening of the learning curve for new emerging technologies, Virtual Reality (VR) interfaces for enterprise manufacturing DTs seem to be a good solution. Furthermore, with the emergence of Industry 5.0 (I5.0) paradigm, human operators will be increasingly integrated in the systems interfaces though advanced interactions, pervasive sensors, real time tracking and data acquisition. This scenario is especially relevant in collaborative automated systems where the introduction of immersive VR interfaces based on production cell DTs might provide a solution for the integration of the human factors in the modern industrial scenarios. This study presents experimental results of the comparison between users controlling a physical industrial robot system via a traditional teach pendant and a DT leveraging a VR user interface. The study group involves forty subjects including experts in robotics and VR as well as non-experts. An analysis of the data gathered in both the real and the virtual use case scenario is provided. The collected information includes time for performing a task with an industrial robot, stress level evaluation, physical and mental effort, and the human subjects’ perceptions of the physical and simulated robots. Additionally, operator gazes were tracked in the VR environment. In this study, VR interfaces in the DT representation are exploited to gather user centered metrics and validate efficiency and safety standards for modern collaborative industrial systems in I5.0. The goal is to evaluate how the operators perceive and respond to the virtual robot and user interface while interacting with them and detect if any degradation of user experience and task efficiency exists compared to the real robot interfaces. Results demonstrate that the use of DT VR interfaces is comparable to traditional tech pendants for the given task and might be a valuable substitute of physical interfaces. Despite improving the overall task performance and considering the higher stress levels detected while using the DT VR interface, further studies are necessary to provide a clearer validation of both interfaces and user impact assessment methods. Full article
(This article belongs to the Special Issue Human Factors in Human–Robot Interaction)
Show Figures

Figure 1

Review

Jump to: Research

24 pages, 2758 KiB  
Review
What Affects Human Decision Making in Human–Robot Collaboration?: A Scoping Review
by Yuan Liu, Glenda Caldwell, Markus Rittenbruch, Müge Belek Fialho Teixeira, Alan Burden and Matthias Guertler
Robotics 2024, 13(2), 30; https://doi.org/10.3390/robotics13020030 - 9 Feb 2024
Cited by 2 | Viewed by 3639
Abstract
The advent of Industry 4.0 has heralded advancements in Human–robot Collaboration (HRC), necessitating a deeper understanding of the factors influencing human decision making within this domain. This scoping review examines the breadth of research conducted on HRC, with a particular focus on identifying [...] Read more.
The advent of Industry 4.0 has heralded advancements in Human–robot Collaboration (HRC), necessitating a deeper understanding of the factors influencing human decision making within this domain. This scoping review examines the breadth of research conducted on HRC, with a particular focus on identifying factors that affect human decision making during collaborative tasks and finding potential solutions to improve human decision making. We conducted a comprehensive search across databases including Scopus, IEEE Xplore and ACM Digital Library, employing a snowballing technique to ensure the inclusion of all pertinent studies, and adopting the PRISMA Extension for Scoping Reviews (PRISMA-ScR) for the reviewing process. Some of the important aspects were identified: (i) studies’ design and setting; (ii) types of human–robot interaction, types of cobots and types of tasks; (iii) factors related to human decision making; and (iv) types of user interfaces for human–robot interaction. Results indicate that cognitive workload and user interface are key in influencing decision making in HRC. Future research should consider social dynamics and psychological safety, use mixed methods for deeper insights and consider diverse cobots and tasks to expand decision-making studies. Emerging XR technologies offer the potential to enhance interaction and thus improve decision making, underscoring the need for intuitive communication and human-centred design. Full article
(This article belongs to the Special Issue Human Factors in Human–Robot Interaction)
Show Figures

Figure 1

30 pages, 7237 KiB  
Review
Research Perspectives in Collaborative Assembly: A Review
by Thierry Yonga Chuengwa, Jan Adriaan Swanepoel, Anish Matthew Kurien, Mukondeleli Grace Kanakana-Katumba and Karim Djouani
Robotics 2023, 12(2), 37; https://doi.org/10.3390/robotics12020037 - 7 Mar 2023
Cited by 5 | Viewed by 5294
Abstract
In recent years, the emergence of Industry 4.0 technologies has introduced manufacturing disruptions that necessitate the development of accompanying socio-technical solutions. There is growing interest for manufacturing enterprises to embrace the drivers of the Smart Industry paradigm. Among these drivers, human–robot physical co-manipulation [...] Read more.
In recent years, the emergence of Industry 4.0 technologies has introduced manufacturing disruptions that necessitate the development of accompanying socio-technical solutions. There is growing interest for manufacturing enterprises to embrace the drivers of the Smart Industry paradigm. Among these drivers, human–robot physical co-manipulation of objects has gained significant interest in the literature on assembly operations. Motivated by the requirement for human dyads between the human and the robot counterpart, this study investigates recent literature on the implementation methods of human–robot collaborative assembly scenarios. Using a combination of strings, the researchers performed a systematic review search, sourcing 451 publications from various databases (Science Direct (253), IEEE Xplore (49), Emerald (32), PudMed (21) and SpringerLink (96)). A coding assignment in Eppi-Reviewer helped screen the literature based on ‘exclude’ and ‘include’ criteria. The final number of full-text publications considered in this literature review is 118 peer-reviewed research articles published up until September 2022. The findings anticipate that research publications in the fields of human–robot collaborative assembly will continue to grow. Understanding and modeling the human interaction and behavior in robot co-assembly is crucial to the development of future sustainable smart factories. Machine vision and digital twins modeling begin to emerge as promising interfaces for the evaluation of tasks distribution strategies for mitigating the actual human ergonomic and safety risks in collaborative assembly solutions design. Full article
(This article belongs to the Special Issue Human Factors in Human–Robot Interaction)
Show Figures

Figure 1

13 pages, 3542 KiB  
Review
A Narrative Review on Wearable Inertial Sensors for Human Motion Tracking in Industrial Scenarios
by Elisa Digo, Stefano Pastorelli and Laura Gastaldi
Robotics 2022, 11(6), 138; https://doi.org/10.3390/robotics11060138 - 2 Dec 2022
Cited by 15 | Viewed by 3203
Abstract
Industry 4.0 has promoted the concept of automation, supporting workers with robots while maintaining their central role in the factory. To guarantee the safety of operators and improve the effectiveness of the human-robot interaction, it is important to detect the movements of the [...] Read more.
Industry 4.0 has promoted the concept of automation, supporting workers with robots while maintaining their central role in the factory. To guarantee the safety of operators and improve the effectiveness of the human-robot interaction, it is important to detect the movements of the workers. Wearable inertial sensors represent a suitable technology to pursue this goal because of their portability, low cost, and minimal invasiveness. The aim of this narrative review was to analyze the state-of-the-art literature exploiting inertial sensors to track the human motion in different industrial scenarios. The Scopus database was queried, and 54 articles were selected. Some important aspects were identified: (i) number of publications per year; (ii) aim of the studies; (iii) body district involved in the motion tracking; (iv) number of adopted inertial sensors; (v) presence/absence of a technology combined to the inertial sensors; (vi) a real-time analysis; (vii) the inclusion/exclusion of the magnetometer in the sensor fusion process. Moreover, an analysis and a discussion of these aspects was also developed. Full article
(This article belongs to the Special Issue Human Factors in Human–Robot Interaction)
Show Figures

Figure 1

Back to TopTop