Next Article in Journal
Exploring Dynamic Difficulty Adjustment Methods for Video Games
Previous Article in Journal
Tactile Speech Communication: Reception of Words and Two-Way Messages through a Phoneme-Based Display
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Augmented Reality Application for Wound Management: Enhancing Nurses’ Autonomy, Competence and Connectedness

Institute of Positive Computing, Hochschule Ruhr West—University of Applied Sciences, 46236 Bottrop, Germany
*
Author to whom correspondence should be addressed.
Virtual Worlds 2024, 3(2), 208-229; https://doi.org/10.3390/virtualworlds3020011
Submission received: 2 February 2024 / Revised: 20 March 2024 / Accepted: 29 May 2024 / Published: 3 June 2024

Abstract

:
The use of Augmented Reality glasses opens up many possibilities in hospital care, as they facilitate treatments and their documentation. In this paper, we present a prototype for the HoloLens 2 supporting wound care and documentation. It was developed in a participatory process with nurses using the positive computing paradigm, with a focus on the improvement of the working conditions of nursing staff. In a qualitative study with 14 participants, the factors of autonomy, competence and connectedness were examined in particular. It was shown that good individual adaptability and flexibility of the system with respect to the work task and personal preferences lead to a high degree of autonomy. The availability of the right information at the right time strengthens the feeling of competence. On the one hand, the connection to patients is increased by the additional information in the glasses, but on the other hand, it is hindered by the unusual appearance of the device and the lack of eye contact. In summary, the potential of Augmented Reality glasses in care was confirmed, and approaches for a well-being-centered system design were identified but, at the same time, a number of future research questions, including the effects on patients, were also identified.

1. Introduction

Nowadays, employees in the nursing profession are confronted with an immense workload resulting from rising patient numbers [1] and a shortage of skilled professionals [2]. Economic requirements, compliance with standards and bureaucratic regulations are making the nursing processes increasingly complex [3,4]. All these factors have to be reconciled with the need to provide good, humane care that responds to patients’ individual needs [5]. Digitalization holds opportunities to improve the situation as an adequate solution that can release resources for individual patient care, and result in a higher quality of work and relief of secondary tasks.
In line with this, new approaches to use technologies, such as extended reality, are now entering the healthcare sector [6,7,8,9]. Among them, Augmented Reality (AR) glasses are a promising technology to support nursing processes [7]. Based on the framework for task–technology fit, a model which suggests a strong connection between the requirements of a task and the characteristics of a technology when it comes to adoption and human performance, AR glasses are advantageous to optimize the use of space, as well as for tasks that need to be performed in a timely, hands-free manner, and with continuous attention [10]. In relation to nursing, this gives rise to a wide range of applications, such as to support medication dispensation [11,12,13,14] or wound care [9,15]. Information can be retrieved in a timely manner in front of the patient, or while performing a nursing task. Hands-free interaction allows nurses to fulfill hygienic standards while controlling the system. However, using AR might also have negative effects, such as motion sickness or raising privacy concerns [16].
Although both researchers and nursing staff expect many opportunities for the use of AR, many questions remain unanswered: What are the most promising and accepted application scenarios? How does the technology affect the interaction work of nurses and patients? Can it effectively reduce the workload? And, most importantly, as it can be considered an essential requirement, how does it affect well-being (in the sense of supporting important preconditions like autonomy, competence and connectedness) [17,18]?
Following the idea of the positive computing framework [17,18,19], the presented research uses an exploratory approach to see if and how an AR application can contribute to users’ feelings of autonomy, competence and connectedness, and to determine important factors influencing this experience. In doing so, a prototype was developed in a participatory process and tested in a simulated hospital environment with nursing staff.

2. Related Work

2.1. The Positive Computing Framework

As it is important for the acceptance and value-adding of new technology for it to have positive effects on people’s well-being, we follow the positive computing framework, which aims to design and develop technology to support psychological well-being and human potential [17]. It extends beyond the common goals of effectiveness and efficiency to also consider the quality-of-life and well-being of the users of a digital system and its impact on society [18]. Based on the Self-Determination Theory (SDT) [20], Peters [19] designed strategies for how technology can be developed to meet individuals’ needs for autonomy, competence, and connectedness:
Autonomy is the need to operate in compliance with one’s goals and values [19]. Accordingly, Peters claims that technology has to be accessible, embed optional levels of help, and users should be able to choose their own goals and strategies. Individuals should be able to interact independently with the system and decide how to use it in a simple way. Instead of giving strict instructions, it is important to provide guidance and the constant opportunity to correct data. Users should be in control of the communication. The design of the interface should be simplistic, to bring focus and concentration to the essentials. Lastly, users should decide for themselves when and how often to use the system [19].
Competence describes a person’s desire to perceive themselves as in control of their environment, and to be able to anticipate it [21]. According to Peters, in order to generate a high degree of competence, it is important to divide larger tasks into sub-tasks. In addition, the system has to be updated constantly and offer the possibility of a simplified presentation of the information. It is beneficial to have informative feedback that guides users through the process [19].
Connectedness refers to the desire to experience interaction with other people. It is characterized by the connectedness and the feeling of caring for others [22]. Peters states that this can be supported by ensuring that the technological interactions with others are seamless. Communication should also continue to happen offline and ensure that both intrinsic and extrinsic motivation is provided. The sense of community should be emphasized, and all involved should be given the opportunity to contribute. Finally, the communication of kindness and positive influence should be emphasized [19].
The basic requirements regarding autonomy, competence and connectedness provide initial guidance on how an AR application could be designed to improve well-being and the quality of work. However, in order to develop a tailored solution for nursing, it is necessary to understand the users and their work environment.

2.2. Specific Requirements Based on Nurses’ Needs

Although there has been a change in recent decades, van der Cingel et al. [5] report that the profession of nurses is still being seen as compassionate helpers who perform simple and straightforward tasks to care for patients. Due to the advancing professionalization and versatility of tasks, employees see themselves as an important group that does not simply follow doctors’ instructions strictly, but also wants to contribute its own needs and knowledge [5]. An AR application should, therefore, provide the ability to support different skill levels and empower users to make decisions on their own. Economic pressure on the healthcare system, bureaucratic regulations, increasing patient numbers, and less contact time with patients are leading to higher workloads, stress, and emotional exhaustion [4]. With regard to the use of AR glasses, nurses surveyed indicated that they hoped to save time and protect themselves through documentation that might become easier, as information and communication with others could be available in a timely manner, regardless of location [23]. Accordingly, it can be summarized that an AR application has to have a high degree of user friendliness, particularly providing exactly the right information in the respective situation to support nursing and documentation tasks.
There is a growing recognition that interacting with people is an essential part of today’s work that requires much closer attention. Interaction work involves establishing a cooperative relationship, dealing with one’s own feelings, influencing the feelings of customers, clients and patients, and dealing with the imponderables that are part of working on and with people [24]. For example, many nursing interventions require the patient to be calm and relaxed. This can be impaired by the patient’s anxiety, but enabled and encouraged by the nurse’s composure and confidence [24].
Accordingly, besides using the AR glasses for documentation or treatment purpose, nurses have to continuously respond to the patient. They have to (1) control their own feelings as they influence the patients emotions, and (2) be able to interpret the patient’s emotions and sense the situation, in order to (3) prove to be competent and trustworthy to (4) establish a cooperative relationship.

2.3. Factors That Must Be Considered When Using AR Glasses in Interaction Work

Showing and interpreting emotions is crucial to establishing a trustful connection. Nonverbal communication is used to express emotions, convey attitudes, and demonstrate character traits. Humans are able to decode these subtle signals and interpret them in a culture-specific way [25,26]. Eye contact and glances are particularly effective nonverbal signals for building trust in the Western world [27]. Overall, emotions are perceived and interpreted through several of these channels. If one of the channels is not available (e.g., in the absence of eye contact), it is still possible to assess situations correctly through other signals [28]. Hence, there are multiple opportunities for nurses to express emotions, and multiple ways to decode these emotions for patients. However, many AR glasses cover large parts of the nurse’s face, and lenses are often tinted, making eye contact less likely to be made. In previous studies, nurses indicated concerns about the negative impact on the relationship between them and the patient, fearing that AR glasses will be distracting rather than supportive, as eye contact is broken [23]. From the patients’ perspective, AR glasses can significantly reduce their estimates of healthcare workers’ abilities, and decrease their willingness to opt-in to medical procedures [16]. Many AR glasses can be operated by gesture control. However, it is not obvious to outsiders whether a pointing gesture is used for operation or for communication, for example. The interpretation of non-verbal behavior could therefore be made more difficult by the glasses. Here, it is important to examine the ways in which nurses deal with the various interaction possibilities in order to meet the patient’s situation.
According to Böhle and Weihrich [24], it is fundamental in this subjectifying action to sense and feel (e.g., soft skin, acrid odor, or a nervous patient). Thinking and sensing are not performed from a distance, but are directly connected to the service. This leads to an explorative, dialogic-interactive approach, where action and reaction are interwoven [24]. The integration of new technologies in such processes could serve as a facilitator, as digital information could be consulted to better assess the patient’s condition and make decisions based on it. However, it also harbors the risk of triggering stress, since operating might be difficult at first and another parallel work thread that demands attention is added. With regard to AR glasses, it is also uncertain how the superimposed digital information influences the perception of the real environment. Is it still possible to sense and feel authentically, as described by Böhle and Weihrich [24], while using AR glasses?
The nurse’s external impact on the patient might be influenced by AR glasses. During the interaction, the patient observes the nurse’s behavior and makes assumptions about their feelings, traits, and motives. According to the attribution theory, people try to determine why others act in a certain way in order to uncover the feelings and traits behind their actions, distinguishing between internal personality-related attributions or external, situation-related attributions [29].
It is, therefore, questionable as to how caregivers themselves and patients evaluate the care situation mediated with AR glasses. Do they attribute the gain or loss of competence to the AR glasses, or to their own ability?
A lack of common ground might be another issue using AR glasses. By common ground, Clark and Brennan [30] mean the knowledge that is shared by two or more individuals. Three heuristics allow one to infer the shared knowledge: (1) group membership (e.g., belonging to a certain profession), (2) physical co-presence (e.g., objects that everyone can see and know about), and (3) linguistic co-presence (e.g., content of the course of the conversation) [30]. Regarding AR glasses particularly, the second point might be critical, as patients are not able to see the digital content the caregiver is seeing trough the glasses. Also, heuristic three needs to be considered when choosing control opportunities for the AR glasses application, such as voice interfaces that cannot be heard by patients.

2.4. Summary and Research Questions

In summary, by using a participatory, user-oriented approach, this exploratory work studies the design and use of an AR-glasses prototype in a user study, observing the complexities in the interaction work of nurses with patients. It thereby considers factors from the positive computing framework, with the goal of supporting nurses in their autonomous and competent fulfillment of their work, while preserving their connectedness with patients.
Along the previously outlined research, we pose the following research questions:
Research question 1: what strategies for using AR glasses do nurses map out in order to autonomously and competently establish a connection to patients in nursing interaction work?
Research question 2: what ambivalences emerge from empirically identified chances and risks in relation to a successful integration of AR glasses in nursing interaction work?

3. Prototype, Materials, Methods and Procedure

To answer the research questions, a prototype was developed in a participatory process and evaluated in a qualitative study, using behavioral data from a simulation study and semi-structured interviews.

3.1. Prototype

In line with positive computing and participatory design, it is crucial for a successful integration of AR glasses to continuously involve the users of the system in the development process [17,31]. Emergent technologies, such as AR, come with unfamiliar interaction forms. How it works and how the technology is experienced might often be vague and abstract, because users cannot refer to mental models yet. This makes it hard for first-time users to state requirements and to formulate ideas on application areas.
According to the guiding principals of participatory design, it is important for democratic and hierarchy-less teamwork to discuss situations based actions that are understandable, rather than abstract, and to use tools and techniques that help to express needs and visions to be able to participate and achieve a mutual understanding [32]. Hence, tight feedback loops with two co-designing nurses allowed us to pick up the mental model of the caregivers, use plausible (still fictional) patient information, and integrate it meaningfully into a realistic use-case scenario by following the design thinking framework [33,34]. To create an AR application that meets the previously stated requirements and fits in smoothly in a realistic patient scenario, we followed the participatory design approach, using contextual inquiries, interlocking workshops, and co-design techniques to develop a prototype in close coordination with nurses [35].

3.1.1. Use Case and Device Decision

In an early phase of the participatory process, nursing staff were very committed to developing a series of application scenarios in which they expected AR glasses to make their work significantly easier. At the end of the workshop series, the documentation of wounds was chosen as the most interesting use case.
Wound care with simultaneous documentation emerged as an important scenario for the use of AR glasses within surveys conducted with nurses with different levels of expertise and experience [23]. The respondents of [23] mentioned the availability of information regardless of location as the main advantage, as it is important to interact continuously with the patient. It is, therefore, well-suited to test the effects of AR glasses on interaction work. Adherence to hygiene standards is a high priority, which means that non-contact work with the AR glasses could be advantageous. Klinker et al. [36] already investigated wound management by testing an AR-based tablet application, and came to the conclusion that the provided information is beneficial, but handheld-based AR applications are impractical for medical professions, as they can not be used without physical contact. This supports the idea of using AR glasses.
After jointly testing several AR glasses and discussing their system characteristics, the Microsoft HoloLens 2 (HL2) was chosen. In contrast to other devices, the HL2 allows users to interact with digital 3D-holograms via gesture control and, therewith, without a physical control device, whereby the hygienic requirements can be met. Caregivers prefer discreet AR glasses models, as they assume that these are less disconcerting for the patient [23,37]. However, these models have the disadvantage that they usually cannot be operated without physical contact, and they are unsuitable for people who wear glasses. They also usually sit less firmly on the head, and could slip during treatment. The HL2 compensates for these disadvantages, but is heavier, bulkier, and more conspicuous. However, a visor that can be folded up and down allows eye contact with the patient.
The prototype was developed with Unity (according to Microsoft’s recommendations version 2021.3 (LTS)) and the mixed reality toolkit MRTK 2.7. All UI elements used originate from this toolkit.

3.1.2. Aim and Scope of Functions

Based on the results of the former section, the prototype is designed to enable nurses to achieve three key objectives while using AR glasses:
  • Competence: the prototype’s structure supports common processes in nursing, and provides information to competently and safely assess, care for, and document the patient’s wound.
  • Autonomy: users are able to set up and use their own digitally augmented workspace autonomously and flexibly according to their individual needs and preferences.
  • Connectedness: the prototype allows nurses to stay close to the patients and to involve them in the care process.
Different types of features are installed to meet these objectives. The navigation structure and information architecture are based on existing nursing routines and the documentation system currently used by the co-designing nurses. Selected screenshots of the prototype are shown in Figure 1. By picking up on dialog structures, labels, and input options of the existing patient file, we take up the mental model of the nurses and place these building blocks in a new workflow adapted to wound care. In addition to the familiar contents, further information materials on the patient and auxiliary materials for the assessment of wounds are integrated into the concept. For this purpose, the wound care process is sorted into chronologically sequenced tabs (visible in Figure 1a,c). Necessary information is integrated into the main dialogues, and supporting content is incorporated with progressive disclosure mechanisms (Figure 1a,b). This allows the nurse to decide for him/herself whether, when, and where to use the support material.
Besides retrieving information, it is also possible to create a new documentation entry (Figure 1c). Step-by-step, each documentation entry can be adjusted by selecting pre-defined options. This way, standardized and quick documentation can be made while treating the patient’s wound. Additional material, such as reference pictures, can be faded-in to support the wound assessment correctly (Figure 1d). To be able to flexibly adapt the augmented workspace to the spatial conditions, the prototype is divided into two areas that can be separately positioned in the environment. In addition to the patient file, a second window displays a picture of the patient’s wound as previously documented, which can be aligned to the actual wound in the real world (Figure 1b). Accordingly, a change in wound status can directly be observed and documented. These features in combination allow the users to receive and document the individually needed information directly within the treatment procedure.
In the implemented prototype, users can choose between far (Figure 1b) and near (Figure 1c) gestures to control the system. This allows them to interact with the system from a distance (e.g., pointing to a window) or from close (e.g., clicking on a button). This way of control is more intuitive, deliberate, and intentional compared to eye-tracking methods. Voice control, as another alternative, was dismissed because of its potential susceptibility. Conversations with the patient might be interpreted as an input by the system, and could lead to unintended actions. By using near and far gestures, the nurses are also able to arrange the windows from a position in the room that allows them to create physical closeness or distance to the patient (Figure 1c). The visor of the HL2 can also be used to maintain eye contact. By providing detailed information about the patient and their current health condition, the user can easily refer to it while being in a dialogue with the patient.
With this first set of functionalities, nurses already have several options to develop their own strategies on how exactly to use the augmented information in a patient situation. The following section presents the study we conducted to simulate realistic nurse–patient interactions, test the prototype in this context, and investigate our research questions.

3.2. Study

The study conducted consisted of two main phases. In the first phase, the participants were given the opportunity to test the AR glasses using the developed prototype in a realistic patient situation. In the second phase of the study, they were asked to reflect on their experiences and impressions of the use in an qualitative interview.
To investigate the use of AR glasses in as realistic a context as possible, a plausible case study was developed with the two co-designing nurses. Therefore, a fictitious patient was created, including all relevant patient data needed for documentation purposes, as well as handover information typically used at shift changes. This information was used as mock data in the prototype. The fictitious patient suffers from a chronic wound on the left lower leg, and was acted out by one of the co-designing nurses. Participants testing the prototype were asked to learn about the patient, care for the wound, and document the procedure.
The patient interaction situation took place in a simulation center. The participants were located in a typically furnished patient room. They were equipped with materials that they could use for the treatment (e.g., painkillers, bandages, and gloves). The researchers were in the control room next door, which allowed a view into the observation room through a mirrored window. Additionally, three cameras were set up in the patient room to observe and record the situation from multiple perspectives. To trace what the participant sees and experiences in the HL2, a live stream was transmitted in the control room, which was recorded as well. Both rooms were connected with an intercom system that allowed the researchers to give instructions to the participants. The subsequent interview took place in a meeting room, and questions were asked face-to-face by a researcher. Screenshots illustrating the main components of the prototype were used as reference materials.
Figure 2 provides an overview of the study’s individual steps, methods used, and locations in which it took place. The individual steps will be described in more detail in the following.

3.2.1. Briefing and Introduction

Participants were welcomed and introduced to the study, its procedure, and privacy policy. They were informed that the trial was voluntary, and could be stopped at any time without giving a reason. Emphasis was placed on informing them that the aim was not to test their performance, but rather the usefulness of the prototype. What counts is their subjective opinion. Afterwards, an informed consent form was signed by all participants. They were brought to the simulation room by a researcher, where they were briefly introduced to the HL2 by explaining the main functions and how to use the visor. The main interaction patterns (near and far gesture) were explained, and eye calibration was performed to ensure the best individual experience possible. Afterwards, they were introduced to the patient’s case study by using a pre-recorded video on a laptop in which a nurse provides handover information from the previous shift. Finally, they were prompted to start the prototype on the HL2, and the researcher moved to the control room.

3.2.2. Familiarization with the Prototype

In the first phase, the test subjects were able to familiarize themselves with the prototype and its functionalities step-by-step to be prepared for the upcoming patient situation. First, they were encouraged to click on buttons, walk around the three-dimensional holograms, and re-position them. This situation was used to collect direct feedback on the prototype, its screens, and components using the think-aloud technique. Participants were asked to comment on anything coming to their mind while using the prototype to uncover misinterpretations. Furthermore, participants were prompted to express what they like or dislike, and to make suggestions for improvement. To obtain feedback on all screens, users were given the same tasks that subtly navigated them there. The guide was semi-structured, and allowed for flexible responses. The tasks were coordinated in a way that the user could prepare for the patient situation step-by-step and explore the prototype independently beforehand, covering all central components and screens.
After the participants had familiarized themselves with the prototype, they were asked to place the screens in the simulation room. They were free to decide where to place the one with the wound documentation and where the wound image should be displayed while explaining what advantages they expected from which positioning.

3.2.3. Wound Care Simulation Using AR

If participants were ready to receive the patient, they were encouraged to behave as they usually would in a real situation. However, the patient was acted out by one of the co-designing nurses, who followed a behavioral script to create comparable situations. The participants received information about the patient’s condition and the perceived wound pain through a simulated handover briefing. Afterwards, the patient was brought into the room and gave the same information about her state of health, so that the participants were able to react accordingly. If the participants did not introduce the AR glasses on their own, the patient addressed them at a specific time, so that all participants were encouraged to explain the use of the AR glasses in their own words. A wound was simulated on the patient’s leg with the help of a glued-on photo and makeup. The wound was initially bandaged, and had to be uncovered and treated by the participants. This created a situation in which the hygienic conditions could be addressed. During wound documentation, the patient learns that the wound condition has worsened, subsequently panics and demands the nurse’s attention. Due to that, all participants were confronted with an increasingly stressful and distracting situation.
During the interaction with the fictitious patient, participating nurses were asked to treat the wound and provide documentation at the same time (not afterwards) to test the confidence and the ability to work with the hologram in realistic situations. In this situation, the participants could continue to provide comments and suggestions for optimization of the prototype at any time. From the observation room, they received assistance when necessary. In the case of surprising actions or difficulties, the participants were also asked to comment on the situation. The trial ended when the participants reached a pre-defined point in the documentation process, or when the maximum time slot of 60 min was reached.

3.2.4. Sharing Experiences in Semi-Structured Interviews

The subsequent interview was semi-structured and followed a flexible guideline within approximately 30 min. The following topic areas were included: (a) demographic data and current role, (b) usability and overall experience, (c) applicability and integration in interaction work, and (d) perception of the SDT determinants autonomy, competence and connectedness.
Demographics included questions regarding the age and professional experience of each participant, their position in the respective hospital, and whether they had previously been involved in the project or had been using AR glasses before. Subsequently, the subjects were asked to describe their experiences and emotions while using the AR glasses. Participants were asked to relate to specific features and information provided by the prototype, and to share their preferences, criticisms, and suggestions for improvement.
In terms of autonomy, they were particularly asked to describe their regular wound care routine, and to compare it with their experiences during the simulation. If not mentioned by themselves, they were asked how functionalities like the flexible window positioning were perceived with regard to autonomy, and how they used it to create an individual work environment. This served to determine what strategy they chose, and what advantages and disadvantages they perceived as a result.
Concerning competence, the participants’ own perceptions were examined, as well as their evaluation of how competent they were considered by the patient during the simulation. On the one hand, technical competence and statements regarding the control and the handling of the glasses were collected. On the other hand, questions were aimed at nursing competence and to what extent the glasses support or hinder nursing care.
With regard to connectedness to the patient, the participants were first instructed to reflect their own feelings during the wound treatment and how they perceived the interaction with the patient. Additionally, we asked them to imagine themselves in the patient’s position, in order to recapitulate how they thought the patient might have felt during the treatment. They were questioned if and how the AR glasses had an effect on the interaction work and the perceived connection. This topic area also concluded with a request for suggestions to evaluate advice on how to increase connectedness. In this way, they were encouraged to identify strategies that they would use to improve their connection with the patient.
Finally, they were asked to imagine that AR glasses would be introduced into daily hospital routine in the next one or two months. The subjects were requested to express their feelings towards this situation. We wanted to find out concerns, limitations and necessary improvements.

3.2.5. Debriefing

Following the interviews, the participants were accompanied back to the initial area. They had another opportunity to ask questions and receive further information about the development of the project in future. Afterwards, they were dismissed.

3.3. Participants

The participants (N = 14) are or have been active in nursing, and were recruited from two different hospitals that previously supported the participative development of the AR application. One of the hospitals is located in a rural area, and the other in the city. Since people from both institutions participated in the design, some of the participants had previous involvement with the project. In addition, they have different levels of knowledge regarding wound management, and different levels of job experience. All participants are trained nurses, although some of them are currently employed as supervisors or division managers. However, they were all familiar with wound treatment. On average, they were 38.93 years old (SD = 10.05), ranging from 22 to 55, and had about 18.25 years (SD = 10.84) of professional experience. Two participants were male, twelve were female.
We reached saturation after 11 participants, meaning that no additional aspects related to the research questions were expressed in the think-alouds and semi-structured interviews by participants 12–14. To this end, we consulted the findings of [39] to confirm that we have a reasonably large sample. Subsequently, we were able to start analyzing the data, which will be described in greater detail in the following section.

3.4. Coding Scheme Development

To evaluate the study, the recorded material of the 14 participants was processed by transcribing the audio recordings of the observations and interviews, and summarizing and synchronizing the video streams. All data were then imported into MAXQDA software (2022, VERBI Software, Berlin, Germany) for further analysis. Qualitative content analysis was performed according to [40]. The coding system resulted from a combination of deductive and inductive procedures. An excerpt of our coding scheme can be viewed in greater detail in Table 1. This form of coding is increasing in preference in current research, as it allows for subjective interpretation of the data and makes it possible to map new and unforeseen findings [41]. Deductive main and subcategories provided a basic structure for the content analysis. Inductive codes were then generated within each thematic block. For example, we used deductive coding to pre-sort the participants’ statements (positive, neutral, and negative statements) and assignment to the principals of positive computing (autonomy, competence, and connectedness). Points of discussions, concerns, and suggestions for optimization were coded in an inductive manner based on evolving themes and their similarity to each other. In the further qualitative analysis of the coded segments, multiple responses were clustered. Instead of counting each individual statement made by a person, we only counted whether a person made this statement or not. In addition to the transcript fragments, video excerpts and representative screenshots were collected for further coding: patient–participant interaction, initial window positioning, repositioning of windows, and flipped up visor.

4. Results

In order to find answers to both research questions, we first give an overview of the general feedback on the prototype and how differently participants used it to create their work environment. We then go into more detail about how they perceived the situation in terms of autonomy, competence and connectedness with the patient. Afterwards, we discuss the assumed patient perception, and focus on the aspects that lead to different strategies to support interaction with the patient. Finally, we present concerns and suggestions for improvement that participants stated with regard to an integration of AR glasses in nursing interaction work.

4.1. General Feedback and Overall Experience with the Prototype

General feedback on the experience with the HL2 provides both positive and negative statements. In summary, we counted 411 positive statements, 153 neutral statements, and 356 negative statements from all participants, including data from both the simulation and the interview. Negative statements refer to the wearing comfort of the HL2, which was perceived as too big and too heavy. It also became warm under the AR glasses, and some subjects began to sweat. A few participants complained about dizziness and the first signs of motion sickness. With regard to usability, participants pointed out the poor performance of gesture control in the think-aloud-parts as well as in the interview. The video recordings show a high number of operating problems while using the near and far gestures. Video passages were coded as to whether the gesture successfully triggered the desired interaction, whether it was problematic, or whether it did not trigger the desired interaction at all (failure). Only interactions that were canceled by an interruption (e.g., the patient asked for attention) were ejected from the data set. In 70% (N = 392) of all observed interactions (N = 560), participants used the far gestures. In only 30% of all cases, the near interaction (N = 168) was used. However, the use of near gestures achieved slightly better interaction results compared to the far gestures: 54% of interactions were successful while using the near gestures, whereas it was only 44% of interactions using the far gesture. Accordingly, working with the HL2 was found to be exhausting and frustrating in large parts.
The application and the concept of the prototype, on the other hand, were evaluated positively. The interviewees described the use of the application as exciting, work-saving, self-explanatory, clear, simply structured, and practical, since one has all the information directly at hand. In total, 12 of the 14 subjects indicated that they would fully trust the prototype. Incorrect data are attributable to nurses input errors and not to the glasses (P01, P04). The prototype was conceptually based on the well-known PC-based hospital information system in several aspects (terms, structure of information, etc.). Seven participants said that they recognized the parallels. However, these parallels also raised expectations, some of which are not yet covered by the prototype. During the administration of medication and the wound assessment, some subjects wanted more detailed information.

4.2. Creating the Individual Work Environment by Initial Window Positioning

Based on the video fragments demonstrating the initial window positioning, we were able to derive three recurring patterns (see Figure 3).
Most participants chose an initial window alignment with the patient (N = 10). In this group, half of the participants opted for an alignment at the head side of the bed (N = 5) or at the long side of the bed (N = 5). In the associated think-alouds the participants, those who chose the head side of the bed reasoned that they moved the patient file window to where the information is directly needed. With this positioning, they have the patients’ name in view (P02, P11) when speaking to them, but are also able to take another look at the bandage material that they usually prepare on the service desk (P08, P11). This combination was perceived as particularly practical:
“I think that’s just great because you always have so many patients in your head and you forget so many little things. Which foam dressing and what size was it? I think it’s just super practical that you can combine that now [with the room].” (P11, video transcript, pos.97)
Staying in contact with the patient was important. Participants pointed out that patients remain in the field of view when looking at the patient file (P03, P11), and that that this position is less disturbing (P08) during conversations. P10 points out that the windows must not cover the patient’s face.
Participants who chose the long side of the bed for window alignment focused more on the patient’s wound. The information is in view during wound treatment, and can be referred to for direct comparison (P04, P14). In addition, the nurse can switch between the patient, the wound, and the information without having to change the body position or turn the head too much (P05). P12 weighs up between a position where a light background allows good legibility for herself, but where she would be standing with her back to the patient, and decides to move the windows to the side of the bed as a compromise in order to be able to make eye contact with the patient by moving her head.
The remaining four participants decided to place (parts of) the application on the opposite wall where the bandage material is stored. One reason is that they will need the information about the bandage material as a check list. However, they verify whether they can still see the information from the bed where they are communicating with the patient. During the dialogue, they are still close to the patient.
In total, half of the participants (N = 7) chose to re-position the windows during the patient interaction scene. This behavior could be observed in all three initial groups. However, the participants who initially positioned the windows on the head side of the bed took the wound picture (P02, P03 and P08) and the documentation screens (P02 and P03) to the patient’s wound. From this position, we could observe fine tuning. On the one hand, participants pulled screens containing pictures closer to the wound for a better reference. On the other hand, they pulled the screens closer to themselves to use near gestures more comfortably while documenting.
Furthermore, participants recognized that the window positioning influenced the way in which they interacted with the patient. Two contrasting examples provide insights on what was perceived as “too close” or “too distant”. During the wound documentation of P12, the hologram hovered very close above the patient. When she used near gestures, she had to reach over the patient and the bed, which she perceived as disturbing (see Figure 4a).
In the other extreme, P13 was dissatisfied with her initial window position, because she had to write the documentation with her back to the patient. While documenting the wound, she communicated with the patient, but was not able to maintain eye contact. In addition, all participants described the setup of their individual workplace as very positive (N = 14).

4.3. Autonomy

Overall, participants felt very autonomous while interacting with the AR glasses. In total, 13 participants stated that they were free in any decisions. During the interviews, it was mentioned that the documentation of wounds must cover certain aspects, so that the hologram does not have a dominating effect.
“Being more autonomous means that I can decide freely. It does not tell me what I have to do. I can click my way through as I wish. I am still free to make my own decisions.” (P02, interview, pos. 255)
In addition, when asked to what extent they felt dominated by the glasses, one participant commented that she could always take them off.
The fact that the AR glasses combine various functions and devices enabled the participants to feel more autonomous in their actions. Since the HL2 provides detailed information directly in the situation, the nurses are no longer dependent on their colleagues or the availability of a PC to finish their work. Participant 5 describes this as follows:
“Instead of how we are doing it now, having seen the picture before, going into the patient’s room and thinking, ‘Hm. Did that really look like this?’. Must return to the mobile PC again. This way I had it directly and could continue swiping to see what it looked like.” (P05, interview, pos. 95)
Despite all of the benefits in terms of perceived autonomy, two participants pointed out that the use of the AR glasses must be voluntary. Therefore, documentation via computer would still have to be available. Reference was made to the issues of motion sickness or the constricting feeling when wearing the glasses.

4.4. Competence

The participants commented positively on the AR glasses with regard to their own competence. Twelve participants reported that the glasses enabled them to work more efficiently. For example:
“While treating the wound, I would be able to document it immediately and would not have to do it two hours later, as it can happen in clinical practice sometimes.” (P03, interview, pos. 145)
Another important aspect were the reference images. Some participants perceived these as particularly competence-enhancing, since they were confident of evaluating the wound correctly by reviewing the images (N = 7). Participant 2 stated:
“Personally, I would have been even more uncertain in the evaluation of the wound. Because of the pictures, I was 100 % sure of how to describe them” (P02, interview, pos. 221)
Additionally, half of the participants recognized the similarity to systems they already knew from their own routine at work. As a result, it promotes their competence, since they do not have to familiarize themselves with a completely new scheme and can start documenting immediately.
Even though the application was rated as very positive and competence-enhancing, six nurses stated that it would limit their interaction work (P04–P06, P09, P13 and P14). More specifically, they indicated that their focus was on the glasses rather than the patient. Additionally, the tinted visor was problematic for two participants. They described that the wound could not be assessed correctly.
In addition, a major negative factor was the limitation of competency in terms of prioritization. In total, seven participants stated that they had focused more on the glasses than on nursing activities (P01, P03, P05–P07, P09–P13). Thus, there is a concern that the interaction with the patients could be neglected.
Another important element was the unavailability of different user profiles (N = 9). The subjects stated that wound experts needed different information than certified nurses or nursing students. In addition, they described the wish that the software was designed for ward-specific topics. For example, nurses in the intensive care unit need different information than those in the cardiology unit.

4.5. Connectedness

In the following, we focus more on the feeling of connectedness with the patient. Four participants explicitly stated that they perceived a “distance” between them and the patient (P01, P03, P05, P08). One argument was the reduction of eye contact or loosing sight of the patient (P03, P05, P07, P10, P12, P13). P07 explains:
“For patients, it is important to at least be able to see the eyes and to see a little facial expression. That is totally important. Especially when we also wear this FFP2 mask.” (P07, interview, pos. 145)
Another participant referred to the distraction by the holograms that caught their visual attention (P05). Establishing a personal connection to the patients and interacting with them is a key attribute in nursing, as P01 describes:
“So in my field, it’s very much about having a relationship with patients and being able to cater to patients. Even before competence, before basic care and so on.” (P01, interview, pos. 238)
Accordingly, participants stated that they would establish a relationship with the patients first, before introducing the AR glasses. For example, P13 would feel more comfortable if patients had already been treated by her and knew how she “normally” delivered care (interview, pos. 68–70). P01 believes that once the patients realize that they are nevertheless being well looked after, then it is no longer a problem to use AR glasses (interview, pos. 273). For the subjects, the simulated scene represented an exceptional situation in several respects. As a result, some participants refer to the AR glasses, others to the application, and still others to the overall experience when talking about the impact on feelings of connectedness. The nurses were using the AR glasses for the first time, and had difficulties working with them. Accordingly, three participants pointed out technical difficulties with the HL2 that negatively influenced the patient interaction, as they felt distracted or insecure (P04, P07, and P11). However, all of them assumed that these problems would disappear after a certain period of familiarization and practice.
As the documentation of the wound usually takes place after the treatment, and not simultaneously with it, participants in our study struggled to concentrate on both at the same time. They mentioned that they focused and concentrated on the AR glasses and their content much more and on the patient much less (P01, P03, P05–P07, P09–P13). Here, again, some participants assumed that with a little practice, it will be easier to interact with the patient while documenting.

4.6. Assumed Patient Perception

The participants were concerned about how they would be perceived by the patients when wearing and handling the AR glasses and what effect this might have on the patients:
“It takes some getting used to, When you’re flailing around in the air. I think that’s weird for the patient at first, too, when there’s someone standing there waving in the air like crazy.” (P05, interview, pos. 15)
Participants in this study drew a connection between their ability to establish a connection with the patient and the perception of their competence as a nurse. In their assessment, they often refer to their interaction work and that they managed to connect with the patient with or despite the AR glasses. P07, on the one hand, was concerned that the patients might loose trust, as they might equate poor technical skills in using the AR glasses with nursing skills in general.
Participants stated that because of the information inherently provided, they were able to address the patients in a more targeted way and to react more confidently: for example, addressing the patient by the correct name (P02) or referring to more complex details, like the ones that are usually documented on the handover cheat sheet, the patient’s file or the medication reference book (P03). It was also assumed that the usage of AR glasses would lead to quality control that would provide reassurance to the patient and enable the nurse to appear more professional:
“If someone comes in who wears AR glasses, then you assume that they will be checked, that […] they have guidelines. Control is perhaps the wrong word, but [the nurses] can double check and look at everything again and it all looks professional. That would give me [as a patient] a bit of security.” (P02, interview, Pos. 176)
In total, we found more negative comments (28 segments) related to the assumed patient perception in our sample than positive (5 segments) or neutral (8 segments) comments. Among the negative statements, the most common theme was that patients might feel neglected (N = 8). Participants reasoned that they were more preoccupied with the new and unpracticed technique of using the AR glasses, or doing the documentation in parallel with the wound treatment and patient interaction (N = 6). The patients’ feeling of being neglected was assumed to be caused by losing contact with the patient due to the AR glasses (P10, P12, P13). References were made to the lack of eye contact, and to the fact that patients cannot see what one sees oneself and may not dare to ask about it (P12, P13). One participant wishes “that the patient has the feeling that when I look at him, I am really looking at him and not seeing any pictures.” (P04, interview, pos. 171).
Nurses were particularly concerned about patients with dementia and elderly, disoriented, or delirious patients (N = 7). Here, they see the use of AR glasses critically, as it could worsen the mental health status of these patients.
However, despite the many critical voices, some participants put their negative statements into perspective. In doing so, they mainly pointed to the still unpracticed situation and the fact that they were using the AR glasses for the first time. If the use of the AR glasses and the procedure of parallel documentation is practiced, the interaction with the patient will be different, and might be easier (P05, P09, P11). Subjects stated in the context that it was important to inform the patients about the AR glasses and how they work (P11, P13).

4.7. Strategies to Support Interaction Work with Patients

Within the role play scene, we were able to observe how the nurses managed to balance interaction work, documentation duties and taking care of the wound. We found different approaches for how participants tried to maintain a connection with the patient.
All participants were informed about the flip-up visor of the HL2 within the introduction, but less than half of the nurses made use of it to establish eye contact. Four participants were using the visor selectively over short amounts of time. In these shorter sequences, they raised the visor, usually at the beginning of the interaction scene, to greet the patient and to introduce themselves, to explain the AR glasses, or in case the patient demanded their attention (see Figure 4b). Two participants raised the visor over an extended period of time, and must have been remembered by the researcher to document the condition of the wound with the AR application. They flipped up the visor while collecting materials and treating the wound.
However, care must be taken when using the visor to ensure that the general hygienic conditions are met. Accordingly, one participant expressed the wish that the HL2 should offer the possibility to flip up the visor without physical contact. One participant found another compromise to deal with the tinted visor. She folded it down only halfway, so that she could squint at the information during the treatment.
Another way to better connect with the patient was to have the patient participate in the procedure. We observed that, similarly to traditional caregiver routines, participants commented in 26 cases on what they saw or did in the AR glasses.
“(to the patient:) I would just look at the picture again. I’ll put the glasses on now. I think that’s a little bit more extraordinary than usual.” (P13, video transcript, pos. 98)
However, only two participants started to introduce the AR glasses and its functioning initially. The other twelve nurses did not explain it until the patient actively asked them about it. All participants involved the patient in different ways. One participant (P02) explained that, during wound treatment without the HL2, some information is not available or incomplete, and thus the patient is frequently asked about it. This creates a continuous dialog in which the patient is involved. Based on this, he/she has concerns that the patient could take on a more passive role, being limited to a subject of documentation, when he/she is using the AR glasses. Another challenge arises from the patient’s presence during documentation, as they could notice the entries. Depending on the patient’s state of health, this information should be expressed with caution, and nurses were uncertain as to how to react adequately:
“I wouldn’t say that out loud either: ‘this is infected, looks totally bad, etc.’. But I just thought out loud, what should I click here? And then I wasn’t sure either, should I continue thinking out loud?” (P14, interview, pos. 65–68)

5. Discussion

5.1. Factors Influencing Autonomy

The results of our study clearly show that the nurses used different ways to control the prototype autonomously. These observations can be corroborated by the statements of the qualitative interviews. Here, the participants described the operation and set-up of their own work environment as autonomy-promoting. They particularly emphasized the independent positioning of the individual windows, which enabled a high degree of flexibility. Through progressive disclosure mechanisms and the display of additional reference images, the application provided sufficiently graded support material that could be consulted as needed. The sequence of steps could also be customized through the use of tabs, providing the ability to tailor the process to one’s own needs and routines. Based on this positive feedback, we conclude that the application and the AR concept, as such, meet the requirements postulated at the beginning regarding autonomy [19]. Nevertheless, suggestions for improvement were also expressed, which can be derived well from window positioning and progressive disclosure. Some reference images were displayed by extending an existing window. At this point, a decoupling of the information in a separate window would have allowed even more flexibility (e.g., in which only these images are aligned to the wound). Based on these findings, we recommend presenting topics that are related in content as modular units. Similar to widgets on a desktop, individual window elements and holograms can be plugged together to form an individual workspace. Compared to the current situation, the subjects stated that this form of documentation allowed for more flexibility.

5.2. Factors Influencing Competence

Regarding the perception of competence, the impressions on the AR experience were diverse. Generally, the caregivers considered themselves competent, although they sometimes had major problems with the operation. However, they attributed this to the AR glasses’ maturity level, and not to their own abilities.
On the basis of the constantly retrievable information, the nurses became increasingly informed about the respective medical conditions, and were able to provide more efficient decisions with regard to treatment. Additionally, they assumed that it provides patients with a sense of security, as the nurses are monitored eventually. Thereby, the AR glasses support the professionalization of the nursing care. These findings are in line with the statements made by van der Cingel and Brouwer [5], who said that the self-image of caregivers has changed in recent years. Due to the increased autonomy, the prototype can contribute to enabling nurses to adopt more complex tasks and, thus, to changing the perception of patients and medical doctors towards them.
On the other hand, the subjects criticized the AR glasses’ handling with regard to competence. Besides difficulties in usability, they referred to tapping in the air as being ridiculous, thus assuming that the patients would perceive them as less competent. We conclude that, despite the option to use near or far gestures, the way of operation should be improved further to increase the nurses’ feeling of competence.
Lastly, participants pointed out the importance of connecting to the patients and making them feel comfortable, as they assumed that was what made a competent nurse.

5.3. Factors Influencing Connectedness

By surveying the nurses’ perceived connectedness to the patient, it became evident that establishing a connection to patients is considered a core competence in nursing, and that using AR glasses has an influence on this perception. We discovered both promoting and impairing influencing factors.
A feeling of connectedness can be promoted by accessing all of the required information about the patient at any time. It enables nurses to respond to the patient’s needs as the situation demands (e.g., referring to pain assessment and providing medication). In addition, the flexible workplace arrangement allows the caregivers to vary the amount of attention they pay to the patient. However, the test subjects pointed to the bulkiness of the HL2. Particularly in combination with a surgical mask, it was seen as a hindrance to connectedness, as nonverbal signals can be shown and interpreted less obviously. This issue is the main criticism of the nurses, as they see the danger of not establishing sufficient eye contact with the patient through the glasses and, as a result, not creating an effective connection. However, the visor could have been a solution to this problem, but was used only by a few participants. The reasons were diverse, ranging from hygienic concerns to forgetting about this functionality at all. How to use this functionality properly should therefore be first re-designed to meet hygienic standards. The participants proposed using the voice function by suggesting raising or lowering the visor by voice commands. Additionally, they wished to use voice commands to bypass the gesture control or to use speech input for documentation. This was assumed easier and faster than the gesture control. An uncertainty was observed on how transparently the documentation should be completed in front of the patient. Loudly expressed documentation content or input commands could make the patient uncomfortable or anxious. This contrasts with statements from [24], which advocate maximizing transparency in order to create a common ground. Concerns were also raised about inadvertently activating voice control when talking to the patient—especially if this input is not perceived consciously.

5.4. Strategies of the Nurses to Support Interaction Work

To ensure successful interaction work, the nurses considered it more important to convey a positive feeling to the patients than placing the hologram in the most comfortable position for them. These strategies should be presented and discussed with the nurses to shed more light on the underlying motives and their changing role, as described in [5]. In addition, the interaction concept should be further developed so that it meets both demands.
Similarly to the concept of Klinker et al. [15], our prototype was designed to document and care for the wound simultaneously. However, some nurses did not complete the documentation during wound care initially, but preferred to complete it afterwards as usual. In this case, we prompted them to complete it directly, so that they can gain experience with the HL2. Nurses pointed to this change in procedure and revealed that it incorporates a stress factor. Additionally, if the interaction concept of Böhle and Weihrich [24] is taken into account, it became apparent that not only two, but several, complex tasks need to be performed simultaneously by the nurses: (1) referring to patient’s needs, (2) treating the wound, (3) documenting the status, (4) exploring how to individually use the HL2, (5) making the process transparent to the patient to establish common ground and, finally, (6) managing their own emotions to make a calm, competent and trustworthy impression. For further development, it will be crucial to figure out how to best support balancing these tasks. On the one hand, information can be brought much closer to the specific situation, both in time and space. On the other hand, the influence of AR operations that are visible to the outside world, but not comprehensible to outsiders, must be given greater consideration.

5.5. Chances and Risks for Integration

Overall, our study provided a realistic impression of what a nursing situation with AR technology could feel like in the future, which enabled the participants to provide meaningful feedback on the opportunities and risks of integration. In general, the use of AR was seen positively. However, some areas of tension were identified, which revealed fundamental conditions that need to be considered more strongly in the further development of such a system. Besides technical inaccessibility and getting used to the still unfamiliar handling, the integration of the glasses into nursing interaction work with the patients represents a particularly complex challenge. With regard to the on-boarding process, it was stated that both nurses and patients need to be informed about the AR glasses, its functionality, and the context of usage. Nurses emphasized that patient information cannot be their sole responsibility, and that patients need to be kept informed through other channels, such as brochures. In particular, the camera was mentioned in this context, as it could endanger the patient’s privacy.
The prototype’s concept, content and functionalities were rated as useful, helpful, and facilitating work. Nevertheless, it also became apparent that providing a lot of information and documentation functionalities leads to increasing parallelization of previously linear tasks, with the risk of switching the focus from the patient to the documentation.
Referring to the involvement of patients, our results indicate that the bulky HL2 has to be seen as a disruptive factor in interaction work between nurses and patients. Participants in our study suspected that some patients might feel uncomfortable if they, as caregivers, wore AR glasses. This assumption can be supported by Klinker et al. [16], who investigated, from the patient’s perspective, to what extend they would opt-in to a treatment with the HL2. Here, some surveyed participants found that caregivers wearing smart glasses look inhumane, as their eyes can hardly be seen, and found it difficult to build a trusting relationship with this person [16]. However, in the study conducted by Janssen and Prilla [9], in which caregivers tested AR glasses in a comparable nursing scenario, participants expressed less concerns regarding the patient’s acceptance towards AR glasses. Interestingly, none of the interviewees expected patients to have severe problems with the nurses wearing AR glasses if they explain this properly to the patient before [9]. In contrast to [16] and our study, [9] used another device, which is far less bulky, has no shades, and reminds one more of a pair of conventional glasses. Additionally, the ongoing technological improvements in hardware could resolve this obstacle in the future. Hence, eye contact is not disturbed, and technical features, such as cameras, are less obvious.
However, with regard to trust, Klinker et al. [16] also reported positive aspects of smart glasses, as some patients mentioned the reduction of errors and higher productivity of the caregivers. This maps well to statements from nurses in our study, who obtained competence from the possibility to check augmented information and to verify their decisions based on it. The information directly provided by AR can also help nurses to respond more quickly to patient’s questions. Feeling competent is highly influenced by the ability to establish a trustful relationship with the patient, as was explained in [5,24,42] and also reported by our participants.

6. Limitations and Implications for Future Research

In the further development of AR systems for wound management, the first step should be to ensure that the nurses perceive themselves as competent and autonomous. Good usability must be ensured so that the nurse can use the contents and functions of the application optimally in the nursing situation. Technical difficulties during usage led to frustration when the prototype did not respond to the nurse’s input instantly. Since the study only contained a prototype implementation, not all usability problems could be eliminated in advance. We countered this limitation by informing the participants that they would only be working with a prototypical version. In addition, participants’ statements were, therefore, evaluated under this restriction. Furthermore, the analysis of the study indicated that some participants had difficulties navigating through the hologram, leading to a number of operating errors. Although we have integrated the option for near and far gestures, a more robust system needs to be implemented in the future to maximize usability and user experience. Therefore, we suggest removing elements like the slider and replacing it with buttons or text input fields. In addition, the menu structure needs to be designed more intuitively, and drop-down buttons should be labeled more explicitly. In the second step, consideration should also be given to how these multimodal forms of interaction, like gestures or voice commands, can be perceived and interpreted by outsiders. This plays a central and complex role, especially in interaction work on and with humans, which has not yet been investigated in sufficient detail. Accordingly, future research should examine the impact of different AR glasses models on the self-perception and perceptions of others by the caregivers who wear them. Special focus should be laid on the impact of eye-contact, as it might have stronger effects on trust than other nonverbal behavior, such as body posture [43]. We observed additional strategies, like involving the patient by explaining what is presented via AR to establish common ground and to inform the patient of what is happening, which can be opposed to eye-contact. As a result, special emphasis should be laid on how a trustful connection between the patient and the caregiver can be realized using AR technology. Nonverbal and verbal communication take place not only among people but, in this case, also between people and technology. The participants highlighted that the information and processes through the HL2 were mapped more clearly than in the system currently used. A direct comparison between traditional documentation and AR-assisted documentation was not made, and could be explored in subsequent studies to prove this impression.
Complicating matters with AR is the lack of common ground, as only the caregiver, not the patient, can see content. Interpersonal misunderstandings, as well as unintentional inputs, can quickly arise in these situations, as our results indicate, and must be adequately addressed by future application concepts. When designing interaction patterns, designers should not only consider intended inputs made by the users, but also inputs that were unintended—either by the user or the patient. We recommend that future research should not only consider the perceptions of nurses, but also those of the patients. Both in our study, as well as in [9,16], the need for introducing and explaining AR glasses to the patient evolves as a key requirement for a successful integration in practice. Thus, we recommend extending the design process beyond the digital product itself and to complementing with service design methods that integrate additional artifacts, instructions and routines. Additionally, Friemer et al. [44] already suggested that nurses need training not only to learn how to operate a new technology, but also how to build an understanding about the context of its usage. A next step should be to investigate the effects of the changed workflow in longer work phases, for example when treating several patients in succession.
Additionally, some participants experienced motion sickness. According to the findings of [16], this is a common accompaniment when using AR glasses. In order to avoid this limitation, we provided the participants with a familiarization period with the glasses at the beginning, and offered them the opportunity to sit down to ensure that no one had to terminate the study. Future research should focus on how AR holograms can be designed to maximize usability and minimize the feeling of dizziness.

7. Conclusions

In conclusion, it can be stated that the approach of the well-being-centered system design with the positive computing framework, an intensive observation of the work task’s characteristics, and the involvement of domain experts led to a promising prototype, but also to the identification of further needs for research and development. The perceived autonomy of the nurses could be affirmed based on the option to control the glasses by both near and far gestures, and to be able to set up the workplace independently. Furthermore, it was observed that the presumed patient perception influenced the perceived personal competence. In addition, the hardware was considered too bulky. Future developments are expected to result in smaller devices less disruptive for the connectedness to the patient. We recommend that future research needs to focus on patients’ perceptions. Their perspective should also be considered and integrated into the design of applications, in order to ensure that all requirements are covered. In addition, technical and organizational framework conditions for integration into real hospital operations must be investigated. The basic conceptual approaches of this work suggest transferability to other fields of application. This should be investigated in more detail in future work.

Author Contributions

Conceptualization, C.A.-G., L.T., S.C.E. and S.G.; methodology, C.A.-G. and L.T.; software, C.A.-G.; validation, S.C.E. and S.G.; formal analysis, C.A.-G. and L.T.; investigation, C.A.-G. and L.T.; resources, C.A.-G., L.T. and S.G.; data curation, C.A.-G. and L.T.; writing—original draft preparation, C.A.-G. and L.T.; writing—review and editing, S.C.E. and S.G.; visualization, C.A.-G. and L.T.; supervision, S.C.E. and S.G.; project administration, S.C.E. and S.G.; funding acquisition, S.C.E. and S.G. All authors have read and agreed to the published version of the manuscript.

Funding

This work is part of the PARCURA project, funded by the Federal Ministry of Education and Research Germany and the European Social Fund of the EU (grant no. 02L18A164).

Institutional Review Board Statement

Ethical review and approval was not required for the study on human participants in accordance with the local legislation and institutional requirements. The participants provided their written informed consent to participate in this study.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data are presented in aggregated format within this manuscript and throughout the results section. Further inquiries can be directed to the corresponding author. The data are not publicly accessible for the protection of the test subjects.

Acknowledgments

We would like to thank partners and participants from the hospitals, especially the co-designers, involved for their valuable contributions.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Fendrich, K.; Hoffmann, W. More than just aging societies: The demographic change has an impact on actual numbers of patients. J. Public Health 2007, 15, 345–351. [Google Scholar] [CrossRef]
  2. Michel, J.P.; Ecarnot, F. The shortage of skilled workers in Europe: Its impact on geriatric medicine. Eur. Geriatr. Med. 2020, 11, 345–347. [Google Scholar] [CrossRef] [PubMed]
  3. Rössler, W. Stress, burnout, and job dissatisfaction in mental health workers. Eur. Arch. Psychiatry Clin. Neurosci. 2012, 262, 65–69. [Google Scholar] [CrossRef] [PubMed]
  4. Zander, B.; Dobler, L.; Busse, R. The introduction of DRG funding and hospital nurses’ changing perceptions of their practice environment, quality of care and satisfaction: Comparison of cross-sectional surveys over a 10-year period. Int. J. Nurs. Stud. 2013, 50, 219–229. [Google Scholar] [CrossRef] [PubMed]
  5. van der Cingel, M.; Brouwer, J. What makes a nurse today? A debate on the nursing professional identity and its need for change. Nurs. Philos. 2021, 22, e12343. [Google Scholar] [CrossRef] [PubMed]
  6. Prilla, M.; Recken, H.; Janßen, M.; Schmidt, A. Die Pflegebrille als Instrument der Digitalisierung in der Pflege: Nutzenpotentiale. In Assistive Technologien im Sozial- und Gesundheitssektor; Luthe, E.W., Müller, S.V., Schiering, I., Eds.; Springer Fachmedien: Wiesbaden, Germany, 2022; pp. 735–752. [Google Scholar] [CrossRef]
  7. Wüller, H.; Behrens, J.; Garthaus, M.; Marquard, S.; Remmers, H. A scoping review of augmented reality in nursing. BMC Nurs. 2019, 18, 19. [Google Scholar] [CrossRef] [PubMed]
  8. Park, S.; Bokijonov, S.; Choi, Y. Review of Microsoft HoloLens applications over the past five years. Appl. Sci. 2021, 11, 7259. [Google Scholar] [CrossRef]
  9. Janßen, M.; Prilla, M. Investigating the use of Head Mounted Devices for remote cooperation and guidance during the treatment of wounds. Proc. ACM Hum.-Comput. Interact. 2022, 6, 1–27. [Google Scholar] [CrossRef]
  10. Klinker, K.; Berkemeier, L.; Zobel, B.; Wüller, H.; Huck-Fries, V.; Wiesche, M.; Remmers, H.; Thomas, O.; Krcmar, H. Structure for innovations: A use case taxonomy for smart glasses in service processes. Multikonferenz Wirtsch. Lünebg. Dtschl. 2018, 4, 1599–1610. [Google Scholar]
  11. Aicher, S.; Klinker, K.; Wiesche, M.; Krcmar, H. Augmented Reality im Gesundheitswesen—Entwurf und Auswertung einer HoloLens-Anwendung zur Verteilung von Medikamenten. In Systematische Entwicklung von Dienstleistungsinnovationen; Wiesche, M., Welpe, I.M., Remmers, H., Krcmar, H., Eds.; Research; Springer Gabler: Berlin/Heidelberg, Germany, 2021; pp. 287–306. [Google Scholar] [CrossRef]
  12. Schneidereith, T. Seeing Through Google Glass: Using an Innovative Technology to Improve Medication Safety Behaviors in Undergraduate Nursing Students. Nurs. Educ. Perspect. 2015, 36, 337–339. [Google Scholar] [CrossRef]
  13. Othman, S.B.; Foinard, A.; Herbommez, P.; Storme, L.; Décaudin, B.; Hammadi, S.; Odou, P. Augmented reality for risks management in injectable drugs preparation in hospital pharmacy. In Proceedings of the GERPAC, Hyères, France, 5–7 October 2016. [Google Scholar]
  14. Chang, W.J.; Chen, L.B.; Hsu, C.H.; Chen, J.H.; Yang, T.C.; Lin, C.P. MedGlasses: A Wearable Smart-Glasses-Based Drug Pill Recognition System Using Deep Learning for Visually Impaired Chronic Patients. IEEE Access 2020, 8, 17013–17024. [Google Scholar] [CrossRef]
  15. Klinker, K.; Wiesche, M.; Krcmar, H. Digital transformation in health care: Augmented reality for hands-free service innovation. Inf. Syst. Front. 2020, 22, 1419–1431. [Google Scholar] [CrossRef]
  16. Klinker, K.; Wiesche, M.; Krcmar, H. Smart Glasses in Health Care: A Patient Trust Perspective. In Proceedings of the Hawaii International Conference on System Sciences, Maui, HI, USA, 7–10 January 2020. [Google Scholar]
  17. Calvo, R.A.; Peters, D. Positive Computing: Technology for Wellbeing and Human Potential; MIT Press: Cambridge, MA, USA, 2014. [Google Scholar] [CrossRef]
  18. Pawlowski, J.M.; Eimler, S.C.; Jansen, M.; Stoffregen, J.; Geisler, S.; Koch, O.; Müller, G.; Handmann, U. Positive computing: A new trend in business and information systems engineering? Bus. Inf. Syst. Eng. 2015, 57, 405–408. [Google Scholar] [CrossRef]
  19. Peters, D.; Calvo, R.A.; Ryan, R.M. Designing for motivation, engagement and wellbeing in digital experience. Front. Psychol. 2018, 9, 797–812. [Google Scholar] [CrossRef] [PubMed]
  20. Ryan, R.; Deci, E. Self-Determination Theory and the Facilitation of Intrinsic Motivation, Social Development, and Well-Being. Am. Psychol. 2000, 55, 68–78. [Google Scholar] [CrossRef] [PubMed]
  21. Diener, E.; Diener, C. Monitoring psychosocial prosperity for social change. In Positive Psychology as Social Change; Springer: Dordrecht, The Netherlands, 2011; pp. 53–71. [Google Scholar] [CrossRef]
  22. Gaggioli, A.; Riva, G.; Peters, D.; Calvo, R.A. Chapter 18—Positive Technology, Computing, and Design: Shaping a Future in Which Technology Promotes Psychological Well-Being. In Emotions and Affect in Human Factors and Human-Computer Interaction; Jeon, M., Ed.; Academic Press: San Diego, CA, USA, 2017; pp. 477–502. [Google Scholar] [CrossRef]
  23. Wüller, H.; Behrens, J. Anforderungen an Augmented Reality in der Pflege. In Systematische Entwicklung von Dienstleistungsinnovationen: Augmented Reality für Pflege und Industrielle Wartung; Wiesche, M., Welpe, I.M., Remmers, H., Krcmar, H., Eds.; Springer Fachmedien: Wiesbaden, Germany, 2021; pp. 153–169. [Google Scholar] [CrossRef]
  24. Böhle, F.; Weihrich, M. Das Konzept der Interaktionsarbeit. Z. Arbeitswissenschaft 2020, 74, 9–22. [Google Scholar] [CrossRef]
  25. Ekman, P.; Friesen, W.V. Constants across cultures in the face and emotion. J. Personal. Soc. Psychol. 1971, 17, 124–129. [Google Scholar] [CrossRef] [PubMed]
  26. Ekman, P.E.; Davidson, R.J. The Nature of Emotion: Fundamental Questions; Oxford University Press: Oxford, UK, 1994. [Google Scholar]
  27. Argyle, M. Rules for social relationships in four cultures. Aust. J. Psychol. 1986, 38, 309–318. [Google Scholar] [CrossRef]
  28. Archer, D.; Akert, R.M. Words and everything else: Verbal and nonverbal cues in social interpretation. J. Personal. Soc. Psychol. 1977, 35, 443–449. [Google Scholar] [CrossRef]
  29. Heider, F. Social perception and phenomenal causality. Psychol. Rev. 1944, 51, 358. [Google Scholar] [CrossRef]
  30. Clark, H.H.; Brennan, S.E. Grounding in communication. In Perspectives on Socially Shared Cognition; Resnick, L.B., Levine, J.M., Teasley, S.D., Eds.; American Psychological Association: Washington, DC, USA, 1991; pp. 127–149. [Google Scholar] [CrossRef]
  31. Robertson, T.; Simonsen, J. Participatory Design: An introduction. In Routledge International Handbook of Participatory Design; Simonsen, J., Robertson, T., Eds.; Routledge: New York, NY, USA, 2012; pp. 1–17. [Google Scholar] [CrossRef]
  32. Greenbaum, J.; Loi, D. Participation, the camel and the elephant of design: An introduction. CoDesign 2012, 8, 81–85. [Google Scholar] [CrossRef]
  33. Brown, T. Design thinking. Harv. Bus. Rev. 2008, 86, 84–92. [Google Scholar] [PubMed]
  34. Ball, J. The Double Diamond: A Universally Accepted Depiction of the Design Process. 2019. Available online: https://www.designcouncil.org.uk/our-resources/archive/articles/double-diamond-universally-accepted-depiction-design-process/ (accessed on 30 April 2024).
  35. Albrecht-Gansohr, C.; Geisler, S.; Eimler, S.C. Playful Co-Design: Creating an AR-Prototype with Nurses in Interlocking Remote and On-Site Workshops. In Proceedings of the Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems, Hamburg, Germany, 23–28 April 2023; pp. 1–8. [Google Scholar] [CrossRef]
  36. Klinker, K.; Przybilla, L.; Huck-Fries, V.; Wiesche, M.; Krcmar, H. Wundmanagement mittels Tablet-basierter Augmented Reality Anwendungen. In Systematische Entwicklung von Dienstleistungsinnovationen: Augmented Reality für Pflege und Industrielle Wartung; Wiesche, M., Welpe, I.M., Remmers, H., Krcmar, H., Eds.; Springer Fachmedien: Wiesbaden, Germany, 2021; pp. 245–262. [Google Scholar] [CrossRef]
  37. Klinker, K.; Przybilla, L.; Wiesche, M.; Krcmar, H. Augmented Reality für das Wundmanagement: Hands-Free Service Innovation mittels Datenbrillen. In Systematische Entwicklung von Dienstleistungsinnovationen: Augmented Reality für Pflege und Industrielle Wartung; Wiesche, M., Welpe, I.M., Remmers, H., Krcmar, H., Eds.; Springer Fachmedien: Wiesbaden, Germany, 2021; pp. 263–285. [Google Scholar] [CrossRef]
  38. PARCURA Consortium. Der Erste Prototyp. 2022. Available online: https://www.parcura.de/media/parcura_hrw_simulationsstudie_prototyp_promo.mp4 (accessed on 30 April 2024).
  39. Hennink, M.; Kaiser, B.N. Sample sizes for saturation in qualitative research: A systematic review of empirical tests. Soc. Sci. Med. 2022, 292, 114523. [Google Scholar] [CrossRef] [PubMed]
  40. Mayring, P.; Fenzl, T. Qualitative Inhaltsanalyse. In Handbuch Methoden der Empirischen Sozialforschung; Baur, N., Blasius, J., Eds.; Springer Fachmedien: Wiesbaden, Germany, 2019; pp. 633–648. [Google Scholar] [CrossRef]
  41. Ruin, S. Categories as an Expression of an Identified Observer Perspective? A Constructive Proposal for a more Qualitative Qualitative Content Analysis. Forum Qual. Sozialforschung/Forum Qual. Soc. Res. 2019, 20, Art. 37. [Google Scholar] [CrossRef]
  42. Crigger, N.; Godfrey, N. From the Inside Out: A New Approach to Teaching Professional Identity Formation and Professional Ethics. J. Prof. Nurs. 2014, 30, 376–382. [Google Scholar] [CrossRef]
  43. Hillen, M.; van Tienhoven, G.; Bijker, N.; Laarhoven, H.; Vermeulen, D.; Smets, E. All eyes on the patient: The influence of oncologists’ nonverbal communication on breast cancer patients’ trust. Breast Cancer Res. Treat. 2015, 153, 161–171. [Google Scholar] [CrossRef]
  44. Friemer, A. Digitale Technik droht? Bedroht? Wirklich nur? Kompetenzentwicklung in Veränderungsprojekten. In Digitalisierung der Arbeit in der Langzeitpflege als Veränderungsprojekt; Bleses, P., Busse, B., Friemer, A., Eds.; Springer: Berlin/Heidelberg, Germany, 2020; pp. 135–150. [Google Scholar] [CrossRef]
Figure 1. Screenshots from the prototype. The wounds are only pixelated for publication. (a) shows the last documented status. The top button bar can be used to call up the individual steps in wound treatment and documentation as described by the nursing staff involved. (b) The nurse moves the window to the position where she needs it for her work. (c) shows how a new pain score is documented. (d) An additional window to document the wound stage is activated. A video of the prototype is available at https://parcura.de/media/parcura_hrw_simulationsstudie_prototyp_promo.mp4 (accessed on 20 May 2024) [38].
Figure 1. Screenshots from the prototype. The wounds are only pixelated for publication. (a) shows the last documented status. The top button bar can be used to call up the individual steps in wound treatment and documentation as described by the nursing staff involved. (b) The nurse moves the window to the position where she needs it for her work. (c) shows how a new pain score is documented. (d) An additional window to document the wound stage is activated. A video of the prototype is available at https://parcura.de/media/parcura_hrw_simulationsstudie_prototyp_promo.mp4 (accessed on 20 May 2024) [38].
Virtualworlds 03 00011 g001
Figure 2. Study setup: overview of the steps of the study and methods used.
Figure 2. Study setup: overview of the steps of the study and methods used.
Virtualworlds 03 00011 g002
Figure 3. The three found window positioning patterns and their distribution in the sample.
Figure 3. The three found window positioning patterns and their distribution in the sample.
Virtualworlds 03 00011 g003
Figure 4. Special work situations at the bedside with the HoloLens 2: (a) Participant reaches over the patient to interact with the system by near gestures. (b) Participants raises visor to maintain eye contact with the patient.
Figure 4. Special work situations at the bedside with the HoloLens 2: (a) Participant reaches over the patient to interact with the system by near gestures. (b) Participants raises visor to maintain eye contact with the patient.
Virtualworlds 03 00011 g004
Table 1. Coding scheme.
Table 1. Coding scheme.
Inductive CodesDeductive Codes
Screen 1: Patient selectionAutonomy
Screen 2: Wound selectionCompetence
Screen 3: Patient informationConnectedness
Screen 4: View of tabsNeutral statement
Screen 5: Wound imagePositive statement
Screen 6: Documentation inputNegative statement
ConcernsMotion Sickness
SuggestionsSuccessful interaction
ExpectationsInteraction with problems
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Albrecht-Gansohr, C.; Timm, L.; Eimler, S.C.; Geisler, S. An Augmented Reality Application for Wound Management: Enhancing Nurses’ Autonomy, Competence and Connectedness. Virtual Worlds 2024, 3, 208-229. https://doi.org/10.3390/virtualworlds3020011

AMA Style

Albrecht-Gansohr C, Timm L, Eimler SC, Geisler S. An Augmented Reality Application for Wound Management: Enhancing Nurses’ Autonomy, Competence and Connectedness. Virtual Worlds. 2024; 3(2):208-229. https://doi.org/10.3390/virtualworlds3020011

Chicago/Turabian Style

Albrecht-Gansohr, Carina, Lara Timm, Sabrina C. Eimler, and Stefan Geisler. 2024. "An Augmented Reality Application for Wound Management: Enhancing Nurses’ Autonomy, Competence and Connectedness" Virtual Worlds 3, no. 2: 208-229. https://doi.org/10.3390/virtualworlds3020011

APA Style

Albrecht-Gansohr, C., Timm, L., Eimler, S. C., & Geisler, S. (2024). An Augmented Reality Application for Wound Management: Enhancing Nurses’ Autonomy, Competence and Connectedness. Virtual Worlds, 3(2), 208-229. https://doi.org/10.3390/virtualworlds3020011

Article Metrics

Back to TopTop