Next Article in Journal
Theoretical and Experimental Studies of Acoustic Reflection of Bubbly Liquid in Multilayer Media
Previous Article in Journal
Investigation of the Behaviors of Methanol Spray Impingement and Wall Wetting
Previous Article in Special Issue
Augmented Reality and Virtual Reality in Dentistry: Highlights from the Current Research
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Gamification of Upper Limb Rehabilitation in Mixed-Reality Environment

by
Aditya Pillai
1,
Md Samiul Haque Sunny
2,*,
Md Tanzil Shahria
2,
Nayan Banik
2 and
Mohammad Habibur Rahman
1,2,3
1
Biorobotics Lab, University of Wisconsin-Milwaukee, Milwaukee, WI 53212, USA
2
Computer Science, University of Wisconsin-Milwaukee, Milwaukee, WI 53211, USA
3
Mechanical Engineering, University of Wisconsin-Milwaukee, Milwaukee, WI 53211, USA
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(23), 12260; https://doi.org/10.3390/app122312260
Submission received: 26 October 2022 / Revised: 24 November 2022 / Accepted: 24 November 2022 / Published: 30 November 2022
(This article belongs to the Special Issue AR, VR: From Latest Technologies to Novel Applications)

Abstract

:
The advancements in mixed reality (MR) technology in recent years have provided us with excellent prospects for creating novel approaches to supplement conventional physiotherapy to maintain a sufficient quantity and quality of rehabilitation. The use of MR systems to facilitate patients’ participation in intensive, repetitive, and task-oriented practice using cutting-edge technologies to enhance functionality and facilitate recovery is very encouraging. Multiple studies have found that patients who undergo therapy using MR experience significant improvements in upper limb function; however, assessing the efficacy of MR is challenging due to the wide variety of methods and tools used. Because of these challenges, a novel approach, gamified MR-based solution for upper extremity rehabilitation, is proposed, which is an MR application for the Microsoft HoloLens 2, complete with game levels, and can measure the ranges of motion of the arm joints. The proposed rehabilitative system’s functionality and usability were evaluated with ten healthy adult participants with no prior arm-related injuries and two occupational therapists (OTs). The system successfully provided rehab exercises for upper limb injuries through interactive mixed-reality games. The system can mimic upper limb behavior without additional sensors during rehab sessions. Unlike previously researched technologically-based rehabilitation methods, this method can integrate arm–joint data within the application and are independent of one another. The results and comparisons show that this system is relevant, accurate, and superior to previous VR-based rehabilitation methods because the VR-based system is blind to the surroundings, whereas the proposed approach has spatial awareness of the environment.

1. Introduction

Upper extremity (UE) joint injuries are a significant concern because they can produce a limited range of motion inside the arm joints, which can hinder a person’s ability to conduct day-to-day chores. In the emergency room, 20% of patients presented with injuries to their upper extremities [1]. These injuries can be painful and quickly elevated if individuals do not receive the proper treatment. Patients will continue to use their arms and hands until the discomfort and loss of movement become intolerable [2].
To help patients recover from these injuries, therapists, in general, create a personalized treatment plan that includes physical therapy, exercises, and stretches to boost their body’s healing process. During each session, patients have to go through a series of activities, again and again, involving the upper extremity’s injured part to keep joints and muscles from tightening up and becoming weaker. By monitoring their patients’ activity in rehabilitation, therapists can ensure that their patients are making progress and benefiting from the treatment. However, the most common approaches to rehabilitation after such injuries have limitations that may slow the patient’s progress. The motions used in today’s treatments for joint injuries rarely provide any additional stimulation to keep the patient engaged in the appropriate rehabilitation [3]. A patient’s recovery might be delayed if they lose interest in their rehab because of monotony [4]. Most home-based rehabilitation programs also rely heavily on patient compliance with therapy sessions. Unfortunately, there are few efficient methods for occupational therapists, as most forms of rehabilitation therapy consist of just the most fundamental activities.
In general, rehabilitation approaches for arm injuries of this nature are designed based on experiences from the prior. Therefore, there is a pressing need for an innovative approach to rehabilitation. One example is implementing sensor-based rehabilitation, a sort of therapy with various variations. For instance, Inertial Measurement Units (IMUs), a type of sensor that measures acceleration and rotation, are used to record the arm’s motions and to follow the patient’s rehabilitation progress [5]. There, however, the patient must correctly position the IMUs on the arm to measure certain variables, such as the range of motion at various joints. Surface electromyography (sEMG) is an additional form of sensor-based therapy. This type of therapy uses EMG sensors to monitor the electrical gradient along muscles as they contract to give treatment for various conditions. However, a slight change in the sensor’s placement can affect its precision [5]. Neither of these methods offers a way to keep the patient engaged and motivated, damaging the patient’s recovery.
Due to these challenges in the field, we propose a new approach, a gamified Mixed Reality (MR)-based solution for upper extremity rehabilitation. This solution is fundamentally an MR application for the Microsoft HoloLens 2, including game levels comparable to a video game and an algorithm for measuring the ranges of motion of the various arm joint degrees of freedom. Gamification was chosen as the answer to this problem since research has shown that it can improve learning in various contexts by increasing engagement levels [6]. Choosing MR as the foundation has benefited the field of rehabilitation, especially gait rehabilitation [7], because it comes with spatial awareness, which gives more safety over the existing VR-based rehabilitation. The patient will be able to perform rehabilitation in the comfort of their own home due to the proposed solution’s exceptional portability. The OT/clinicians can monitor the patient’s engagement and progress in therapy by examining the information in real-time collected throughout each session. Furthermore, using a unique algorithm and input from the OT/clinician, the program is patient-specific and can adjust to the patient’s individual needs.
The remaining portion of this manuscript is arranged as follows: Section 2 includes recent advancements in similar fields, while Section 3 provides a detailed illustration of the system’s development. The sequence of different events is laid out in Section 4. Section 5 contains the experimental procedures and outcomes and a detailed discussion of the proposed approach and the criteria for evaluating the proposed gamified system. As the last step, the conclusion of the manuscript was portrayed in Section 6.

2. Related Works

Conventional therapy for patients with UE joint injuries requires long-term commitments from the patient, active involvement of the therapists, and specialized resources in a controlled environment. In this approach, the patient must perform routine therapeutic exercises that are non-engaging and repetitive. Such tasks without motivation make the patient reluctant to participate in the therapy sessions. In addition to that, the devices used in conventional therapy are expensive and limited to work under the supervision of trained professionals. However, these manual devices lack sensors to record therapy session data to be analyzed. Therefore, the need for cheap home-based tech-assisted therapeutic solutions is an active research domain among researchers [8].
Recent advancements in a gamified approach by Virtual Reality (VR)/Augmented Reality (AR)/Mixed Reality (MR) technology improve UE rehabilitation, overcoming the drawbacks of the conventional counterpart [9]. In this model, a patient can get home-based therapeutic exercises without the continuous assistance of a therapist. The game’s combination of natural and virtual objects and real-time interaction with them provide a familiar environment. Moreover, the patient can receive automated guidance and correction from reliable feedback, which ensures motivation and reward-based engagements. From a neurological point of view, stimulation in brain plasticity during gameplay improves motor functions [10]. Continuous state recording ensures progress tracking of the patient and can also be analyzed for other patients. Moreover, unlike manual hardware, the virtual environment can be modified, and challenging in-game activities can be incorporated based on the patient’s abilities. As the gamified therapy system is accessible and provides many advantages over the conventional system, ongoing works aim at different tools and techniques to develop such systems.
VR-based games for therapy portray a computer-generated environment where the patient can act using the virtual presence of the human-resembling model. AR incorporates the interaction of those virtual objects using smart handheld devices. The latest addition to this lineage is MR technology, where the real-world environment can co-exist with the virtual world, and the communication between the two happens in real-time. These emerging gamified therapy approaches provide a multimedia-based realistic environment for the patient to induce recovery proactively, discover unknown phenomena through exploration, increase willpower to accept challenges and participate in an inclusive environment improving social communication [11].
VR/AR/MR-based games can be classified based on the degree of immersion and commercial availability. Typically, non-immersive VR games are displayed on a computer screen where the patient can interact with the environment differently using simple hardware devices such as a mouse, joysticks, and haptic devices. Contrary to that, AR/MR games are immersive because they combine the virtual and natural worlds with improved sensory outputs. Such systems require big-screen projection, sophisticated head-mounting display, and various sensors to capture the actions [12]. Moreover, VR/AR/MR games come in commercial and open-source domains. Some notable commercially released games are from Microsoft Xbox Kinect, Nintendo Wii, and Sony’s Playstation Move. Although proprietary games are developed for healthy people and lack task-specific modifications, existing works examine the efficacy of such games on UE therapy based on randomized controlled trials (RCT) with evaluation measures on different scales [13].
Existing literature shows that work in VR-based UE therapy is an ongoing process for patients with chronic neurological and musculoskeletal diseases, including stroke [14], multiple sclerosis, and cerebral palsy. Non-immersive VR therapy for UE was applied in [15], where the authors used commercial VR-based software MIRA with Kinect sensors. In this RCT, among the fifty-two participants, the experimental group received two types of games from the functional category targeted for movement coordination and progressive-level difficulties. The in-person therapist performed the game assignments based on the patient’s functional capacity. Several assessment scores suggested that properly designed and functionally adapted standalone gamified VR therapy can improve UE motor functions in acute and chronic stroke patients.
User experience-centric research on gamification showed that it is effective in multiuser, support-group settings. The authors of [16] used the Jintronix Rehabilitation System with a Kinect camera for tracking limb movements where patients could choose exercises from the prescribed four to six games depending on the on-site presence of the physiotherapist. Based on the assessment scores, the work discussed that VR-based therapy is reliable, cheap, and effective even remotely monitoring a support-group scenario. Although a prior study in [17] claimed that Kinect-based therapy was ineffective compared to VR-based therapy, authors of [18] used a home-based therapeutic solution by utilizing a Kinect-enabled device to develop Unity3D-based gamified intervention for stroke patients. The games (ball bump, trajectory tracing, and food fight) involved manipulating virtual objects and providing hand movement to the patients. The results showed that multiuser VR therapy in the home environment is more effective than the single-user system.
Gamified therapy showed improvements in different UE motor functions based on the standard Fugl–Meyer Assessment-Upper Extremity score (FMA-UE). A recent RCT in [19] used a Leap Motion Controller (LMC)-based device to play games targeting different hand movements. The multilevel leap ball game was developed to provide hand grasp movements such as wrist and finger flexion–extension. The Pong game was used for wrist-elbow movements such as radial-ulnar deviation, elbow flexion–extension, and forearm pronation–supination. The results showed an increased range of motion to the overall hand and, specifically, wrist movements. However, the author claimed that although the VR-based approach improved FMA-UE and other scores, it was less effective in strengthening the grip. Contrary to that, work in [20] used an LMC device as hardware and the Unity3D game engine to develop the game to provide seven functional tasks in the hemiplegic UE.
The effect of the gamified approach to UE functions on early-stage stroke patients showed satisfactory results [21]. Moreover, Nintendo Wii-based gamified therapy increased vision and motor performances, as shown in [22,23]. However, the authors of [24] claimed the opposite when they performed RCT comparing Nintendo Wii-enabled VR-based therapeutic exercises with occupational therapy for early-stage stroke patients. The patients played three sports games (swordplay, table tennis, and canoe), and the experimental results showed that although it improved the UE functions and ADL, this device-dependent VR-based therapy is less effective during the early stage of stroke than conventional therapy.
Incorporating VR-based gamified therapy into conventional therapy is considered effective in improving UE motor functions [25,26,27]. Authors in [28] claimed the same, where they included traditional therapy and VR game-based task-oriented therapy using the Armeo Spring device for chronic stroke patients. In another RCT for multiple sclerosis patients [29], authors developed six serious games using Unity3D with the clinician’s guidelines reflecting activities commonly performed in traditional rehabilitation, such as hand pronation–supination and finger flexion–extension. Based on their assessment, LMC-based gamified UE therapy and conventional ones can enhance gross and fine manual dexterity and hand coordination. A combination of physiotherapy and Xbox Kinect-based VR training can improve UE for chronic stroke patients [30,31]. One of the recent RCTs in [32] used Xbox Kinect 360 devices with three preloaded games (tennis player, joy riding, rally ball) from the Kinect adventure and sports packs. The work concluded that Kinect games could be applied as a part of conventional therapy to improve UE stroke patients.
Comparative analysis of available AR games on UE is necessary for physicians to prescribe depending on the patient’s condition. Studies show that conventional and VR-based therapy dropout rates are 12–56% and 10%, respectively [33,34]. It is also essential for game developers to understand the existing limitations and propose modifications. Research in this domain shows that task-specific actions are beneficial for certain games when authors in [35] performed an RCT with Microsoft’s Kinect V2 sensor-based three AR multilevel games (balance it, bubble pop, scoop’d). Based on their results, the ’balance it’ game provided the best performances as the patient had to be active and concentrate on balancing the virtual objects.

3. Development

3.1. Concept

The motivation behind this project was to construct a system that could be used by patients to complete therapy at home. With the help of this solution, patients could use a gamified and fully digital system to complete their rehabilitation at home while still being monitored and supervised by OTs. This proposed system has many unique features that distinguish it from existing MR/VR rehabilitation systems. This system is “sensorless” in the sense that the patient does not need any additional sensors in order to complete upper extremity rehabilitation, except those already included in the HoloLens 2. This is important since many other telerehabilitation techniques rely on sensors for data collection. In the case of a home rehabilitation system, the patient may inadvertently collect erroneous data if the sensors are not set up properly. Therefore, this was a pivotal point in the development of this project.

3.2. Components

Despite being a single MR application, this system incorporates several critical components. Not only must the video game be able to run without hiccups, but the data must also be processed and packaged in a way that is suitable for review by the administrative OT. The following is a list of components that each play an important role in the functioning of the aforementioned system. Figure 1 shows the components and how they are connected to result in the development of the proposed system.

3.2.1. HoloLens 2

Microsoft’s HoloLens 2 [36] is a pair of fully immersive MR smartglasses released in 2019 with upgraded features from its predecessor HoloLens (1st gen). Running on Windows Holographic OS, HoloLens 2 is packed with many sensors, including light cameras for hand tracking, IR cameras for eye tracking, depth sensors, IMU (accelerometer, gyroscope, magnetometer), 5-channels microphone, and built-in speakers. Moreover, this lightweight, untethered device provides a resizeable field-of-views screen, real-time multimedia sharing, and monthly OS updates. Some typical use cases for HoloLens 2 include remote assistance, visual teaching and learning with holographic instructions, and building apps for different domains.

3.2.2. Unity and Visual Studio

Unity [37], a popular game design software, served as the primary platform for developing the proposed system. Because of its continued support for the development of MR-based HoloLens 2 applications, the 2019.4.33f1 version with C# support was used. The main reason for using Unity for this project’s development is its simple interface and compatibility with the development of MR-based applications. In addition, in conjunction with Unity, the integrated development environment (IDE) Visual Studio [38] was used to not only program the actual video game portion of the system but also to export builds of the video game to the HoloLens 2 for testing. The use of Visual Studio over other IDEs was primarily due to its relatively seamless compatibility with Unity as an IDE.

3.2.3. Mixed Reality ToolKit (MRTK)

This open toolkit provided by Microsoft serves as the foundation for the majority of MR applications available for the HoloLens 2. MRTK [39] essentially provides core MR application features such as sample user interface (UI) game objects and scripts detailing player–environment interactions. Both of these aspects of the MRTK, as well as a number of other useful features, make it an essential component for not only creating individual levels of the video game but also designing algorithms to measure the range of motion (ROM) values of different joints. Certain MRTK components, such as buttons and sliders, were used repeatedly during the development process.

3.3. Gamification

The button and slider UI game objects were extremely useful for navigating and interacting with the system’s video game portion. Though they served other minor functions in the system, buttons were primarily used for navigating the game’s main menu, as they allowed the user to indicate which part of their body they wanted to test. The pathway created for this purpose works by connecting each button to a new screen that corresponds to the press of that particular button; for example, if the user clicks the hand button within the main menu scene that asks, “which part of the upper extremity do you want to test?”, the user will be guided to a new main menu scene that may now ask “which finger do you want to test?” The slider’s primary function in our system was to interact with the upper arm levels. These levels differ from the others in that the main joints being tested are the shoulder and the elbow, and because the HoloLens 2’s camera is incapable of recognizing said joints, a newly developed algorithm had to be implemented. This algorithm can estimate the positions of the shoulder and elbow joints as well as the ROM for all three degrees of freedom (DoFs) of the shoulder and the flexion and extension of the elbow. However, for this algorithm to function properly, it required some of the user’s personal information, specifically upper arm length, forearm length, shoulder width, and distance between the eye and the clavicle. The sliders were useful in this situation because they allowed the user to input these personalized dimensions at the start of each upper arm level, resulting in accurate data being collected by the system.
The levels designed for this application are intended for use in a rehabilitative setting. To begin, the levels are organized by upper extremity part—upper arm, wrist, or finger—and are designed in such a way that people of all ages can easily participate in the exercises. Figure 2 shows some examples of the game levels. This was accomplished by basing the levels on very basic and rudimentary tasks. Among these tasks are shooting targets, constructing a structure, and completing a puzzle. In order to be as clear as possible about the given level, instructions for completing the tasks appear before the level begins.
Furthermore, levels were divided into three difficulty levels—easy, medium, and hard—to provide the user with a challenge based on their current accessible ROM for a specific joint. A patient with very limited ROM, for example, may play an “easy” level, whereas a patient much further into their rehabilitation journey, and thus with much greater ROM, may play a “hard” level. To see more information on the categorization of the different levels shown below, see Table 1. Furthermore, additional challenge levels have been developed to provide patients with a timed challenge as a means of both testing their collaborative abilities and providing a source of competition for more competitive patients.
The scripts are useful for more technical aspects of the system, such as calculating upper extremity ROM and player–object interactions. RadialView and SolverHandler were especially helpful in configuring the algorithms that measure the ROMs of the Upper Extremity. This is due to the fact that they were designed to recognize the player’s hand, finger joints, and wrist. As a result, those C# scripts were used to determine the ROM of the finger joints, the wrist joint, and, via a custom C# script created as part of the system, the elbow joint. Then, in the video game portion of the system, the NearInteractionGrabbable and ObjectManipulator C# scripts were used to implement player–object interactions. This applies to every aspect of the video game, including UI elements such as sliders and buttons, as well as actual in-game objectives such as the cabbages in the level shown in Figure 2e.

4. Event Sequence

4.1. Game Creation and Launch

After creating a video game that can aid in rehabilitating arm–joint injuries, it is deployed to the MR environment through a HoloLens 2. Figure 3 shows the game levels in MR where the user can observe spatial awareness. After determining which tasks will require players to use their elbow and shoulder joints, the user needs to select levels in which the player must perform actions that involve using both of the aforementioned arm joints.

4.2. Upper Extremity Joint Measurement

A C# script uses the positions of the user’s headset and hands and input from the player on their body dimensions to calculate the DoFs of the shoulder and elbow joints. The total number of DoFs measured is four. Three are from the shoulder (vertical flexion/extension, abduction/adduction, and internal/external rotation) and one is from the elbow (flexion/extension). The first step in this process was to create a user interface (UI) for players to enter information about their bodies. This was accomplished through the MRTK-provided slider UI shown in Figure 4, which allows participants to enter a value in meters (m) ranging from 0.00 m to 1.00 m for their forearm length, upper arm length, shoulder width, and the distance between their eye level and clavicle bone. This value was chosen and then accessed by methods used to determine the positions of various joints.
The second step in this process was to locate the shoulder and elbow joints and determine the different DoFs. It is important to note that the x-axis in Unity represents the horizontal direction, whereas moving to the right is positive. Furthermore, the y-axis represents the vertical direction, with upward movement being positive, and the z-axis represents the depth, with forward movement being positive. The user inputs were used to determine the shoulder position. A new game object would then be instantiated at the modified x and y positions while retaining the same z position, as we assumed that the z position would remain constant due to the headset and shoulder being in the same plane, given that the user’s neck would have to be straight to interact with the game appropriately.
The elbow position, on the other hand, was calculated using the hand’s position and the forearm length input. This was done using the Ray class [40] within the UnityEngine namespace. The origin point and ray’s intended path must be determined before the ray can be built. Both were discovered using the MRTK and the HoloLens 2’s hand-tracking capabilities in Unity. After the ray was constructed, the elbow position was determined using the Ray class’s GetPoint method. Based on user input, the parameter distance was set to the distance between the wrist and the middle knuckle plus the forearm length. A new elbow game object was created at the specified position. This method, however, only determines an accurate elbow position when the top of the hand is parallel to the forearm. To control this factor, subjects were instructed to wear a wrist brace to straighten their wrists.

4.3. Finger and Wrist Joint Measurement

The method used to record the ROM values for the flexion and extension of the finger joints (metacarpophalangeal (MCP), proximal interphalangeal (PIP), and distal interphalangeal joints (DIP)) was far less complex than that used to record the ROM values of the upper arm joints. Furthermore, for the wrist joint, the proposed system can calculate the two DoFs—wrist flexion/extension and radial/ulnar deviation—with high precision. When viewing the user’s hand, the HoloLens 2 can locate the different finger joints using the SolverHandler script. Figure 5 shows the visual representation of the hand calibration in mixed reality. From the estimated positions of the finger joints—MCP, PIP, and DIP—the ROM can be calculated. The following example clearly displays how the ROM-calculating algorithm functions. Let us say that the algorithm is calculating the index finger’s PIP joint. Then it will locate the two-game objects on the index finger, directly next to the joint of interest. In this case, the game objects are initialized at the user’s MCP and DIP joints. For the other two cases, the game objects next to the joint of interest are not always other joints of the finger; for example, for the DIP, the two game objects directly next to the joint of interest are the PIP and the finger tip, and, for the MCP, the two game objects directly next to the joint of interest are the PIP and the finger’s respective metacarpal bone. Then, the algorithm calculates the individual distances between the user’s joint of interest and the two game objects directly next to it, and the distance between the two game objects next to the joint of interest. The law of cosines formula is then used to calculate the flexion/extension ROM of the finger joint. The recording of the wrist’s degrees of freedom also utilizes the same script but is a bit more complex than the finger. The first part of the method uses the script to identify different areas of the hand: the wrist and the palm. When this script is initialized, game objects with the script attached will be spawned in the middle of the user’s palm and on the user’s wrist. With this information, the system calculates the wrist’s different DoFs’ ROMs.
However, for the system to produce accurate results when recording the wrist’s DoFs’ ROM, the forearm must be facing forward, and the upper portion of the arm must be kept in a still manner, which is a potential source of error. Because the cardinal axes are stationary in the gamified environment, if the arm moves in a certain way or is not facing forward, the recording system may incorrectly measure the wrist’s DoFs’ ROM. However, a solution to this problem was found; it involves an experimental scenario. This solution is a setup that the user can deploy in order to yield accurate results from the wrist ROM algorithms of the system. The setup makes use of a table or other flat, stationary surface for the user to place their elbow upon. The user will then sit next to this surface in a chair and place their elbow upon the surface as they play the levels that target the wrist’s DoFs’ ROM. For the calculations to be accurate, the user should avoid utilizing their upper arm and moving their elbow such that the variations within the data are solely caused by the user’s wrist movements.

4.4. Data Transfer

Because the system comprises many different components, it is critical to understand how each component fits into the data processing and exporting structure. Figure 6 depicts the system’s data flow. The third and final step was to export these data to a CSV file in the HoloLens 2 directory. This was accomplished by assigning each value to a different column and then utilizing the value generated by the system. Scripts were tasked with creating a CSV file at the specified parameter path. The path was set to a folder within HoloLens 2 where permissions for access to HoloLens 2 applications can be modified. Furthermore, components of this system will be classified based on how they affect data flow and whether they are critical to transporting quantitative ROM data.

5. Experiments and Results

The functionality and usefulness of the proposed rehabilitation system were examined with ten healthy adult volunteers with no past arm-related injuries and two occupational therapists (OT) (n = 12). The average age, height and body weight of the participants were 26.3 ± 2.6 years, 175 ± 10 cm and 135 ± 25 lb, respectively. However, the target population of the proposed system includes patients with Motricity Index (MI) scores (50–100) focusing on pinch grip, and elbow flexion and shoulder abduction muscle force, mobilization of upper limbs. Figure 7 depicts users using the HoloLens 2 to play interactive games. The system could perform as anticipated and accurately record the joints of interest in the arm. Without any glitches in data collecting, the algorithm produced a CSV file with all the information. An informal assessment of the participants’ post-control testing showed excellent user experience with minimal to no technical issues. The game’s goal was clearly explained to the participants and was well-accepted as entertaining and engaging. A breakdown of the game’s difficulty by DoFs and ROM is provided in Table 1.
While the participant was playing the level shown in Figure 2b, in-game text displays upper arm joint ROM, as shown in Figure 8. As the Kinect V2 sensor is proven to be effective [41,42,43] in measuring upper-body kinematics, the accuracy of the upper limb joint was determined by comparing arm motion data statistically with the data calculated based on the Kinect sensor data while the users were playing the games [44]. Figure 9 and Figure 10 display the joint angles a user exhibited throughout a session of playing the game displayed in Figure 2a as measured by the proposed system and the joint angles generated by the MATLAB program (based on Kinect sensor data) for the same user, respectively.
Table 2 shows the average 1-sample t-test p-values for all ten control testing individuals to assess MR-game precision and effectiveness. As seen in the table, all of the p-values are above the proposed α value of 0.05. Due to this, the 1-sample T-test shows that the ROM values recorded by both the MR system and MATLAB program are statistically insignificant of each other, in turn, validating the accuracy of the MR systems capability to correctly record the user’s upper arm joints’ DoFs’ ROMs.
This “sensorless” system was achieved by developing custom algorithms for measuring the joints’—of the upper extremity, that is—DoFs’ ROMs. This was done using C# scripts, RadialView, and SolverHandler, provided by the MRTK, as explained in Section 2. These two programs located some joints of interest for the proposed system, such as the finger and wrist joints. Then, using mathematical algorithms based on certain geometric principles, the ROMs of the finger and wrist joints were easily found. Recognizing and finding the ROM of other joints served to be more challenging. These particular joints, the shoulder and elbow, required custom programs due to the HoloLens 2 not being able to recognize them upon gazing at those joints. These programs relied heavily on the user’s body dimensions to make assumptions about the user’s structure and estimate the position of the elbow and shoulder position. Once these estimations were made, finding the ROMs of the shoulder and elbow was simple and relied on mathematical algorithms similar to those used for determining the finger and wrist joints’ DoFs’ ROM.
Gamified approach to UE rehabilitation is evaluated from both objective and subjective perspectives. Objective measures include comparable in-game performances through task completion, level achievements, badge collection, milestone crossing, and the number of hours spent playing the game in a timeframe. The subjective measures are surveys among the participants about their post-game experience and feedback from the OT.
Traditional games provide engagement to the players by offering several in-game achievements, depicting the status of the player. Among them, task completion is a primary criterion to measure success. Some tasks are the virtual representations of ADLs performed by the in-game avatar, such as cleaning dishes, brushing teeth, combing hair, moving objects, etc. Tasks for a UE patient may include different movements of joints such as shoulder abduction–adduction, elbow flexion–extension, and forearm pronation–supination to make changes to the in-game virtual objects. Though the binary interpretation of task completion can be recorded as successful or unsuccessful, scaling them on a predetermined scale helps analyze the data for suggesting improvements.
Qualitative evaluation of the gamified system involves the patients’ views towards the game and opinions from the OT. Before using the system, a questionnaire is given to the patient to collect their study expectations. It includes basic knowledge of using a virtual reality environment, prior experience with gamified rehab systems, and the study’s time frame commitments. It is required as the post-game questionnaire helps analyze whether the system can satisfy the patients’ initial expectations. Similarly, feedback from OT is recorded in a post-therapy survey form where they evaluate the current system and provide necessary suggestions to improve. Some aspects of the survey include game complexity, learning difficulty, ease of use, and integration of required and consistent functionalities. In the proposed system, this qualitative evaluation was performed on a Likert scale among the participants. The healthy participants provided evaluation based on a 1–5 scale (1=strongly disagree, 2 = disagree, 3 = somewhat agree, 4 = agree, 5 = strongly agree), and the OT gave their assessment on a 0–5 scale (0 = minimum and 5 = maximum). The individual scores were averaged, and the final summative assessment scores are given in Table 3. From Table 3, it can be seen that the average score from the healthy participants is between 4.5 and 4.8, signifying the effectiveness of the proposed system. Similarly to that, the response from the OT participants shows promising scores between 4.6 to 4.9.

6. Conclusions

Repetitive tasks of the same complexity fail to generalize the improvement of UE motor functions. Therefore, increasing the complexity of different tasks and actions to complete them can be judged by the different achievement levels in a game. In such an assessment, patients’ in-game levels better reflect their engagement and intention to improve their situation. Another way of evaluation is the number of badges a patient collects in the game. A badge is a visual marker of success that can be unlocked by completing tasks or achieving levels, and the player can add them to their profile. Similarly to that, milestones or checkpoint crossing shows the patient’s involvement and improvement by playing the game. However, currently, those games are not open source. They will be publicly released after further testing and evaluation.
A novel approach to upper extremity rehabilitation by developing a gamified MR-based solution for the Microsoft HoloLens 2 is proposed in this study, which is an MR application with game levels that can measure the ranges of motion of the arm joints. Ten adults with no history of arm injuries served as test subjects for the proposed rehabilitation system, along with two occupational therapists. During rehabilitation sessions, the system can simulate the actions of the upper limb without needing external sensors. It integrates arm joint data in real-time into the application while remaining standalone, making it unique among technologically-based rehabilitation methods. By comparing and contrasting, we find that the spatial awareness feature of the mixed-reality environment, which provides safety to the users, makes this system superior to earlier VR-based rehabilitation techniques. Furthermore, it is significantly more relevant and accurate than the existing MR-based rehabilitation. The overall experimental results obtained from the participants indicate a promising solution that can be used for individuals with UE motor dysfunctions. Although participants provided positive feedback about the proposed MR game for its user-friendly design, we acknowledge that it is difficult to make generalized statements about the findings with only the healthy participants. In future studies, people with upper limb mobility impairments are necessary to derive robust conclusions about the proposed system. Furthermore, there are some uncertainties with the participants’ feedback as it is not objective. The next step will be to test the system in a clinical setting with individuals with upper extremity dysfunctions.

Author Contributions

Conceptualization, methodology, software, A.P.; validation, investigation, formal analysis, A.P., M.S.H.S., M.T.S. and N.B.; resources, supervision, project administration, M.H.R.; writing—original draft preparation, A.P., M.S.H.S. and M.H.R.; visualization, A.P. and M.S.H.S., writing—review and editing, A.P., N.B., M.S.H.S. and M.H.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki and approved by the Institutional Review Board of the University of Wisconsin-Milwaukee (protocol UWM IRB 22.220 and the approval date is August 08/01/2022).

Informed Consent Statement

Informed consent was obtained from all participants involved in the study. Written informed consent has been obtained from the participants to publish this paper.

Data Availability Statement

The data generated during and/or analyzed during the current study are available from the corresponding author upon reasonable request.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
UEUpper Extremity
VRVirtual Reality
ARAugmented Reality
MRMixed Reality
OTOccupational Therapists
IMUsInertial Measurement Units
sEMGSurface Electromyography
RCTRandomized Controlled Trial
ARATAction Research Arm Test
WMFTWolf Motor Function Test
LMCLeap Motion Controller
FMA-UEFugl–Meyer Assessment-Upper Extremity
VRTEVR-based Therapeutic Exercise
ADLActivity of Daily Living
VTVirtual Therapy
BRSBrunnstrom Recovery Stage
BIBarthel Index
UIUser Interface
ROMRange of Motion
DoFsDegrees of Freedom

References

  1. Gideroğlu, K.; Sağlam, İ.; Çakıcı, H.; Özturan, K.; Güven, M.; Görgü, M. Epidemiology of the hand injuries in Bolu region: A retrospective clinical study. Abant. Med. J. 2012, 1, 13–15. [Google Scholar] [CrossRef]
  2. Upper Extremity Injury. 2022. Available online: https://www.jonathanohebmd.com/services/upper-extremity-injury (accessed on 23 October 2022).
  3. Jaggi, A.; Alexander, S. Suppl-6, M13: Rehabilitation for shoulder instability–current approaches. Open Orthop. J. 2017, 11, 957. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Kenah, K.; Bernhardt, J.; Cumming, T.; Spratt, N.; Luker, J.; Janssen, H. Boredom in patients with acquired brain injuries during inpatient rehabilitation: A scoping review. Disabil. Rehabil. 2018, 40, 2713–2722. [Google Scholar] [CrossRef]
  5. Maceira-Elvira, P.; Popa, T.; Schmid, A.C.; Hummel, F.C. Wearable technology in stroke rehabilitation: Towards improved diagnosis and treatment of upper-limb motor impairment. J. Neuroeng. Rehabil. 2019, 16, 1–18. [Google Scholar] [CrossRef] [PubMed]
  6. Smiderle, R.; Rigo, S.J.; Marques, L.B.; Peçanha de Miranda Coelho, J.A.; Jaques, P.A. The impact of gamification on students’ learning, engagement and behavior based on their personality traits. Smart Learn. Environ. 2020, 7, 1–11. [Google Scholar] [CrossRef] [Green Version]
  7. Held, J.P.O.; Yu, K.; Pyles, C.; Veerbeek, J.M.; Bork, F.; Heining, S.M.; Navab, N.; Luft, A.R. Augmented reality–based rehabilitation of gait impairments: Case report. JMIR mHealth uHealth 2020, 8, e17804. [Google Scholar] [CrossRef] [PubMed]
  8. Chen, J.; Or, C.K.; Chen, T. Effectiveness of Using Virtual Reality–Supported Exercise Therapy for Upper Extremity Motor Rehabilitation in Patients with Stroke: Systematic Review and Meta-analysis of Randomized Controlled Trials. J. Med. Internet Res. 2022, 24, e24111. [Google Scholar] [CrossRef] [PubMed]
  9. Aguilera-Rubio, Á.; Cuesta-Gómez, A.; Mallo-López, A.; Jardón-Huete, A.; Oña-Simbaña, E.D.; Alguacil-Diego, I.M. Feasibility and Efficacy of a Virtual Reality Game-Based Upper Extremity Motor Function Rehabilitation Therapy in Patients with Chronic Stroke: A Pilot Study. Int. J. Environ. Res. Public Health 2022, 19, 3381. [Google Scholar] [CrossRef]
  10. Valentin, L.S.S. Can Digital Games Be a Way of Improving the Neuroplasticity in Stroke Damage? Can the Adult Brain Grow New Cells or Rewire Itself in Response to a New Experience? Open J. Med. Psychol. 2017, 6, 153–165. [Google Scholar] [CrossRef] [Green Version]
  11. Leong, S.C.; Tang, Y.M.; Toh, F.M.; Fong, K.N. Examining the effectiveness of virtual, augmented, and mixed reality (VAMR) therapy for upper limb recovery and activities of daily living in stroke patients: A systematic review and meta-analysis. J. Neuroeng. Rehabil. 2022, 19, 1–20. [Google Scholar] [CrossRef]
  12. Phan, H.L.; Le, T.H.; Lim, J.M.; Hwang, C.H.; Koo, K.i. Effectiveness of augmented reality in stroke rehabilitation: A Meta-Analysis. Appl. Sci. 2022, 12, 1848. [Google Scholar] [CrossRef]
  13. Fang, Z.; Wu, T.; Lv, M.; Chen, M.; Zeng, Z.; Qian, J.; Chen, W.; Jiang, S.; Zhang, J. Effect of traditional plus virtual reality rehabilitation on prognosis of stroke survivors: A systematic review and meta-analysis of randomized controlled trials. Am. J. Phys. Med. Rehabil. 2022, 101, 217–228. [Google Scholar] [CrossRef] [PubMed]
  14. Faria, A.L.; Cameirão, M.S.; Couras, J.F.; Aguiar, J.R.; Costa, G.M.; Bermúdez i Badia, S. Combined cognitive-motor rehabilitation in virtual reality improves motor outcomes in chronic stroke–a pilot study. Front. Psychol. 2018, 9, 854. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  15. Miclaus, R.; Roman, N.; Caloian, S.; Mitoiu, B.; Suciu, O.; Onofrei, R.R.; Pavel, E.; Neculau, A. Non-Immersive virtual reality for post-stroke upper extremity rehabilitation: A small cohort randomized trial. Brain Sci. 2020, 10, 655. [Google Scholar] [CrossRef] [PubMed]
  16. Johnson, L.; Bird, M.L.; Muthalib, M.; Teo, W.P. An innovative STRoke Interactive Virtual thErapy (STRIVE) online platform for community-dwelling stroke survivors: A randomized controlled trial. Arch. Phys. Med. Rehabil. 2020, 101, 1131–1137. [Google Scholar] [CrossRef]
  17. Kim, W.S.; Cho, S.; Park, S.H.; Lee, J.Y.; Kwon, S.; Paik, N.J. A low cost kinect-based virtual rehabilitation system for inpatient rehabilitation of the upper limb in patients with subacute stroke: A randomized, double-blind, sham-controlled pilot trial. Medicine 2018, 97, e11173. [Google Scholar] [CrossRef]
  18. Thielbar, K.O.; Triandafilou, K.M.; Barry, A.J.; Yuan, N.; Nishimoto, A.; Johnson, J.; Stoykov, M.E.; Tsoupikova, D.; Kamper, D.G. Home-based upper extremity stroke therapy using a multiuser virtual reality environment: A randomized trial. Arch. Phys. Med. Rehabil. 2020, 101, 196–203. [Google Scholar] [CrossRef]
  19. KESKİN, Y.; Atci, A.; Urkmez, B.; Akgul, Y.; Ozaras, N.; Aydin, T. Efficacy of a video-based physical therapy and rehabilitation system in patients with post-stroke hemiplegia: A randomized, controlled, pilot study. Turkish J. Geriatr.-Turk Geriatr. Derg. 2020, 23, 118–128. [Google Scholar] [CrossRef]
  20. Fong, K.N.; Tang, Y.M.; Sie, K.; Yu, A.K.; Lo, C.C.; Ma, Y.W. Task-specific virtual reality training on hemiparetic upper extremity in patients with stroke. Virtual Real. 2022, 26, 453–464. [Google Scholar] [CrossRef]
  21. Lee, M.M.; Lee, K.J.; Song, C.H. Game-based virtual reality canoe paddling training to improve postural balance and upper extremity function: A preliminary randomized controlled study of 30 patients with subacute stroke. Med. Sci. Monit. Int. Med. J. Exp. Clin. Res. 2018, 24, 2590. [Google Scholar] [CrossRef]
  22. Choi, D.; Choi, W.; Lee, S. Influence of Nintendo Wii Fit balance game on visual perception, postural balance, and walking in stroke survivors: A pilot randomized clinical trial. Games Health J. 2018, 7, 377–384. [Google Scholar] [CrossRef]
  23. Kim, J.H. Effects of a virtual reality video game exercise program on upper extremity function and daily living activities in stroke patients. J. Phys. Ther. Sci. 2018, 30, 1408–1411. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  24. Cho, H.y.; Song, E.; Moon, J.H.; Hahm, S.C. Effects of Virtual Reality Based Therapeutic Exercise on the Upper Extremity Function and Activities of Daily Living in Patients with Acute Stroke: A Pilot Randomized Controlled Trial. Med.-Leg. Update 2021, 21, 676–682. [Google Scholar]
  25. Oh, Y.B.; Kim, G.W.; Han, K.S.; Won, Y.H.; Park, S.H.; Seo, J.H.; Ko, M.H. Efficacy of virtual reality combined with real instrument training for patients with stroke: A randomized controlled trial. Arch. Phys. Med. Rehabil. 2019, 100, 1400–1408. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  26. Ahmad, M.A.; Singh, D.K.A.; Mohd Nordin, N.A.; Hooi Nee, K.; Ibrahim, N. Virtual reality games as an adjunct in improving upper limb function and general health among stroke survivors. Int. J. Environ. Res. Public Health 2019, 16, 5144. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  27. Kiper, P.; Szczudlik, A.; Agostini, M.; Opara, J.; Nowobilski, R.; Ventura, L.; Tonin, P.; Turolla, A. Virtual reality for upper limb rehabilitation in subacute and chronic stroke: A randomized controlled trial. Arch. Phys. Med. Rehabil. 2018, 99, 834–842. [Google Scholar] [CrossRef]
  28. Abd El-Kafy, E.M.; Alshehri, M.A.; El-Fiky, A.A.R.; Guermazi, M.A. The effect of virtual reality-based therapy on improving upper limb functions in individuals with stroke: A randomized control trial. Front. Aging Neurosci. 2021, 13, 731343. [Google Scholar] [CrossRef]
  29. Cuesta-Gómez, A.; Sánchez-Herrera-Baeza, P.; Oña-Simbaña, E.D.; Martínez-Medina, A.; Ortiz-Comino, C.; Balaguer-Bernaldo-de Quirós, C.; Jardón-Huete, A.; Cano-de-la Cuerda, R. Effects of virtual reality associated with serious games for upper limb rehabilitation in patients with multiple sclerosis: Randomized controlled trial. J. Neuroeng. Rehabil. 2020, 17, 1–10. [Google Scholar] [CrossRef]
  30. Afsar, S.I.; Mirzayev, I.; Yemisci, O.U.; Saracgil, S.N.C. Virtual reality in upper extremity rehabilitation of stroke patients: A randomized controlled trial. J. Stroke Cerebrovasc. Dis. 2018, 27, 3473–3478. [Google Scholar] [CrossRef]
  31. Aşkın, A.; Atar, E.; Koçyiğit, H.; Tosun, A. Effects of Kinect-based virtual reality game training on upper extremity motor recovery in chronic stroke. Somatosens. Mot. Res. 2018, 35, 25–32. [Google Scholar] [CrossRef]
  32. Ain, Q.U.; Khan, S.; Ilyas, S.; Yaseen, A.; Tariq, I.; Liu, T.; Wang, J. Additional effects of Xbox kinect training on upper limb function in chronic stroke patients: A randomized control trial. Healthcare 2021, 9, 242. [Google Scholar] [CrossRef] [PubMed]
  33. Blasco-Peris, C.; Fuertes-Kenneally, L.; Vetrovsky, T.; Sarabia, J.M.; Climent-Paya, V.; Manresa-Rocamora, A. Effects of Exergaming in Patients with Cardiovascular Disease Compared to Conventional Cardiac Rehabilitation: A Systematic Review and Meta-Analysis. Int. J. Environ. Res. Public Health 2022, 19, 3492. [Google Scholar] [CrossRef] [PubMed]
  34. Gauthier, L.V.; Nichols-Larsen, D.S.; Uswatte, G.; Strahl, N.; Simeo, M.; Proffitt, R.; Kelly, K.; Crawfis, R.; Taub, E.; Morris, D.; et al. Video game rehabilitation for outpatient stroke (VIGoROUS): A multi-site randomized controlled trial of in-home, self-managed, upper-extremity therapy. EClinicalMedicine 2022, 43, 101239. [Google Scholar] [CrossRef]
  35. Malick, W.H.; Butt, R.; Awan, W.A.; Ashfaq, M.; Mahmood, Q. Effects of Augmented Reality Interventions on the Function of Upper Extremity and Balance in Children with Spastic Hemiplegic Cerebral Palsy: A Randomized Clinical Trial. Front. Neurol. 2022, 13, 895055. [Google Scholar] [CrossRef] [PubMed]
  36. HoloLens 2—Overview, Features, and Specs|Microsoft HoloLens. 2022. Available online: https://www.microsoft.com/en-us/hololens/hardware] (accessed on 23 October 2022).
  37. Technologies. Unity. 2022. Available online: https://unity.com (accessed on 23 October 2022).
  38. Visual Studio: IDE and Code Editor for Software Developers and Teams. 2022. Available online: https://visualstudio.microsoft.com (accessed on 23 October 2022).
  39. Microsoft. MRTK2-Unity Developer Documentation—MRTK 2. 2022. Available online: https://learn.microsoft.com/en-us/windows/mixed-reality/mrtk-unity/mrtk2/?view=mrtkunity-2022-05 (accessed on 23 October 2022).
  40. Technologies, U. Unity-Scripting API: Ray. 2022. Available online: https://docs.unity3d.com/ScriptReference/Ray.html (accessed on 22 October 2022).
  41. Faity, G.; Mottet, D.; Froger, J. Validity and reliability of Kinect v2 for quantifying upper body kinematics during seated reaching. Sensors 2022, 22, 2735. [Google Scholar] [CrossRef]
  42. Milosevic, B.; Leardini, A.; Farella, E. Kinect and wearable inertial sensors for motor rehabilitation programs at home: State of the art and an experimental comparison. Biomed. Eng. Online 2020, 19, 1–26. [Google Scholar] [CrossRef] [Green Version]
  43. Ma, M.; Proffitt, R.; Skubic, M. Validation of a Kinect V2 based rehabilitation game. PLoS ONE 2018, 13, e0202338. [Google Scholar] [CrossRef]
  44. Assad-Uz-Zaman, M.; Islam, M.R.; Miah, S.; Rahman, M.H. NAO robot for cooperative rehabilitation training. J. Rehabil. Assist. Technol. Eng. 2019, 6, 2055668319862151. [Google Scholar] [CrossRef]
Figure 1. Schematic diagram of how the proposed system developed.
Figure 1. Schematic diagram of how the proposed system developed.
Applsci 12 12260 g001
Figure 2. Different game levels. (a) A level where the user selects all the spheres in order to “destroy” them. (b) A level where the user knocks down structures with blue spheres. (c) Picking apples from a tree. (d) A level where the user must put a ball in a box. (e) Picking cabbages in the garden. (f) Pouring a cup of tea.
Figure 2. Different game levels. (a) A level where the user selects all the spheres in order to “destroy” them. (b) A level where the user knocks down structures with blue spheres. (c) Picking apples from a tree. (d) A level where the user must put a ball in a box. (e) Picking cabbages in the garden. (f) Pouring a cup of tea.
Applsci 12 12260 g002
Figure 3. Game levels shown in the Microsoft HoloLens 2 AR environment with its spatial awareness feature (i.e., recognizing and including the room details) (a) Picking cabbages in the garden. (b) Picking apples from a tree. (c) Pouring a cup of tea.
Figure 3. Game levels shown in the Microsoft HoloLens 2 AR environment with its spatial awareness feature (i.e., recognizing and including the room details) (a) Picking cabbages in the garden. (b) Picking apples from a tree. (c) Pouring a cup of tea.
Applsci 12 12260 g003
Figure 4. User interface (UI) for users to input upper arm joint information. (a) Game objects for reference on hand along with cardinal axes for the Unity environment (b) Interactable sliders for users to input body dimensions.
Figure 4. User interface (UI) for users to input upper arm joint information. (a) Game objects for reference on hand along with cardinal axes for the Unity environment (b) Interactable sliders for users to input body dimensions.
Applsci 12 12260 g004
Figure 5. Visual representation of the hand calibration tool in mixed-reality.
Figure 5. Visual representation of the hand calibration tool in mixed-reality.
Applsci 12 12260 g005
Figure 6. Data flow of the system.
Figure 6. Data flow of the system.
Applsci 12 12260 g006
Figure 7. User testing the proposed MR-based video game on the HoloLens 2, (a) the user is knocking down structures in a game level (b) the user destroying all the spheres in a game level (c) completing a game level by picking an apple from a tree.
Figure 7. User testing the proposed MR-based video game on the HoloLens 2, (a) the user is knocking down structures in a game level (b) the user destroying all the spheres in a game level (c) completing a game level by picking an apple from a tree.
Applsci 12 12260 g007
Figure 8. In-game text displaying the ROM of the upper limb joints’ DoFs while a user is playing the game in Figure 2b.
Figure 8. In-game text displaying the ROM of the upper limb joints’ DoFs while a user is playing the game in Figure 2b.
Applsci 12 12260 g008
Figure 9. The upper limb joint angles as calculated by the proposed MR system during a session when one of the subjects was playing the game displayed in Figure 2a.
Figure 9. The upper limb joint angles as calculated by the proposed MR system during a session when one of the subjects was playing the game displayed in Figure 2a.
Applsci 12 12260 g009
Figure 10. The upper limb joint angles as calculated based on the Kinect sensor data during the same session as in Figure 9.
Figure 10. The upper limb joint angles as calculated based on the Kinect sensor data during the same session as in Figure 9.
Applsci 12 12260 g010
Table 1. Game level difficulties based on the DoFs and ROM.
Table 1. Game level difficulties based on the DoFs and ROM.
Game LevelMain Focus AreaDoFsROM Covered (Degrees)Difficulty
Destroy all the spheresShoulderFlexion/extension and abduction/adduction0–100Medium
Knock down the structuresShoulder and ElbowElbow flexion/extension, shoulder flexion/extension, and shoulder abduction/adduction0–150 (Elbow), 0–120 (Shoulder)Hard
Picking apples from a treeShoulderFlexion/extension and abduction/adduction0–150Hard
Picking cabbages in the gardenShoulderFlexion/extension and abduction/adduction0–60Easy
Select the spheres in the order of the rainbowShoulderFlexion/extension and abduction/adduction0–100Medium
Pour a cup of teaShoulder and ElbowElbow flexion/extension, and shoulder rotation0–50 (Elbow), 0–30 (Shoulder)Easy
Put the ball in the boxWrist and fingerWrist flexion/extension, wrist abduction/adduction, and finger flexion/extension−50–50 (Wrist flexion/extension), −30–30 (Wrist abduction/adduction), 0–60 (Finger flexion/extension)Hard
Table 2. Example of the average p-values calculated using 1-sample T-test for all ten subjects as recorded from their data when playing the level shown in Figure 2a. The subjects were part of the control testing cohort to assess the precision of MR-game.
Table 2. Example of the average p-values calculated using 1-sample T-test for all ten subjects as recorded from their data when playing the level shown in Figure 2a. The subjects were part of the control testing cohort to assess the precision of MR-game.
Range of MotionShoulder Abduction/AdductionShoulder Flexion/ExtensionShoulder Internal/External RotationElbow Flexion/Extension
p-Value0.820.250.190.14
Table 3. Evaluation and OT assessment of the proposed MR-based gamified rehabilitation system, n = 12 (10 Healthy participants and 2 OTs).
Table 3. Evaluation and OT assessment of the proposed MR-based gamified rehabilitation system, n = 12 (10 Healthy participants and 2 OTs).
ItemCriteriaQuestionsAvg. Score
   (n = 12)
1UsabilityI think the MR Game is easy to use.4.64
2FunctionalityI think the functionalities of the MR Game are well integrated4.57
3LearnabilityI think the most users can quickly learn to use the MR Game4.68
4MotivationI am confident when using the MR Game4.8
5ComfortabilityI would like to use the MR Game on a daily basis4.78
6Overall SatisfactionHow do you rate the MR game?4.72
7OT Assessment(i) How do you rate the comfort of using the MR Game?4.88
(ii) How do you rate the accuracy of the MR Game’s capabilities for measuring the shoulder’s flexion/extension ROM?4.83
(iii) How do you rate the accuracy of the MR Game’s capabilities for measuring the shoulder’s abduction/adduction ROM?4.89
(iv) How do you rate the accuracy of the MR Game’s capabilities for measuring the shoulder’s internal/external rotation ROM?4.60
Scale ranging from 1 to 5 is used for items (1–6) where 1 = strongly disagree, 2 = disagree, 3 = somewhat agree, 4 = agree, 5 = strongly agree. Another scale ranging from 0 to 5 is used for item 7 where 0 = minimum and 5 = maximum.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Pillai, A.; Sunny, M.S.H.; Shahria, M.T.; Banik, N.; Rahman, M.H. Gamification of Upper Limb Rehabilitation in Mixed-Reality Environment. Appl. Sci. 2022, 12, 12260. https://doi.org/10.3390/app122312260

AMA Style

Pillai A, Sunny MSH, Shahria MT, Banik N, Rahman MH. Gamification of Upper Limb Rehabilitation in Mixed-Reality Environment. Applied Sciences. 2022; 12(23):12260. https://doi.org/10.3390/app122312260

Chicago/Turabian Style

Pillai, Aditya, Md Samiul Haque Sunny, Md Tanzil Shahria, Nayan Banik, and Mohammad Habibur Rahman. 2022. "Gamification of Upper Limb Rehabilitation in Mixed-Reality Environment" Applied Sciences 12, no. 23: 12260. https://doi.org/10.3390/app122312260

APA Style

Pillai, A., Sunny, M. S. H., Shahria, M. T., Banik, N., & Rahman, M. H. (2022). Gamification of Upper Limb Rehabilitation in Mixed-Reality Environment. Applied Sciences, 12(23), 12260. https://doi.org/10.3390/app122312260

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop