1. Introduction
In job training, it is extremely important to provide job trainees with the opportunity to have a realistic experience within an unfamiliar work environment. Repetitive training is also important in job training; however, trainable work is limited, resulting in a lack of opportunity for accumulating skills. Furthermore, various difficulties are faced when training in a real-world environment, such as a limited physical space, facility management, safety accidents, and the costs of consumable materials during repeated training.
In recent years, virtual reality (VR) devices have become popular, and virtual environments have been applied to various fields such as games, education, medical care, and sports. VR has the advantage of being able to provide safe, repeatable, and specific training, focusing on improving social and practical skills required in the real world. Specifically, it can provide an effective job training environment through observations accompanied by direct interaction and experience with objects, rather than passive visual interaction-oriented virtual education [
1,
2]. Furthermore, the use of physical tools in virtual training can increase the effectiveness of training by improving the transfer and acceptance of learning [
3].
For VR training to be effective during job training, it is important to provide realistic training situations. Hence, studies have been introduced to add appropriate physical objects for the virtual environment and use them for interaction with users to increase the immersion and situational awareness of the users [
4,
5,
6]. Studies using haptic tools or hand tracking have also been actively conducted to enhance the interaction effect of the user [
7,
8,
9,
10,
11,
12,
13]. Some studies have been published to improve the coexistence and mutual dependence between users and other objects in a virtual environment based on the presence of user avatars [
14,
15].
However, if only VR controllers are used as in conventional experience, it is difficult to touch objects directly, and an interaction is carried out through the action of in-air selection while holding the controller. This method does not support the awareness of real tools required in job training, making it difficult to implement a tangible interaction. For example, in mixed reality (MR) systems, physical objects can be controlled in a simple way such as translation and rotation and do not provide physical senses of touching. Moreover, this method facilitates only simple experiences in fields where training requires real contact (training using tools) rather than simple selections, and it is difficult to use at a level of actual applied force in training. Finger interaction methods have been proposed, which use glove-type haptic tools [
16,
17,
18] to provide the feeling of grabbing a virtual object through force feedback. However, they are difficult to use in interaction-based content (tool-based job training), which requires exchange of force with the whole arm.
In this paper, we propose a VR-based job training system, in which users use real tools to provide tangible virtual training service. The proposed system provides a steam car wash training program to job trainees for safe, repeatable, and practical training. This system specifically uses a method of matching the vehicle model in a virtual environment with the coordinate system of the vehicle body in the real environment allowing the users to be trained while feeling an actual contact sensation when wiping with a mop in the virtual environment. Furthermore, to improve the immersion within the virtual environment, the proposed system virtualizes the user’s entire body by sending the 3D point and color data obtained from the calibrated Kinect sensor [
19] to the content. A cost-effective problem may occur during virtual training for novice trainees because it requires an intensive intervention of the trained staff [
20]. Therefore, this system was implemented to minimize expert intervention by providing information on the training through vision and audio within the virtual environment while proceeding with repeated training. The proposed method can increase the immersion and effectiveness through real-world-like training.
This paper is structured as follows.
Section 2 analyzes previous related studies, whereas
Section 3 introduces an overview of the system for tangible training.
Section 4 then describes the construction of a reality-based virtual environment, and
Section 5 details the tool-based user posture estimation method. Next,
Section 6 shows the results of user experiments, and
Section 7 provides some concluding remarks and introduces areas of future research.
2. Related Works
Various physical objects and devices have recently been used to increase the tangible sensation of the user. Simeone et al. placed physical objects corresponding to virtual objects and investigated the responses of the participants [
4], whereas Lee et al. used a physical-virtual table to study the interaction with virtual characters and the physical effects [
5]. He et al. proposed a design for the surrounding information to improve the safety problem and psychological discomfort and increase the situational awareness of physical elements [
6].
Most VR content has been developed to be experienced with a controller held in the user’s hand for convenience and price issues. In recent years, research using haptic tools or hand tracking have been actively conducted to enhance the interaction effect. Loch et al. adopted physic components in the system during virtual training to allow users to know the feeling, size, and weight of the tools during training and proposed a haptic-interaction-based virtual training system [
7]. Seo et al. proposed a method of learning through more intuitive interaction than learning through a VR controller by providing tactile sensations of air pressure and vibration through a fire extinguisher-shaped controller [
8]. Arora et al. proposed VirtualBrics technology, a variety of controllers that facilitates a physical operation in a VR environment though LEGO-based tools [
9]. Zhu et al. compared real tools and a VR controller to confirm whether the tangible interaction is the highest in real tools and developed haptic tools that were like real tools [
10]. Shigeyama et al. developed Transcalibur, a portable VR controller that can render shapes by changing the mass properties on a two-dimensional plane to provide a realistic tool feeling [
11]. Zenner and Krüger also proposed a technique for changing the drag and rotational inertia that the user feels by dynamically adjusting the surface area to deliver realistic and kinematic sensations [
12]. Furthermore, VR and augmented reality (AR) based medical simulators are instruments for improving the skills of medical workers, and Talhan and Jeon conducted a study to provide various haptic effects by simulating physical effects through a pneumatic actuation [
13]. Although many devices that track the user’s hand have recently appeared [
16,
17,
18], the majority have modeled and visualized only a part (e.g., head or hand) of the user’s body for a VR interaction, thus lacking a feeling of realism.
In a study on the effect of user avatars on VR, Heidicker et al. presented a complete body model, which was mapped to the user’s movement and produced a higher coexistence and mutual dependence than the model composed of the user’s head and hand [
14]. Franco and Peck reported that the representation of a user avatar in a virtual environment increased the subjective presence, increasing the validity of a stronger immersion and experience [
15].
6. Experimental Results
The system used in this study consists of the following: a personal computer (with an Intel i7-9700 3-GHz CPU and an nVidia RTX2080 GPU) was connected to two Kinect sensors (Microsoft Azure Kinect DK) [
19] and two data transmitters (Zigbee) of the training tools (steam gun and mop). As shown in
Figure 15, the users participated in job training through the steam car wash training content created using a wireless HMD (VIVE Pro) [
25] and the Unity game engine [
26]. The accompanying video is shown in
Supplementary.
6.1. Training and Evaluation Method
In the real-world environment, the steam gun should be used carefully, and it is dangerous to spray steam toward one’s body or at a very close distance from the car. Therefore, the training content should determine the spraying direction of the steam and the user’s body position to inform the user of a dangerous situation. To this purpose, the training content created in this study uses the direction of the steam gun, the steam spraying distance, and the user’s depth data to recognize a dangerous situation and provide a voice warning. Furthermore, we measured and evaluated the training performance of the user by composing training scores with four items: steam spaying (i.e., steam, whether steam was evenly sprayed on the car), steam distance (i.e., distance, whether steam was sprayed at a close range from the car), wiping (i.e., wipe, whether the steam sprayed was evenly and properly wiped), and user safety (safety, whether steam was sprayed to the user’s body). The “steam” and “wipe” were scored based on a method for adding points starting from zero based on how much of the region was filled. The scores for two items related to safety were based on a method of point deduction, i.e., the score decreased when the situation was deemed unsafe. Finally, the amounts of time spent on the steam spraying and wiping procedures were measured together.
As shown in
Figure 14, if the steam gun is pointed toward the sampling points of the automobile body within the predefined region, the intersection position is calculated according to the distance (
) and direction (
) of the steam gun. In the training content, the spraying distance and direction of steam can be set. Based on this, “steam” is evaluated based on the degree of interaction with the sampling points of the automobile body, and points are added only when the steam gun is within an appropriate distance (set to 20–35 cm as in the case of an actual steam car wash) after pressing the button. For example, if the distance between the intersection points and the steam gun is too far (more than 35 cm), the steam is not allowed to be applied to the surface of the automobile when sprayed, and a voice warning informs the user that the steam gun is too far away. If the steam gun is too close to the car (within 20 cm), information on being too close to the vehicle is provided, and at the same time, the score for the distance to the car is reduced. Furthermore, if the user is spraying steam toward his/her body, the user’s safety score is reduced, and a voice warning is given.
Figure 16 shows examples of the evaluation according to the steam spraying distance and direction.
During wiping training, the user holds a mop in one hand and wipes the car within the automobile region, and contaminants are wiped only when the mop position is close to the automobile model and the input signals arrive. When wiping, only when the position of the mop is within the predetermined distance from the actual automobile body and the user wipes with a certain pressure or higher is a point given. If the user wipes too weakly or too strongly, a voice warning is provided. In the case of pressure data, because each user has a different hand size, and it is difficult to press all pressure sensors while wiping, the highest-pressure value among 13 values is used for the calculation.
Figure 17 shows the changes in the sampling points after steam-spraying and mop-wiping on the surface of the automobile body.
After the steam gun spraying is finished, the system tells the user which part of the automobile lacks steam spray. For such part, the work is divided into a 7 × 5 area, and each area with a low score is selected in ascending order and indicated with an arrow, as shown in
Figure 18. For wiping training, the parts that have not been properly wiped are also indicated with arrows. When all training procedures are completed, the effectiveness of the training is measured through the final scores and the time performed. The system used in this study can save the training evaluation result for each user and thereby accumulate the data required for analyzing the training effect.
6.2. User Test
To verify the usefulness and effectiveness of the job training content created through this study, we measured the work proficiency improvement for 12 participants (university students majoring in engineering.) The participants underwent the steam car wash training in both tangible interaction and VR controller environments 2–3 times per week for four weeks, and the average training scores of the four items (steam, distance, wipe, and safety in
Figure 19) and time were recorded.
In the tangible interaction environment, the work proficiency of the participants continued improving during the evaluation, as shown in
Figure 19. Relatively high scores were recorded for steam spraying (steam) and user safety (safety), which implies that the participants properly recognized and followed the warnings of the content. The participants found that wiping was more difficult than steam spraying, and it was most difficult to maintain a proper distance for the latter. In the case of wiping, relatively more sensors were attached to satisfy the real-world-like sophisticated tasks, and every participant was required to be trained on maintaining the optimal posture for achieving an appropriate spraying distance in a 3D space, like a real-world environment. Finally, the measured work time was 4–5 min during the first week, and afterward was reduced to 2–3 min.
In a VR-controller based environment in which a real automobile frame and car washing tools were not used, relatively high scores were observed, and no noticeable improvement was observed in terms of the work proficiency during the evaluation period, as shown in
Figure 19. This was because, in the case of a VR controller, physical changes such as force control and posture correction were not specifically required because the car washing could be conducted through relatively simple interactions. In the case of the work time, the work was overall finished relatively quickly, typically within 1–2 min.
7. Conclusions
An intuitive interaction with a virtual environment is needed to increase the effectiveness and efficiency of virtual training. In this study, we developed a virtual training system using real objects to improve the tangible interaction during virtual environment-based training. The proposed system tracks the real training tools by using multiple sensors to increase the immersion during user training by matching the real-world environment with a virtual environment, visualizes the virtual space by matching it with the user’s real-world space, maps the real and virtual objects to provide a realistic physical touching sensation, and uses the system to propose a training evaluation method.
In this study, we improved the effectiveness of virtual training through user tests and obtained a positive effect for tangible interactions. By using a sensor-attached sprayer and mop, we observed that more effective training can be experienced compared to simple VR or MR experiences using conventional controllers because the users can determine how much force to apply based on the information provided on the actual contact and pressing forces.
The training system used in this study applies a real object (automobile frame.) Therefore, to use various shapes of virtual models, shapes of real models are required, despite the use of a virtual training environment. If a wall-shaped object with a similar tactile sensation is created instead of a real automobile, the cost of constructing the training environment can be reduced, and various training models can be provided. In addition, we can rely on the 3D printing devices to fabricate necessary objects in a cost-effective way.
Although only one user can currently be trained in the virtual environment, in the future, we plan to conduct a study to analyze the movement of each user when two or more people are in the same space and provide interactions for collaborative training. We plan to predict the user behavior more precisely by expanding the training evaluation and analyzing the starting position, as well as by moving the direction of the tools when spraying the steam or wiping the automobile.