Next Article in Journal
Fracture Resistance of Monolithic Zirconia Crowns on Four Occlusal Convergent Abutments in Implant Prosthesis
Next Article in Special Issue
Simulation of Truck Haulage Operations in an Underground Mine Using Big Data from an ICT-Based Mine Safety Management System
Previous Article in Journal
Development of UAV Tracing and Coordinate Detection Method Using a Dual-Axis Rotary Platform for an Anti-UAV System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Performance Comparison of User Interface Devices for Controlling Mining Software in Virtual Reality Environments

Department of Energy Resources Engineering, Pukyong National University, Busan 48513, Korea
*
Author to whom correspondence should be addressed.
Appl. Sci. 2019, 9(13), 2584; https://doi.org/10.3390/app9132584
Submission received: 9 May 2019 / Revised: 14 June 2019 / Accepted: 21 June 2019 / Published: 26 June 2019

Abstract

:
Recently, many studies have been conducted to apply virtual reality (VR) technology to the mining industry. To accomplish this, it is necessary to develop user interface devices that can effectively control software. Most VR content in the mining industry requires precise device control for equipment operation or accident response. In this study, we compare the performance of four user interface devices (a 2D mouse, 2D & 3D mice, a VR controller, and a Kinect (Microsoft) sensor and bend-sensing data glove) for controlling mining industry software in a VR environment. The total working time, number of device clicks and click accuracy, were analyzed based on 10 experimenters performing 3D orebody modeling, using each device in the VR environment. Furthermore, we conducted a survey to evaluate the ease of learning, ease of use, immersion and fatigue of each device after the experiment. The results show that the 2D mouse yields a high performance in terms of its working time, click accuracy, ease of learning and ease of use. However, the 2D mouse did not completely leverage the VR environment, owing to low user immersion. The Kinect sensor and bend-sensing data glove could control the software efficiently while maximizing user immersion. Our findings are expected to provide a useful reference for the future development of user interface devices in the mining industry.

1. Introduction

Virtual reality (VR) technology enables users to perceive the virtual world as if it were real, and to interact with it by providing various sensations to users in a virtual world implemented using a computer [1,2]. Studies on VR technology have been conducted since Ivan Sutherland [3] developed a head mounted display (HMD) in 1968. VR technology has since been used widely in various industries. For example, Lele [4] presented an assessment of VR technology for military applications. Parsons and Rizzo [5] conducted a feasibility assessment for medical treatment of VR technology and Seymour et al. [6] analyzed the use of VR technology for medical training in the operating room. Portman and Natapov [7] introduced the application of VR technology for architecture, landscape architecture, and environmental planning. Gotsis et al. [8] developed a games platform based on VR technology for body exercises and rehabilitation. McMahan et al. [9] evaluated the effects of fidelity on the user in a complex performance-intensive context using a VR game. Grossard et al. [10] proposed the use games utilizing VR technology for treating individuals with autism spectrum disorders.
Moreover, VR technology has been intensively studied for human training purposes. For instances, Chenechal et al. [11] developed a remote guiding system that can help a user by way of two virtual arms controlled by a remote expert in the virtual reality environment. Torres et al. [12] developed a virtual workspace where workers are trained in welding operation. Anton et al. [13] proposed telerehabilitation systems that can support physical therapy in virtual environments, and Quail et al. [14] introduced a digital game within a medical education area, in order to improve engagement, confidence and knowledge for undergraduate students.
Several studies focused on VR technology in the mining industry, thus van Wyk and de Villiers [15] developed simulation software to indirectly experience workplaces in a VR environment to prevent safety accidents in South African mining sites. Their work enabled workers to effectively learn the hazards at mining sites, and how to cope with accidents, should they occur. In a similar study, Kizil et al. [16] developed software for training control of equipment in underground mines and ventilation facilities in a VR environment. Orr et al. [17] developed software for training users on how to respond to underground mine fires. A number of start-up companies have recently launched products that incorporate VR technology into the mining industry. For example, Llamazoo [18] developed software that enables intuitive development plans by visualizing mining site data in a VR environment, and MOON PATROL VR [19] developed a system that offers a 360-degree birds-eye view of mining sites in a VR environment. In addition, VR software for preventing safety accidents in mining sites [20] and for training users on equipment operation [21] were also launched into the market.
To apply VR technology to the mining industry, it is essential to develop devices to effectively control software. Most of the VR content in the mining industry requires precise control for equipment operation or accident response. In other industries, studies have been conducted on the development of user interface devices that can enhance user immersion and effectively control software in a VR environment [22,23,24,25,26,27]. For example, Kaiser et al. [24] developed 3D multimodal interaction in immersive virtual reality environments using gesture, language, and gaze. Kok and van Liere [25] developed a 3D interaction system that enables users to control 3D widgets that are utilized to interact with visualized data using a head tracker, stereo glasses, and a camera in a virtual environment. The developed user interface enables the user to control an object more intuitively with immersion in the virtual reality environment.
In the mining industry, several studies have been conducted on the development of 3D user interfaces (3DUIs) to control VR-based software [28,29]. Kim and Choi [28] developed a 3D user interface based on gesture and hand tracking using a Kinect sensor and a bend-sensing data glove to control mining software in a virtual reality environment; Bednarz et al. [29] proposed a new interaction approach that can control mining equipment using a touch screen, data glove and spherical dome in an immersive virtual reality environment. However, previous studies have not quantitatively compared the performance of various user interface devices for controlling mining industry software in a VR environment, as they have mainly focused only on the development of 3DUI devices.
In this study, we compared the performance of four user interface devices (a 2D mouse, 2D & 3D mice, a VR controller, a bend-sensing data glove with Kinect sensor) in a VR environment when controlling Kmodstudio [30], a mining industry software product. We analyzed the total working time, the number of device clicks, and the click accuracy for 10 experimenters performing 3D orebody modeling using four user interface devices in a VR environment. In addition, we conducted a user survey to evaluate the ease of learning, ease of use, immersion and fatigue for each device after the experiment.

2. Materials and Methods

We compared the performance of four user interface devices for controlling mining industry software in a virtual reality (VR) environment. The total working time, number of device clicks and the click accuracy were analyzed for ten experimenters when these experimenters performed 3D orebody modeling, using each device in the VR environment.
Furthermore, we conducted a survey to evaluate the ease of learning, ease of use, immersion and fatigue for each device after the experiment. Table 1 describes the experimental conditions of this study.

2.1. User Interface Device

We used a typical 2D mouse in the experiment (Figure 1a). The experimenters placed the 2D mouse on a desk and controlled the Kmodstudio using the mouse while sitting on a chair, as per the predefined procedure.
Similar to the 2D mouse experiment, for the 2D & 3D mice, the experimenters placed two devices on the desk and used while them sitting on the chair. Each experimenter controlled Kmodstudio using the 2D mouse with the right hand and the 3D mouse (Figure 1b) with the left hand. Using the controller, placed at the center of the 3D mouse, the experimenters performed rotation, zoom-in and zoom-out operations in the X, Y, and Z directions.
Figure 1c shows the VR controller used in this study. The experimenters controlled the position of the mouse cursor while in a standing position by holding the device in both hands and moving the wrist in the VR environment. Click events were implemented using the buttons.
The Kinect sensor & bend-sensing data glove (Figure 1d) were developed by Kim et al. [28] as a 3DUI for use in a VR environment. The experimenters controlled the position of the cursor by tracking the position of the hand and then tracking the movement of the arm using the Kinect sensor, which is a 3D depth measurement camera. Kmodstudio runs the commands listed in Table 2 using the bend-sensing data glove, which can measure the degree of the bending of fingers when each finger is bent. The experimenters can use this device while standing.

2.2. Experimenter

The experimenters consisted of six males and four females between 22 and 27 years old (average 24.8). All of the experimenters had normal vision, or had their corrected vision to within the normal range. They did not have any physical issues inhibiting the use of the four interface devices, and primarily used their right hand when operating the devices. The experimenters were already familiar with how to use PCs, 2D mice, and KmodStudio. Apart from the 2D mouse, they had never used the three interface devices. They also had no previous experience in controlling software in a VR environment with an HMD. Therefore, before conducting the comparison experiment, we provided the 10 experimenters with 30 min of training in controlling KmodStudio with the HMDs, using each interface device. To prevent the learning effects that may occur when an experimenter uses the interface devices in the same order, the experimenters performed the work using the interface devices in different orders. In addition, in order to prevent accumulation of fatigue that may occur when using an interface repeatedly, rest time was provided after using each interface.

2.3. Software

KmodStudio is a software for 3D orebody modeling and mine design; it is widely used in the mining industry [30]. In this study, we set up a scenario to perform 3D orebody modeling with KmodStudio using the following procedure:
(1) After visualizing data representing an orebody grade in 3D space, the user checks the geometric isotropy of the data. To visualize the data, the menu bar at the top is sequentially clicked. In order to check geometric isotropy, the modeling results are confirmed in real-time by clicking various numerical values in the variogram modeling window, as shown in Figure 2b.
(2) Perform spatial interpolation of 3D ore grade values using ordinary Kriging (Gaussian process regression). The GetPointSetOKG function is used to perform ordinary Kriging; to use the function, the modeling results of the variogram produced in the previous step are called up, and the numerical value is entered.
(3) After adjusting the range of the X, Y and Z axes to visualize the spatially interpolated data, check the results of the 3D orebody grade estimation by zooming in and out of the screen. To define the range of the X, Y and Z axes, we adjusted the scale bar for the visualization area, as shown in Figure 2d.
According to the above procedure, we classified unit operations, such as mouse cursor movements and clicks required to control KmodStudio, and made a list of 69 operations. According to this operation list, the experimenters performed the comparison experiment by controlling KmodStudio using the four user interface devices. However, when using the Kinect sensor and the bend-sensing data glove, users can perform multiple unit operations at one time, according to the finger bending motions, as shown in Table 2; thus, the list was actually composed of 41 operations. Figure 2 shows the major unit operations required for 3D orebody modeling using KmodStudio.

2.4. PC and HMD Device

We conducted the experiment using a desktop PC environment with an Intel Core i9-7940X CPU (Intel, Santa Clara, CA, USA), SAMSUNG 64 GB RAM (Samsung Electronics, Suwon, Korea), and ASUS GTX 1080Ti 11 GB graphics card (ASUS, Beitou, Taiwan). We used a Windows 10 Pro for Workstations as the PC operating system. We used the Oculus Rift (Oculus, Irvine, CA, USA) as an HMD for implementing the VR environment. We used the Virtual Desktop (Oculus, Irvine, CA, USA) software to project the PC screen to the HMD. Additionally, we connected a large monitor to the PC, so that we were able to view the screen that the experimenters viewed through the HMD. Through this, we were able to analyze the working time spent on each unit operation, the number of device clicks, and the click accuracy by recording the process as each experimenter controlled KmodStudio using each of the user interface devices (Figure 3).

2.5. Survey

We conducted a user survey to evaluate the ease of learning, ease of use, immersion and fatigue for the four interface devices. The ease of learning was rated between 1 (very difficult) to 5 (very easy). Ease of use was rated between 1 (very difficult) and 5 (very easy). The level of immersion that users felt in the VR environment when using the devices was rated between 1 (none) to 5 (very high), and the level of fatigue felt when using these devices was rated between 1 (none) to 5 (very high).

3. Experimental Results

Figure 4 shows the cumulative working times of three of the experimenters with the HMD device while performing the 69 unit operations using the four user interface devices in the VR environment. The working times for each user interface device differed slightly for each experimenter. Experimenter No. 1 was able to finish operations the fastest when using the Kinect sensors and the bend-sensing data glove, and spent the longest time when using the 2D & 3D mice. On the other hand, Experimenters No. 2 and No. 3 spent the shortest time when using the 2D mouse. Apart from the VR controller, no significant difference was observed in the working time spent by experimenter No. 2 when using the other three user interface devices. However, for experimenters No. 1 and No. 3, there was a difference in the working time spent on using all four user interface devices.
The difference in the number of device-clicks with fingers while performing operations was relatively small for each experimenter (Figure 5). All experimenters (Nos. 1–3) recorded the fewest clicks when using the Kinect sensor and the bend-sensing data glove, followed by the 2D mouse, 2D & 3D mice, and the VR controller. The reason why the number of clicks was generally higher when using the VR controller is considered to be because the VR controller is difficult to control precisely.
Figure 6 shows the mean and box plots of the experimental results for the 10 experimenters. The box plots include minimum, median and maximum values, along with the interquartile range (IQR), composed of the first quartile (Q1) and third quartile (Q3). The mean value for the 10 experimenters is denoted by the X symbol. The 2D mouse recorded the shortest total working time with an average of 99.3 s, followed in order by the Kinect sensor and the bend-sensing data glove and the 2D & 3D mice. The VR controller recorded the longest time, taking 30 s longer than the 2D mouse. The Kinect sensor and the bend-sensing data glove took longer than the 2D mouse; however, the difference was only 3.6 s.
The number of device-clicks while performing the operations was the fewest (31.4 clicks) when using the Kinect sensor and the bend-sensing data glove. This is because we used the shortcut command functions according to the finger motions when controlling KmodStudio processes, as shown in Table 1. This was followed in order by the 2D mouse, the 2D & 3D mice, and the VR controller, which recorded the greatest number of clicks. The reason behind the difference in the number of clicks between user interface devices is that the number of incorrect clicks while performing the operations is also included in the total number of clicks. A greater number of clicks to perform the same operations translates into more incorrect clicks.
The click accuracy refers to the ratio of the number of correct clicks to the total number of clicks. Using the 2D mouse resulted in a 6% higher click accuracy than using the 2D & 3D mice together, and a 5% higher click accuracy than using the Kinect sensor and the bend-sensing data glove. That is, the click accuracies of these three devices were similar. On the other hand, there was a significant difference (of 14%) between the 2D mouse and the VR controller. When using the 2D & 3D mice together, the experimenters tended to have trouble controlling devices with both hands, resulting in incorrect clicks.
In the case of using the Kinect sensor and the bend-sensing data glove, the experimenters often failed to click because they accidentally mis-clicked while bending their fingers or moving their wrists. In particular, for the VR controller, the experimenters often failed to click because the cursor moved a lot, even for slight movements in their wrists.
In terms of total working time, when using the 2D & 3D mice together, the distribution of the experimenters varied. In contrast, when using the Kinect sensor and the bend-sensing data glove, most of the experimenters were close to the average value. The number of device clicks during the total operation showed a large variation between the experimenters using the VR controller, while the other three interfaces showed similar results overall. For click accuracy, various distributions were shown by the experimenters when using the VR Controller. It is considered that the VR controller showed a significant difference in learning level among the experimenters when learning the operation method during the training time (30 min provided for each interface device before starting the experiment). Finally, the Kinect sensor and the bend-sensing data glove had similar a learning level.
Figure 7 shows the results of the survey on ease of learning, ease of use, immersion and fatigue. Regarding the ease of learning, the 2D mouse received the highest score (4.8). This was followed in order by the Kinect sensor and the bend-sensing data glove, the 2D & 3D mice and finally the VR controller. While the 2D mouse did not require any additional learning, the experimenters were considered to have difficulties in rotating objects on the actual software using the 3D mouse joystick when using the 2D & 3D mice together. Regarding the Kinect sensor and the bend-sensing data glove, the experimenters had difficulties in learning the control functions according to the finger bending motions listed in Table 1.
Regarding ease of use, the 2D mouse also received the highest score. For the other three user interface devices, ease of use showed similar patterns to ease of learning. However, the score of ease of use for the VR controller were significantly different from those of the other interface devices. This is considered to be because the experimenters had difficulties in controlling the software using the VR controller.
Regarding immersion, the experimenters felt that while performing operations in the VR environment, the Kinect sensor and the bend-sensing data glove received the highest score followed by the VR controller. The 2D mouse and 2D & 3D mice combination received relatively low scores. These two interface devices received low scores for immersion because the experimenters used them on the desk while sitting on a chair, which limited their movements and resulted in a reduced immersion in the VR environment.
In the VR environment, the fatigue that the experimenters felt for each interface device showed similar patterns to immersion. The experimenters showed high fatigue when using the Kinect sensor and the bend-sensing data glove and the VR controller, which they used while standing. It was found that the experimenters felt relatively low fatigue when using the 2D mouse and the 2D & 3D mice while sitting.

4. Discussion

The results of the comparison experiment performed in this study show that the 2D mouse recorded the shortest total working time spent performing the operations, and the highest click accuracy. This is considered to be because the experimenters were all familiar with the use of a 2D mouse, but were less familiar with the other three interface devices. However, the 2D mouse made it difficult for the experimenters to freely move as they used the device on the desk, which reduced their immersion in the VR environment. The most important reason for implementing a VR environment comes down to reality and immersion, which enable users to perceive the virtual world as if it were real. In this respect, our experimental results confirm that it is difficult to maximize the advantages of a VR environment with an interface device such as a 2D mouse that limits body movements.
Comparing the two user interface devices (the VR controller and the Kinect sensor and bend-sensing data glove) that provided high user movement freedom and immersion, the Kinect sensor and bend-sensing data glove reduced the total working time by 29.53 s, and yielded a 9.17% higher accuracy as compared with the VR controller. General VR content controlled using VR controllers does not require a high control accuracy, as it selects or moves relatively large objects. However, a relatively high level of control accuracy is required when performing 3D orebody modeling using mining industry software in a VR environment. The results of the comparison experiment in this study reveal that the Kinect sensor and bend-sensing data glove were the most-suitable user interface devices for controlling mining industry software in a VR environment.

5. Conclusions

In this study, we conducted a performance comparison experiment on four interface devices (a 2D mouse, 2D & 3D mice, a VR controller, a Kinect sensor and bend-sensing data glove) that were able to control Kmodstudio, a mining industry software, in a VR environment. We analyzed the total working time, the number of device clicks, and click accuracy for a total of ten experimenters after they completed operations predefined in Kmodstudio.
We then conducted a survey to evaluate the ease of learning, ease of use, immersion and fatigue for each device. Our findings show that the 2D mouse achieved the best performance in its working time, click accuracy, ease of learning and ease of use, while the Kinect sensor and the bend-sensing data glove received the highest score for immersion and fatigue.
The 2D mouse has many restrictions on overall body movements because the users control it while sitting on a chair. Therefore, given that the goal of implementing a VR environment is to maximize immersion that enables us to perceive the virtual world as if it were real, a 2D mouse that limits body movements is not suitable. The Kinect sensor and the bend-sensing data glove, which could maximize immersion with relatively no restrictions on body movements, showed excellent performance in working time and click accuracy. Thus, it is considered to be a suitable user interface device to control mining industry software in a VR environment.
Currently, 3D user interface devices such as the Kinect sensor and bend-sensing data glove cannot completely replace traditional 2D user interface devices such as keyboards and mice. Although 3D user interface devices are effective in providing a high level of immersion in a VR environment, some operations still require character inputs using keyboards, or precise control using 2D mice to control mining industry software. Nevertheless, as VR technology rapidly advances and various HMD devices are commercialized, it is expected that the market demand for the development and utilization of user interface devices for controlling mining industry software will be even greater.
Although several previous studies have been performed in the development of user interface devices that can enhance user immersion and effectively control software in a VR environment [22,23,24,25,26,27,28,29], they did not quantitatively compare the performances of the various user interface devices that are able to control mining industry software. Therefore, our research findings are expected to be a useful reference material for the future development of user interface devices in the mining industry. In future work, the workload with user interface devices needs to be evaluated using standard tests such as NASA-TLX [31].

Author Contributions

Y.C. conceived and designed the experiments; H.K. performed the experiments; H.K. and Y.C. analyzed the data; Y.C. contributed reagents/materials/analysis tools; H.K. and Y.C. wrote the paper.

Funding

This work was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (2018R1D1A1A09083947).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Steuer, J. Defining Virtual Reality: Dimensions Determining Telepresence. J. Commun. 1992, 42, 73–93. [Google Scholar] [CrossRef]
  2. Sherman, W.R.; Craig, A.B. Understanding Virtual Reality: Interface, Application, and Design; Morgan Kaufmann: San Francisco, CA, USA, 2003; ISBN 1-55860-353-0. [Google Scholar]
  3. Sutherland, I.E. A head-mounted three dimensional display. In Proceedings of the Fall Joint Computer Conference, Part I, San Francisco, CA, USA, 9–11 December 1968; Association for Computing Machinery(ACM): New York, NY, USA, 1968; pp. 757–764. [Google Scholar] [Green Version]
  4. Lele, A. Virtual reality and its military utility. J. Ambient Intell. Humaniz. Comput. 2013, 4, 17–26. [Google Scholar] [CrossRef]
  5. Parsons, T.D.; Rizzo, A.A. Affective outcomes of virtual reality exposure therapy for anxiety and specific phobias: A meta-analysis. J. Behav. Ther. Exp. Psychiatry 2008, 39, 250–261. [Google Scholar] [CrossRef] [PubMed]
  6. Seymour, N.E.; Gallagher, A.G.; Roman, S.A.; O’Brien, M.K.; Bansal, V.K.; Andersen, D.K.; Satava, R.M. Virtual Reality Training Improves Operating Room Performance Results of a Randomized, Double-Blinded Study. Ann. Surg. 2002, 236, 458–464. [Google Scholar] [CrossRef] [PubMed]
  7. Portman, M.E.; Natapov, A.; Fisher-Gewirtzman, D. Computers, Environment and Urban Systems To go where no man has gone before: Virtual reality in architecture, landscape architecture and environmental planning. Comput. Environ. Urban Syst. 2015, 54, 376–384. [Google Scholar] [CrossRef]
  8. Gotsis, M.; Lympouridis, V.; Turpin, D.; Tasse, A.; Poulos, I.C.; Tucker, D.; Swider, M.; Thin, A.G.; Jordan-Marsh, M. Mixed reality game prototypes for upper body exercise and rehabilitation. In Proceedings of the 2012 IEEE Virtual Reality Workshops (VRW), Costa Mesa, CA, USA, 4–8 March 2012; IEEE: New York, NY, USA, 2012; pp. 181–182. [Google Scholar]
  9. McMahan, R.P.; Bowman, D.A.; Zielinski, D.J.; Brady, R.B. Evaluating Display Fidelity and Interaction Fidelity in a Virtual Reality Game. IEEE Trans. Vis. Comput. Graph. 2012, 18, 626–633. [Google Scholar] [CrossRef] [PubMed]
  10. Grossard, C.; Grynspan, O.; Serret, S.; Jouen, A.L.; Bailly, K.; Cohen, D. Serious games to teach social interactions and emotions to individuals with autism spectrum disorders (ASD). Comput. Educ. 2017, 113, 195–211. [Google Scholar] [CrossRef] [Green Version]
  11. Chenechal, M.L.; Duval, T.; Gouranton, V.; Royan, J.; Arnaldi, B. Vishnu: 664 Virtual Immersive Support for HelpiNg Users an interaction paradigm for collaborative remote guiding in mixed reality. In Proceedings of the 2016 IEEE Third VR International Workshop on Collaborative Virtual Environments (3DCVE), Greenville, SC, USA, 20 March 2016; IEEE: New York, NY, USA, 2016. [Google Scholar]
  12. Torres, F.; Tovar, L.A.N.; del Rio, M.S. A learning evaluation for an immersive virtual laboratory for technical training applied into a welding workshop. J. Math. Sci. Technol. Educ. 2017, 13, 521–532. [Google Scholar] [CrossRef]
  13. Anton, D.; Berges, I.; Bermúdez, J.; Goñi, A.; Illarramendi, A. A Telerehabilitation System for the Selection, Evaluation and Remote Management of Therapies. Sensors 2018, 18, 1459. [Google Scholar] [CrossRef] [PubMed]
  14. Quail, N.; Holmes, K.; Linn, A.; Rea, P.; Livingstone, D.; Boyle, J. Serious Digital Games Using Virtual Patients and Digital Chalk-talk Videos: A Novel Approach to Teaching Acute Diabetes Care in the Undergraduate Medical Curriculum. In Proceedings of the Diabetes UK Professional Conference 2018, London, UK, 14–16 March 2018. [Google Scholar]
  15. van Wyk, E.; de Villiers, R. Virtual Reality Training Applications for the Mining Industry. In Proceedings of the 6th International Conference on Computer Graphics, Virtual Reality, Visualisation and Interaction in Africa, Pretoria, South Africa, 4–6 February 2009; Spencer, S.N., Ed.; Association for Computing Machinery (ACM): New York, NY, USA, 2009; pp. 53–63. [Google Scholar]
  16. Kizil, M.S.; Kerridge, A.P.; Hancock, M.G. Use of virtual reality in mining education and training. In Proceedings of the 2004 CRCMining Research and Effective Technology Transfer Conference, Noosa Head, Queensland, Australia, 15–16 June 2004; Gurgenci, H., Hall, A., Howarth, D., Lever, P., Meyer, T., Nebot, E., Eds.; Cooperative Research Centre-Mining (CRCMining): Brisbane, Australia, 2004; pp. 1–7. [Google Scholar]
  17. Orr, T.J.; Mallet, L.G.; Margolis, K.A. Enhanced fire escape training for mine workers using virtual reality simulation. Min. Eng. 2009, 61, 41–44. [Google Scholar]
  18. Llamazoo. MineLife VR. Available online: https://www.llamazoo.com/wp-content/uploads/2018/05/MineLife.2Pager.LlamaZOO.pdf (accessed on 12 April 2019).
  19. MOON PATROL, V.R. Available online: https://moonpatrolvr.com/ (accessed on 12 April 2019).
  20. Coal Services. Virtual Reality Technologies (VRT). Available online: https://www.coalservices.com.au/mining/mines-rescue/virtual-reality-technologies-vrt/ (accessed on 12 April 2019).
  21. TECKNOTROVE. Virtual Reality in Mine Training. Available online: http://tecknotrove.com/industries/mining-simulators/dump-truck-simulators/ (accessed on 12 April 2019).
  22. LaViola, J.J., Jr.; KRUIJFF, E.; McMahan, R.P.; Bowman, D.A.; Poupyrev, I. 3D User Interfaces Theory and Practice, 2nd ed.; Taub, M., Ed.; Addison-Wesley Professional: Boston, MA, USA, 2014; ISBN 0-13-403446-5. [Google Scholar]
  23. Kim, Y.W.; Jo, D.S.; Kim, Y.H.; Kim, H.M.; Kim, K.H. Realistic Interaction Technology for Virtual Reality. Electron. Telecommun. Trends 2012, 27, 62–72. [Google Scholar] [CrossRef]
  24. Kaiser, E.; Olwal, A.; Mcgee, D.; Benko, H.; Corradini, A.; Li, X.; Cohen, P.; Feiner, S. Mutual Disambiguation of 3D Multimodal Interaction in Augmented and Virtual Reality. In Proceedings of the 5th International Conference on Multimodal Interfaces, Vancouver, BC, Canada, 5–7 November 2003; Association for Computing Machinery(ACM): New York, NY, USA, 2003; pp. 12–19. [Google Scholar]
  25. Kok, A.J.F.; van Liere, R. A multimodal virtual reality interface for 3D interaction with VTK. Knowl. Inf. Syst. 2007, 13, 197–219. [Google Scholar] [CrossRef] [Green Version]
  26. Park, J.H.; Park, K.S. Human performance evaluation of the three-dimensional input devices in virtual environment system. J. Ergon. Soc. Korea 2000, 19, 49–61. [Google Scholar]
  27. Anton, D.; Gregorij, K.; Bajcsy, R. User experience and interaction performance in 2D/3D telecollaboration. Future Gener. Comput. Syst. 2017, 82, 77–88. [Google Scholar] [CrossRef]
  28. Kim, H.; Choi, Y. Development of a 3D User Interface based on Kinect Sensor and Bend-Sensing Data Glove for Controlling Software in the Mining Industry. J. Korean Soc. Miner. Energy Resour. Eng. 2019, 56, 44–52. [Google Scholar] [CrossRef]
  29. Bednarz, T.P.; Caris, C.; Thompson, J.; Wesner, C.; Dunn, M. Human-computer interaction experiments - Immersive virtual reality applications for the mining industry. In Proceedings of the 2010 24th IEEE International Conference on Advanced Information Networking and Applications, Perth, WA, Australia, 20–23 April 2010; IEEE: New York, NY, USA, 2010; pp. 1323–1327. [Google Scholar]
  30. Korea Resources Corporation (KORES). Available online: https://www.kores.or.kr/ (accessed on 12 April 2019).
  31. NASA. Task Load Index (TLX). Available online: https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20000021488.pdf/ (accessed on 10 June 2019).
Figure 1. Three-dimensional (3D) user interface devices used in this study. (a) Two-dimensional (2D) mouse (M185, Logitech, Lausanne, Switzerland); (b) 3D mouse (Space Explorer, 3D Connexion, München, Germany); (c) virtual reality (VR) controller (Oculus Touch, Oculus, Irvine, CA, USA); and (d) bend-sensing data glove with Kinect sensor [28].
Figure 1. Three-dimensional (3D) user interface devices used in this study. (a) Two-dimensional (2D) mouse (M185, Logitech, Lausanne, Switzerland); (b) 3D mouse (Space Explorer, 3D Connexion, München, Germany); (c) virtual reality (VR) controller (Oculus Touch, Oculus, Irvine, CA, USA); and (d) bend-sensing data glove with Kinect sensor [28].
Applsci 09 02584 g001
Figure 2. Major unit operations for orebody modeling using KmodStudio. (a) Three-dimensional (3D) visualization of sampling data representing the grade of an orebody; (b) variogram modeling; (c) spatial interpolation using ordinary Kriging; and (d) 3D visualization and evaluation of the estimated grade of the orebody.
Figure 2. Major unit operations for orebody modeling using KmodStudio. (a) Three-dimensional (3D) visualization of sampling data representing the grade of an orebody; (b) variogram modeling; (c) spatial interpolation using ordinary Kriging; and (d) 3D visualization and evaluation of the estimated grade of the orebody.
Applsci 09 02584 g002
Figure 3. Images of experimenters controlling the KmodStudio software in a virtual reality (VR) environment using different user interface devices. (a) Two-dimensional (2D) mouse; (b) 2D and three-dimensional (3D) mice; (c) VR controller; and (d) bend-sensing data glove with a Kinect sensor.
Figure 3. Images of experimenters controlling the KmodStudio software in a virtual reality (VR) environment using different user interface devices. (a) Two-dimensional (2D) mouse; (b) 2D and three-dimensional (3D) mice; (c) VR controller; and (d) bend-sensing data glove with a Kinect sensor.
Applsci 09 02584 g003
Figure 4. Cumulative working time for controlling KmodStudio in a virtual reality (VR) environment, using different user interface devices. Experimenter No. (a) 1, (b) 2, and (c) 3.
Figure 4. Cumulative working time for controlling KmodStudio in a virtual reality (VR) environment, using different user interface devices. Experimenter No. (a) 1, (b) 2, and (c) 3.
Applsci 09 02584 g004
Figure 5. The cumulative number of clicks for controlling KmodStudio in a virtual reality (VR) environment using different user interface devices. Experimenter No. (a) 1, (b) 2, and (c) 3.
Figure 5. The cumulative number of clicks for controlling KmodStudio in a virtual reality (VR) environment using different user interface devices. Experimenter No. (a) 1, (b) 2, and (c) 3.
Applsci 09 02584 g005
Figure 6. Results from experiments on controlling KmodStudio in a virtual reality (VR) environment using different user interface devices (a) Total working time; (b) number of total clicks; and (c) click accuracy.
Figure 6. Results from experiments on controlling KmodStudio in a virtual reality (VR) environment using different user interface devices (a) Total working time; (b) number of total clicks; and (c) click accuracy.
Applsci 09 02584 g006
Figure 7. Questionnaire survey results for users controlling KmodStudio in a virtual reality (VR) environment using different user interface devices. (a) Ease of learning; (b) ease of use; (c) level of immersion; and (d) level of fatigue.
Figure 7. Questionnaire survey results for users controlling KmodStudio in a virtual reality (VR) environment using different user interface devices. (a) Ease of learning; (b) ease of use; (c) level of immersion; and (d) level of fatigue.
Applsci 09 02584 g007
Table 1. Experimental conditions and evaluation criteria related to user interface devices, experimenter, software, and the head mounted display (HMD) device.
Table 1. Experimental conditions and evaluation criteria related to user interface devices, experimenter, software, and the head mounted display (HMD) device.
ItemCondition
User interface device2D mouse, 2D & 3D mouse, VR Controller, Bend-sensing data glove with Kinect sensor
Experimenter10 people (6 male, 4 female, average 24.8 years old)
Mining softwareKmodstudio (Korea Resources Corporation, Wonju, Korea)
Task3D orebody modeling consisting of 69 operations
HMD deviceOculus Rift (Oculus, Irvine, CA, USA)
Quantitative evaluation criterionTotal working time, number of total clicks, click accuracy
Survey evaluation criterionEase of learning, ease of use, level of immersion, level of fatigue
Table 2. Functions of bend-sensing data glove according to the user’s finger bending motion.
Table 2. Functions of bend-sensing data glove according to the user’s finger bending motion.
Flex FingerFunction
ThumbEnter of keyboard
Index fingerLeft click of 2D mouse
Middle fingerGet point data set for variogram
Ring fingerGet point data set for ordinary kriging
Thumb and Index fingerImport Kmod data file
Middle and Ring fingerDefine X, Y, and Z display ranges

Share and Cite

MDPI and ACS Style

Kim, H.; Choi, Y. Performance Comparison of User Interface Devices for Controlling Mining Software in Virtual Reality Environments. Appl. Sci. 2019, 9, 2584. https://doi.org/10.3390/app9132584

AMA Style

Kim H, Choi Y. Performance Comparison of User Interface Devices for Controlling Mining Software in Virtual Reality Environments. Applied Sciences. 2019; 9(13):2584. https://doi.org/10.3390/app9132584

Chicago/Turabian Style

Kim, Heonmoo, and Yosoon Choi. 2019. "Performance Comparison of User Interface Devices for Controlling Mining Software in Virtual Reality Environments" Applied Sciences 9, no. 13: 2584. https://doi.org/10.3390/app9132584

APA Style

Kim, H., & Choi, Y. (2019). Performance Comparison of User Interface Devices for Controlling Mining Software in Virtual Reality Environments. Applied Sciences, 9(13), 2584. https://doi.org/10.3390/app9132584

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop