Next Article in Journal
Fine-Tuning and Optimization of Superconducting Quantum Magnetic Sensors by Thermal Annealing
Next Article in Special Issue
A Systematic Review on the Use of Wearable Body Sensors for Health Monitoring: A Qualitative Synthesis
Previous Article in Journal
High Energy Double Peak Pulse Laser Induced Plasma Spectroscopy for Metal Characterization Using a Passively Q-Switched Laser Source and CCD Detector
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Agreement Analysis between Vive and Vicon Systems to Monitor Lumbar Postural Changes

by
Susanne M. van der Veen
1,*,
Martine Bordeleau
2,
Peter E. Pidcoe
1,
Christopher R. France
3 and
James S. Thomas
1
1
Department of Physical TherapyVirginia Commonwealth University, MCV Campus, Richmond, VA 23220, USA
2
Department of Neuroscience, Université Laval, Québec, QC G1V 4G2, Canada
3
Department of Psychology, Ohio University, College of Arts & Sciences, Athens, OH 45701, USA
*
Author to whom correspondence should be addressed.
Sensors 2019, 19(17), 3632; https://doi.org/10.3390/s19173632
Submission received: 26 July 2019 / Revised: 14 August 2019 / Accepted: 19 August 2019 / Published: 21 August 2019
(This article belongs to the Special Issue Wireless Body Sensors)

Abstract

:
Immersive virtual reality has recently developed into a readily available system that allows for full-body tracking. Can this affordable system be used for component tracking to advance or replace expensive kinematic systems for motion analysis in the clinic? The aim of this study was to assess the accuracy of position and orientation measures from Vive wireless body trackers when compared to Vicon optoelectronic tracked markers attached to (1) a robot simulating trunk flexion and rotation by repeatedly moving to know locations, and (2) healthy adults playing virtual reality games necessitating significant trunk displacements. The comparison of both systems showed component tracking with Vive trackers is accurate within 0.68 ± 0.32 cm translationally and 1.64 ± 0.18° rotationally when compared with a three-dimensional motion capture system. No significant differences between Vive trackers and Vicon systems were found suggesting the Vive wireless sensors can be used to accurately track joint motion for clinical and research data.

1. Introduction

The last decade has witnessed a booming development in immersive virtual reality (VR) technology, where the user has the perception of being physically present in a non-physical environment. These alternative realities can be created by surrounding the user with computer-generated sensory perceptions stimulating vision (e.g., head-mounted displays [1], cave automatic virtual environments [2], etc.), hearing (e.g., virtual acoustics [3], binaural sounds [4], etc.), touch (e.g., haptics technology [5], tactile [6] or force [7] feedback, etc.), smell (e.g., scent cartridges [8,9]), and taste (controlled electrical pulses [10] and aromas [11,12]).
VR has long been associated with gaming, but now it is expanding into other fields like the healthcare industry. According to a new report by Grand View Research, the VR and augmented reality healthcare market is expected to reach United States Dollars (USD) 5.1 billion by 2025 [13]. VR has been used in numerous biomedical applications from surgical training to medical education, and from psychiatric to motor rehabilitation. Such applications for patient care include the treatment of acute and chronic pain [14,15,16,17,18], specific phobias [19,20], and post-traumatic stress disorder [20,21]. Among several other applications, VR has been successfully used in cognitive and physical rehabilitation after stroke [22,23] and traumatic brain injury [24].
When combined with an optoelectronic three-dimensional (3D) motion capture system, VR becomes a valuable tool to enhance patient motivation during long-term rehabilitation sessions while giving the clinician real-time measurement of joint movement and motor control. However, optoelectronic 3D motion capture systems such as Vicon, can cost up to a hundred thousand US dollars, making this technology out of reach for most clinicians and researchers.
Motion capture systems are used in numerous fields and studies based on motion capture data can be found in biomechanical, sport, and clinical sciences. Optoelectronic motion capture systems based on passive reflective markers were originally developed for gait analysis [25] but this method of motion analysis has also been used to quantify lumbar flexion during reaching tasks [26,27,28]. Optoelectronic motion capture systems provide a powerful measuring tool for biomechanical applications as position accuracy [29] with these systems has reported errors of less than 2 mm [30].
HTC and Valve Corporation released a head-mounted display (HMD) (April 2016), called the HTC Vive. The HTC Vive is an HMD that provides a fully immersive VR environment and costs less than traditional motion tracking systems, such as Vicon. While tracking the orientation and position of the HMD in VR is critical to provide a seamless immersive environment, we found only one recent article that has quantified the precision and accuracy of the position and orientation of the Vive HMD [31] and one evaluating the accuracy of the HTC Vive hand controllers and trackers [32]. Niehorster and colleagues [31] compared the Vive HMD tracking with a Cartesian grid and a WorldViz PPT-X optical tracking system and suggested that tracking accuracy of the HMD display using the Vive was unsuitable for scientific experiments that require accurate visual simulation of self-motion through a virtual world. Likely the larger errors reported were due to tracking of the HMD being lost by the lighthouses (the hardware used to track the HTC Vive HMD and controllers) and tilting of the orientation of the VR space when tracking was re-established. However, others recently found Vive controllers and trackers to have high accuracy (i.e., less than a millimetre) in both static and dynamic planar tasks when additional custom tracking algorithms were applied [32]. Additionally, a recent master’s thesis on the accuracy of Vive tracker position reported a modest error in positional tracking (i.e., 7.5 mm compared to real world positioning and 1cm with respect to a research-grade system) [33]. Although Vive trackers have been shown to have reasonable positional accuracy in static tasks, a more robust examination during dynamic multiplanar tasks is needed to address whether these devices are suitable alternatives for biomedical applications.
As Vive trackers are wireless and lightweight, they can be securely attached to different parts of the body, and therefore it might be possible to accurately track body motion, eliminating the need of an expensive 3D kinematic systems. The aim of this study was to assess the accuracy of position and orientation measures reported by Vive trackers in comparison to an optoelectronic 3D motion capture. The accuracy of the Vive trackers were examined using (1) a robot to produce fixed single plane and multiplanar motions and (2) while attached to the thorax and sacrum of human participants engaging in VR gameplay.

2. Materials and Methods

Simultaneous recording of position and orientation of sacral and thoracic marker clusters were made using custom 3D-printed mounting plates that integrated light reflective markers with HTC Vive trackers (see Figure 1b,c). We examined the accuracy of the trackers under two conditions. In the first condition, sacral and thoracic marker clusters were mounted on to a robot arm (SCORBOT) that made single plane and multiplanar movements. In the second condition, the sacral and thoracic marker clusters were attached to healthy participants (n = 7) during VR gameplay.

2.1. Motion Capture Systems

2.1.1. Vive System

The Vive kinematic system employed in this experiment consisted of two HTC Vive wireless trackers (HTC America, Inc., Seattle, WA, USA) (Table 1). The position and orientation of these trackers were determined by two fixed infrared laser emitter “Lighthouses” (HTC Vive Lighthouses, Valve Washington, DC, USA). Our experiments used first generation (1.0) “lighthouses” during the human participant game play, and second generation (2.0) for robot simulated movements.
First generation (1.0) “Lighthouses” use infrared lasers that alternately sweep horizontal and vertical lines of light through a 120° field-of-view collection volume. A synchronization pulse at the beginning of each sweep resets the internal clock of trackers to time-zero. The time from the start of a sweep to beam detection by a surface-mounted photodiode infers the angular position of the tracker relative to the fixed “Lighthouse” location. Multiple photodiode detectors on each tracker provide additional orientation information. Time-stamped data is then transformed to a world reference frame and recorded. Second generation (2.0) “Lighthouses” use the same laser projections, but synchronization flashes are eliminated by the use of a synch-on-beam signal to provide angular position information, eliminating the need for a time-based calculation [34].
Using custom software developed within the Unity game engine, (version 2018.2.6f1 Unity Technologies, California, CA, USA), the 6- degrees of freedom (DOF) kinematic data collection from the trackers was set to a sampling frequency of 100 Hz. However, variations in software and hardware processing resulted in actual sampling frequencies ranging 58–100 Hz. Since each sample was time stamped, data were adjusted in post-processing using linear interpolation methods within the Matlab environment (version R2018a Mathworks, Natick, MA) to ensure a 100 Hz output.

2.1.2. Vicon System

The Vicon kinematic system consists of ten Vicon Bonita 10 cameras (Vicon Motion Systems Ltd., Oxford, UK) which track the 6-DOF position and orientation of light-reflective marker clusters mounted on the custom 3D-printed mounting plates (on which the HTC Vive trackers were co-located). These plates were attached to thorax and sacrum of human participants and attached to the robot using elastic straps. The 6-DOF kinematic data from the light reflective markers were collected at a sampling frequency of 100 Hz (with a spatial resolution of 0.1 mm) and recorded using Motion Monitor software (Innovative Sports Training, Chicago, IL, USA) in the Euler Angle sequence y, z, x (Figure 2).

2.2. Immersive Virtual Reality Environment

We created a custom virtual reaching game, called Matchality (see Supplementary Materials: Multimedia Appendix 1), using Unity software. The game requires players to reach to a static set of cubes, arranged in a 4 × 4 grid, at a self-selected pace. The locations of the cubes in the virtual space are such that the participant could touch the highest row of cubes with 15° of lumbar flexion, the second through fourth rows would necessitate 30°, 45°, and 60° of lumbar flexion, respectively. These target positions are based on a standardized algorithm that takes into account the anthropometrics of the participant [35] but given the number of body segments involved in whole-body reaching, kinematic redundancy allows for successful completion of the task with an infinite number of movement strategies. Game play begins with the random illumination of two cubes for 100 ms each. Participants must then move such that their avatar touches the previously lit cubes in the same sequence as presented. After each successful completion, the pattern is repeated with an additional cube added to create a longer sequence. The sequence length continues to increase until the player is unable to correctly match the sequence, and then the game reverts to a sequence of only two cubes. This is a time-based game that lasts 90 s per set. There are two sets per level and three levels per game. The HTC Vive system (HTC America, Inc., Seattle, WA, USA) was used to allow physical movement of the virtual avatar within the virtual environment. The participants were immersed using a wired HMD (470 g) that uses an organic light-emitting diode display and provides a resolution of 1080 × 1200 per eye, with a refresh rate of 90 Hz, and a field of view of 110°. See Multimedia Appendix 1 for an example of the Matchality game.

2.3. Measurement Set Up

Two HTC Vive lighthouses and ten Vicon 10 cameras were positioned around a 2.5 m by 2.5 m platform providing an adequate data collection volume for unconstrained game play (Matchality is a stationary game, when a VR game requires greater player movements, larger virtual areas may need to be covered). The axes of the world reference frame for the Vicon system was such that positive z faces upward, positive x faces forward, and positive y faces leftward relative to the position of the SCORBOT and of the subject. Thus, flexion of the spine would result in a clockwise rotation about the y-axis and a forward displacement along the x-axis. Twisting of the trunk would result in a rotation about the z axis with minimal displacement along the x- or y-axes.

2.4. Robot Instrumentation

To evaluate the accuracy and repeatability of position and orientation measurements in a controlled environment, we used a SCORBOT ER VII (Eshed Robotec/ RoboGroup, Rosh Ha’Ayin, Israel) to rotate the HTC Vive trackers to known positions that require angular rotation about the y- and z-axes (simulating spine flexion and rotation). Note that rotation about the y-axis results in a significant displacement among the x-axis while rotation about the z-axis has minimal displacement along the x and y-axes. As illustrated in Figure 3, the SCORBOT ER VII is a 5-axis robot allowing controlled 6-DOF movement of the distal link. The thoracic marker plate was placed on the end-effector (or gripper as labeled in Figure 3) and the lumbar marker plate was placed on the rigid segment between the elbow and shoulder (Figure 3). This configuration allowed movement of the gripper (around the z-axes) to simulate lumbar rotation and movement of the elbow (around the y-axis) to simulate lumbar flexion. The SCORBOT ER VII was programmed to move to positions of approximately 15°, 30°, 45° and 60° of rotation around the y-axis and 0°, 15°, and 45° of rotation about the z-axis, displacing Vicon marker clusters and HTC Vive trackers synchronously.

2.5. Participants

Seven healthy adults between the age of 18 and 35 were recruited for this study. The exclusion criteria included (1) a history of low back injury that required medical attention, including chiropractic care or missing school or work, (2) current low back pain or low back pain in the last 6 months (only), or (3) any orthopedic or neurological impairment that would prevent participants from executing tasks that require moderate amounts of trunk flexion. The Institutional Review Board of Ohio University approved the study protocol and all participants signed an informed consent form. In compensation for their time, participants received an Ohio University Motor Control Lab t-shirt.
The participants were asked to wear shorts, a shirt and gym shoes provided by the laboratory. Vicon light-reflective trackers were attached to custom-designed 3D printed plates to create marker clusters for measuring 6-DOF. These plates were designed to allow the attachment of HTC Vive trackers in the center of the marker clusters, thus co-locating the two sensors. The 3D printed components were attached to the thorax and sacrum using elastic straps.

2.6. Procedure

(i) SCORBOT: With the sacrum and thorax components attached to the SCORBOT, four consecutive trials of 20 reaches were performed at 15°, 30°, 45° and 60° of rotation about the y-axis with five reaches for each 0°, 15°, 30°, 60° with additional rotation about the z-axis. For each reach, the motion was paused for 3 s at the target angle about the y and x-axis. Each movement started from a 0° flex/0° rotation position and data were recorded for the entirety of the movement.
(ii) Participants: One game of Matchality which consisted of two sets at each of the three levels of the game (See link to video in supplementary materials). The location of the 4 × 4 set of blocks in this game was determined based on the participant’s arm length, trunk length, and hip height. Specifically, row 1 would be reached, in theory, with 15° of trunk flexion with the elbow extended and the shoulder flexed 90°, however, the body is a redundant system, leading to multiple degrees of freedom [27,35]. Rows 1–4 could be reached by flexing the trunk 15°, 30°, 45°, and 60° respectfully.

2.7. Outcomes

The time series data were examined to determine root means square (RMS) error and the mean difference at peak displacement between spatial position (position of the sacrum and thorax plates within the global coordinate system in cm) and orientation (rotation of the sacrum and thorax plates within the global coordinate system in degrees).

2.8. Data Collection

The time series position and orientation data for both systems were derived from the global coordinate system data with the Motion Monitor and Unity software. As the axes are different between the two systems, the data from the HTC Vive trackers were translated to conform to the world axes of the Vicon system before sending the data to the Motion Monitor software. The time-series Euler angle data and position data from the Vicon cluster and the HTC Vive trackers were exported from The Motion Monitor software. These time series data were imported into Matlab and processed using custom programs. All data were smoothed using a 40-point Savitzky–Golay [36] filter and DC offset removed. The two data sets were temporally aligned based on known events (initial start of game output from Unity game engine).

2.9. Statistical Analysis

RMS was computed on the time-series data from the Vive and Vicon positional and orientation data streams. The average RMS ± standard deviation (STD) is presented separately for the SCORBOT and for data collected on human participants. Separate 2-way repeated measures ANOVA (SPSS version 24.0, IBM) were used to assess differences in peak positional and orientation displacement between HTC Vive trackers and Vicon marker clusters for the SCORBOT and participants. Normality was assessed using standard skewness and kurtosis thresholds (i.e., <~1.25). The within-subject factors for the both the ANOVA for SCORBOT and the participants were System (i.e., Vicon, Vive) and target angle (15°, 30°, 45°, 60° target angles). Post hoc comparisons were assessed using Bonferroni test with adjustment for multiple comparisons. p-values < 0.05 were considered significant.

3. Results

Seven healthy adults were recruited for this study (see Table 2 for participant characteristics). The data of the SCORBOT is divided for position and orientation of the target angles 15°, 30°, 45° and 60°. Descriptive data for both experiments are reported as means ± standard deviation (see Table 3 and Table 4).

3.1. Position

Positional RMS error, averaged across thorax and sacrum components as well as across SCORBOT and participants was 0.58 ± 0.89 cm. Positional RMS error, averaged across SCORBOT and participants, is larger in thorax (0.77 ± 1.07 cm) component compared to the sacrum component (0.38 ± 0.61 cm). This is most likely driven by multi-planar movement at the thorax component compared to single plane motion of the sacrum component. Finally, RMS error was larger in participants game play (1.30 ± 0.92 cm) compared to the SCORBOT (0.02 ± 0.01 cm).
(i) SCORBOT: No main effect was found between the Vicon (0.46 ± 0.01 cm) and Vive (0.68 ± 0.24 cm) system on peak displacement of the thorax component (p = 0.37). Further, there was no interaction (p = 0.82) between the system and target angles (15°, 30°, 45°, 60° rotation about y-axis). However, as expected, there was a main effect of angle on peak displacement (F(3,6) = 2328.4, p< 0.001).
(ii) Participants: No main effect was found between the Vicon (0.24 ± 0.02 cm) and Vive (0.21 ± 0.04 cm) system at peak displacement of the thorax component (p = 0.41). While an interaction effect between the system and targets angles (15°, 30°, 45°, 60°) about y-axis trended toward significance (p = 0.053), pairwise comparison revealed no significant differences in peak displacement between Vicon and Vive for the four different target angles (e.g., 15°, p=0.36, 30°, p=0.42, 45°, p=0.42, 60°, p = 0.13).

3.2. Orientation

The average RMS error in orientation was 1.46 ± 0.59°. The difference between the two measurement methods was similar for participants (1.61 ± 0.62°) and the SCORBOT (1.66 ± 0.58°). The rotational RMS error was larger for the thorax component (1.90 ± 0.65°) compared to the sacrum component (1.38 ± 0.39°); however, as noted above, motion of the thorax component was multiplanar while the sacrum component was primarily about a single plane There was a noticeable increase of error in the SCORBOT when axial rotation was introduced (see Table 3 and Figure 4).
(i) SCORBOT: No main effect of system on the peak y-axis rotation of the thorax component (p = 0.073). On average, Vicon had a peak rotation of 33.2 ± 0.75° while Vive had peak rotation of 33.4 ± 0.79°. However, there was a significant interaction between the system and target angles (F (3,6) = 6.425, p = 0.027). A pairwise comparison showed the displacement measured between Vicon and Vive was significantly different only for 45°rotation about the y-axis (Vicon 41.88 ± 0.09°, Vive 42.12 ± 0.05°, F (1,8) = 8.703, p = 0.018). Finally, there was a significant main effect of target angle on peak rotation (F (3,6) = 389,022.8, p ≤ 0.001)
(ii) Participants: No main effect of system on peak y-axis rotation of the thorax component was found (p = 0.75) with the Vicon averaging 23.79 ± 3.58° and Vive 23.49 ± 4.10. Furthermore, there was no significant interaction between the system and target angles (p = 0.64). Finally, there was no main effect of target angle.

4. Discussion

This paper aimed to establish the agreement between a VR tracking system, the Vive HTC, with, a 3D optoelectric kinematic system (Vicon) during multiplanar dynamic tasks. As a traditional method of motion tracking, Vicon has established an accuracy of 0.1 mm and 0.1° within a calibrated volume space. This analysis shows that HTC Vive trackers agree with Vicon motion tracking with an average error of 0.58 ± 0.89 m and 1.46 ± 0.62° for position and orientation respectively. The inclusion of the SCORBOT robot allowed specifically controlled simultaneous rotations about two cardinal axes while natural human motion during gameplay in the VR space (i.e., while playing Matchality) provided comparison that would apply in more practical scenarios.
Positional accuracy of the HTC Vive trackers assessed using RMS of time series data ranged from 0.02 to 0.05 cm in the SCORBOT and 0.45 to 3.69 cm in participants, which is significantly larger compared to a study by Niehorster and colleagues [4] and Borges et al. [32]. They reported and RMS between 0.0049 cm and 0.0080 cm for the VIVE head-mounted display [31]. However, the large differences are most likely driven by key differences in methodology. Niehorster et al. used a median sample-to-sample RMS over three data points of 1s of stationary position of the HMD [31] compared to our use of 3 min of data during dynamic multiplanar movements. And Borges et al. used a custom tracking algorithm [32]. Besides the differences in analysis methods, these studies differed in the fact we used 3D dynamic motion and not just static and 2D motion [31,32].
Similar to the magnitude of position error, the average RMS error for rotation about the y-axis (i.e., flexion) measured between Vive and Vicon was 1.64 ± 0.62° which is larger than previously reported (0.0053 to 0.0111°) [31]. The larger RMS error values measured in the SCORBOT occurred with simultaneous rotation about the z-axis (i.e., axial rotation) and the y-axis (flexion). Specifically, as seen in Table 3, RMS error increased in SCORBOT_15 and SCORBOT_30 compared to SCORBOT_0. However, it is worth noting that RMS error was considerably smaller (i.e., 1.61 ± 0.62°) during human motions captured during VR gameplay compared to the RMS error of 2.56 ± 0.77° observed during the SCORBOR_30 condition.
Previous studies have tried to establish the accuracy and precision of position and orientation of the Vive HTC before [31,32,33]. However, these studies have reported various levels of positional and rotational accuracy (which will be discussed below) [31,33] and difficulty with reference planes when tracking of the HMD was lost [31]. We have not encountered this problem, possibly due to the (1) continuous updates provided by Steam VR solving initial orientation errors, (2) carefully setting up the play space, and (3) making sure the hand controllers are tracked well and are static during the setup process.
While differences between the Vive and Vicon exceeded previously described errors, visual examinations of the time series (Figure 4) suggest robust accuracy, especially in human participants engaged in typical motions (e.g., 15–45° of trunk flexion) in gameplay. Our results highlight the importance of considering movements about multiple axes simultaneously, as differences between the two systems increase. However, it is possible that these errors may be driven by differences in the method of calculating the Euler angle sequence between the systems (i.e., Vive sequence x, y, z, Vicon sequence y, z, x).

5. Conclusions

The HTC Vive trackers are wireless, lightweight, and inexpensive. They are easily attached to the body and allow reasonably accurate measures of joint motions (both position and orientation) during human movement with VR gameplay. While our results suggest that the accuracy of Vive trackers is lower than previously reported [31,32], they provide acceptable accuracy for tracking joint motion in human subjects using wireless body sensors (See Figure 4A). These data are very promising for future applications of this system for self-representation in the VR environment, clinical practice, and research labs. With the introduction of the second generation “Lighthouses”, additional “Lighthouses” can be linked covering up to a 10m diagonal space. This would increase tracking space and cater to experiments needing greater environments, multiple participants, or other clinical measures such as gait analyses. The ability to easily measure human motion using wireless sensors such as the Vive trackers, could have significant clinical implications and allow clinicians to quickly generate objective data to improve treatment. While these updates might not change the accuracy of positional and orientation data, they illustrate how fast VR equipment and software evolves. Accurate wireless tracking provides the ability to increase the agency and sense of presence in the VR environment, which has real-world implications for greater use of VR in rehabilitation as shaping of motor behavior within VR should then translate to improved movements in real life. As the development of new VR motion tracking devices is constantly evolving (e.g., finger tracking in new Valve Index controllers), the accuracy and fidelity of these devices will need to be assessed in controlled conditions. Future research should examine the role of multiple trackers accuracy tracking at higher movement velocities to further elucidate the potential of these motion tracking devices.

Supplementary Materials

Multimedia Appendix 1: Video files, https://www.youtube.com/watch?v=jcBAEpnz9XY.

Author Contributions

J.S.T. and M.B. conceived and designed the experiments; S.M.v.d.V. performed the experiments; J.S.T. and S.M.v.d.V. analyzed the data; P.E.P. contributed reagents/materials/analysis tools/manuscript editing; S.M.v.d.V., M.B., C.R.F. and J.S.T. contributed to writing the manuscript.

Funding

National Institutes of Health R01HD088417 awarded to J.S.T. and C.R.F.

Acknowledgments

The authors would like to thank Samantha Baldwin, Emma Fish, Dana Nocera, and Kellen Kubik, Dave Hammond for their help with the data collection, Samantha Baldwin for her assistance with the review of the manuscript and data collection.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Shibata, T. Head mounted display. Displays 2002, 23, 57–64. [Google Scholar] [CrossRef]
  2. Creagh, H. Cave Automatic Virtual Environment. In Proceedings of the Electrical Insulation Conference and Electrical Manufacturing and Coil Winding Technology Conference Cat. No. 03CH37480, Indianapolis, IN, USA, 25–25 September 2003; pp. 499–504. [Google Scholar]
  3. Rungta, A.; Rewkowski, N.; Klatzky, R.; Lin, M.; Manocha, D. Effects of virtual acoustics on dynamic auditory distance perception. J. Acoust. Soc. Am. 2017, 141, EL427–EL432. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Zaunschirm, M.; Schörkhuber, C.; Höldrich, R. Binaural rendering of Ambisonic signals by head-related impulse response time alignment and a diffuseness constraint. J. Acoust. Soc. Am. 2018, 143, 3616–3627. [Google Scholar] [CrossRef] [PubMed]
  5. Mansor, N.N.; Jamaluddin, M.H.; Zaki Shukor, A. CONCEPT AND APPLICATION OF VIRTUAL REALITY HAPTIC TECHNOLOGY: A REVIEW. J. Theor. Appl. Inf. Technol. 2017, 31. [Google Scholar]
  6. Dimbwadyo-Terrer, I.; Trincado-Alonso, F.; de los Reyes-Guzmán, A.; Aznar, M.A.; Alcubilla, C.; Pérez-Nombela, S.; del Ama-Espinosa, A.; Polonio-López, B.; Gil-Agudo, Á. Upper limb rehabilitation after spinal cord injury: A treatment based on a data glove and an immersive virtual reality environment. Disabil. Rehabil. Assist. Technol. 2016, 11, 462–467. [Google Scholar] [CrossRef] [PubMed]
  7. Aiple, M.; Schiele, A. Pushing the limits of the CyberGrasp for haptic rendering. In Proceedings of the IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; pp. 3541–3546. [Google Scholar]
  8. VAQSOInc. Scent Device for VR. Available online: https://vaqso.com (accessed on 8 September 2019).
  9. FeelrealInc. Available online: https://feelreal.com/. (accessed on 8 September 2019).
  10. Keio-NUS CUTE Center. Taste+. Available online: http://cutecenter.nus.edu.sg/projects/taste+.html. (accessed on 8 September 2019).
  11. Ranasinghe, N.; Tram Nguyen, N.; Liangkun, Y.; Lin, L.-Y.; Tolley, D.; Yi-Luen Do, E. Vocktail: A Virtual Cocktail for Pairing Digital Taste, Smell, and Color Sensations. In Proceedings of the 25th ACM international conference on Multimedia, Mountain View, CA, USA, 23–27 October 2017. [Google Scholar]
  12. Project Nourished. Heighten Sensation of Food and Medicine. Available online: http://www.projectnourished.com (accessed on 8 September 2019).
  13. Grand View Research Inc. Augmented Reality (AR) & Virtual Reality (VR) in Healthcare Market Analysis By Component (Hardware, Software, and Service), By Technology (Augmented Reality, Virtual Reality), And Segment Forecasts 2018–2025; Market Research Report; Grand View Research, Inc.: San Francisco, CA, USA, 2017. [Google Scholar]
  14. Lin, H.-T.; Li, Y.-I.; Hu, W.-P.; Huang, C.-C.; Du, Y.-C.; Lin, H.-T.; Li, Y.-I.; Hu, W.-P.; Huang, C.-C.; Du, Y.-C. A Scoping Review of The Efficacy of Virtual Reality and Exergaming on Patients of Musculoskeletal System Disorder. J. Clin. Med. 2019, 8, 791. [Google Scholar] [CrossRef]
  15. Collado-Mateo, D.; Merellano-Navarro, E.; Olivares, P.R.; García-Rubio, J.; Gusi, N. Effect of exergames on musculoskeletal pain: A systematic review and meta-analysis. Scand. J. Med. Sci. Sports 2018, 28, 760–771. [Google Scholar] [CrossRef]
  16. Austin, P.D.; Siddall, P.J. Virtual reality for the treatment of neuropathic pain in people with spinal cord injuries: A scoping review. J. Spinal Cord Med. 2019, 1–11. [Google Scholar] [CrossRef]
  17. Scapin, S.; Echevarría-Guanilo, M.E.; Boeira Fuculo Junior, P.R.; Gonçalves, N.; Rocha, P.K.; Coimbra, R. Virtual Reality in the treatment of burn patients: A systematic review. Burns 2018, 44, 1403–1416. [Google Scholar] [CrossRef]
  18. Mallari, B.; Spaeth, E.K.; Goh, H.; Boyd, B.S. Virtual reality as an analgesic for acute and chronic pain in adults: a systematic review and meta-analysis. J. Pain Res. 2019, 12, 2053–2085. [Google Scholar] [CrossRef]
  19. Botella, C.; Fernández-Álvarez, J.; Guillén, V.; García-Palacios, A.; Baños, R. Recent Progress in Virtual Reality Exposure Therapy for Phobias: A Systematic Review. Curr. Psychiatry Rep. 2017, 19, 42. [Google Scholar] [CrossRef] [PubMed]
  20. Maples-Keller, J.L.; Yasinski, C.; Manjin, N.; Rothbaum, B.O. Virtual Reality-Enhanced Extinction of Phobias and Post-Traumatic Stress. Neurotherapeutics 2017, 14, 554–563. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  21. Reger, G.M.; Koenen-Woods, P.; Zetocha, K.; Smolenski, D.J.; Holloway, K.M.; Rothbaum, B.O.; Difede, J.A.; Rizzo, A.A.; Edwards-Stewart, A.; Skopp, N.A.; et al. Randomized controlled trial of prolonged exposure using imaginal exposure vs. virtual reality exposure in active duty soldiers with deployment-related posttraumatic stress disorder (PTSD). J. Consult. Clin. Psychol. 2016, 84, 946–959. [Google Scholar] [CrossRef] [PubMed]
  22. Lee, H.S.; Park, Y.J.; Park, S.W. The Effects of Virtual Reality Training on Function in Chronic Stroke Patients: A Systematic Review and Meta-Analysis. Biomed Res. Int. 2019, 2019, 1–12. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  23. Ahn, S.; Hwang, S. Virtual rehabilitation of upper extremity function and independence for stoke: A meta-analysis. J. Exerc. Rehabil. 2019, 15, 358. [Google Scholar] [CrossRef] [PubMed]
  24. Aida, J.; Chau, B.; Dunn, J. Immersive virtual reality in traumatic brain injury rehabilitation: A literature review. NeuroRehabilitation 2018, 42, 441–448. [Google Scholar] [CrossRef] [PubMed]
  25. Cappozzo, A.; Della Croce, U.; Leardini, A.; Chiari, L. Human movement analysis using stereophotogrammetry. Part 1: Theoretical background. Gait Posture 2005. [Google Scholar]
  26. Clark, B.C.; Russ, D.W.; Nakazawa, M.; France, C.R.; Walkowski, S.; Law, T.D.; Applegate, M.; Mahato, N.; Lietkam, S.; Odenthal, J.; et al. A randomized control trial to determine the effectiveness and physiological effects of spinal manipulation and spinal mobilization compared to each other and a sham condition in patients with chronic low back pain: Study protocol for The RELIEF Study. Contemp. Clin. Trials 2018, 70, 41–52. [Google Scholar] [CrossRef] [PubMed]
  27. Thomas, J.S.; Gibson, G.E. Coordination and timing of spine and hip joints during full body reaching tasks. Hum. Mov. Sci. 2007, 26, 124–140. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  28. Trost, Z.; France, C.R.; Thomas, J.S. Exposure to movement in chronic back pain: Evidence of successful generalization across a reaching task. Pain 2008, 137, 26–33. [Google Scholar] [CrossRef]
  29. Windolf, M.; Götzen, N.; Morlock, M. Systematic accuracy and precision analysis of video motion capturing. J. Biomech. 2008. [Google Scholar] [CrossRef] [PubMed]
  30. Merriaux, P.; Dupuis, Y.; Boutteau, R.; Vasseur, P.; Savatier, X. A study of vicon system positioning performance. Sensors 2017, 17, 1591. [Google Scholar] [CrossRef] [PubMed]
  31. Niehorster, D.C.; Li, L.; Lappe, M. The accuracy and precision of position and orientation tracking in the HTC vive virtual reality system for scientific research. Iperception 2017, 8, 204166951770820. [Google Scholar] [CrossRef] [PubMed]
  32. Borges, M.; Symington, A.; Coltin, B.; Smith, T.; Ventura, R. HTC Vive: Analysis and Accuracy Improvement. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 2610–2615. [Google Scholar]
  33. Luckett, E. A Quantitative Evaluation of the HTC Vive for Virtual Reality Research. Ph.D. Thesis, The University of Mississippi, Oxford, MI, USA, 2018. [Google Scholar]
  34. Corporation, V. SteamTrackingHDK. Available online: https://steamcommunity.com/games/steamvrtracking/announcements/detail/1264796421606498053 (accessed on 6 May 2018).
  35. Thomas, J.S.; Corcos, D.M.; Hasan, Z. Kinematic and Kinetic Constraints on Arm, Trunk, and Leg Segments in Target-Reaching Movements. J. Neurophysiol. 2004, 93, 352–364. [Google Scholar] [CrossRef]
  36. Press, W.H.; Teukolsky, S.A.; Vetterling, W.T.; Flannery, B.P. Numerical Recipes in C: The Art of Scientific Computing; Cambridge University Press: Cambridge, UK, 1992; ISBN 0521431085. [Google Scholar]
Figure 1. Participant instrumentation: (A) 3D printed plate for the head with Vicon markers (not compared in this in this study); (B,C) 3D printed plate with Vicon and Vive trackers for the thoracic and sacrum levels. (D) HTC Vive controller and (E) HTC Vive wired HMD.
Figure 1. Participant instrumentation: (A) 3D printed plate for the head with Vicon markers (not compared in this in this study); (B,C) 3D printed plate with Vicon and Vive trackers for the thoracic and sacrum levels. (D) HTC Vive controller and (E) HTC Vive wired HMD.
Sensors 19 03632 g001
Figure 2. Illustration of Vive and Vicon coordinate systems.
Figure 2. Illustration of Vive and Vicon coordinate systems.
Sensors 19 03632 g002
Figure 3. Representation of the SCORBOT ER VII. With Vicon axis system represented with the red (X), green (Y) and blue (Z) arrows. Sections of the scorebot: (1) Base—lower part of robot which rotates 310 about the z-axis, (2) Shoulder—connects to the base by way of a joint which rotates 35–130 about the y-axis, (3) Elbow—connects to the shoulder by way of a joint which also rotates 130 about the y-axis, (4) Wrist—connected to the elbow and gives the robot its final two degrees of freedom, rotating 360 about the z-axis and 130 about the y-axis, and (5) Gripper—the end effector attached to the wrist and capable of opening and closing.
Figure 3. Representation of the SCORBOT ER VII. With Vicon axis system represented with the red (X), green (Y) and blue (Z) arrows. Sections of the scorebot: (1) Base—lower part of robot which rotates 310 about the z-axis, (2) Shoulder—connects to the base by way of a joint which rotates 35–130 about the y-axis, (3) Elbow—connects to the shoulder by way of a joint which also rotates 130 about the y-axis, (4) Wrist—connected to the elbow and gives the robot its final two degrees of freedom, rotating 360 about the z-axis and 130 about the y-axis, and (5) Gripper—the end effector attached to the wrist and capable of opening and closing.
Sensors 19 03632 g003
Figure 4. Time-series of Vive and Vicon. Graphs (A,B) show an example of thorax position and orientation of a participant playing Matchality. Graph (C,D) shows the thorax position and flexion angle of the SCORBOT moving 15° around the y-axis with 0°, 15°, and 45° rotation about the z-axis.
Figure 4. Time-series of Vive and Vicon. Graphs (A,B) show an example of thorax position and orientation of a participant playing Matchality. Graph (C,D) shows the thorax position and flexion angle of the SCORBOT moving 15° around the y-axis with 0°, 15°, and 45° rotation about the z-axis.
Sensors 19 03632 g004
Table 1. Description of motion capture systems under study.
Table 1. Description of motion capture systems under study.
CharacteristicViveVicon
Tracking system for lumbar motion2 infrared laser lighthouses
2 Vive trackers (HTC America, Inc., Seattle, WA, USA)
10 Bonita 10 infrared cameras
10 light reflective markers mounted on 2 3D printed plates (Vicon Motion Systems Ltd., Oxford, UK)
Platform softwareSteam VR (Valve Corporation, Washington, DC, USA).Tracker version 3.4 (Vicon Motion Systems Ltd., Oxford, UK)
Motion capture softwareUnity 2019.2.6f1 (Unity Technologies, California, CA, USA)Motion Monitor (Innovative Sports Training, Chicago, IL, USA).
Sampling rate (Hz)58–100100
Latency (ms)2220
3D parameterEuler angle (x, y, z)Euler angle (y, z, x)
Table 2. Participant characteristics.
Table 2. Participant characteristics.
IDSexAgeWeight (Kg)Height (m)BMI
1F24591.6023
2F23811.6829
3M25821.7527
4M22771.8922
5F41901.6832
6F22571.6820
7M30591.7020
Mean 72.1472.141.7124.71
STD 6.8713.500.094.68
Table 3. Root mean square for Participants and SCORBOT with 0°, 15° and 45° of rotation about the y-axis of Vive and Vicon.
Table 3. Root mean square for Participants and SCORBOT with 0°, 15° and 45° of rotation about the y-axis of Vive and Vicon.
IDSacrumThorax
Position (mm)Rotation (°)Position (mm)Rotation (°)
MeanSTDMeanSTDMeanSTDMeanSTD
SCORBOT_00.020.001.650.520.020.021.210.34
SCORBOT_150.020.001.470.380.020.011.620.16
SCORBOT_300.020.011.470.370.030.022.560.77
Participants0.670.691.180.331.740.961.980.54
average0.180.611.440.390.451.071.840.65
Table 4. Average difference over 5 reaches between Vive and Vicon position and orientation of the thorax at peak displacement and orientation.
Table 4. Average difference over 5 reaches between Vive and Vicon position and orientation of the thorax at peak displacement and orientation.
ID 15° 30° 45° 60°
MeanSTDMeanSTDMeanSTDMeanSTD
0.020.010.010.010.010.020.000.01
SCORBOT_00.020.010.010.010.010.020.010.01
SCORBOT_150.010.010.010.01−2.814.890.030.05
SCORBOT_30−0.010.02−0.010.040.000.010.000.01
Participants
average0.010.020.000.03-0.522.120.010.02
∆ (Vicon-Vive) Orientation
0.380.59−0.230.42−0.370.40−0.080.48
SCORBOT_00.210.43−0.150.62−0.240.19−0.200.36
SCORBOT_15−0.020.41−0.430.28−0.140.13−0.350.31
SCORBOT_30−0.110.871.425.952.164.64−3.315.93
Participants
average0.060.660.060.660.144.680.813.19

Share and Cite

MDPI and ACS Style

van der Veen, S.M.; Bordeleau, M.; Pidcoe, P.E.; France, C.R.; Thomas, J.S. Agreement Analysis between Vive and Vicon Systems to Monitor Lumbar Postural Changes. Sensors 2019, 19, 3632. https://doi.org/10.3390/s19173632

AMA Style

van der Veen SM, Bordeleau M, Pidcoe PE, France CR, Thomas JS. Agreement Analysis between Vive and Vicon Systems to Monitor Lumbar Postural Changes. Sensors. 2019; 19(17):3632. https://doi.org/10.3390/s19173632

Chicago/Turabian Style

van der Veen, Susanne M., Martine Bordeleau, Peter E. Pidcoe, Christopher R. France, and James S. Thomas. 2019. "Agreement Analysis between Vive and Vicon Systems to Monitor Lumbar Postural Changes" Sensors 19, no. 17: 3632. https://doi.org/10.3390/s19173632

APA Style

van der Veen, S. M., Bordeleau, M., Pidcoe, P. E., France, C. R., & Thomas, J. S. (2019). Agreement Analysis between Vive and Vicon Systems to Monitor Lumbar Postural Changes. Sensors, 19(17), 3632. https://doi.org/10.3390/s19173632

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop