Technologies for Multimodal Interaction in Extended Reality—A Scoping Review
Abstract
:1. Introduction
2. Background
Taxonomies for Multimodal Interaction
3. Review Methodology
4. Multimodal Interaction Technologies
4.1. Vision
4.1.1. Gestural Interaction in XR
4.1.2. Facial Expression and Emotion Recognition Interfaces
4.1.3. Gaze
4.2. Audition
4.2.1. D Audio
4.2.2. Speech
4.2.3. Exhalation Interfaces
4.3. Haptics
4.3.1. Utilizing Kinesthetics and Tactility to Create Meaningful Interaction
4.3.2. Tactile Feedback
Non-Contact Interaction
Surface-Based or Hand-Held Interaction
4.3.3. Kinesthetic Feedback
Wearable Device Interaction
Multi-Device and Full-Body Interaction
Locomotion Interfaces
Tongue Interfaces
Dynamic Physical Environments and Shape-Changing Objects
4.4. Olfaction
4.5. Gustation
4.6. Brain-Computer Interface (BCI)
4.7. Galvanism
5. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Milgram, P.; Kishino, F. Taxonomy of mixed reality visual displays. Inst. Electron. Inf. Commun. Eng. Trans. Inf. Syst. 1994, E77-D, 1321–1329. [Google Scholar]
- LaValle, S. Virtual Reality; National Programme on Technology Enhanced Learning: Bombay, India, 2016. [Google Scholar]
- Benzie, P.; Watson, J.; Surman, P.; Rakkolainen, I.; Hopf, K.; Urey, H.; Sainov, V.; Von Kopylow, C. A survey of 3DTV displays: Techniques and technologies. Inst. Electr. Electron. Eng. Trans. Circuits Syst. Video Technol. 2007, 17, 1647–1657. [Google Scholar] [CrossRef]
- Cruz-Neira, C.; Sandin, D.J.; DeFanti, T.A. Surround-screen projection-based virtual reality: The design and implementation of the CAVE. In Proceedings of the 20th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 1993, Anaheim, CA, USA, 2–6 August 1993; pp. 135–142. [Google Scholar]
- Rakkolainen, I.; Sand, A.; Palovuori, K. Midair User Interfaces Employing Particle Screens. Inst. Electr. Electron. Eng. Comput. Graph. Appl. 2015, 35, 96–102. [Google Scholar] [CrossRef] [PubMed]
- Bimber, O.; Raskar, R. Spatial Augmented Reality: Merging Real and Virtual Worlds; AK Peters: Wellesley, MA, USA, 2005; ISBN 9781439864944. [Google Scholar]
- Arksey, H.; O’Malley, L. Scoping studies: Towards a methodological framework. Int. J. Soc. Res. Methodol. 2005, 8, 19–32. [Google Scholar] [CrossRef] [Green Version]
- Colquhoun, H.L.; Levac, D.; O’Brien, K.K.; Straus, S.; Tricco, A.C.; Perrier, L.; Kastner, M.; Moher, D. Scoping reviews: Time for clarity in definition, methods, and reporting. J. Clin. Epidemiol. 2014, 67, 1291–1294. [Google Scholar] [CrossRef] [PubMed]
- Peters, M.D.J.; Godfrey, C.M.; Khalil, H.; McInerney, P.; Parker, D.; Soares, C.B. Guidance for conducting systematic scoping reviews. Int. J. Evid. Based Healthc. 2015, 13, 141–146. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Raisamo, R. Multimodal Human-Computer Interaction: A Constructive and Empirical Study; University of Tampere: Tampere, Finland, 1999. [Google Scholar]
- Spence, C. Multisensory contributions to affective touch. Curr. Opin. Behav. Sci. 2022, 43, 40–45. [Google Scholar] [CrossRef]
- Engelbart, D. A demonstration at AFIPS. In Proceedings of the Fall Joint Computer Conference, San Francisco, CA, USA, 9–11 December 1968. [Google Scholar]
- Sutherland, I.E. A head-mounted three dimensional display. In Proceedings of the Fall Joint Computer Conference, San Francisco, CA, USA, 9–11 December 1968; Association for Computing Machinery Press: New York, NY, USA, 1968; Volume 3, p. 757. [Google Scholar]
- Bolt, R.A. “Put-that-there”: Voice and gesture at the graphics interface. In Proceedings of the 7th Annual Conference on Computer Graphics and Interactive Techniques, Seattle, WA, USA, 14–18 July 1980; pp. 262–270. [Google Scholar] [CrossRef]
- Rekimoto, J.; Nagao, K. The world through the computer. In Proceedings of the 8th Annual Association for Computing Machinery Symposium on User Interface and Software Technology—UIST’95, Pittsburgh, PA, USA, 15–17 November 1995; Association for Computing Machinery Press: New York, NY, USA, 1995; pp. 29–36. [Google Scholar]
- Feiner, S.; MacIntyre, B.; Höllerer, T.; Webster, A. A touring machine: Prototyping 3D mobile augmented reality systems for exploring the urban environment. Pers. Ubiquitous Comput. 1997, 1, 208–217. [Google Scholar] [CrossRef] [Green Version]
- Van Dam, A. Post-WIMP user interfaces. Commun. Assoc. Comput. Mach. 1997, 40, 63–67. [Google Scholar] [CrossRef]
- Turk, M. Multimodal interaction: A review. Pattern Recognit. Lett. 2014, 36, 189–195. [Google Scholar] [CrossRef]
- LaViola, J.J., Jr.; Kruijff, E.; Bowman, D.; Poupyrev, I.P.; McMahan, R.P. 3D User Interfaces: Theory and Practice, 2nd ed.; Addison-Wesley: Boston, MA, USA, 2017. [Google Scholar]
- Steed, A.; Takala, T.M.; Archer, D.; Lages, W.; Lindeman, R.W. Directions for 3D User Interface Research from Consumer VR Games. Inst. Electr. Electron. Eng. Trans. Vis. Comput. Graph. 2021, 27, 4171–4182. [Google Scholar] [CrossRef]
- Jerald, J. The VR Book: Human-Centered Design for Virtual Reality; Morgan & Claypool: San Rafael, CA, USA, 2015; ISBN 9781970001129. [Google Scholar]
- Rash, C.; Russo, M.; Letowski, T.; Schmeisser, E. Helmet-Mounted Displays: Sensation, Perception and Cognition Issues; Army Aeromedical Research Laboratory: Fort Rucker, AL, USA, 2009. [Google Scholar]
- Schmalstieg, D.; Höllerer, T. Augmented Reality: Principles and Practice; Addison-Wesley Professional: Boston, MA, USA, 2016. [Google Scholar]
- Billinghurst, M.; Clark, A.; Lee, G. A Survey of Augmented Reality. Found. Trends® Hum.–Comput. Interact. 2015, 8, 73–272. [Google Scholar] [CrossRef]
- Rubio-Tamayo, J.L.; Barrio, M.G.; García, F.G. Immersive environments and virtual reality: Systematic review and advances in communication, interaction and simulation. Multimodal Technol. Interact. 2017, 1, 21. [Google Scholar] [CrossRef] [Green Version]
- Augstein, M.; Neumayr, T. A Human-Centered Taxonomy of Interaction Modalities and Devices. Interact. Comput. 2019, 31, 27–58. [Google Scholar] [CrossRef]
- Blattner, M.M.; Glinert, E.P. Multimodal integration. Inst. Electr. Electron. Eng. Multimed. 1996, 3, 14–24. [Google Scholar] [CrossRef]
- Benoit, C.; Martin, J.; Pelachaud, C.; Schomaker, L.; Suhm, B. Audio-visual and Multimodal Speech Systems. In Handbook of Standards and Resources for Spoken Language Systems-Supplement; Kluwer: Dordrecht, The Netherlands, 2000; Volume 500, pp. 1–95. [Google Scholar]
- Koutsabasis, P.; Vogiatzidakis, P. Empirical Research in Mid-Air Interaction: A Systematic Review. Int. J. Hum. Comput. Interact. 2019, 35, 1747–1768. [Google Scholar] [CrossRef]
- Mewes, A.; Hensen, B.; Wacker, F.; Hansen, C. Touchless interaction with software in interventional radiology and surgery: A systematic literature review. Int. J. Comput. Assist. Radiol. Surg. 2017, 12, 291–305. [Google Scholar] [CrossRef]
- Kim, J.; Laine, T.; Åhlund, C. Multimodal Interaction Systems Based on Internet of Things and Augmented Reality: A Systematic Literature Review. Appl. Sci. 2021, 11, 1738. [Google Scholar] [CrossRef]
- Serafin, S.; Geronazzo, M.; Erkut, C.; Nilsson, N.C.; Nordahl, R. Sonic Interactions in Virtual Reality: State of the Art, Current Challenges, and Future Directions. Inst. Electr. Electron. Eng. Comput. Graph. Appl. 2018, 38, 31–43. [Google Scholar] [CrossRef]
- Krueger, M.W.; Gionfriddo, T.; Hinrichsen, K. VIDEOPLACE—An artificial reality. In Proceedings of the 8th Annual Association for Computing Machinery Symposium on User Interface and Software Technology, San Francisco, CA, USA, 1 April 1985; pp. 35–40. [Google Scholar] [CrossRef]
- Mayer, S.; Reinhardt, J.; Schweigert, R.; Jelke, B.; Schwind, V.; Wolf, K.; Henze, N. Improving Humans’ Ability to Interpret Deictic Gestures in Virtual Reality. In Proceedings of the Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; Volume 20, pp. 1–14. [Google Scholar]
- Henrikson, R.; Grossman, T.; Trowbridge, S.; Wigdor, D.; Benko, H. Head-Coupled Kinematic Template Matching: A Prediction Model for Ray Pointing in VR. In Proceedings of the Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–14. [Google Scholar]
- Li, N.; Han, T.; Tian, F.; Huang, J.; Sun, M.; Irani, P.; Alexander, J. Get a Grip: Evaluating Grip Gestures for VR Input using a Lightweight Pen. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1–13. [Google Scholar]
- Mann, S. Wearable computing: A first step toward personal imaging. Computer 1997, 30, 25–32. [Google Scholar] [CrossRef]
- Starner, T.; Mann, S.; Rhodes, B.; Levine, J.; Healey, J.; Kirsch, D.; Picard, R.W.; Pentland, A. Augmented reality through wearable computing. Presence Teleoperators Virtual Environ. 1997, 6, 386–398. [Google Scholar] [CrossRef]
- Kölsch, M.; Bane, R.; Höllerer, T.; Turk, M. Multimodal interaction with a wearable augmented reality system. Inst. Electr. Electron. Eng. Comput. Graph. Appl. 2006, 26, 62–71. [Google Scholar] [CrossRef] [PubMed]
- Li, Y.; Huang, J.; Tian, F.; Wang, H.A.; Dai, G.Z. Gesture interaction in virtual reality. Virtual Real. Intell. Hardw. 2019, 1, 84–112. [Google Scholar] [CrossRef]
- Chen, Z.; Li, J.; Hua, Y.; Shen, R.; Basu, A. Multimodal interaction in augmented reality. In Proceedings of the 2017 Institution of Electrical Engineers International Conference on Systems, Man, and Cybernetics (SMC), Banff, AB, Canada, 5–8 October 2017; Volume 2017, pp. 206–209. [Google Scholar]
- Yi, S.; Qin, Z.; Novak, E.; Yin, Y.; Li, Q. GlassGesture: Exploring head gesture interface of smart glasses. In Proceedings of the 2016 Institution of Electrical Engineers Conference on Computer Communications Workshops (INFOCOM WKSHPS), San Francisco, CA, USA, 10–14 April 2016; Volume 2016, pp. 1017–1018. [Google Scholar]
- Zhao, J.; Allison, R.S. Real-time head gesture recognition on head-mounted displays using cascaded hidden Markov models. In Proceedings of the 2017 Institution of Electrical Engineers International International Conference on Systems, Man, and Cybernetics (SMC), Banff, AB, Canada, 5–8 October 2017; Volume 2017, pp. 2361–2366. [Google Scholar]
- Yan, Y.; Yu, C.; Yi, X.; Shi, Y. HeadGesture: Hands-Free Input Approach Leveraging Head Movements for HMD Devices. Proc. Assoc. Comput. Mach. Interact. Mob. Wearable Ubiquitous Technol. 2018, 2, 1–23. [Google Scholar] [CrossRef]
- Zhao, J.; Allison, R.S. Comparing head gesture, hand gesture and gamepad interfaces for answering Yes/No questions in virtual environments. Virtual Real. 2020, 24, 515–524. [Google Scholar] [CrossRef]
- Ren, D.; Goldschwendt, T.; Chang, Y.; Hollerer, T. Evaluating wide-field-of-view augmented reality with mixed reality simulation. In Proceedings of the 2016 Institution of Electrical Engineers Virtual Reality (VR), Greenville, SC, USA, 19–23 March 2016; Volume 2016, pp. 93–102. [Google Scholar]
- Cardoso, J.C.S. A Review of Technologies for Gestural Interaction in Virtual Reality; Cambridge Scholars Publishing: Newcastle upon Tyne, UK, 2019; ISBN 9781527535367. [Google Scholar]
- Rautaray, S.S.; Agrawal, A. Vision based hand gesture recognition for human computer interaction: A survey. Artif. Intell. Rev. 2015, 43, 1–54. [Google Scholar] [CrossRef]
- Cheng, H.; Yang, L.; Liu, Z. Survey on 3D Hand Gesture Recognition. Inst. Electr. Electron. Eng. Trans. Circuits Syst. Video Technol. 2016, 26, 1659–1673. [Google Scholar] [CrossRef]
- Vuletic, T.; Duffy, A.; Hay, L.; McTeague, C.; Campbell, G.; Grealy, M. Systematic literature review of hand gestures used in human computer interaction interfaces. Int. J. Hum. Comput. Stud. 2019, 129, 74–94. [Google Scholar] [CrossRef] [Green Version]
- Chen, W.; Yu, C.; Tu, C.; Lyu, Z.; Tang, J.; Ou, S.; Fu, Y.; Xue, Z. A survey on hand pose estimation with wearable sensors and computer-vision-based methods. Sensors 2020, 20, 1074. [Google Scholar] [CrossRef] [Green Version]
- Alam, M.; Samad, M.D.; Vidyaratne, L.; Glandon, A.; Iftekharuddin, K.M. Survey on Deep Neural Networks in Speech and Vision Systems. Neurocomputing 2020, 417, 302–321. [Google Scholar] [CrossRef]
- Beddiar, D.R.; Nini, B.; Sabokrou, M.; Hadid, A. Vision-based human activity recognition: A survey. Multimed. Tools Appl. 2020, 79, 30509–30555. [Google Scholar] [CrossRef]
- DecaGear. Available online: https://www.deca.net/decagear/ (accessed on 3 December 2021).
- Bai, H.; Sasikumar, P.; Yang, J.; Billinghurst, M. A User Study on Mixed Reality Remote Collaboration with Eye Gaze and Hand Gesture Sharing. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1–13. [Google Scholar]
- Majaranta, P.; Ahola, U.-K.; Špakov, O. Fast gaze typing with an adjustable dwell time. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; Association for Computing Machinery: New York, NY, USA, 2009; pp. 357–360. [Google Scholar]
- Kowalczyk, P.; Sawicki, D. Blink and wink detection as a control tool in multimodal interaction. Multimed. Tools Appl. 2019, 78, 13749–13765. [Google Scholar] [CrossRef]
- Schweigert, R.; Schwind, V.; Mayer, S. EyePointing: A Gaze-Based Selection Technique. In Proceedings of the Mensch und Computer 2019 (MuC’19), Hamburg, Germany, 8–11 September 2019; pp. 719–723. [Google Scholar]
- Parisay, M.; Poullis, C.; Kersten-Oertel, M. EyeTAP: Introducing a multimodal gaze-based technique using voice inputs with a comparative analysis of selection techniques. Int. J. Hum. Comput. Stud. 2021, 154, 102676. [Google Scholar] [CrossRef]
- Nukarinen, T.; Kangas, J.; Rantala, J.; Koskinen, O.; Raisamo, R. Evaluating ray casting and two gaze-based pointing techniques for object selection in virtual reality. In Proceedings of the 24th Association for Computing Machinery Symposium on Virtual Reality Software and Technology, Tokio, Japan, 18 November–1 December 2018; Association for Computing Machinery: New York, NY, USA, 2018; pp. 1–2. [Google Scholar]
- Hyrskykari, A.; Istance, H.; Vickers, S. Gaze gestures or dwell-based interaction? In Proceedings of the Symposium on Eye Tracking Research and Applications—ETRA’12, Santa Barbara, CA, USA, 28–30 March 2012; Association for Computing Machinery Press: New York, NY, USA, 2012; p. 229. [Google Scholar]
- Drewes, H.; Schmidt, A. Interacting with the Computer Using Gaze Gestures. In Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2007; Volume 4663, pp. 475–488. ISBN 9783540747994. [Google Scholar]
- Istance, H.; Hyrskykari, A.; Immonen, L.; Mansikkamaa, S.; Vickers, S. Designing gaze gestures for gaming: An investigation of performance. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications—ETRA’10, Austin, TX, USA, 22–24 March 2010; Association for Computing Machinery Press: New York, NY, USA, 2010; p. 323. [Google Scholar]
- Vidal, M.; Bulling, A.; Gellersen, H. Pursuits: Spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proceedings of the 2013 Association for Computing Machinery International Joint Conference on Pervasive and Ubiquitous Computing, Zurich, Switzerland, 8–12 September 2013; Association for Computing Machinery: New York, NY, USA, 2013; pp. 439–448. [Google Scholar]
- Esteves, A.; Velloso, E.; Bulling, A.; Gellersen, H. Orbits. In Proceedings of the 28th Annual Association for Computing Machinery Symposium on User Interface Software & Technology, Charlotte, NC, USA, 8–11 November 2015; Association for Computing Machinery: New York, NY, USA, 2015; pp. 457–466. [Google Scholar]
- Sidenmark, L.; Clarke, C.; Zhang, X.; Phu, J.; Gellersen, H. Outline Pursuits: Gaze-assisted Selection of Occluded Objects in Virtual Reality. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1–13. [Google Scholar]
- Duchowski, A. Eye Tracking Methodology: Theory and Practice; Springer: London, UK, 2007; ISBN 978-1-84628-608-7. [Google Scholar]
- Hansen, D.W.; Qiang, J. In the Eye of the Beholder: A Survey of Models for Eyes and Gaze. Inst. Electr. Electron. Eng. Trans. Pattern Anal. Mach. Intell. 2010, 32, 478–500. [Google Scholar] [CrossRef]
- Khamis, M.; Kienle, A.; Alt, F.; Bulling, A. GazeDrone. In Proceedings of the 4th Association for Computing Machinery Workshop on Micro Aerial Vehicle Networks, Systems, and Applications, Munich, Germany, 10–15 June 2018; Association for Computing Machinery: New York, NY, USA, 2018; pp. 66–71. [Google Scholar]
- Majaranta, P.; Bulling, A. Eye Tracking and Eye-Based Human–Computer Interaction. In Advances in Physiological Computing; Gilleade, S.F.K., Ed.; Springer: Berlin/Heidelberg, Germany, 2014; pp. 39–65. [Google Scholar]
- Hutchinson, T.E.; White, K.P.; Martin, W.N.; Reichert, K.C.; Frey, L.A. Human-computer interaction using eye-gaze input. Inst. Electr. Electron. Eng. Trans. Syst. Man Cybern. 1989, 19, 1527–1534. [Google Scholar] [CrossRef]
- Majaranta, P.; Räihä, K.J. Twenty years of eye typing: Systems and design issues. In Proceedings of the Eye Tracking Research and Applications Symposium (ETRA), New Orleans, LA, USA, 25–27 March 2002; pp. 15–22. [Google Scholar]
- Rozado, D.; Moreno, T.; San Agustin, J.; Rodriguez, F.B.; Varona, P. Controlling a smartphone using gaze gestures as the input mechanism. Hum.-Comput. Interact. 2015, 30, 34–63. [Google Scholar] [CrossRef]
- Holland, C.; Komogortsev, O. Eye tracking on unmodified common tablets: Challenges and solutions. In Proceedings of the Symposium on Eye Tracking Research and Applications—ETRA’12, Santa Barbara, CA, USA, 28–30 March 2012; Association for Computing Machinery Press: New York, NY, USA, 2012; p. 277. [Google Scholar]
- Akkil, D.; Kangas, J.; Rantala, J.; Isokoski, P.; Spakov, O.; Raisamo, R. Glance Awareness and Gaze Interaction in Smartwatches. In Proceedings of the 33rd Annual Association for Computing Machinery Conference Extended Abstracts on Human Factors in Computing Systems, Seoul, Korea, 18–23 April 2015; Association for Computing Machinery: New York, NY, USA, 2015; Volume 18, pp. 1271–1276. [Google Scholar]
- Zhang, L.; Li, X.Y.; Huang, W.; Liu, K.; Zong, S.; Jian, X.; Feng, P.; Jung, T.; Liu, Y. It starts with iGaze: Visual attention driven networking with smart glasses. In Proceedings of the Annual International Conference on Mobile Computing and Networking, MOBICOM, Maui, HI, USA, 7–11 September 2014; pp. 91–102. [Google Scholar]
- Zhang, Y.; Bulling, A.; Gellersen, H. SideWays: A gaze interface for spontaneous interaction with situated displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, France, 27 April–2 May 2013; Association for Computing Machinery: New York, NY, USA, 2013; pp. 851–860. [Google Scholar]
- Hansen, J.P.; Alapetite, A.; MacKenzie, I.S.; Møllenbach, E. The use of gaze to control drones. In Proceedings of the Symposium on Eye Tracking Research and Applications, Safety Harbor, FL, USA, 26–28 March 2014; Association for Computing Machinery: New York, NY, USA, 2014; pp. 27–34. [Google Scholar]
- Yuan, L.; Reardon, C.; Warnell, G.; Loianno, G. Human gaze-driven spatial tasking of an autonomous MAV. Inst. Electr. Electron. Eng. Robot. Autom. Lett. 2019, 4, 1343–1350. [Google Scholar] [CrossRef]
- Clay, V.; König, P.; König, S.U. Eye tracking in virtual reality. J. Eye Mov. Res. 2019, 12. [Google Scholar] [CrossRef]
- Piumsomboon, T.; Lee, G.; Lindeman, R.W.; Billinghurst, M. Exploring natural eye-gaze-based interaction for immersive virtual reality. In Proceedings of the 2017 Institute of Electrical and Electronics Engineers Symposium on 3D User Interfaces (3DUI), Los Angeles, CA, USA, 18–19 March 2017; Volume 3DUI, pp. 36–39. [Google Scholar]
- Nukarinen, T.; Kangas, J.; Rantala, J.; Pakkanen, T.; Raisamo, R. Hands-free vibrotactile feedback for object selection tasks in virtual reality. In Proceedings of the 24th Association for Computing Machinery Symposium on Virtual Reality Software and Technology, Tokio, Japan, 18 November–1 December 2018; Association for Computing Machinery: New York, NY, USA, 2018; pp. 1–2. [Google Scholar]
- Meißner, M.; Pfeiffer, J.; Pfeiffer, T.; Oppewal, H. Combining virtual reality and mobile eye tracking to provide a naturalistic experimental environment for shopper research. J. Bus. Res. 2019, 100, 445–458. [Google Scholar] [CrossRef]
- Tobii VR. Available online: https://vr.tobii.com/ (accessed on 3 December 2021).
- Varjo Eye. Tracking in VR. Available online: https://varjo.com/blog/how-to-do-eye-tracking-studies-in-virtual-reality/ (accessed on 3 December 2021).
- Burova, A.; Mäkelä, J.; Hakulinen, J.; Keskinen, T.; Heinonen, H.; Siltanen, S.; Turunen, M. Utilizing VR and Gaze Tracking to Develop AR Solutions for Industrial Maintenance. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1–13. [Google Scholar]
- Gardony, A.L.; Lindeman, R.W.; Brunyé, T.T. Eye-tracking for human-centered mixed reality: Promises and challenges. In Optical Architectures for Displays and Sensing in Augmented, Virtual, and Mixed Reality (AR, VR, MR); Kress, B.C., Peroz, C., Eds.; International Society for Optics and Photonics: Bellingham, WA, USA, 2020; Volume 11310, p. 27. [Google Scholar]
- Sims, S.D.; Conati, C. A Neural Architecture for Detecting User Confusion in Eye-tracking Data. In Proceedings of the 2020 International Conference on Multimodal Interaction, Online, 25–29 October 2020; Association for Computing Machinery: New York, NY, USA, 2020; Volume ICMI’20, pp. 15–23. [Google Scholar]
- DeLucia, P.R.; Preddy, D.; Derby, P.; Tharanathan, A.; Putrevu, S. Eye Movement Behavior During Confusion. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2014, 58, 1300–1304. [Google Scholar] [CrossRef]
- Marshall, S.P. Identifying cognitive state from eye metrics. Aviat. Sp. Environ. Med. 2007, 78, B165–B175. [Google Scholar]
- Boraston, Z.; Blakemore, S.J. The application of eye-tracking technology in the study of autism. J. Physiol. 2007, 581, 893–898. [Google Scholar] [CrossRef]
- Drèze, X.; Hussherr, F.-X. Internet advertising: Is anybody watching? J. Interact. Mark. 2003, 17, 8–23. [Google Scholar] [CrossRef] [Green Version]
- Raisamo, R.; Rakkolainen, I.; Majaranta, P.; Salminen, K.; Rantala, J.; Farooq, A. Human augmentation: Past, present and future. Int. J. Hum. Comput. Stud. 2019, 131, 131–143. [Google Scholar] [CrossRef]
- Hyrskykari, A.; Majaranta, P.; Räihä, K.J. From Gaze Control to Attentive Interfaces. In Proceedings of the 11th International Conference on Human-Computer Interaction, Las Vegas, NV, USA, 22–27 July 2005. [Google Scholar]
- Hansen, J.; Hansen, D.; Johansen, A.; Elvesjö, J. Mainstreaming gaze interaction towards a mass market for the benefit of all. In Universal Access in HCI: Exploring New Interaction Environments; Stephanidis, C., Ed.; Lawrence Erlbaum Associates, Inc.: Mahwah, NJ, USA, 2005; Volume 7. [Google Scholar]
- Freeman, E.; Wilson, G.; Vo, D.-B.; Ng, A.; Politis, I.; Brewster, S. Multimodal feedback in HCI: Haptics, non-speech audio, and their applications. In The Handbook of Multimodal-Multisensor Interfaces: Foundations, User Modeling, and Common Modality Combinations; ACM Books: New York, NY, USA, 2017; Volume 1, pp. 277–317. [Google Scholar]
- Miccini, R.; Spagnol, S. HRTF Individualization using Deep Learning. In Proceedings of the 2020 Institute of Electrical and Electronics Engineers Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Atlanta, GA, USA, 22–26 March 2020; pp. 390–395. [Google Scholar]
- Wolf, M.; Trentsios, P.; Kubatzki, N.; Urbanietz, C.; Enzner, G. Implementing Continuous-Azimuth Binaural Sound in Unity 3D. In Proceedings of the 2020 Institute of Electrical and Electronics Engineers Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Atlanta, GA, USA, 22–26 March 2020; pp. 384–389. [Google Scholar]
- Sra, M.; Xu, X.; Maes, P. BreathVR: Leveraging breathing as a directly controlled interface for virtual reality games. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018; Association for Computing Machinery: New York, NY, USA, 2018; Volume 2018, pp. 1–12. [Google Scholar]
- Kusabuka, T.; Indo, T. IBUKI: Gesture Input Method Based on Breathing. In Proceedings of the 33rd Annual Association for Computing Machinery Symposium on User Interface Software and Technology, Online, 20–23 October 2020; pp. 102–104. [Google Scholar]
- Chen, Y.; Bian, Y.; Yang, C.; Bao, X.; Wang, Y.; De Melo, G.; Liu, J.; Gai, W.; Wang, L.; Meng, X. Leveraging Blowing as a Directly Controlled Interface. In Proceedings of the 2019 Institute of Electrical and Electronics Engineers SmartWorld, Ubiquitous Intelligence & Computing, Advanced & Trusted Computing, Scalable Computing & Communications, Cloud & Big Data Computing, Internet of People and Smart City Innovation (SmartWorld/SCALCOM/UIC/ATC/CBDCom/IOP/SCI), Los Alamitos, CA, USA, 19–23 August 2019; pp. 419–424. [Google Scholar]
- Goldstein, E.B. Sensation & Perception, 5th ed.; Brooks/Cole Publishing Company: Pacific Grove, CA, USA, 1999. [Google Scholar]
- Biswas, S.; Visell, Y. Emerging Material Technologies for Haptics. Adv. Mater. Technol. 2019, 4, 1900042. [Google Scholar] [CrossRef] [Green Version]
- Asaga, E.; Takemura, K.; Maeno, T.; Ban, A.; Toriumi, M. Tactile evaluation based on human tactile perception mechanism. Sens. Actuators A Phys. 2013, 203, 69–75. [Google Scholar] [CrossRef]
- Kandel, E.; Schwartz, J.; Jesell, T.; Siegelbaum, S. Hudspeth Principles of Neural Science; McGraw-Hill: New York, NY, USA, 2013. [Google Scholar]
- Proske, U.; Gandevia, S.C. The proprioceptive senses: Their roles in signaling body shape, body position and movement, and muscle force. Physiol. Rev. 2012, 92, 1651–1697. [Google Scholar] [CrossRef] [PubMed]
- Oakley, I.; McGee, M.R.; Brewster, S.; Gray, P. Putting the feel in ‘look and feel’. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems—CHI’00, The Hague, The Netherlands, 1–6 April 2000; Association for Computing Machinery Press: New York, NY, USA, 2000; pp. 415–422. [Google Scholar]
- Barnett-Cowan, M. Vestibular Perception is Slow: A Review. Multisens. Res. 2013, 26, 387–403. [Google Scholar] [CrossRef] [PubMed]
- Morphew, M.E.; Shively, J.R.; Casey, D. Helmet-mounted displays for unmanned aerial vehicle control. In Proceedings of the Helmet- and Head-Mounted Displays IX: Technologies and Applications, Orlando, FL, USA, 12–13 April 2004; Volume 5442, p. 93. [Google Scholar]
- Mollet, N.; Chellali, R. Virtual and Augmented Reality with Head-Tracking for Efficient Teleoperation of Groups of Robots. In Proceedings of the 2008 International Conference on Cyberworlds, Hangzhou, China, 22–24 September 2008; pp. 102–108. [Google Scholar]
- Higuchi, K.; Fujii, K.; Rekimoto, J. Flying head: A head-synchronization mechanism for flying telepresence. In Proceedings of the 2013 23rd International Conference on Artificial Reality and Telexistence (ICAT), Tokyo, Japan, 11–13 December 2013; pp. 28–34. [Google Scholar]
- Smolyanskiy, N.; Gonzalez-Franco, M. Stereoscopic first person view system for drone navigation. Front. Robot. AI 2017, 4, 11. [Google Scholar] [CrossRef] [Green Version]
- Pittman, C.; LaViola, J.J. Exploring head tracked head mounted displays for first person robot teleoperation. In Proceedings of the 19th International Conference on Intelligent User Interfaces, Haifa, Israel, 24–27 February 2014; Association for Computing Machinery: New York, NY, USA, 2014; pp. 323–328. [Google Scholar]
- Teixeira, J.M.; Ferreira, R.; Santos, M.; Teichrieb, V. Teleoperation Using Google Glass and AR, Drone for Structural Inspection. In Proceedings of the 2014 XVI Symposium on Virtual and Augmented Reality, Piata Salvador, Brazil, 12–15 May 2014; pp. 28–36. [Google Scholar]
- Doisy, G.; Ronen, A.; Edan, Y. Comparison of three different techniques for camera and motion control of a teleoperated robot. Appl. Ergon. 2017, 58, 527–534. [Google Scholar] [CrossRef]
- Culbertson, H.; Schorr, S.B.; Okamura, A.M. Haptics: The Present and Future of Artificial Touch Sensation. Annu. Rev. Control. Robot. Auton. Syst. 2018, 1, 385–409. [Google Scholar] [CrossRef]
- Bermejo, C.; Hui, P. A Survey on Haptic Technologies for Mobile Augmented Reality. Assoc. Comput. Mach. Comput. Surv. 2022, 54, 1–35. [Google Scholar] [CrossRef]
- Choi, S.; Kuchenbecker, K.J. Vibrotactile Display: Perception, Technology, and Applications. Proc. Inst. Electr. Electron. Eng. 2013, 101, 2093–2104. [Google Scholar] [CrossRef]
- Wang, D.; Ohnishi, K.; Xu, W. Multimodal haptic display for virtual reality: A survey. Inst. Electr. Electron. Eng. Trans. Ind. Electron. 2020, 67, 610–623. [Google Scholar] [CrossRef]
- Hamza-Lup, F.G.; Bergeron, K.; Newton, D. Haptic Systems in User Interfaces. In Proceedings of the 2019 Association for Computing Machinery Southeast Conference, Kennesaw, GA, USA, 18–20 April 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 141–148. [Google Scholar]
- McGlone, F.; Vallbo, A.B.; Olausson, H.; Loken, L.; Wessberg, J. Discriminative touch and emotional touch. Can. J. Exp. Psychol. Can. Psychol. Expérimentale 2007, 61, 173–183. [Google Scholar] [CrossRef] [Green Version]
- Pacchierotti, C.; Sinclair, S.; Solazzi, M.; Frisoli, A.; Hayward, V.; Prattichizzo, D. Wearable haptic systems for the fingertip and the hand: Taxonomy, review, and perspectives. Inst. Electr. Electron. Eng. Trans. Haptics 2017, 10, 580–600. [Google Scholar] [CrossRef] [Green Version]
- Yu, X.; Xie, Z.; Yu, Y.; Lee, J.; Vazquez-Guardado, A.; Luan, H.; Ruban, J.; Ning, X.; Akhtar, A.; Li, D.; et al. Skin-integrated wireless haptic interfaces for virtual and augmented reality. Nature 2019, 575, 473–479. [Google Scholar] [CrossRef]
- De Jesus Oliveira, V.A.; Nedel, L.; Maciel, A.; Brayda, L. Spatial discrimination of vibrotactile stimuli around the head. In Proceedings of the 2016 Institute of Electrical and Electronics Engineers Haptics Symposium (HAPTICS), Philadelphia, PA, USA, 8–11 April 2016; pp. 1–6. [Google Scholar]
- Kaul, O.B.; Rohs, M. HapticHead: A spherical vibrotactile grid around the head for 3D guidance in virtual and augmented reality. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017; Association for Computing Machinery: New York, NY, USA, 2017; Volume 2017, pp. 3729–3740. [Google Scholar]
- Iwamoto, T.; Tatezono, M.; Shinoda, H. Non-contact Method for Producing Tactile Sensation Using Airborne Ultrasound. In Haptics: Perception, Devices and Scenarios; Springer: Berlin/Heidelberg, Germany, 2008; Volume 5024, pp. 504–513. ISBN 3540690565. [Google Scholar]
- Long, B.; Seah, S.A.; Carter, T.; Subramanian, S. Rendering volumetric haptic shapes in mid-air using ultrasound. Assoc. Comput. Mach. Trans. Graph. 2014, 33, 1–10. [Google Scholar] [CrossRef] [Green Version]
- Rakkolainen, I.; Freeman, E.; Sand, A.; Raisamo, R.; Brewster, S. A Survey of Mid-Air Ultrasound Haptics and Its Applications. Inst. Electr. Electron. Eng. Trans. Haptics 2021, 14, 2–19. [Google Scholar] [CrossRef]
- Farooq, A.; Evreinov, G.; Raisamo, R.; Hippula, A. Developing Intelligent Multimodal IVI Systems to Reduce Driver Distraction. In Intelligent Human Systems Integration 2019. IHSI 2019. Advances in Intelligent Systems and Computing; Springer: Cham, Switzerland, 2019; Volume 903, pp. 91–97. [Google Scholar]
- Hoshi, T.; Abe, D.; Shinoda, H. Adding tactile reaction to hologram. In Proceedings of the RO-MAN 2009—The 18th Institute of Electrical and Electronics Engineers International Symposium on Robot and Human Interactive Communication, Toyama, Japan, 27 September–2 October 2009; pp. 7–11. [Google Scholar]
- Martinez, J.; Griffiths, D.; Biscione, V.; Georgiou, O.; Carter, T. Touchless Haptic Feedback for Supernatural VR Experiences. In Proceedings of the 2018 Institute of Electrical and Electronics Engineers International Conference on Virtual Reality and 3D User Interfaces (VR), Reutlingen, Germany, 18–22 March 2018; pp. 629–630. [Google Scholar]
- Furumoto, T.; Fujiwara, M.; Makino, Y.; Shinoda, H. BaLuna: Floating Balloon Screen Manipulated Using Ultrasound. In Proceedings of the 2019 Institute of Electrical and Electronics Engineers International Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 937–938. [Google Scholar]
- Kervegant, C.; Raymond, F.; Graeff, D.; Castet, J. Touch hologram in mid-air. In Proceedings of the Association for Computing Machinery SIGGRAPH 2017 Emerging Technologies, Los Angeles, CA, USA, 30 July–3 August 2017; Association for Computing Machinery: New York, NY, USA, 2017; pp. 1–2. [Google Scholar]
- Sand, A.; Rakkolainen, I.; Isokoski, P.; Kangas, J.; Raisamo, R.; Palovuori, K. Head-mounted display with mid-air tactile feedback. In Proceedings of the 21st Association for Computing Machinery Symposium on Virtual Reality Software and Technology, Beijing, China, 13–15 November 2015; Association for Computing Machinery: New York, NY, USA, 2015; Volume 13, pp. 51–58. [Google Scholar]
- Palovuori, K.; Rakkolainen, I.; Sand, A. Bidirectional touch interaction for immaterial displays. In Proceedings of the 18th International Academic MindTrek Conference on Media Business, Management, Content & Services—AcademicMindTrek’14, Tampere, Finland, 4–6 November 2014; Association for Computing Machinery Press: New York, NY, USA, 2014; pp. 74–76. [Google Scholar]
- Wilson, G.; Carter, T.; Subramanian, S.; Brewster, S.A. Perception of ultrasonic haptic feedback on the hand. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Toronto, ON, Canada, 26 April–1 May 2014; Association for Computing Machinery: New York, NY, USA, 2014; pp. 1133–1142. [Google Scholar]
- Van Neer, P.; Volker, A.; Berkhoff, A.; Schrama, T.; Akkerman, H.; Van Breemen, A.; Peeters, L.; Van Der Steen, J.L.; Gelinck, G. Development of a flexible large-area array based on printed polymer transducers for mid-air haptic feedback. Proc. Meet. Acoust. 2019, 38, 45008. [Google Scholar] [CrossRef] [Green Version]
- Farooq, A.; Weitz, P.; Evreinov, G.; Raisamo, R.; Takahata, D. Touchscreen Overlay Augmented with the Stick-Slip Phenomenon to Generate Kinetic Energy. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology, Tokyo, Japan, 16–19 October 2016; Association for Computing Machinery: New York, NY, USA, 2016; pp. 179–180. [Google Scholar]
- Desai, A.P.; Pena-Castillo, L.; Meruvia-Pastor, O. A Window to Your Smartphone: Exploring Interaction and Communication in Immersive VR with Augmented Virtuality. In Proceedings of the 2017 14th Conference on Computer and Robot Vision (CRV), Edmonton, AB, Canada, 16–19 May 2017; Volume 2018, pp. 217–224. [Google Scholar]
- Chuah, J.H.; Lok, B. Experiences in Using a Smartphone as a Virtual Reality Interaction Device. Int. J. Virtual Real. 2012, 11, 25–31. [Google Scholar] [CrossRef] [Green Version]
- Qian, J.; Ma, J.; Li, X.; Attal, B.; Lai, H.; Tompkin, J.; Hughes, J.F.; Huang, J. Portal-ble: Intuitive Free-hand Manipulation in Unbounded Smartphone-based Augmented Reality. In Proceedings of the 32nd Annual Association for Computing Machinery Symposium on User Interface Software and Technology, New Orleans, LA, USA, 20–23 October 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 133–145. [Google Scholar]
- Nakagaki, K.; Fitzgerald, D.; Ma, Z.J.; Vink, L.; Levine, D.; Ishii, H. InFORCE: Bi-directional “Force” Shape Display For Haptic Interaction. In Proceedings of the Thirteenth International Conference on Tangible, Embedded, and Embodied Interaction, Tempe, AR, USA, 17–20 March 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 615–623. [Google Scholar]
- Allman-Farinelli, M.; Ijaz, K.; Tran, H.; Pallotta, H.; Ramos, S.; Liu, J.; Wellard-Cole, L.; Calvo, R.A. A Virtual Reality Food Court to Study Meal Choices in Youth: Design and Assessment of Usability. JMIR Form. Res. 2019, 3, e12456. [Google Scholar] [CrossRef]
- Stelick, A.; Penano, A.G.; Riak, A.C.; Dando, R. Dynamic Context Sensory Testing-A Proof of Concept Study Bringing Virtual Reality to the Sensory Booth. J. Food Sci. 2018, 83, 2047–2051. [Google Scholar] [CrossRef]
- Kaluschke, M.; Weller, R.; Zachmann, G.; Pelliccia, L.; Lorenz, M.; Klimant, P.; Knopp, S.; Atze, J.P.G.; Mockel, F. A Virtual Hip Replacement Surgery Simulator with Realistic Haptic Feedback. In Proceedings of the 2018 Institute of Electrical and Electronics Engineers International Conference on Virtual Reality and 3D User Interfaces (VR), Reutlingen, Germany, 18–22 March 2018; pp. 759–760. [Google Scholar]
- Brazil, A.L.; Conci, A.; Clua, E.; Bittencourt, L.K.; Baruque, L.B.; da Silva Conci, N. Haptic forces and gamification on epidural anesthesia skill gain. Entertain. Comput. 2018, 25, 1–13. [Google Scholar] [CrossRef]
- Karafotias, G.; Korres, G.; Sefo, D.; Boomer, P.; Eid, M. Towards a realistic haptic-based dental simulation. In Proceedings of the 2017 Institute of Electrical and Electronics Engineers International Symposium on Haptic, Audio and Visual Environments and Games (HAVE), 21–22 October 2017; Volume 2017, pp. 1–6. [Google Scholar]
- Holoride: Virtual Reality Meets the Real World. Available online: https://www.audi.com/en/experience-audi/mobility-and-trends/digitalization/holoride-virtual-reality-meets-the-real-world.html (accessed on 3 December 2021).
- Ma, Z.; Ben-Tzvi, P. Design and optimization of a five-finger haptic glove mechanism. J. Mech. Robot. 2015, 7, 041008. [Google Scholar] [CrossRef] [Green Version]
- Turner, M.L.; Gomez, D.H.; Tremblay, M.R.; Cutkosky, M.R. Preliminary tests of an arm-grounded haptic feedback device in telemanipulation. In Proceedings of the 2001 ASME International Mechanical Engineering Congress and Exposition, New York, NY, USA, 11–16 November 2001; Volume 64, pp. 145–149. [Google Scholar]
- Bouzit, M.; Popescu, G.; Burdea, G.; Boian, R. The Rutgers Master II-ND force feedback glove. In Proceedings of the 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Orlando, FL, USA, 24–25 March 2002; pp. 145–152. [Google Scholar]
- Perret, J.; Poorten, E. Vander Touching virtual reality: A review of haptic gloves. In Proceedings of the ACTUATOR 2018—16th International Conference and Exhibition on New Actuators and Drive Systems, Bremen, Germany, 25–27 June 2018; pp. 270–274. [Google Scholar]
- Caeiro-Rodríguez, M.; Otero-González, I.; Mikic-Fonte, F.A.; Llamas-Nistal, M. A systematic review of commercial smart gloves: Current status and applications. Sensors 2021, 21, 2667. [Google Scholar] [CrossRef]
- Lindeman, R.W.; Page, R.; Yanagida, Y.; Sibert, J.L. Towards full-body haptic feedback: The design and deployment of a spatialized vibrotactile feedback system. In Proceedings of the Association for Computing Machinery Symposium on Virtual Reality Software and Technology—VRST’04, Tokyo, Japan, 28 November–1 December 2018; Association for Computing Machinery Press: New York, NY, USA, 2004; p. 146. [Google Scholar]
- Farooq, A.; Coe, P.; Evreinov, G.; Raisamo, R. Using Dynamic Real-Time Haptic Mediation in VR and AR Environments. In Advances in Intelligent Systems and Computing; Ahram, T., Taiar, R., Colson, S., Choplin, A., Eds.; Springer: Cham, Switzerland, 2020; Volume 1018, pp. 407–413. ISBN 9783030256289. [Google Scholar]
- Kasahara, S.; Konno, K.; Owaki, R.; Nishi, T.; Takeshita, A.; Ito, T.; Kasuga, S.; Ushiba, J. Malleable Embodiment: Changing sense of embodiment by spatial-temporal deformation of virtual human body. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017; Association for Computing Machinery: New York, NY, USA, 2017; Volume 2017, pp. 6438–6448. [Google Scholar]
- Jiang, F.; Yang, X.; Feng, L. Real-time full-body motion reconstruction and recognition for off-the-shelf VR devices. In Proceedings of the 15th Association for Computing Machinery SIGGRAPH Conference on Virtual-Reality Continuum and Its Applications in Industry—Volume 1, Zhuhai, China, 3–4 December 2016; Association for Computing Machinery: New York, NY, USA, 2016; Volume 1, pp. 309–318. [Google Scholar]
- Slater, M.; Wilbur, S. A framework for immersive virtual environments (FIVE): Speculations on the role of presence in virtual environments. Presence Teleoperators Virtual Environ. 1997, 6, 603–616. [Google Scholar] [CrossRef]
- Caserman, P.; Garcia-Agundez, A.; Gobel, S. A Survey of Full-Body Motion Reconstruction in Immersive Virtual Reality Applications. Inst. Electr. Electron. Eng. Trans. Vis. Comput. Graph. 2020, 26, 3089–3108. [Google Scholar] [CrossRef]
- Olivier, A.H.; Bruneau, J.; Kulpa, R.; Pettre, J. Walking with Virtual People: Evaluation of Locomotion Interfaces in Dynamic Environments. Inst. Electr. Electron. Eng. Trans. Vis. Comput. Graph. 2018, 24, 2251–2263. [Google Scholar] [CrossRef] [Green Version]
- Nilsson, N.C.; Serafin, S.; Steinicke, F.; Nordahl, R. Natural walking in virtual reality: A review. Comput. Entertain. 2018, 16, 1–22. [Google Scholar] [CrossRef]
- Boletsis, C. The new era of virtual reality locomotion: A systematic literature review of techniques and a proposed typology. Multimodal Technol. Interact. 2017, 1, 24. [Google Scholar] [CrossRef] [Green Version]
- Suzuki, Y.; Sekimori, K.; Yamato, Y.; Yamasaki, Y.; Shizuki, B.; Takahashi, S. A Mouth Gesture Interface Featuring a Mutual-Capacitance Sensor Embedded in a Surgical Mask. In Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2020; Volume 12182, pp. 154–165. [Google Scholar]
- Hashimoto, T.; Low, S.; Fujita, K.; Usumi, R.; Yanagihara, H.; Takahashi, C.; Sugimoto, M.; Sugiura, Y. TongueInput: Input Method by Tongue Gestures Using Optical Sensors Embedded in Mouthpiece. In Proceedings of the 2018 57th Annual Conference of the Society of Instrument and Control Engineers of Japan (SICE), Nara, Japan, 11–14 September 2018; pp. 1219–1224. [Google Scholar]
- Visell, Y.; Law, A.; Cooperstock, J.R. Touch is everywhere: Floor surfaces as ambient haptic interfaces. Inst. Electr. Electron. Eng. Trans. Haptics 2009, 2, 148–159. [Google Scholar] [CrossRef] [PubMed]
- Bouillot, N.; Seta, M. A Scalable Haptic Floor Dedicated to Large Immersive Spaces. In Proceedings of the 17th Linux Audio Conference (LAC-19), Stanford, CA, USA, 23–26 March 2019. [Google Scholar]
- Yixian, Y.; Takashima, K.; Tang, A.; Tanno, T.; Fujita, K.; Kitamura, Y. ZoomWalls: Dynamic walls that simulate haptic infrastructure for room-scale VR world. In Proceedings of the 33rd Annual Association for Computing Machinery Symposium on User Interface Software and Technology, Online, 20–23 October 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 223–235. [Google Scholar]
- Bouzbib, E.; Bailly, G.; Haliyo, S.; Frey, P. CoVR: A Large-Scale Force-Feedback Robotic Interface for Non-Deterministic Scenarios in VR. In Proceedings of the 33rd Annual Association for Computing Machinery Symposium on User Interface Software and Technology, Online, 20–23 October 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 209–222. [Google Scholar]
- Kovacs, R.; Ofek, E.; Gonzalez Franco, M.; Siu, A.F.; Marwecki, S.; Holz, C.; Sinclair, M. Haptic PIVOT: On-demand handhelds in VR. In Proceedings of the 33rd Annual Association for Computing Machinery Symposium on User Interface Software and Technology, Online, 20–23 October 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1046–1059. [Google Scholar]
- Munyan, B.G.; Neer, S.M.; Beidel, D.C.; Jentsch, F. Olfactory Stimuli Increase Presence in Virtual Environments. PLoS ONE 2016, 11, e0157568. [Google Scholar] [CrossRef] [Green Version]
- Hopf, J.; Scholl, M.; Neuhofer, B.; Egger, R. Exploring the Impact of Multisensory VR on Travel Recommendation: A Presence Perspective. In Information and Communication Technologies in Tourism 2020; Springer: Cham, Switzerland, 2020; pp. 169–180. [Google Scholar]
- Baus, O.; Bouchard, S.; Nolet, K. Exposure to a pleasant odour may increase the sense of reality, but not the sense of presence or realism. Behav. Inf. Technol. 2019, 38, 1369–1378. [Google Scholar] [CrossRef]
- Ranasinghe, N.; Jain, P.; Thi Ngoc Tram, N.; Koh, K.C.R.; Tolley, D.; Karwita, S.; Lien-Ya, L.; Liangkun, Y.; Shamaiah, K.; Eason Wai Tung, C.; et al. Season Traveller: Multisensory narration for enhancing the virtual reality experience. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018; Association for Computing Machinery: New York, NY, USA, 2018; Volume 2018, pp. 1–13. [Google Scholar]
- Tortell, R.; Luigi, D.-P.; Dozois, A.; Bouchard, S.; Morie, J.F.; Ilan, D. The effects of scent and game play experience on memory of a virtual environment. Virtual Real. 2007, 11, 61–68. [Google Scholar] [CrossRef]
- Murray, N.; Lee, B.; Qiao, Y.; Muntean, G.M. Olfaction-enhanced multimedia: A survey of application domains, displays, and research challenges. Assoc. Comput. Mach. Comput. Surv. 2016, 48, 1–34. [Google Scholar] [CrossRef]
- Obrist, M.; Velasco, C.; Vi, C.T.; Ranasinghe, N.; Israr, A.; Cheok, A.D.; Spence, C.; Gopalakrishnakone, P. Touch, Taste, & Smell User Interfaces. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, San Jose, CA, USA, 7–12 May 2016; Association for Computing Machinery: New York, NY, USA, 2016; Volume 7, pp. 3285–3292. [Google Scholar]
- Cheok, A.D.; Karunanayaka, K. Virtual Taste and Smell Technologies for Multisensory Internet and Virtual Reality. In Human–Computer Interaction Series; Springer International Publishing: Cham, Switzerland, 2018; ISBN 978-3-319-73863-5. [Google Scholar]
- Spence, C.; Obrist, M.; Velasco, C.; Ranasinghe, N. Digitizing the chemical senses: Possibilities & pitfalls. Int. J. Hum. Comput. Stud. 2017, 107, 62–74. [Google Scholar] [CrossRef]
- Spangenberg, E.R.; Crowley, A.E.; Henderson, P.W. Improving the Store Environment: Do Olfactory Cues Affect Evaluations and Behaviors? J. Mark. 1996, 60, 67–80. [Google Scholar] [CrossRef]
- Salminen, K.; Rantala, J.; Isokoski, P.; Lehtonen, M.; Müller, P.; Karjalainen, M.; Väliaho, J.; Kontunen, A.; Nieminen, V.; Leivo, J.; et al. Olfactory Display Prototype for Presenting and Sensing Authentic and Synthetic Odors. In Proceedings of the 20th Association for Computing Machinery International Conference on Multimodal Interaction, Boulder, CO, USA, 16–20 October 2018; Association for Computing Machinery: New York, NY, USA, 2018; pp. 73–77. [Google Scholar]
- Niedenthal, S.; Lunden, P.; Ehrndal, M.; Olofsson, J.K. A Handheld Olfactory Display For Smell-Enabled VR Games. In Proceedings of the 2019 Institute of Electrical and Electronics Engineers International Symposium on Olfaction and Electronic Nose (ISOEN), Fukuoka, Japan, 26–29 May 2019; pp. 1–4. [Google Scholar]
- Wang, Y.; Amores, J.; Maes, P. On-Face Olfactory Interfaces. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1–9. [Google Scholar]
- Brooks, J.; Nagels, S.; Lopes, P. Trigeminal-based Temperature Illusions. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1–12. [Google Scholar]
- Kato, S.; Nakamoto, T. Wearable Olfactory Display with Less Residual Odor. In Proceedings of the 2019 Institute of Electrical and Electronics Engineers International Symposium on Olfaction and Electronic Nose (ISOEN), Fukuoka, Japan, 26–29 May 2019; pp. 1–3. [Google Scholar]
- Narumi, T.; Nishizaka, S.; Kajinami, T.; Tanikawa, T.; Hirose, M. Augmented reality flavors. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada, 7–12 May 2011; Association for Computing Machinery: New York, NY, USA, 2011; pp. 93–102. [Google Scholar]
- Yanagida, Y. A survey of olfactory displays: Making and delivering scents. In Proceedings of the 11th Institute of Electrical and Electronics Engineers Sensors Conference, Taipei, Taiwan, 28–31 October 2012; pp. 1–4. [Google Scholar] [CrossRef]
- Ravia, A.; Snitz, K.; Honigstein, D.; Finkel, M.; Zirler, R.; Perl, O.; Secundo, L.; Laudamiel, C.; Harel, D.; Sobel, N. A measure of smell enables the creation of olfactory metamers. Nature 2020, 588, 118–123. [Google Scholar] [CrossRef]
- Iwata, H. Taste interfaces. In HCI Beyond the GUI: Design for Haptic, Speech, Olfactory, and Other Nontraditional Interfaces; Kortum, P., Ed.; Elsevier Inc.: Cambridge, MA, USA, 2008. [Google Scholar]
- Auvray, M.; Spence, C. The multisensory perception of flavor. Conscious. Cogn. 2008, 17, 1016–1031. [Google Scholar] [CrossRef]
- Aisala, H.; Rantala, J.; Vanhatalo, S.; Nikinmaa, M.; Pennanen, K.; Raisamo, R.; Sözer, N. Augmentation of Perceived Sweetness in Sugar Reduced Cakes by Local Odor Display. In Proceedings of the 2020 International Conference on Multimodal Interaction, Utrecth, The Netherlands, 25–29 October 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 322–327. [Google Scholar]
- Kerruish, E. Arranging sensations: Smell and taste in augmented and virtual reality. Senses Soc. 2019, 14, 31–45. [Google Scholar] [CrossRef]
- Maynes-Aminzade, D. Edible Bits: Seamless Interfaces between People, Data and Food. In Proceedings of the 2005 Association for Computing Machinery Conference on Human Factors in Computing Systems (CHI’2005), Portland, OR, USA, 2–7 April 2005; pp. 2207–2210. [Google Scholar]
- Ranasinghe, N.; Nguyen, T.N.T.; Liangkun, Y.; Lin, L.-Y.; Tolley, D.; Do, E.Y.-L. Vocktail: A virtual cocktail for pairing digital taste, smell, and color sensations. In Proceedings of the 25th Association for Computing Machinery International Conference on Multimedia, Mountain View, CA, USA, 23–27 October 2017; Association for Computing Machinery: New York, NY, USA, 2017; Volume MM’17, pp. 1139–1147. [Google Scholar]
- Nakamura, H.; Miyashita, H. Development and evaluation of interactive system for synchronizing electric taste and visual content. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Austin, TX, USA, 5–10 May 2012; Association for Computing Machinery: New York, NY, USA, 2012; pp. 517–520. [Google Scholar]
- Ranasinghe, N.; Cheok, A.; Nakatsu, R.; Do, E.Y.-L. Simulating the sensation of taste for immersive experiences. In Proceedings of the 2013 Association for Computing Machinery International Workshop on Immersive Media Experiences—ImmersiveMe’13, Barcelona, Spain, 22 October 2013; Association for Computing Machinery Press: New York, NY, USA, 2013; pp. 29–34. [Google Scholar]
- Suzuki, C.; Narumi, T.; Tanikawa, T.; Hirose, M. Affecting tumbler: Affecting our flavor perception with thermal feedback. In Proceedings of the 11th Conference on Advances in Computer Entertainment Technology, Funchal, Portugal, 11–14 November 2014; Association for Computing Machinery: New York, NY, USA, 2014; pp. 1–10. [Google Scholar]
- Koskinen, E.; Rakkolainen, I.; Raisamo, R. Direct retinal signals for virtual environments. In Proceedings of the 23rd Association for Computing Machinery Symposium on Virtual Reality Software and Technology, Gothenburg, Sweden, 8–10 November 2017; Association for Computing Machinery: New York, NY, USA, 2017; Volume F1319, pp. 1–2. [Google Scholar]
- Abiri, R.; Borhani, S.; Sellers, E.W.; Jiang, Y.; Zhao, X. A comprehensive review of EEG-based brain–computer interface paradigms. J. Neural Eng. 2019, 16, 011001. [Google Scholar] [CrossRef]
- Bernal, G.; Yang, T.; Jain, A.; Maes, P. PhysioHMD. In Proceedings of the 2018 Association for Computing Machinery International Symposium on Wearable Computers, Singapore, 8–12 October 2018; Association for Computing Machinery: New York, NY, USA, 2018; pp. 160–167. [Google Scholar]
- Vourvopoulos, A.; Niforatos, E.; Giannakos, M. EEGlass: An EEG-eyeware prototype for ubiquitous brain-computer interaction. In Proceedings of the 2019 Association for Computing Machinery International Joint Conference on Pervasive and Ubiquitous Computing, London, UK, 11–13 September 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 647–652. [Google Scholar]
- Luong, T.; Martin, N.; Raison, A.; Argelaguet, F.; Diverrez, J.-M.; Lecuyer, A. Towards Real-Time Recognition of Users Mental Workload Using Integrated Physiological Sensors Into a VR HMD. In Proceedings of the 2020 Institute of Electrical and Electronics Engineers International Symposium on Mixed and Augmented Reality (ISMAR), Online, 9–13 November 2020; pp. 425–437. [Google Scholar]
- Barde, A.; Gumilar, I.; Hayati, A.F.; Dey, A.; Lee, G.; Billinghurst, M. A Review of Hyperscanning and Its Use in Virtual Environments. Informatics 2020, 7, 55. [Google Scholar] [CrossRef]
- Losey, D.M.; Stocco, A.; Abernethy, J.A.; Rao, R.P.N. Navigating a 2D virtual world using direct brain stimulation. Front. Robot. AI 2016, 3, 72. [Google Scholar] [CrossRef] [Green Version]
- Lee, W.; Kim, H.C.; Jung, Y.; Chung, Y.A.; Song, I.U.; Lee, J.H.; Yoo, S.S. Transcranial focused ultrasound stimulation of human primary visual cortex. Sci. Rep. 2016, 6, 34026. [Google Scholar] [CrossRef]
- Farooq, U.; Grudin, J. Human-computer integration. Interactions 2016, 23, 26–32. [Google Scholar] [CrossRef]
- Sra, M.; Jain, A.; Maes, P. Adding Proprioceptive Feedback to Virtual Reality Experiences Using Galvanic Vestibular Stimulation. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 1–14. [Google Scholar]
- Spicer, R.P.; Russell, S.M.; Rosenberg, E.S. The mixed reality of things: Emerging challenges for human-information interaction. In Next-Generation Analyst V; International Society for Optics and Photonics: Bellingham, WA, USA, 2017; Volume 10207, p. 102070A. ISBN 9781510609150. [Google Scholar]
- Mueller, F.F.; Lopes, P.; Strohmeier, P.; Ju, W.; Seim, C.; Weigel, M.; Nanayakkara, S.; Obrist, M.; Li, Z.; Delfa, J.; et al. Next Steps for Human-Computer Integration. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1–15. [Google Scholar]
- Hainich, R.R. The End of Hardware: Augmented Reality and Beyond; BookSurge: Charleston, SC, USA, 2009. [Google Scholar]
- Bariya, M.; Li, L.; Ghattamaneni, R.; Ahn, C.H.; Nyein, H.Y.Y.; Tai, L.C.; Javey, A. Glove-based sensors for multimodal monitoring of natural sweat. Sci. Adv. 2020, 6, 8308. [Google Scholar] [CrossRef]
- Lawrence, J. Review of Communication in the Age of Virtual Reality. Contemp. Psychol. A J. Rev. 1997, 42, 170. [Google Scholar] [CrossRef]
- Hendaoui, A.; Limayem, M.; Thompson, C.W. 3D social virtual worlds: Research issues and challenges. Inst. Electr. Electron. Eng. Internet Comput. 2008, 12, 88–92. [Google Scholar] [CrossRef]
- Wann, J.P.; Rushton, S.; Mon-Williams, M. Natural problems for stereoscopic depth perception in virtual environments. Vis. Res. 1995, 35, 2731–2736. [Google Scholar] [CrossRef] [Green Version]
- Ahmed, S.; Irshad, L.; Demirel, H.O.; Tumer, I.Y. A Comparison Between Virtual Reality and Digital Human Modeling for Proactive Ergonomic Design. In Proceedings of the International Conference on Human-Computer Interaction, Orlando, FL, USA, 26–31 July 2019; Springer: Cham, Switzerland, 2019; pp. 3–21. [Google Scholar]
- Bonner, E.; Reinders, H. Augmented and Virtual Reality in the Language Classroom: Practical Ideas. Teach. Engl. Technol. 2018, 18, 33–53. [Google Scholar]
- Royakkers, L.; Timmer, J.; Kool, L.; van Est, R. Societal and ethical issues of digitization. Ethics Inf. Technol. 2018, 20, 127–142. [Google Scholar] [CrossRef] [Green Version]
- Welch, G.; Bruder, G.; Squire, P.; Schubert, R. Anticipating Widespread Augmented Reality; University of Central Florida: Orlando, FL, USA, 2018. [Google Scholar]
- Smits, M.; Bart Staal, J.; Van Goor, H. Could Virtual Reality play a role in the rehabilitation after COVID-19 infection? BMJ Open Sport Exerc. Med. 2020, 6, 943. [Google Scholar] [CrossRef] [PubMed]
- Huang, H.M.; Rauch, U.; Liaw, S.S. Investigating learners’ attitudes toward virtual reality learning environments: Based on a constructivist approach. Comput. Educ. 2010, 55, 1171–1182. [Google Scholar] [CrossRef]
- Siricharoen, W. V The Effect of Virtual Reality as a form of Escapism. In Proceedings of the International Conference on Information Resources Management, Auckland, New Zealand, 27–29 May 2019; p. 36. [Google Scholar]
- Pesce, M. AR’s Prying Eyes. Inst. Electr. Electron. Eng. Spectr. 2021, 19. [Google Scholar]
- DeCarlo, C.C. Toward the Year 2018; Foreign Policy Association, Ed.; Cowles Educational Corp.: New York, NY, USA, 1968. [Google Scholar]
- Aati, K.; Chang, D.; Edara, P.; Sun, C. Immersive Work Zone Inspection Training using Virtual Reality. Transp. Res. Rec. J. Transp. Res. Board 2020, 2674, 224–232. [Google Scholar] [CrossRef]
- Sowndararajan, A.; Wang, R.; Bowman, D.A. Quantifying the benefits of immersion for procedural training. In Proceedings of the IPT/EDT 2008—Immersive Projection Technologies/Emerging Display Technologies Workshop, Los Angeles, CA, USA, 9–10 August 2008; Volume 2, pp. 1–4. [Google Scholar]
- Nigay, L.; Coutaz, J. A design space for multimodal systems. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems—CHI’93, Amsterdam, The Netherlands, 24–29 April 1993; Association for Computing Machinery Press: New York, NY, USA, 1993; pp. 172–178. [Google Scholar]
- Covarrubias, M.; Bordegoni, M.; Rosini, M.; Guanziroli, E.; Cugini, U.; Molteni, F. VR system for rehabilitation based on hand gestural and olfactory interaction. In Proceedings of the 21st Association for Computing Machinery Symposium on Virtual Reality Software and Technology, Beijing, China, 13–15 November 2015; Association for Computing Machinery: New York, NY, USA, 2015; Volume 13, pp. 117–120. [Google Scholar]
- Yeh, S.C.; Lee, S.H.; Chan, R.C.; Wu, Y.; Zheng, L.R.; Flynn, S. The Efficacy of a Haptic-Enhanced Virtual Reality System for Precision Grasp Acquisition in Stroke Rehabilitation. J. Healthc. Eng. 2017, 2017, 9840273. [Google Scholar] [CrossRef] [Green Version]
- Manuel, D.; Moore, D.; Charissis, V. An investigation into immersion in games through motion control and stereo audio reproduction. In Proceedings of the 7th Audio Mostly Conference on A Conference on Interaction with Sound—AM’12, Corfu, Greece, 26–28 September 2012; Association for Computing Machinery Press: New York, NY, USA, 2012; pp. 124–129. [Google Scholar]
- Shaw, L.A.; Wuensche, B.C.; Lutteroth, C.; Buckley, J.; Corballis, P. Evaluating sensory feedback for immersion in exergames. In Proceedings of the Australasian Computer Science Week Multiconference, Geelong, Australia, 30 January–3 February 2017; Association for Computing Machinery: New York, NY, USA, 2017; pp. 1–6. [Google Scholar]
- Triantafyllidis, E.; McGreavy, C.; Gu, J.; Li, Z. Study of multimodal interfaces and the improvements on teleoperation. Inst. Electr. Electron. Eng. Access 2020, 8, 78213–78227. [Google Scholar] [CrossRef]
Device | Type | Fingers | Wireless | Actuator | Force Feedback | Tactile Feedback | Hand Tracking | Active DoFs | Weight (g) | Price |
---|---|---|---|---|---|---|---|---|---|---|
Gloveone 1 | Glove | 5 | yes | Electromagnetic | no | yes | yes | 10 | na | 499 € |
AvatarVR 2 | Glove | 5 | yes | Electromagnetic | no | yes | yes | 10 | na | 1100 € |
Senso Glove 3 | Glove | 5 | yes | Electromagnetic | no | yes | yes | 5 | na | $599 |
Cynteract 4 | Glove | 5 | yes | Electromagnetic | yes | no | yes | 5 | na | na |
Maestro 5 | Glove | 1 | yes | Electromagnetic | no | yes | yes | 5 | 590 | na |
GoTouchVR 6 | Thimble | 1 | yes | Electromagnetic | yes | yes | yes | 2 | 20 | na |
Exo-Glove Poly 7 | Thimble | 1–3 | no | Soft Polymer (20–40N) | yes | yes | no | na | 194 | na |
Prime X Series 8 | Glove | 5 | yes | Electromagnetic | no | yes | yes | 9 | na | 3999 € |
CyberGrasp 9 | Exosk. | 5 | no | Electromagnetic | yes | yes | yes | 5 | 450 | $50,000 |
Dextarobotics 10 | Exo-Gloves | 5 + 5 | yes | Electromagnetic | yes | yes | yes | 11 | 320 | $12,000 |
HaptX 11 | Exosk. | 5 | no | Pneumatic | yes | yes | yes | na | na | na |
VRgluv 12 | Exosk. | 5 | yes | Electromagnetic | yes | yes | yes | 5 | na | $579 |
Sense Glove 1 13 | Exosk. | 5 | yes | Electromagnetic | no | yes | yes | 5 | 300 | 999 € |
Sense Glove N 14 | Exosk. | 5 | yes | Electromagnetic | yes | yes | yes | 24 | na | 4500 € |
HGlove 15 | Exosk. | 3 | no | Electromagnetic | yes | yes | no | 9 | 750 | 30,000 € |
Noitom Hi5 16 | Glove | 5 | yes | Electromagnetic | no | yes | yes | 9 | 105 | $999 |
Sensoryx VR Fr 17 | Glove | 5 + 5 | yes | Electromagnetic | no | yes | yes | 10 | na | 600 € |
Tesla Glove 18 | Exosk. | 5 | yes | Electromagnetic 3 × 3 display/finger | yes | yes | yes | 9 | 300 | $5000 |
ThermoReal Plus 19 | HMD, glove, sleeve | 1 pt | yes | Thermal feedback (hot & cold stimulation) | no | yes | no | na | na | na |
WEART Thimble 20 | Thimbles | 3 | yes | Electromagnetic and Thermal actuation | yes | yes | yes | na | na | $3999 |
Device | Coverage | Tracking | Feedback Type | Haptic Library | Dev. API | HMD Platform | Price ($) |
---|---|---|---|---|---|---|---|
Nullspace VR 1 | 32 independent zones (chest, abdomen, shoulders, arms & hands) | Yes | Vibrotactile (via Bluetooth & wired) | Fixed 117 effects with customization effects editor | Unity 3D | Multiplatform with 3rd party tracking | 299 (vest only) |
Tesla Suit 2 | Full body suit, 80 embedded electrostatic channels | Yes, biometrics, 10 MC sensors | EM Vibrotactile (via Bluetooth) | Customizable: from 1–300 Hz, 1–260 ms and 0.15 A/chan. | Unity 5 and Unreal Engine 4 | Multiplatform | 13,000 (suit and gloves) |
Axon VR/HaptX 3 | Full body suit, gloves, and pads | Yes, magnetic motion tracking | Vibrotactile, micro-fluidic, force feedback exoskeleton | Temperature, vibration, motion, shape, and texture | Unity 5, Unreal Engine, Steam VR HaptX SDK | Multiplatform | N/A enterprise solution |
Tactsuit x16 Tactsuit x40 (bHaptics) 4 | Full body suit, gloves, pads, and feet guard, 70 zones (x16, x40) | Yes | Vibrotactile Bluetooth actuator versions: x16, x40 | Customizable | Unity 5, Unreal Engine | Multiplatform | x16 = 299 x40 = 499 (pre-order price) |
Rapture VR 5 | HMD and Vest by uploadVR | Yes | Vibrotactile | N/A | Unity 5, Unreal Engine | Only for VOID Experience | N/A |
Synesthesia Suit 6 | Vest, gloves, and pads, 26 active zones | Yes | Vibrotactile | Customizable (triggered by audio feedback) | PS VR, Unity 5, Unreal Engine | Multiplatform including PlayStation | Under development |
Hands Omni 7 | Gloves, vest, pads, treadmill | Yes | Vibrotactile | Customizable | Unity 5, Unreal Engine | Multiplatform | Enterprise solution |
HoloSuit 8 | Glove, jacket & pants, 40 sensors, nine actuation elem. | Yes | Vibrotactile | Customizable | Unity, UE 4 and Motion-Builder | Multiplatform | Under development |
Woojer 9 | Vest (two actuators at sides, back, front) and Waist straps (one actuator) | No | Vibrotactile, audio (1–200 Hz) & TI wireless control | Customizable (triggered along-side audio feedback) | Used over any audio-based interface | Multiplatform/open | Vest 349 Strap 129 |
NeoSensory exoskin VR suit 10 | Haptic jacket, vest & wrist, 32 actuation motors | No | Vibrotactile, adjustable freq. signals | Customizable | Custom SDK, Unity, and UE | Multiplatform | 400 SDK and developer package |
Shockwave 11 (Kickstarter) | Vest, and wearable straps (on the legs) with eight zones | Fullbody, eight IMUs (wireless) | 64 vibrotactile feedback (HD haptics) | Customizable | Unity, Unreal Engine 4 | Most VR headsets (requires dev. support) | 300 for Kickstarter |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Rakkolainen, I.; Farooq, A.; Kangas, J.; Hakulinen, J.; Rantala, J.; Turunen, M.; Raisamo, R. Technologies for Multimodal Interaction in Extended Reality—A Scoping Review. Multimodal Technol. Interact. 2021, 5, 81. https://doi.org/10.3390/mti5120081
Rakkolainen I, Farooq A, Kangas J, Hakulinen J, Rantala J, Turunen M, Raisamo R. Technologies for Multimodal Interaction in Extended Reality—A Scoping Review. Multimodal Technologies and Interaction. 2021; 5(12):81. https://doi.org/10.3390/mti5120081
Chicago/Turabian StyleRakkolainen, Ismo, Ahmed Farooq, Jari Kangas, Jaakko Hakulinen, Jussi Rantala, Markku Turunen, and Roope Raisamo. 2021. "Technologies for Multimodal Interaction in Extended Reality—A Scoping Review" Multimodal Technologies and Interaction 5, no. 12: 81. https://doi.org/10.3390/mti5120081
APA StyleRakkolainen, I., Farooq, A., Kangas, J., Hakulinen, J., Rantala, J., Turunen, M., & Raisamo, R. (2021). Technologies for Multimodal Interaction in Extended Reality—A Scoping Review. Multimodal Technologies and Interaction, 5(12), 81. https://doi.org/10.3390/mti5120081