Design of Interactions for Handheld Augmented Reality Devices Using Wearable Smart Textiles: Findings from a User Elicitation Study
Abstract
:1. Introduction
2. Related Work
2.1. Participatory Elicitation Studies
2.2. Hand-Worn Clothing-Based Interfaces
2.3. Summary
3. Methodology
Fabric-Based Hand-Worn Interface
4. User Study
4.1. Participants
4.2. Experimental Setup
4.3. Handheld Augmented Reality Tasks
4.4. Procedure
4.4.1. Introduction
4.4.2. Pre-Elicitation
4.4.3. Elicitation
4.4.4. Semi-Structured Interview
4.5. Measures
5. Results
5.1. Classification of Gestures
5.2. Consensus between the Users
5.3. Effects of Users’ Prior Experience on Agreement
5.4. Taxonomy of Wrist and Thumb-to-Index Touch Gestures
- Complexity (see Figure 4) identifies both touch and wrist gesture as either (a) simple or (b) complex. We define simple gestures as gestures that are performed using only one gesture (e.g., wrist or touch gesture). For example, moving the wrist downwards toward the palm to perform downward flexion and/or using the thumb to press any of the soft foam buttons are identified as simple gestures. Gestures performed using two or more distinctive gestures are identified as complex gestures (e.g., pressing a soft foam button followed by moving the wrist downwards toward the palm). We adopted this dimension from [52].
- Structure (see Figure 5) represents the relative importance of the wrist and touch gestures in the elicitation of HAR gestures, with seven categories: (a) wrist (b) touch (button one), (c) touch (button two), (d) touch (button three), (e) touch (button one) and wrist, (f) touch (button two) and wrist, and (g) touch (button three) and wrist. We modified this category from the taxonomy of Vatatu and Pentiu [51]. For example, for touch only category, the tap or hold gesture was performed using any one of the three buttons.
- Action (see Figure 6) groups the gestures based on their actions rather than their semantic meaning with five categories: (a) scroll, (b) swipe, (c) tap, (d) hold, and (e) compound. We adopted and modified this classification from Chan et al. [35], who used these dimensions to define user-designed single-hand microgestures without any specific domains. For example, downward flexion and upward extension were grouped as scrolls, while leftward flexion and rightward extension were grouped as swipes.
5.5. Consensus Gesture Set for HAR Interactions
5.6. Users’ Feedback
6. Discussion
6.1. Design Recommendations for Fabric-Based Interface for HAR Interactions
6.1.1. Simple Gestures Were Preferred for HAR Interactions
6.1.2. Utilize Familiar Touch-Based Input Methods for Discrete HAR Tasks
6.1.3. Design Wrist Gestures for Continuous HAR Tasks
6.1.4. Favor Stretchable Fabric with Fingerless Design
6.1.5. Consider Resistive Sensing Technique to Capture Both the Wrist and Touch Inputs
6.1.6. Design Fabric-Based Interface That Foster Socially Acceptable Gestures
6.2. Limitations
7. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Azuma, R.T. A Survey of Augmented Reality. Presence Teleoperators Virtual Environ. 1997, 6, 355–385. [Google Scholar] [CrossRef]
- Mekni, M.; Lemieux, A. Augmented Reality: Applications, Challenges and Future Trends. Appl. Comput. Sci. 2014, 205, 205–214. [Google Scholar]
- Chatzopoulos, D.; Bermejo, C.; Huang, Z.; Hui, P. Mobile Augmented Reality Survey: From Where We Are to Where We Go. IEEE Access 2017, 5, 6917–6950. [Google Scholar] [CrossRef]
- Zhou, F.; Duh, H.B.-L.; Billinghurst, M. Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR. In Proceedings of the 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality, Cambridge, UK, 15–18 September 2008; pp. 193–202. [Google Scholar]
- Kim, K.; Billinghurst, M.; Bruder, G.; Duh, H.B.L.; Welch, G.F. Revisiting trends in augmented reality research: A review of the 2nd Decade of ISMAR (2008–2017). IEEE Trans. Vis. Comput. Graph. 2018, 24, 2947–2962. [Google Scholar] [CrossRef] [PubMed]
- Bekele, M.K.; Town, C.; Pierdicca, R.; Frontoni, E.; Malinverni, E.V.A.S. A Survey of Augmented, Virtual, and Mixed Reality for Cultural Heritage. ACM J. Comput. Cult. Herit. 2018, 11, 7. [Google Scholar] [CrossRef]
- Polvi, J.; Taketomi, T.; Yamamoto, G.; Dey, A.; Sandor, C.; Kato, H. SlidAR: A 3D positioning method for SLAM-based handheld augmented reality. Comput. Graph. 2016, 55, 33–43. [Google Scholar] [CrossRef] [Green Version]
- Grandi, J.G.; Debarba, H.G.; Bemdt, I.; Nedel, L.; MacIel, A. Design and Assessment of a Collaborative 3D Interaction Technique for Handheld Augmented Reality. In Proceedings of the 25th IEEE Conference Virtual Reality 3D User Interfaces, VR 2018, Reutlingen, Germany, 18–22 March 2018; pp. 49–56. [Google Scholar]
- Mossel, A.; Venditti, B.; Kaufmann, H. 3DTouch and HOMER-S: Intuitive Manipulation Techniques for One-Handed Handheld Augmented Reality. In Proceedings of the Virtual Reality International Conference on Laval Virtual—VRIC ’13, Laval, France, 20–22 March 2013; p. 1. [Google Scholar]
- Samini, A.; Palmerius, K.L. A study on improving close and distant device movement pose manipulation for hand-held augmented reality. In Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology—VRST ’16, Munich, Germany, 2–4 November 2016; pp. 121–128. [Google Scholar]
- Bai, H.; Lee, G.A.; Ramakrishnan, M.; Billinghurst, M. 3D gesture interaction for handheld augmented reality. In Proceedings of the SIGGRAPH Asia 2014 Mobile Graphics and Interactive Applications on—SA ’14, Shenzhen, China, 3–6 December 2014; pp. 1–6. [Google Scholar]
- Wacker, P.; Nowak, O.; Voelker, S.; Borchers, J. ARPen: Mid-Air Object Manipulation Techniques for a Bimanual AR System with Pen & Smartphone. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems—CHI ’19, Glasgow, UK, 4–9 May 2019; pp. 1–12. [Google Scholar]
- Goh, E.S.; Sunar, M.S.; Ismail, A.W. 3D object manipulation techniques in handheld mobile augmented reality interface: A review. IEEE Access 2019, 7, 40581–40601. [Google Scholar] [CrossRef]
- Yin, J.; Fu, C.; Zhang, X.; Liu, T. Precise Target Selection Techniques in Handheld Augmented Reality Interfaces. IEEE Access 2019, 7, 17663–17674. [Google Scholar] [CrossRef]
- Liang, H.N.; Williams, C.; Semegen, M.; Stuerzlinger, W.; Irani, P. An investigation of suitable interactions for 3D manipulation of distant objects through a mobile device. Int. J. Innov. Comput. Inf. Control. 2013, 9, 4737–4752. [Google Scholar]
- Liang, H.-N.; Williams, C.; Semegen, M.; Stuerzlinger, W.; Irani, P. User-defined surface + motion gestures for 3d manipulation of objects at a distance through a mobile device. In Proceedings of the 10th Asia Pacific Conference on Computer Human Interaction—APCHI ’12, Shimane, Japan, 28–31 August 2012; p. 299. [Google Scholar]
- Yusof, C.S.; Bai, H.; Billinghurst, M.; Sunar, M.S. A review of 3D gesture interaction for handheld augmented reality. J. Teknol. 2016, 78, 15–20. [Google Scholar] [CrossRef]
- Malhotra, Y.; Galletta, D.F. Extending the technology acceptance model to account for social influence: Theoretical bases and empirical validation. In Proceedings of the 32nd Annual Hawaii International Conference on Systems Sciences. 1999. HICSS-32. Abstracts and CD-ROM of Full Papers, Maui, HI, USA, 5–8 January 1999; p. 14. [Google Scholar]
- Stoppa, M.; Chiolerio, A. Wearable electronics and smart textiles: A critical review. Sensors 2014, 14, 11957–11992. [Google Scholar] [CrossRef]
- Weiser, M. The Computer for the 21st Century. Sci. Am. 1991, 265, 94–104. [Google Scholar] [CrossRef]
- Nanjappan, V.; Liang, H.-N.; Lau, K.; Choi, J.; Kim, K.K. Clothing-based wearable sensors for unobtrusive interactions with mobile devices. In Proceedings of the 2017 International SoC Design Conference (ISOCC), Seoul, Korea, 5–8 November 2017; pp. 139–140. [Google Scholar]
- Parzer, P.; Sharma, A.; Vogl, A.; Steimle, J.; Olwal, A.; Haller, M. SmartSleeve: Real-time Sensing of Surface and Deformation Gestures on Flexible, Interactive Textiles, using a Hybrid Gesture Detection Pipeline. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology—UIST ’17, Quebec City, QC, Canada, 22–25 October 2017; pp. 565–577. [Google Scholar]
- Schneegass, S.; Voit, A. GestureSleeve: Using Touch Sensitive Fabrics for Gestural Input on the Forearm for Controlling Smartwatches. In Proceedings of the 2016 ACM International Symposium on Wearable Computers—ISWC ’16, Heidelberg, Germany, 12–16 September 2016; pp. 108–115. [Google Scholar]
- Yoon, S.H.; Huo, K.; Nguyen, V.P.; Ramani, K. TIMMi: Finger-worn Textile Input Device with Multimodal Sensing in Mobile Interaction. In Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction—TEI ’15, Stanford, CA, USA, 15–19 January 2015; pp. 269–272. [Google Scholar]
- Peshock, A.; Dunne, L.E.; Duvall, J. Argot: A wearable one-handed keyboard glove. In Proceedings of the 2014 ACM International Symposium on Wearable Computers Adjunct Program—ISWC ’14 Adjunct, Seattle, WA, USA, 13–17 September 2014; pp. 87–92. [Google Scholar]
- Hsieh, Y.-T.; Jylhä, A.; Orso, V.; Gamberini, L.; Jacucci, G. Designing a Willing-to-Use-in-Public Hand Gestural Interaction Technique for Smart Glasses. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems—CHI ’16, San Jose, CA, USA, 7–12 May 2016; pp. 4203–4215. [Google Scholar]
- Hansson, P.; Wallberg, A.; Simsarian, K. Techniques for ‘Natural’ Interaction in Multi-User CAVE-Like Environments. In Proceedings of the Fifth European Conference on Computer Supported Cooperative Work—ECSCW ’14 Poster Description, Lancaster, UK, 7–11 September 1997. [Google Scholar]
- Nacenta, M.A.; Kamber, Y.; Qiang, Y.; Kristensson, P.O. Memorability of pre-designed and user-defined gesture sets. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems—CHI ’13, Paris, France, 27 April–2 May 2013; p. 1099. [Google Scholar]
- Wobbrock, J.O.; Aung, H.H.; Rothrock, B.; Myers, B.A. Maximizing the guessability of symbolic input. In Proceedings of the CHI ’05 Extended Abstracts on Human Factors in Computing Systems–CHI ’05, Portland, OR, USA, 2–7 April 2005; pp. 1869–1872. [Google Scholar]
- Seyed, T.; Burns, C.; Sousa, M.C.; Maurer, F.; Tang, A. Eliciting Usable Gestures for Multi-Display Environments. In Proceedings of the International Conference on Interactive Tabletops and Surfaces (ITS’12), Cambridge, MA, USA, 11–14 November 2012; pp. 41–50. [Google Scholar]
- Wobbrock, J.O.; Morris, M.R.; Wilson, A.D. User-defined gestures for surface computing. In Proceedings of the 27th international conference on Human factors in computing systems—CHI 09, Boston, MA, USA, 4–9 April 2009; p. 1083. [Google Scholar]
- Nanjappan, V.; Liang, H.-N.; Lu, F.; Papangelis, K.; Yue, Y.; Man, K.L. User-elicited dual-hand interactions for manipulating 3D objects in virtual reality environments. Hum. Cent. Comput. Inf. Sci. 2018, 8, 31. [Google Scholar] [CrossRef]
- Vatavu, R.-D.; Wobbrock, J.O. Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems—CHI ’15, Seoul, Korea, 18–23 April 2015; pp. 1325–1334. [Google Scholar]
- Piumsomboon, T.; Clark, A.; Billinghurst, M.; Cockburn, A. User-defined gestures for augmented reality. In Proceedings of the CHI ’13 Extended Abstracts on Human Factors in Computing Systems–CHI ’13, Paris, France, 27 April–2 May 2013; pp. 955–960. [Google Scholar]
- Chan, E.; Seyed, T.; Stuerzlinger, W.; Yang, X.-D.; Maurer, F. User Elicitation on Single-hand Microgestures. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, CA, USA, 7–12 May 2016; pp. 3403–3414. [Google Scholar]
- Gong, J.; Yang, X.-D.; Irani, P. WristWhirl: One-handed Continuous Smartwatch Input using Wrist Gestures. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology—UIST ’16, Tokyo, Japan, 16–19 October 2016; pp. 861–872. [Google Scholar]
- Rekimoto, J. GestureWrist and GesturePad: Unobtrusive wearable interaction devices. In Proceedings of the Fifth International Symposium on Wearable Computers, Zurich, Switzerland, 8–9 October 2001; pp. 21–27. [Google Scholar]
- Gheran, B.-F.; Vanderdonckt, J.; Vatavu, R.-D. Gestures for Smart Rings: Empirical Results, Insights, and Design Implications. In Proceedings of the 2018 Designing Interactive Systems Conference DIS ’18, Hong Kong, China, 09–13 June 2018; pp. 623–635. [Google Scholar]
- Nanjappan, V.; Shi, R.; Liang, H.-N.; Lau, K.K.-T.; Yue, Y.; Atkinson, K. Towards a Taxonomy for In-Vehicle Interactions Using Wearable Smart Textiles: Insights from a User-Elicitation Study. Multimodal Technol. Interact. 2019, 3, 33. [Google Scholar] [CrossRef]
- Bianchi, M.; Haschke, R.; Büscher, G.; Ciotti, S.; Carbonaro, N.; Tognetti, A. A Multi-Modal Sensing Glove for Human Manual-Interaction Studies. Electronics 2016, 5, 42. [Google Scholar] [CrossRef]
- Whitmire, E.; Jain, M.; Jain, D.; Nelson, G.; Karkar, R.; Patel, S.; Goel, M. DigiTouch: Reconfigurable Thumb-to-Finger Input and Text Entry on Head-mounted Displays. In Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, Maui, HI, USA, 11–15 September 2017; pp. 1–21. [Google Scholar]
- Miller, S.; Smith, A.; Bahram, S.; Amant, R.S. A glove for tapping and discrete 1D/2D input. In Proceedings of the 2012 ACM international conference on Intelligent User Interfaces—IUI ’12, Lisbon, Portugal, 14–27 February 2012; p. 101. [Google Scholar]
- Wolf, K.; Naumann, A.; Rohs, M.; Müller, J. A taxonomy of microinteractions: Defining microgestures based on ergonomic and scenario-Dependent requirements. Lect. Notes Comput. Sci. 2011, 6946, 559–575. [Google Scholar]
- Cheung, V.; Eady, A.K.; Girouard, A. Exploring Eyes-free Interaction with Wrist-Worn Deformable Materials. In Proceedings of the Tenth International Conference on Tangible, Embedded, and Embodied Interaction—TEI ’17, Yokohama, Japan, 20–23 March 2017; pp. 521–528. [Google Scholar]
- Lopes, P.; Ion, A.; Mueller, W.; Hoffmann, D.; Jonell, P.; Baudisch, P. Proprioceptive Interaction. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems—CHI ’15, Seoul, Korea, 18–23 April 2015; pp. 939–948. [Google Scholar]
- Strohmeier, P.; Vertegaal, R.; Girouard, A. With a flick of the wrist: Stretch Sensors as Lightweight Input for Mobile Devices. In Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction—TEI ’12, Kingston, ON, Canada, 19–22 February 2012; p. 307. [Google Scholar]
- Ferrone, A.; Maita, F.; Maiolo, L.; Arquilla, M.; Castiello, A.; Pecora, A.; Jiang, X.; Menon, C.; Colace, L. Wearable band for hand gesture recognition based on strain sensors. In Proceedings of the 2016 6th IEEE International Conference on Biomedical Robotics and Biomechatronics, Singapore, 26–29 June 2016; pp. 1319–1322. [Google Scholar]
- Morris, M.R.; Danielescu, A.; Drucker, S.; Fisher, D.; Lee, B.; Wobbrock, J.O. Reducing legacy bias in gesture elicitation studies. Interactions 2014, 21, 40–45. [Google Scholar] [CrossRef]
- Rahman, M.; Gustafson, S.; Irani, P.; Subramanian, S. Tilt Techniques: Investigating the Dexterity of Wrist-based Input. In Proceedings of the 27th international conference on Human factors in computing systems—CHI 09, Boston, MA, USA, 4–9 April 2009; p. 1943. [Google Scholar]
- Ruiz, J.; Vogel, D. Soft-Constraints to Reduce Legacy and Performance Bias to Elicit Whole-body Gestures with Low Arm Fatigue. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems—CHI ’15, Seoul, Korea, 18–23 April 2015; pp. 3347–3350. [Google Scholar]
- Vatavu, R.D.; Pentiuc, S.G. Multi-Level Representation of Gesture as Command for Human Computer Interaction. Comput. Inform. 2002, 27, 837–851. [Google Scholar]
- Ruiz, J.; Li, Y.; Lank, E. User-defined motion gestures for mobile interaction. In Proceedings of the 2011 annual conference on Human factors in computing systems—CHI ’11, Vancouver, BC, Canada, 7–12 May 2011; p. 197. [Google Scholar]
- Ens, B.; Grossman, T.; Anderson, F.; Matejka, J.; Fitzmaurice, G. Candid Interaction: Mobile and Wearable Computing Activities. In Proceedings of the 28th Annual ACM Symposium on User Interface Software and Technology—UIST ’15, Charlotte, NC, USA, 11–15 November 2015; pp. 467–476. [Google Scholar]
- Morris, M.R. Web on the wall: Insights from a Multimodal Interaction Elicitation Study. In Proceedings of the 2012 ACM international conference on Interactive tabletops and surfaces—ITS ’12, Cambridge, MA, USA, 11–14 November 2012; p. 95. [Google Scholar]
- Seyedin, S.; Zhang, P.; Naebe, M.; Qin, S.; Chen, J.; Wang, X.; Razal, J.M. Textile strain sensors: A review of the fabrication technologies, performance evaluation and applications. Mater. Horiz. 2019, 6, 219–249. [Google Scholar] [CrossRef]
- Tewari, A.; Gandla, S.; Bohm, S.; McNeill, C.R.; Gupta, D. Highly Exfoliated MWNT-rGO Ink-Wrapped Polyurethane Foam for Piezoresistive Pressure Sensor Applications. ACS Appl. Mater. Interfaces 2018, 10, 5185–5195. [Google Scholar] [CrossRef]
- Bevan, N. Usability is quality of use. Adv. Hum. Factors Ergon. 1995, 20, 349–354. [Google Scholar]
- Knight, J.F.; Baber, C. A Tool to Assess the Comfort of Wearable Computers. Hum. Factors J. Hum. Factors Ergon. Soc. 2005, 47, 77–91. [Google Scholar] [CrossRef]
- Rico, J.; Brewster, S. Usable gestures for mobile interfaces. In Proceedings of the 28th international conference on Human factors in computing systems—CHI ’10, Atlanta, GA, USA, 10–15 April 2010; p. 887. [Google Scholar]
User Interface (Navigation) | Object Transformation | Camera |
---|---|---|
(T1) Select next section | (T11) Move left | (T21) Switch to photo |
(T2) Select previous section | (T12) Move right | (T22) Switch to front camera |
(T3) Go to next target | (T13) Move closer | (T23) Switch to rear camera |
(T4) Go to previous target | (T14) Move further | (T24) Take a photo |
(T5) Select the target | (T15) Uniform scale up | (T25) Switch to video |
(T6) Long press on the target | (T16) Uniform scale down | (T26) Start video recording |
(T7) Scroll up | (T17) Yaw left | (T27) Stop video recording |
(T8) Scroll down | (T18) Yaw right | |
(T9) Swipe left | (T19) Pitch up | |
(T10) Swipe right | (T20) Pitch down |
Category | Tasks | AR | HAR Experience | p | Wearable Device | p | ||
---|---|---|---|---|---|---|---|---|
With | Without | Used | Never | |||||
Discrete | (T1) Select next section | 0.328 | 0.307 | 0.371 | 0.257 | 0.392 | 0.264 | 0.057 |
(T2) Select previous section | 0.167 | 0.15 | 0.171 | 0.7 | 0.187 | 0.132 | 0.354 | |
(T3) Go to next target | 0.11 | 0.078 | 0.114 | 0.51 | 0.129 | 0.077 | 0.39 | |
(T4) Go to previous target | 0.11 | 0.085 | 0.114 | 0.588 | 0.123 | 0.077 | 0.44 | |
(T5) Select the target | 0.205 | 0.144 | 0.295 | 0.02 | 0.228 | 0.143 | 0.177 | |
(T6) Long press on the target | 0.21 | 0.124 | 0.343 | 0.003 | 0.234 | 0.154 | 0.201 | |
(T21) Switch to photo | 0.053 | 0.033 | 0.057 | 0.656 | 0.041 | 0.044 | 0.959 | |
(T22) Switch to front camera | 0.106 | 0.092 | 0.095 | 0.845 | 0.129 | 0.088 | 0.494 | |
(T23) Switch to rear camera | 0.106 | 0.065 | 0.114 | 0.375 | 0.123 | 0.088 | 0.553 | |
(T24) Take a photo | 0.182 | 0.19 | 0.21 | 0.71 | 0.175 | 0.154 | 0.708 | |
(T25) Switch to video | 0.064 | 0.059 | 0.038 | 0.701 | 0.058 | 0.044 | 0.801 | |
(T26) Start recording | 0.133 | 0.118 | 0.21 | 0.12 | 0.105 | 0.154 | 0.417 | |
(T27) End recording | 0.172 | 0.17 | 0.257 | 0.178 | 0.105 | 0.275 | 0.019 | |
Continuous | (T7) Scroll up | 0.11 | 0.131 | 0.067 | 0.258 | 0.105 | 0.132 | 0.65 |
(T8) Scroll down | 0.178 | 0.229 | 0.124 | 0.082 | 0.193 | 0.165 | 0.63 | |
(T9) Swipe left | 0.189 | 0.19 | 0.162 | 0.616 | 0.24 | 0.154 | 0.175 | |
(T10) Swipe right | 0.184 | 0.19 | 0.152 | 0.499 | 0.24 | 0.143 | 0.133 | |
(T11) Move left | 0.22 | 0.255 | 0.152 | 0.088 | 0.181 | 0.242 | 0.316 | |
(T12) Move right | 0.227 | 0.301 | 0.133 | 0.012 | 0.193 | 0.231 | 0.52 | |
(T13) Move closer | 0.138 | 0.124 | 0.152 | 0.61 | 0.158 | 0.088 | 0.255 | |
(T14) Move further | 0.138 | 0.144 | 0.105 | 0.475 | 0.135 | 0.121 | 0.813 | |
(T15) Uniform scale up | 0.1 | 0.052 | 0.171 | 0.052 | 0.123 | 0.066 | 0.344 | |
(T16) Uniform scale down | 0.123 | 0.085 | 0.133 | 0.382 | 0.117 | 0.11 | 0.904 | |
(T17) Yaw left | 0.102 | 0.072 | 0.124 | 0.35 | 0.129 | 0.077 | 0.39 | |
(T18) Yaw right | 0.102 | 0.072 | 0.124 | 0.35 | 0.129 | 0.077 | 0.39 | |
(T19) Pitch up | 0.053 | 0.052 | 0.095 | 0.433 | 0.058 | 0.088 | 0.619 | |
(T20) Pitch down | 0.098 | 0.092 | 0.086 | 0.985 | 0.099 | 0.088 | 0.841 |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Nanjappan, V.; Shi, R.; Liang, H.-N.; Xiao, H.; Lau, K.K.-T.; Hasan, K. Design of Interactions for Handheld Augmented Reality Devices Using Wearable Smart Textiles: Findings from a User Elicitation Study. Appl. Sci. 2019, 9, 3177. https://doi.org/10.3390/app9153177
Nanjappan V, Shi R, Liang H-N, Xiao H, Lau KK-T, Hasan K. Design of Interactions for Handheld Augmented Reality Devices Using Wearable Smart Textiles: Findings from a User Elicitation Study. Applied Sciences. 2019; 9(15):3177. https://doi.org/10.3390/app9153177
Chicago/Turabian StyleNanjappan, Vijayakumar, Rongkai Shi, Hai-Ning Liang, Haoru Xiao, Kim King-Tong Lau, and Khalad Hasan. 2019. "Design of Interactions for Handheld Augmented Reality Devices Using Wearable Smart Textiles: Findings from a User Elicitation Study" Applied Sciences 9, no. 15: 3177. https://doi.org/10.3390/app9153177
APA StyleNanjappan, V., Shi, R., Liang, H. -N., Xiao, H., Lau, K. K. -T., & Hasan, K. (2019). Design of Interactions for Handheld Augmented Reality Devices Using Wearable Smart Textiles: Findings from a User Elicitation Study. Applied Sciences, 9(15), 3177. https://doi.org/10.3390/app9153177