Next Article in Journal
Metabolomics Studies on Asteraceae Family Plants to Find Cytotoxic Drug Candidates
Previous Article in Journal
Diversity of Domestication Loci in Wild Rice Populations
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Smarty Pants: Exploring Textile Pressure Sensors in Trousers for Posture and Behaviour Classification †

1
School of Electronic Engineering and Computer Science, Queen Mary University of London, London E1 4NS, UK
2
Dyson School of Design Engineering, Imperial College London, London SW7 2DB, UK
*
Author to whom correspondence should be addressed.
Presented at the International Conference on the Challenges, Opportunities, Innovations and Applications in Electronic Textiles (E-Textiles 2019), London, UK, 12 November 2019.
Proceedings 2019, 32(1), 19; https://doi.org/10.3390/proceedings2019032019
Published: 30 December 2019

Abstract

:
In this paper, we introduce a new modality for capturing body postures and social behaviour. Vice versa, we propose a new application area for on-body textile sensors. We have developed “smart trousers” with embedded textile pressure sensors that allow for classification of a large variety of postural movements as well as interactional states. Random Forest models are used to investigate those. Here, we give an overview of the research conducted and discuss potential use cases of the presented design.

1. Introduction

Using technologies in a ubiquitous way which has been integrated into our day-to-day life has long been established. Smart environments [1] or interior designs [2,3] have contributed to imagining a fully connected, Internet of Things inspired life. A central part of such sensing networks is to take measurements of the human body itself. Often, this is achieved by collecting data through objects we come in touch with or encounter, transforming them into sensing surfaces—for example, seat covers [2] or even ceiling lights [4]. A more dynamic and flexible approach to human-centred sensing is wearable computing, using mobile phones, smart watches, or other accessories to track bodily actions continuously.
The range of applications in which these technologies can be used is wide. They are often presented in a context for novel interface designs in Human–Computer Interaction, but are also used in health and medical care for patient monitoring or rehabilitation purposes, as well as in social sciences, to explore interactional dynamics of individuals and groups [5,6]. There are, however, problems with such sensing systems. Many of them are obtrusive and augment our natural environment. Furthermore, many devices record more data than necessary, leading to problems of privacy and obtaining consent of subjects. One modality that has proven to overcome many of these concerns is textile sensing. Utilising this material with which we are so familiar and by which we are surrounded in many situations has gained increasing interest in the past decade. Embedded in pillows [7], tablecloths [8], or garments [9,10], electronic textiles (e-textiles) are able to replace rigid components and capture bodily movement in a novel, holistic way on which other technologies have had to compromise.
In our work, we aim to establish smart clothing as a novel method to capture social interaction. We focus on measuring non-verbal behaviours like postural movement and touch interactions through textile pressure sensors in custom-made trousers. Here, we present an overview of this research and report on the design engineering, verification, and findings of our “smart trousers”.

2. Designing Sensing Trousers

The way we move in social interaction (and in general) is, in part, determined by the affordances of our clothes. Certain materials and pattern cuts allow or restrict bodily actions. This has to be accounted for when designing textile sensors, too. Shape, number, position, and type of sensors should follow the purpose of the determined use case. Additionally, the integration of these sensors in a garment also requires specific manufacturing considerations and design principles.
The starting points of addressing these aspects were ethnographic studies of seated, spontaneous conversations between people. The findings of these observations informed hypotheses on postures linked to social behaviours—for example, identifying typical movements of speakers compared to listeners.
The shell of the trousers is made of a conventional fabric, a viscose–cotton mix jersey knit for a legging-like design and good wearing comfort. The embedded sensing system uses conductive fabrics as well as knitted single jerseys with different properties. We used a resistive fabric from Eeonyx (EeonTex LTT-SLPA Conductive Stretchable Fabric) for the middle layer of the pressure-sensing area, and highly conductive nylon stripes to build the connections (see top right of Figure 1). The fabrics were arranged to form a matrix of 10 × 10 sensors, one for each leg of the trousers. This creates 100 data points per leg, mapped around the thighs and buttocks, as shown on the left side of Figure 1. Each crossing point of the rows and columns of the matrix could be treated as a separate pressure sensor.
The sensors are connected to a battery-powered microcontroller that collected their data locally with a micro SD-card. The circuit board is hidden in the hem of the trousers to be the least intrusive to the wearer. Pressure recordings were collected at 4 Hz, following the principle of [11]. For further pre-processing and final analyses, the sensor data was normalised.
For the integration of this sensing system, trousers were tailored to conceal all electronic components and wiring in an unobtrusive way. Panels on the inside leg were added for wiring and connections to the microcontroller, different clothing sizes were accounted for, and fabric layering was accumulated for optimised wearing comfort. Details are shown in Figure 1 (left). A more detailed description of the design process can be found in [12].
Figure 1. Left: Inside and outside view of “smart trousers” with the layer of textile pressure sensors embedded around thighs and buttocks on upper legs. Right: Inside of the matrix, showing the distribution of rows and columns as well as the layering of the pressure sensors.
Figure 1. Left: Inside and outside view of “smart trousers” with the layer of textile pressure sensors embedded around thighs and buttocks on upper legs. Right: Inside of the matrix, showing the distribution of rows and columns as well as the layering of the pressure sensors.
Proceedings 32 00019 g001

3. Methodology

The performance of our “smart trousers” as methods to capture postures of conversation was tested in two user studies. First, we evaluated how well the trousers can distinguish a large variety of static sitting postures. Single users were instructed to perform a range of 19 postures combining upper and lower legs, as well as hand touch on legs and positions of the upper body in relation. A visualisation of two leg crossing postures is shown as an example in Figure 2.
Secondly, we looked at seated multiparty conversations, focusing on behavioural cues that do not necessarily correlate with marked postural movement, but that determine conversational states. In particular, we evaluated the following against each other: Talking, listening, laughing, nodding, and backchanneling. This was examined in three-way conversations recorded and processed in similar ways to those of the first user study. An overview and comparison of both studies is shown in Table 1.
Figure 2. Simple visualisation of pressure distribution for two leg crossing sitting postures. Each circle represents a data point of the textile sensor matrix in the trousers.
Figure 2. Simple visualisation of pressure distribution for two leg crossing sitting postures. Each circle represents a data point of the textile sensor matrix in the trousers.
Proceedings 32 00019 g002
An overview and comparison of these two studies is shown in Figure 1.
Table 1. Overview of the settings used in the two studies in which the smart trousers were tested.
Table 1. Overview of the settings used in the two studies in which the smart trousers were tested.
ModeParticipantsSet UpSoftwareClassificationClasses
Study 1: Posturessingle user6controlledWeka, ElanRandom Forest19
Study 2: Behavioursmulti-party20spontaneousWeka, ElanRandom Forest4 + 1

4. Results

All studies were video recorded and annotated using Elan [13] The sensor data was time aligned with annotations of the video recordings, marking the gross events we examine here, i.e., postures for the first, and conversational cues for the second study.
In both studies, we used machine learning techniques to detect the determined postures and behaviours. Random Forest classification with a 10-fold cross-validation tested individual and community models. Average accuracies and F-measures were overall notably better for individual participants than for groups. Moreover, posture classification showed significantly better results than behaviour classification, with up to 99.75% accuracy for individual models and 64.26% for community models. In summary, this aligns with our preliminary expectations of this new sensing system. Despite leaving room for improvement both in analysis methods as well as in data processing, we were able to show that simple textile sensors in trousers are a valid method to capture a large range of human behaviour.

5. Discussion

In summary, the results of both studies show that there is a large variation across participants, leading to poorer results in community models. Nevertheless, there are clearly reoccurring patterns of movement and pressure distribution within the same classes of posture or behaviour. The conclusion to draw from this is that we move similarly to ourselves, but there can be a significant difference in how different people perform different movements. This finding can be used for future applications that focus around individuals—for example, for patients in rehabilitation or for athletes’ customised training programmes. With a larger data set, the performance of community models could be improved as well, leaving room for further use cases assessing social dynamics of groups through smart clothing, such as detecting behavioural actions of new interactants.
In this research, we have explored the potential of one modality—pressure sensing. However, there are wearable sensor designs offering a more hybrid approach, being able to additionally capture proximity or other touch interactions. Further iterations on the design of the sensor matrix used here regarding materials, embedding in trousers, or optimised pattern cutting are also proposed in [14] and can contribute towards a more fine-grained and even more connected sensing network to gather bodily data. In addition, if trousers can tell whether someone is speaking or not without recording video or audio, other pieces of clothing may be able to capture more fine-grained and complex behaviours, and can be validated as a method to understand non-verbal behaviours in an even more detailed manner.

Funding

This research was funded by the EPSRC and AHRC Centre for Doctoral Training in Media and Arts Technology (EP/L01632X/1).

Acknowledgments

The authors would like to thank colleagues for their valuable support and suggestions. In particular, we thank Adan Benito Temprano and Tom Gurion for their help with data processing. Furthermore, we thank all volunteers for participating in our studies.

References

  1. Olivier, P.; Xu, G.; Monk, A.; Hoey, J. Ambient kitchen: Designing situated services using a high fidelity prototyping environment. In Proceedings of the 2nd International Conference on Pervasive Technologies Related to Assistive Environments, Corfu, Greece, 9–13 June 2009; ACM: New York, NY, USA, 2009; p. 47. [Google Scholar]
  2. Skach, S.; Healey, P.G.; Stewart, R. Talking Through Your Arse: Sensing Conversation with Seat Covers. In Proceedings of the 39th Annual Meeting of the Cognitive Science Society (CogSci), London, UK, 26–29 July 2017. [Google Scholar]
  3. Bleda, A.L.; Fernández-Luque, F.J.; Rosa, A.; Zapata, J.; Maestre, R. Smart sensory furniture based on WSN for ambient assisted living. IEEE Sens. J. 2017, 17, 5626–5636. [Google Scholar] [CrossRef]
  4. Li, T.; An, C.; Tian, Z.; Campbell, A.T.; Zhou, X. Human sensing using visible light communication. In Proceedings of the 21st Annual International Conference on Mobile Computing and Networking, Paris, France, 7–11 September 2015; ACM: New York, NY, USA, 2015; pp. 331–344. [Google Scholar]
  5. Choudhury, T.; Pentland, A. Sensing and modeling human networks using the sociometer. In Proceedings of the Seventh IEEE International Symposium on Wearable Computers, White Plains, New York, USA, 21–23 October 2003; IEEE: Piscataway, NJ, USA, 2003; pp. 216–222. [Google Scholar]
  6. Katevas, K.; Hänsel, K.; Clegg, R.; Leontiadis, I.; Haddadi, H.; Tokarchuk, L. Finding Dory in the Crowd: Detecting Social Interactions using Multi-Modal Mobile Sensing. arXiv 2018, arXiv:1809.00947. [Google Scholar]
  7. Lokavee, S.; Puntheeranurak, T.; Kerdcharoen, T.; Watthanwisuth, N.; Tuantranont, A. Sensor pillow and bed sheet system: Unconstrained monitoring of respiration rate and posture movements during sleep. In Proceedings of the 2012 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Seoul, Korea, 14–17 October 2012; IEEE: Piscataway, NJ, USA, 2012; pp. 1564–1568. [Google Scholar]
  8. Gaver, W.; Bowers, J.; Boucher, A.; Law, A.; Pennington, S.; Villar, N. The history tablecloth: Illuminating domestic activity. In Proceedings of the 6th Conference on Designing Interactive Systems, University Park, PA, USA, 26–28 June 2006; ACM: New York, NY, USA, 2006; pp. 199–208. [Google Scholar]
  9. Gioberto, G.; Compton, C.; Dunne, L. Machine-Stitched E-Textile Stretch Sensors. Sens. Transducers J. 2016, 202, 25–37. [Google Scholar]
  10. Perner-Wilson, H.; Buechley, L.; Satomi, M. Handcrafting textile interfaces from a kit-of-no-parts. In Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction, Funchal, Portugal, 22–26 January 2011; ACM: New York, NY, USA, 2011; pp. 61–68. [Google Scholar]
  11. Donneaud, M.; Honnet, C.; Strohmeier, P. Designing a multi-touch etextile for music performances. In Proceedings of the NIME, Aalborg University, Copenhagen, Denmark, 15–18 May 2017; pp. 7–12. [Google Scholar]
  12. Skach, S.; Stewart, R.; Healey, P.G.T. Smart Arse: Posture Classification with Textile Sensors in Trousers. In Proceedings of the 20th ACM International Conference on Multimodal Interaction (ICMI ’18), Boulder, CO, USA, 16–20 October 2018; ACM: New York, NY, USA, 2018; pp. 116–124. [Google Scholar] [CrossRef]
  13. Brugman, H.; Russel, A.; Nijmegen, X. Annotating Multi-media/Multi-modal Resources with ELAN. In Proceedings of the LREC, Lisbon, Portugal, 26–28 May 2004. [Google Scholar]
  14. Skach, S.; Stewart, R. One Leg at a Time: Towards Optimised Design Engineering of Textile Sensors in Trousers. In Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers, (UbiComp/ISWC ’19) Adjunct, London, UK, 9–13 September 2019; ACM: New York, NY, USA, 2019; pp. 206–209. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Skach, S.; Stewart, R.; Healey, P.G.T. Smarty Pants: Exploring Textile Pressure Sensors in Trousers for Posture and Behaviour Classification. Proceedings 2019, 32, 19. https://doi.org/10.3390/proceedings2019032019

AMA Style

Skach S, Stewart R, Healey PGT. Smarty Pants: Exploring Textile Pressure Sensors in Trousers for Posture and Behaviour Classification. Proceedings. 2019; 32(1):19. https://doi.org/10.3390/proceedings2019032019

Chicago/Turabian Style

Skach, Sophie, Rebecca Stewart, and Patrick G. T. Healey. 2019. "Smarty Pants: Exploring Textile Pressure Sensors in Trousers for Posture and Behaviour Classification" Proceedings 32, no. 1: 19. https://doi.org/10.3390/proceedings2019032019

APA Style

Skach, S., Stewart, R., & Healey, P. G. T. (2019). Smarty Pants: Exploring Textile Pressure Sensors in Trousers for Posture and Behaviour Classification. Proceedings, 32(1), 19. https://doi.org/10.3390/proceedings2019032019

Article Metrics

Back to TopTop