Recognizing Solo Jazz Dance Moves Using a Single Leg-Attached Inertial Wearable Device
Abstract
:1. Introduction
2. Materials and Methods
2.1. Data Acquisition
2.1.1. Materials
2.1.2. Measurements
2.2. Signal Processing Overview
- 4.1.
- Dance tempo estimation and signal temporal scaling, achieved using a bank of enhanced comb filters as presented in [39];
- 4.2.
- Initial template matching, performed on a sliding correlation basis, using the magnitudes of the temporally scaled acceleration and angular velocity;
- 4.3.
- Signal transformation to the templates’ coordinate system; and
- 4.4.
- Final template matching, performed again on a sliding correlation basis, but by using the acceleration and angular velocity 3D projections of on the template coordinate system axes instead of their magnitudes.
2.3. Signal Pre-Processing
2.4. Templates’ Database
2.4.1. Template Extraction
2.4.2. Templates Similarity Measures
2.5. Dance Move Recognition
2.5.1. Dance Tempo Estimation and Temporal Scaling
2.5.2. Initial Template Matching
2.5.3. Signal Transformation
2.5.4. Final Template Matching
2.6. Recognition Performance Assessment
3. Results and Discussion
3.1. Database of Template Moves
3.2. Dance Move Recognition
3.2.1. Validation Using the Professional Dancer’s Test Sequences
3.2.2. Validation Using the Recreational Dancer’s Test Sequences
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
Appendix A.1. Signal Matrices Similarity
Appendix A.2. Dance Move Repetition Extraction
Appendix A.3. Magnitudes Similarity
Appendix A.4. Projections Similarity
Appendix A.5. Initial Template Matching
Appendix A.6. Final Template Matching
Appendix B
Orientation Independent Transformation
References
- Kyan, M.; Sun, G.; Li, H.; Zhong, L.; Muneesawang, P.; Dong, N.; Elder, B.; Guan, L. An Approach to Ballet Dance Training through MS Kinect and Visualization in a CAVE Virtual Reality Environment. ACM Trans. Intell. Syst. Technol. 2015, 6, 1–37. [Google Scholar] [CrossRef]
- Aich, A.; Mallick, T.; Bhuyan, H.B.G.S.; Das, P.; Majumdar, A.K. NrityaGuru: A Dance Tutoring System for Bharatanatyam using Kinect. In Computer Vision, Pattern Recognition, Image Processing, and Graphics; Rameshan, R., Arora, C., Dutta Roy, S., Eds.; Springer: Singapore, 2018; pp. 481–493. [Google Scholar] [CrossRef]
- Dos Santos, A.D.P.; Yacef, K.; Martinez-Maldonado, R. Let’s dance: How to build a user model for dance students using wearable technology. In Proceedings of the 25th Conference on User Modeling, Adaptation and Personalization, Bratislava, Slovakia, 9–12 July 2017; pp. 183–191. [Google Scholar] [CrossRef]
- Drobny, D.; Weiss, M.; Borchers, J. Saltate!: A sensor-based system to support dance beginners. In Proceedings of the 27th Annual CHI Conference on Human Factors in Computing Systems, Boston, MA, USA, 4–9 April 2009; pp. 3943–3948. [Google Scholar] [CrossRef]
- Romano, G.; Schneider, J.; Drachsler, H. Dancing Salsa with Machines—Filling the Gap of Dancing Learning Solutions. Sensors 2019, 19, 3661. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ofli, F.; Erzin, E.; Yemez, Y.; Tekalp, A.M.; Erdem, C.E.; Erdem, A.T.; Abaci, T.; Ozkan, M.K. Unsupervised dance figure analysis from video for dancing Avatar animation. In Proceedings of the 15th IEEE International Conference on Image Processing, San Diego, CA, USA, 12–15 October 2008; pp. 1484–1487. [Google Scholar] [CrossRef] [Green Version]
- Yamane, R.; Shakunaga, T. Dance motion analysis by correlation matrix between pose sequences. In Proceedings of the 25th International Conference of Image and Vision Computing, Queenstown, New Zealand, 8–9 November 2010; pp. 1–6. [Google Scholar] [CrossRef]
- Shikanai, N.; Hachimura, K. The Effects of the Presence of an Audience on the Emotions and Movements of Dancers. Procedia Technol. 2014, 18, 32–36. [Google Scholar] [CrossRef] [Green Version]
- Shikanai, N. Relations between Femininity and the Movements in Japanese Traditional Dance. In Proceedings of the IEEE International Conference on Consumer Electronics–Asia (ICCE–Asia), Bangkok, Thailand, 12–14 June 2019; pp. 146–148. [Google Scholar] [CrossRef]
- Kim, D.; Kim, D.H.; Kwak, K.C. Classification of K-Pop Dance Movements Based on Skeleton Information Obtained by a Kinect Sensor. Sensors 2017, 17, 1261. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Bakalos, N.; Protopapadakis, E.; Doulamis, A.; Doulamis, N. Dance Posture/Steps Classification Using 3D Joints from the Kinect Sensors. In Proceedings of the IEEE 16th International Conference on Dependable, Autonomic and Secure Computing, 16th International Conference on Pervasive Intelligence and Computing, 4th International Conference on Big Data Intelligence and Computing and Cyber Science and Technology Congress (DASC/PiCom/DataCom/CyberSciTech), Athens, Greece, 12–15 August 2018; pp. 868–873. [Google Scholar] [CrossRef]
- Ho, C.; Tsai, W.; Lin, K.; Chen, H.H. Extraction and alignment evaluation of motion beats for street dance. In Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing, Vancouver, BC, Canada, 26–31 May 2013; pp. 2429–2433. [Google Scholar] [CrossRef]
- Cornacchia, M.; Ozcan, K.; Zheng, Y.; Velipasalar, S. A Survey on Activity Detection and Classification Using Wearable Sensors. IEEE Sens. J. 2017, 17, 386–403. [Google Scholar] [CrossRef]
- Siirtola, P.; Röning, J. Recognizing Human Activities User-independently on Smartphones Based on Accelerometer Data. Int. J. Artif. Intell. Interact. Multimed. 2012, 1, 38–45. [Google Scholar] [CrossRef]
- Long, X.; Yin, B.; Aarts, R.M. Single-accelerometer-based daily physical activity classification. In Proceedings of the 31st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC 2009), Minneapolis, MN, USA, 3–6 September 2009; pp. 6107–6110. [Google Scholar] [CrossRef] [Green Version]
- Lara, O.D.; Labrador, M.A. A Survey on Human Activity Recognition using Wearable Sensors. IEEE Commun. Surv. Tutor. 2013, 15, 1192–1209. [Google Scholar] [CrossRef]
- Niemann, F.; Reining, C.; Moya Rueda, F.; Nair, N.R.; Steffens, J.A.; Fink, G.A.; ten Hompel, M. LARa: Creating a Dataset for Human Activity Recognition in Logistics Using Semantic Attributes. Sensors 2020, 20, 4083. [Google Scholar] [CrossRef]
- Sousa Lima, W.; Souto, E.; El-Khatib, K.; Jalali, R.; Gama, J. Human Activity Recognition Using Inertial Sensors in a Smartphone: An Overview. Sensors 2019, 19, 3213. [Google Scholar] [CrossRef] [Green Version]
- Shoaib, M.; Bosch, S.; Incel, O.D.; Scholten, H.; Havinga, P.J.M. Complex Human Activity Recognition Using Smartphone and Wrist-Worn Motion Sensors. Sensors 2016, 16, 426. [Google Scholar] [CrossRef] [PubMed]
- Gadaleta, M.; Rossi, M. IDNet: Smartphone-based gait recognition with convolutional neural networks. Pattern Recognit. 2018, 74, 25–37. [Google Scholar] [CrossRef] [Green Version]
- Sprager, S.; Juric, M.B. Inertial Sensor-Based Gait Recognition: A Review. Sensors 2015, 15, 22089–22127. [Google Scholar] [CrossRef]
- Cola, G.; Avvenuti, M.; Vecchio, A. Real-time identification using gait pattern analysis on a standalone wearable accelerometer. Comput. J. 2017, 60, 1173–1186. [Google Scholar] [CrossRef]
- Junker, H.; Amft, O.; Lukowicz, P.; Tröster, G. Gesture spotting with body-worn inertial sensors to detect user activities. Pattern Recognit. 2008, 41, 2010–2024. [Google Scholar] [CrossRef]
- Stančin, S.; Tomažič, S. Early Improper Motion Detection in Golf Swings Using Wearable Motion Sensors: The First Approach. Sensors 2013, 13, 7505–7521. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Martínez, A.; Jahnel, R.; Buchecker, M.; Snyder, C.; Brunauer, R.; Stöggl, T. Development of an Automatic Alpine Skiing Turn Detection Algorithm Based on a Simple Sensor Setup. Sensors 2019, 19, 902. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kos, A.; Umek, A. Smart sport equipment: SmartSki prototype for biofeedback applications in skiing. Pers. Ubiquitous Comput. 2018, 22, 535–544. [Google Scholar] [CrossRef]
- Benages Pardo, L.; Buldain Perez, D.; Orrite Uruñuela, C. Detection of Tennis Activities with Wearable Sensors. Sensors 2019, 19, 5004. [Google Scholar] [CrossRef] [Green Version]
- Dadashi, F.; Millet, G.P.; Aminian, K. A Bayesian approach for pervasive estimation of breaststroke velocity using a wearable IMU. Pervasive Mob. Comput. 2015, 19, 37–46. [Google Scholar] [CrossRef]
- Ghasemzadeh, H.; Jafari, R. Coordination Analysis of Human Movements With Body Sensor Networks: A Signal Processing Model to Evaluate Baseball Swings. IEEE Sens. J. 2011, 3, 603–610. [Google Scholar] [CrossRef]
- Paradiso, J.A.; Hsiao, K.; Benbasat, A.Y.; Teegarden, Z. Design and implementation of expressive footwear. IBM Syst. J. 2000, 39, 511–529. [Google Scholar] [CrossRef] [Green Version]
- Aylward, R.; Lovell, S.D.; Paradiso, J.A. A Compact, Wireless, Wearable Sensor Network for Interactive Dance Ensembles. In Proceedings of the International Workshop on Wearable and Implantable Body Sensor Networks, Cambridge, MA, USA, 3–5 April 2006. [Google Scholar] [CrossRef] [Green Version]
- Dang, Q.K.; Pham, D.D.; Suh, Y.S. Dance training system using foot mounted sensors. In Proceedings of the 2015 IEEE/SICE International Symposium on System Integration (SII), Nagoya, Japan, 11–13 December 2015; pp. 732–737. [Google Scholar] [CrossRef]
- Kim, Y.; Jung, D.; Park, S.; Chi, J.; Kim, T.; Lee, S. The Shadow Dancer: A New Dance Interface with Interactive Shoes. In Proceedings of the 2008 International Conference on Cyberworlds, Hangzhou, China, 22–24 September 2008; pp. 745–748. [Google Scholar] [CrossRef]
- Tragtenberg, J.; Calegario, F.; Cabral, G.; Ramalho, G. TumTá and Pisada: Two Foot-controlled Digital Dance and Music Instruments Inspired by Popular Brazillian Traditions. In Proceedings of the 17th Brazilian Symposium on Computer Music (SBCM 2019), São João del-Rei, Brazil, 25–27 September 2019. [Google Scholar] [CrossRef]
- Yamaguchi, T.; Ariga, A.; Kobayashi, T.; Hashimoto, S. TwinkleBall: A Wireless Musical Interface for Embodied Sound Media. In Proceedings of the New Interfaces for Musical Expression (NIME 2010), Sydney, Australia, 15–18 June 2010; pp. 116–119. [Google Scholar]
- Samprita, S.; Koshy, A.S.; Megharjun, V.N.; Talasila, V. LSTM-Based Analysis of a Hip-Hop Movement. In Proceedings of the 2020 6th International Conference on Control, Automation and Robotics (ICCAR), Singapore, 20–23 April 2020; pp. 519–524. [Google Scholar] [CrossRef]
- Hinton-Lewis, C.W.; McDonough, E.; Moyle, G.M.; Thiel, D.V. An Assessment of Postural Sway in Ballet Dancers During First Position, Relevé and Sauté with Accelerometers. Procedia Eng. 2016, 147, 127–132. [Google Scholar] [CrossRef] [Green Version]
- Thiel, D.V.; Quandt, J.; Carter, S.J.L.; Moyle, G.M. Accelerometer based performance assessment of basic routines in classical ballet. Procedia Eng. 2014, 72, 14–19. [Google Scholar] [CrossRef] [Green Version]
- Stančin, S.; Tomažič, S. Dance Tempo Estimation Using a Single Leg-Attached 3D Accelerometer. Sensors 2021, 21, 8066. [Google Scholar] [CrossRef] [PubMed]
- Mbientlab MMR. Available online: https://mbientlab.com/metamotionr/ (accessed on 21 December 2021).
- Alphabetical Jazz Steps 3. Available online: https://www.youtube.com/watch?v=jAIwJd2tQo0&list=PLpLDojUPSMvcYMA7jEFPidEbSD2-vNz8m (accessed on 21 December 2021).
- MATLAB; Version 9.11.0.1769968 (R2021b); The MathWorks Inc.: Natick, MA, USA, 2021.
- Stančin, S.; Tomažič, S. Time-and Computation-Efficient Calibration of MEMS 3D Accelerometers and Gyroscopes. Sensors 2014, 14, 14885–14915. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Move | (1) | (2) | (3) | (4) | (5) | (6) | (7) | (8) | (9) | (10) | (11) | (12) | (13) | (14) | (15) |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
(1) | 1.00 | 0.22 | 0.45 | 0.29 | 0.39 | 0.29 | 0.37 | 0.27 | 0.35 | 0.35 | 0.32 | 0.46 | 0.27 | 0.31 | 0.31 |
(2) | 1.00 | 0.19 | 0.21 | 0.35 | 0.47 | 0.29 | 0.44 | 0.34 | 0.28 | 0.33 | 0.33 | 0.27 | 0.30 | 0.37 | |
(3) | 1.00 | 0.42 | 0.39 | 0.28 | 0.30 | 0.33 | 0.35 | 0.26 | 0.36 | 0.55 | 0.33 | 0.43 | 0.30 | ||
(4) | 1.00 | 0.37 | 0.29 | 0.31 | 0.36 | 0.39 | 0.24 | 0.43 | 0.55 | 0.32 | 0.27 | 0.43 | |||
(5) | 1.00 | 0.33 | 0.37 | 0.37 | 0.29 | 0.26 | 0.29 | 0.48 | 0.23 | 0.48 | 0.38 | ||||
(6) | 1.00 | 0.38 | 0.42 | 0.33 | 0.32 | 0.44 | 0.38 | 0.24 | 0.36 | 0.44 | |||||
(7) | 1.00 | 0.61 | 0.20 | 0.37 | 0.31 | 0.32 | 0.25 | 0.33 | 0.28 | ||||||
(8) | 1.00 | 0.27 | 0.34 | 0.47 | 0.39 | 0.34 | 0.30 | 0.42 | |||||||
(9) | 1.00 | 0.22 | 0.29 | 0.24 | 0.30 | 0.43 | 0.31 | ||||||||
(10) | 1.00 | 0.29 | 0.35 | 0.31 | 0.27 | 0.30 | |||||||||
(11) | 1.00 | 0.47 | 0.54 | 0.26 | 0.35 | ||||||||||
(12) | 1.00 | 0.39 | 0.31 | 0.43 | |||||||||||
(13) | 1.00 | 0.19 | 0.32 | ||||||||||||
(14) | 1.00 | 0.43 | |||||||||||||
(15) | 1.00 |
Move | (1) | (2) | (3) | (4) | (5) | (6) | (7) | (8) | (9) | (10) | (11) | (12) | (13) | (14) | (15) |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
(1) | 1.00 | 0.18 | 0.28 | 0.21 | 0.23 | 0.15 | 0.23 | 0.15 | 0.26 | 0.32 | 0.25 | 0.19 | 0.10 | 0.23 | 0.25 |
(2) | 1.00 | 0.22 | 0.10 | 0.22 | 0.23 | 0.20 | 0.23 | 0.12 | 0.21 | 0.15 | 0.22 | 0.14 | 0.29 | 0.19 | |
(3) | 1.00 | 0.21 | 0.40 | 0.18 | 0.25 | 0.10 | 0.19 | 0.186 | 0.18 | 0.32 | 0.13 | 0.38 | 0.18 | ||
(4) | 1.00 | 0.26 | 0.19 | 0.13 | 0.12 | 0.26 | 0.17 | 0.43 | 0.34 | 0.11 | 0.16 | 0.40 | |||
(5) | 1.00 | 0.23 | 0.35 | 0.14 | 0.17 | 0.20 | 0.26 | 0.33 | 0.10 | 0.34 | 0.25 | ||||
(6) | 1.00 | 0.33 | 0.22 | 0.17 | 0.18 | 0.18 | 0.16 | 0.13 | 0.23 | 0.20 | |||||
(7) | 1.00 | 0.21 | 0.11 | 0.23 | 0.25 | 0.15 | 0.12 | 0.38 | 0.26 | ||||||
(8) | 1.00 | 0.23 | 0.15 | 0.11 | 0.17 | 0.09 | 0.24 | 0.10 | |||||||
(9) | 1.00 | 0.19 | 0.19 | 0.30 | 0.20 | 0.15 | 0.26 | ||||||||
(10) | 1.00 | 0.18 | 0.21 | 0.29 | 0.17 | 0.22 | |||||||||
(11) | 1.00 | 0.28 | 0.23 | 0.14 | 0.39 | ||||||||||
(12) | 1.00 | 0.18 | 0.27 | 0.29 | |||||||||||
(13) | 1.00 | 0.12 | 0.18 | ||||||||||||
(14) | 1.00 | 0.19 | |||||||||||||
(15) | 1.00 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Stančin, S.; Tomažič, S. Recognizing Solo Jazz Dance Moves Using a Single Leg-Attached Inertial Wearable Device. Sensors 2022, 22, 2446. https://doi.org/10.3390/s22072446
Stančin S, Tomažič S. Recognizing Solo Jazz Dance Moves Using a Single Leg-Attached Inertial Wearable Device. Sensors. 2022; 22(7):2446. https://doi.org/10.3390/s22072446
Chicago/Turabian StyleStančin, Sara, and Sašo Tomažič. 2022. "Recognizing Solo Jazz Dance Moves Using a Single Leg-Attached Inertial Wearable Device" Sensors 22, no. 7: 2446. https://doi.org/10.3390/s22072446
APA StyleStančin, S., & Tomažič, S. (2022). Recognizing Solo Jazz Dance Moves Using a Single Leg-Attached Inertial Wearable Device. Sensors, 22(7), 2446. https://doi.org/10.3390/s22072446