A Residual Deep Learning Method for Accurate and Efficient Recognition of Gym Exercise Activities Using Electromyography and IMU Sensors
Abstract
:1. Introduction
- This research introduces a novel hybrid deep learning design called CNN-ResBiGRU. This architecture effectively combines a CNN, residual connections, and bidirectional gated recurrent units (BiGRUs) to accurately identify gym workout patterns using multimodal sensor data. The proposed model effectively captures essential spatial and temporal features from EMG and IMU data, focusing on speed and computational efficiency.
- An extensive evaluation of the CNN-ResBiGRU model was conducted using the Myogym dataset, which serves as a thorough benchmark comprising data from 10 individuals performing 30 different gym exercises. The model showcased outstanding performance, achieving a remarkable accuracy of 97.29% and an F1-score of 92.68% when combining IMU and EMG data. This outperforms existing deep-learning approaches across various sensor configurations.
- To explore the impacts of different elements within the CNN-ResBiGRU structure, thorough ablation experiments are underway. These studies play a crucial role in unraveling the contributions of convolutional blocks in capturing spatial patterns and the ResBiGRU block in modeling temporal relationships, especially concerning EMG signals. The findings highlight the importance of each component in enhancing the model’s effectiveness and robustness.
- The primary objective of this research is to investigate the cooperative interaction between IMU and EMG sensor types when identifying gym activities. Through extensive testing, this study demonstrates the enhanced classification accuracy attained by integrating IMU and EMG data. It emphasizes the importance of utilizing multiple data sources for accurate activity detection.
2. Related Works
2.1. Sensor Modalities
2.2. Machine Learning Approaches
2.3. Deep Learning Approaches
3. Methodology
3.1. Overview of the S-HAR Framework for Gym Exercise Activity Recognition
3.2. Data Acquisition
- The information obtained from the IMUs encompasses measurements from a three-dimensional accelerometer and a three-dimensional gyroscope. Together, these form a feature domain spanning six dimensions in total. Every data point in this collection represents the instantaneous values for acceleration and angular velocity. These values were recorded at a sampling frequency of 50 measurements per second. Figure 2 illustrates a subset of IMU time-series recordings obtained from the multi-sensor wearable dataset. Panel (a) showcases readings from the triaxial accelerometer, while panel (b) presents corresponding gyroscope rotation signals.
- EMG data: The EMG data were acquired through an array of eight distinct electromyography sensors, generating an eight-dimensional feature space. Each channel recorded the electrical impulses the underlying muscular tissues produced throughout the gymnasium-based exercise routines. The electromyographic signals were acquired at a sampling rate of 50 measurements per second, synchronized with the data collected from the inertial measurement units. As stated by Jung et al. [30], the Myo armband employed a WEMG-8 commercial electromyography sensor manufactured by Laxtha Co., Ltd. (Daejeon, Republic of Korea). This sensor was equipped with a wireless transmitter operating at a frequency of 2.4 GHz and incorporated an analog bandpass filter within the electromyography electrode unit itself. The frequency range of this filter spanned from 13 Hz to 430 Hz. The Myogym dataset’s sampling frequency for collecting electromyographic data was set at 50 Hz. Figure 3 depicts a set of EMG time-series streams recorded from arm muscles during various resistance training sessions. The Myogym dataset is a comprehensive compilation of multi-modal time-series data incorporating inertial and EMG sensors to capture information throughout gym sessions. The rationale behind employing a multi-sensor, multi-subject dataset is to facilitate the advancement and evaluation of algorithms for activity recognition and sensor fusion capable of autonomously identifying and classifying typical strength training and cardiovascular exercises.
3.3. Data Pre-Processing
3.3.1. Data Denoising
3.3.2. Data Normalization
3.3.3. Data Segmentation
3.3.4. Data Generation
3.4. The Proposed CNN-ResBiGRU Model
3.4.1. Convolution Block
3.4.2. Residual BiGRU Block
3.5. Evaluation Metrics
- In the context of the model, accuracy refers to the proportion of action occurrences that were accurately identified.The variables TP, TN, FP, and FN represent true positives, true negatives, false positives, and false negatives, respectively.
- Precision is defined as the positive value of predictions, which is the ratio of correctly identified positive identifications (class × predicted) to the total number of positive identifications.
- Recall, also known as sensitivity, quantifies the true-positive rate, the ratio of correctly predicted real positive cases (actual class) to the total number of occurrences.
- The F1-score is a statistic that combines accuracy and recall, providing a balanced measure.
4. Experiments and Results
4.1. Experimental Setting
- The Numpy and Pandas libraries were used for the purpose of data management throughout the retrieval, processing, and analysis of sensor data.
- The outcomes of data exploration and model assessment were charted and presented using Matplotlib and Seaborn.
- In the experiments, the Scikit-learn model was used for the purposes of sampling and data creation.
- TensorFlow was used to generate and train deep learning models.
4.2. Experimental Results
5. Discussion
5.1. Ablation Studies
5.1.1. Impact of Convolution Blocks
5.1.2. Impact of the ResBiGRU Blocks
5.2. Impact of Different Types of Sensors
5.3. Comparison with State-of-the-Art Models
5.4. Practical Applications of Gym Exercise Recognition
5.5. Constraints and Prospective Advancements
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Müller, P.N.; Müller, A.J.; Achenbach, P.; Göbel, S. IMU-Based Fitness Activity Recognition Using CNNs for Time Series Classification. Sensors 2024, 24, 742. [Google Scholar] [CrossRef]
- Mekruksavanich, S.; Jitpattanakul, A. A Deep Learning Network with Aggregation Residual Transformation for Human Activity Recognition Using Inertial and Stretch Sensors. Computers 2023, 12, 141. [Google Scholar] [CrossRef]
- Patalas-Maliszewska, J.; Pajak, I.; Krutz, P.; Pajak, G.; Rehm, M.; Schlegel, H.; Dix, M. Inertial Sensor-Based Sport Activity Advisory System Using Machine Learning Algorithms. Sensors 2023, 23, 1137. [Google Scholar] [CrossRef]
- Concha-Pérez, E.; Gonzalez-Hernandez, H.G.; Reyes-Avendaño, J.A. Physical Exertion Recognition Using Surface Electromyography and Inertial Measurements for Occupational Ergonomics. Sensors 2023, 23, 9100. [Google Scholar] [CrossRef]
- Mahyari, A.; Pirolli, P.; LeBlanc, J.A. Real-Time Learning from an Expert in Deep Recommendation Systems with Application to mHealth for Physical Exercises. IEEE J. Biomed. Health Inform. 2022, 26, 4281–4290. [Google Scholar] [CrossRef]
- Morshed, M.G.; Sultana, T.; Alam, A.; Lee, Y.K. Human Action Recognition: A Taxonomy-Based Survey, Updates, and Opportunities. Sensors 2023, 23, 2182. [Google Scholar] [CrossRef]
- Barbosa, W.A.; Leite, C.D.F.C.; Reis, C.H.O.; Machado, A.F.; Bullo, V.; Gobbo, S.; Bergamin, M.; Lima-Leopoldo, A.P.; Vancini, R.L.; Baker, J.S.; et al. Effect of Supervised and Unsupervised Exercise Training in Outdoor Gym on the Lifestyle of Elderly People. Int. J. Environ. Res. Public Health 2023, 20, 7022. [Google Scholar] [CrossRef]
- Hussain, A.; Zafar, K.; Baig, A.R.; Almakki, R.; AlSuwaidan, L.; Khan, S. Sensor-Based Gym Physical Exercise Recognition: Data Acquisition and Experiments. Sensors 2022, 22, 2489. [Google Scholar] [CrossRef]
- Pathan, N.S.; Talukdar, M.T.F.; Quamruzzaman, M.; Fattah, S.A. A Machine Learning based Human Activity Recognition during Physical Exercise using Wavelet Packet Transform of PPG and Inertial Sensors data. In Proceedings of the 2019 4th International Conference on Electrical Information and Communication Technology (EICT), Khulna, Bangladesh, 20–22 December 2019; pp. 1–5. [Google Scholar] [CrossRef]
- Li, J.H.; Tian, L.; Wang, H.; An, Y.; Wang, K.; Yu, L. Segmentation and Recognition of Basic and Transitional Activities for Continuous Physical Human Activity. IEEE Access 2019, 7, 42565–42576. [Google Scholar] [CrossRef]
- Bouchabou, D.; Nguyen, S.M.; Lohr, C.; LeDuc, B.; Kanellos, I. A Survey of Human Activity Recognition in Smart Homes Based on IoT Sensors Algorithms: Taxonomies, Challenges, and Opportunities with Deep Learning. Sensors 2021, 21, 6037. [Google Scholar] [CrossRef]
- Aquino, G.; Costa, M.G.F.; Filho, C.F.F.C. Explaining and Visualizing Embeddings of One-Dimensional Convolutional Models in Human Activity Recognition Tasks. Sensors 2023, 23, 4409. [Google Scholar] [CrossRef]
- Mekruksavanich, S.; Jitpattanakul, A. LSTM Networks Using Smartphone Data for Sensor-Based Human Activity Recognition in Smart Homes. Sensors 2021, 21, 1636. [Google Scholar] [CrossRef]
- Mekruksavanich, S.; Jitpattanakul, A. Deep Convolutional Neural Network with RNNs for Complex Activity Recognition Using Wrist-Worn Wearable Sensor Data. Electronics 2021, 10, 1685. [Google Scholar] [CrossRef]
- Webber, M.; Rojas, R.F. Human Activity Recognition With Accelerometer and Gyroscope: A Data Fusion Approach. IEEE Sens. J. 2021, 21, 16979–16989. [Google Scholar] [CrossRef]
- Masum, A.K.M.; Bahadur, E.H.; Shan-A-Alahi, A.; Uz Zaman Chowdhury, M.A.; Uddin, M.R.; Al Noman, A. Human Activity Recognition Using Accelerometer, Gyroscope and Magnetometer Sensors: Deep Neural Network Approaches. In Proceedings of the 2019 10th International Conference on Computing, Communication and Networking Technologies (ICCCNT), Kanpur, India, 6–8 July 2019; pp. 1–6. [Google Scholar] [CrossRef]
- Ashry, S.; Gomaa, W.; Abdu-Aguye, M.G.; El-borae, N. Improved IMU-based Human Activity Recognition using Hierarchical HMM Dissimilarity. In Proceedings of the 17th International Conference on Informatics in Control, Automation and Robotics—ICINCO, Online, 7–9 July 2020; INSTICC, SciTePress: Setúbal, Portugal, 2020; pp. 702–709. [Google Scholar] [CrossRef]
- Nurhanim, K.; Elamvazuthi, I.; Izhar, L.; Capi, G.; Su, S. EMG Signals Classification on Human Activity Recognition using Machine Learning Algorithm. In Proceedings of the 2021 8th NAFOSTED Conference on Information and Computer Science (NICS), Hanoi, Vietnam, 21–22 December 2021; pp. 369–373. [Google Scholar] [CrossRef]
- Zia ur Rehman, M.; Waris, A.; Gilani, S.O.; Jochumsen, M.; Niazi, I.K.; Jamil, M.; Farina, D.; Kamavuako, E.N. Multiday EMG-Based Classification of Hand Motions with Deep Learning Techniques. Sensors 2018, 18, 2497. [Google Scholar] [CrossRef]
- Ding, Z.; Yang, C.; Tian, Z.; Yi, C.; Fu, Y.; Jiang, F. sEMG-Based Gesture Recognition with Convolution Neural Networks. Sustainability 2018, 10, 1865. [Google Scholar] [CrossRef]
- Faust, O.; Hagiwara, Y.; Hong, T.J.; Lih, O.S.; Acharya, U.R. Deep learning for healthcare applications based on physiological signals: A review. Comput. Methods Programs Biomed. 2018, 161, 1–13. [Google Scholar] [CrossRef]
- Lee, K.H.; Min, J.Y.; Byun, S. Electromyogram-Based Classification of Hand and Finger Gestures Using Artificial Neural Networks. Sensors 2022, 22, 225. [Google Scholar] [CrossRef]
- Wang, J.; Sun, S.; Sun, Y. A Muscle Fatigue Classification Model Based on LSTM and Improved Wavelet Packet Threshold. Sensors 2021, 21, 6369. [Google Scholar] [CrossRef]
- Xiong, D.; Zhang, D.; Zhao, X.; Zhao, Y. Deep Learning for EMG-based Human-Machine Interaction: A Review. IEEE/CAA J. Autom. Sin. 2021, 8, 512–533. [Google Scholar] [CrossRef]
- Elamvazuthi, I.; Duy, N.; Ali, Z.; Su, S.; Khan, M.A.; Parasuraman, S. Electromyography (EMG) based Classification of Neuromuscular Disorders using Multi-Layer Perceptron. Procedia Comput. Sci. 2015, 76, 223–228. [Google Scholar] [CrossRef]
- Cai, S.; Chen, Y.; Huang, S.; Wu, Y.; Zheng, H.; Li, X.; Xie, L. SVM-Based Classification of sEMG Signals for Upper-Limb Self-Rehabilitation Training. Front. Neurorobot. 2019, 13, 31. [Google Scholar] [CrossRef]
- Di Nardo, F.; Morbidoni, C.; Cucchiarelli, A.; Fioretti, S. Influence of EMG-signal processing and experimental set-up on prediction of gait events by neural network. Biomed. Signal Process. Control 2021, 63, 102232. [Google Scholar] [CrossRef]
- Nazmi, N.; Abdul Rahman, M.A.; Yamamoto, S.I.; Ahmad, S.A. Walking gait event detection based on electromyography signals using artificial neural network. Biomed. Signal Process. Control 2019, 47, 334–343. [Google Scholar] [CrossRef]
- Koskimäki, H.; Siirtola, P.; Röning, J. MyoGym: Introducing an open gym data set for activity recognition collected using myo armband. In Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers, New York, NY, USA, 11–15 September 2017; UbiComp ’17. pp. 537–546. [Google Scholar] [CrossRef]
- Jung, P.G.; Lim, G.; Kim, S.; Kong, K. A Wearable Gesture Recognition Device for Detecting Muscular Activities Based on Air-Pressure Sensors. IEEE Trans. Ind. Inform. 2015, 11, 485–494. [Google Scholar] [CrossRef]
- Crema, C.; Depari, A.; Flammini, A.; Sisinni, E.; Haslwanter, T.; Salzmann, S. IMU-based solution for automatic detection and classification of exercises in the fitness scenario. In Proceedings of the 2017 IEEE Sensors Applications Symposium (SAS), Glassboro, NJ, USA, 13–15 March 2017; pp. 1–6. [Google Scholar] [CrossRef]
- Pernek, I.; Kurillo, G.; Stiglic, G.; Bajcsy, R. Recognizing the intensity of strength training exercises with wearable sensors. J. Biomed. Inform. 2015, 58, 145–155. [Google Scholar] [CrossRef]
- Hochreiter, S. The Vanishing Gradient Problem During Learning Recurrent Neural Nets and Problem Solutions. Int. J. Uncertain. Fuzziness Knowl. Based Syst. 1998, 6, 107–116. [Google Scholar] [CrossRef]
- Cho, K.; van Merriënboer, B.; Bahdanau, D.; Bengio, Y. On the Properties of Neural Machine Translation: Encoder–Decoder Approaches. In Proceedings of the SSST-8, Eighth Workshop on Syntax, Semantics and Structure in Statistical Translation, Doha, Qatar, 25 October 2014; pp. 103–111. [Google Scholar] [CrossRef]
- Chung, J.; Gulcehre, C.; Cho, K.; Bengio, Y. Empirical evaluation of gated recurrent neural networks on sequence modeling. In Proceedings of the NIPS 2014 Workshop on Deep Learning, Montreal, QC, Canada, 13 December 2014. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar] [CrossRef]
- Aşuroğlu, T.; Açici, K.; Erdaş, c.B.; Oğul, H. Texture of Activities: Exploiting Local Binary Patterns for Accelerometer Data Analysis. In Proceedings of the 2016 12th International Conference on Signal-Image Technology & Internet-Based Systems (SITIS), Naples, Italy, 28 November–1 December 2016; pp. 135–138. [Google Scholar] [CrossRef]
- Montaha, S.; Azam, S.; Rafid, A.K.M.R.H.; Ghosh, P.; Hasan, M.Z.; Jonkman, M.; De Boer, F. BreastNet18: A High Accuracy Fine-Tuned VGG16 Model Evaluated Using Ablation Study for Diagnosing Breast Cancer from Enhanced Mammography Images. Biology 2021, 10, 1347. [Google Scholar] [CrossRef]
- de Vente, C.; Boulogne, L.H.; Venkadesh, K.V.; Sital, C.; Lessmann, N.; Jacobs, C.; Sánchez, C.I.; van Ginneken, B. Improving Automated COVID-19 Grading with Convolutional Neural Networks in Computed Tomography Scans: An Ablation Study. arXiv 2020, arXiv:2009.09725. [Google Scholar]
- Meyes, R.; Lu, M.; de Puiseau, C.W.; Meisen, T. Ablation Studies in Artificial Neural Networks. arXiv 2019, arXiv:1901.08644. [Google Scholar]
- Ojiako, K.; Farrahi, K. MLPs Are All You Need for Human Activity Recognition. Appl. Sci. 2023, 13, 11154. [Google Scholar] [CrossRef]
- Ismail Fawaz, H.; Lucas, B.; Forestier, G.; Pelletier, C.; Schmidt, D.F.; Weber, J.; Webb, G.I.; Idoumghar, L.; Muller, P.A.; Petitjean, F. InceptionTime: Finding AlexNet for time series classification. Data Min. Knowl. Discov. 2020, 34, 1936–1962. [Google Scholar] [CrossRef]
- Ordóñez, F.J.; Roggen, D. Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition. Sensors 2016, 16, 115. [Google Scholar] [CrossRef] [PubMed]
- Wang, Z.; Yan, W.; Oates, T. Time series classification from scratch with deep neural networks: A strong baseline. In Proceedings of the 2017 International Joint Conference on Neural Networks (IJCNN), Anchorage, AK, USA, 14–19 May 2017; pp. 1578–1585. [Google Scholar] [CrossRef]
Muscle Group | Gym Exercise Activities | Posture | One-Arm, Both or Alternate | Equipment Used |
---|---|---|---|---|
Middle Back | Seated cable rows | Seated | Both | Cable machine |
One-Arm dumbbell row | Bent over | One- | dumbbell | |
Wide-Grip pulldown behind the neck | Seated | Both | Cable machine | |
Bent-over barbell row | Bent over | Both | Barbell | |
Reverse-grip bent-over row | Bent over | Both | Barbell | |
Wide-grip front pulldown | Seated | Both | Cable machine | |
Chest | Bench press | On back | Both | Barbell |
Inclined dumbbell flyers | Seated, inclined | Both | Dumbbell | |
Inclined dumbbell press | Seated, inclined | Both | Dumbbell | |
Dumbbell flyers | On back | Both | Dumbbell | |
Pushups | Prone with hands and toes grounded | Both | Body weight | |
Leveraged chest press | Seated | Both | Barbell | |
Closed-grip barbell bench press | On back | Both | Barbell | |
Triceps | Bar skullcrusher | On back | Both | Barbell |
Tricep pushdown | Standing | Both | Cable machine | |
Bench dip/dip | Seated, inclined | Both | Body weight | |
Overhead tricep extension | Standing | Both | Barbell | |
Tricep dumbbell kickback | Bent over | One arm | Dumbbell | |
Biceps | Spider curl | On stomach | Both | Barbell |
Dumbbell alternate bicep curl | Standing | Alternate | Dumbbell | |
Inclined hammer curl | Seated, inclined | Both | Dumbbell | |
Concentration curl | Seated | One arm | Dumbbell | |
Cable curl | Standing | Both | Cable machine | |
Hammer curl | Standing | Alternate | Dumbbell | |
Shoulders | Upright barbell row | Standing | Both | Barbell |
Side lateral raise | Standing | Both | Dumbbell | |
Front dumbbell raise | Standing | Alternate | Dumbbell | |
Seated dumbbell shoulder press | Seated | Both | Dumbbell | |
Car drivers | Standing | Both | Barbell plate | |
Lying rear-delt raise | On stomach | Both | Dumbbell |
Model | Recognition Performance | ||
---|---|---|---|
Accuracy | Loss | F1-score | |
CNN | 90.08% (±0.44%) | 0.48 (±0.03) | 69.60% (±1.30%) |
LSTM | 92.54% (±0.15%) | 0.29 (±0.01) | 78.58% (±0.69%) |
BiLSTM | 94.22% (±0.27%) | 0.32 (±0.01) | 84.08% (±0.38%) |
GRU | 92.94% (±0.19%) | 0.35 (±0.01) | 80.09% (±1.02%) |
BiGRU | 95.33% (±0.27%) | 0.29 (±0.01) | 86.84% (±0.68%) |
CNN-ResBiGRU | 95.67% (±0.21%) | 0.17 (±0.01) | 88.08% (±0.56%) |
Model | Recognition Performance | ||
---|---|---|---|
Accuracy | Loss | F1-score | |
CNN | 82.94% (±0.55%) | 0.95 (±0.09) | 42.16% (±1.27%) |
LSTM | 90.96% (±0.16%) | 0.51 (±0.02) | 70.82% (±0.52%) |
BiLSTM | 92.15% (±0.27%) | 0.53 (±0.05) | 74.88% (±1.33%) |
GRU | 90.89% (±0.39%) | 0.58 (±0.01) | 70.84% (±1.36%) |
BiGRU | 93.00% (±0.21%) | 0.54 (±0.02) | 78.02% (±0.55%) |
CNN-ResBiGRU | 95.58% (±0.18%) | 0.20 (±0.01) | 87.02% (±0.64%) |
Model | Recognition Performance | ||
---|---|---|---|
Accuracy | Loss | F1-score | |
CNN | 89.13% (±0.42%) | 0.96 (±0.10) | 64.72% (±1.19%) |
LSTM | 94.29% (±0.15%) | 0.30 (±0.02) | 82.92% (±0.73%) |
BiLSTM | 94.32% (±0.07%) | 0.35 (±0.01) | 82.82% (±0.35%) |
GRU | 94.15% (±0.15%) | 0.34 (±0.02) | 82.60% (±0.29%) |
BiGRU | 95.03% (±0.27%) | 0.35 (±0.02) | 85.22% (±0.80%) |
CNN-ResBiGRU | 96.96% (±0.10%) | 0.14 (±0.01) | 91.78% (±0.42%) |
Model | Recognition Performance | ||
---|---|---|---|
Accuracy | Loss | F1-score | |
CNN | 75.74% (±0.51%) | 0.93 (±0.02) | 8.00% (±1.38%) |
LSTM | 78.05% (±0.40%) | 1.25 (±0.04) | 24.06% (±1.45%) |
BiLSTM | 77.58% (±0.34%) | 1.83 (±0.07%) | 22.04% (±0.98%) |
GRU | 76.57% (±0.85%) | 1.37 (±0.03) | 18.82% (±0.69%) |
BiGRU | 76.18% (±0.41%) | 2.14 (±0.06) | 16.55% (±0.45%) |
CNN-ResBiGRU | 91.53% (±0.60%) | 0.34 (±0.02) | 74.49% (±1.74%) |
Model | Recognition Performance | ||
---|---|---|---|
Accuracy | Loss | F1-score | |
CNN | 86.43% (±0.54%) | 0.61 (±0.07) | 54.22% (±2.97%) |
LSTM | 91.79% (±0.42%) | 0.33 (±0.01) | 73.00% (±1.55%) |
BiLSTM | 91.20% (±0.31%) | 0.50 (±0.04) | 70.62% (±0.92%) |
GRU | 89.71% (±0.28%) | 0.46 (±0.02) | 64.66% (±0.73%) |
BiGRU | 89.93% (±0.41%) | 0.6 5(±0.03) | 65.91% (±1.88%) |
CNN-ResBiGRU | 97.29% (±0.20%) | 0.12 (±0.01) | 92.68% (±0.59%) |
Model | F1-score (%) | ||||
---|---|---|---|---|---|
Accelerometer | Gyroscope | IMU | EMG | IMU + EMG | |
The proposed model without convolution blocks | 68.90% | 69.29% | 80.93% | 29.92% | 75.99% |
CNN-ResBiGRU | 88.08% | 87.02% | 91.78% | 74.49% | 92.68% |
Model | F1-score (%) | ||||
---|---|---|---|---|---|
Accelerometer | Gyroscope | IMU | EMG | IMU + EMG | |
The proposed model without convolution blocks | 82.10% | 83.78% | 91.49% | 54.21% | 92.40% |
CNN-ResBiGRU | 88.08% | 87.02% | 91.78% | 74.49% | 92.68% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Mekruksavanich, S.; Jitpattanakul, A. A Residual Deep Learning Method for Accurate and Efficient Recognition of Gym Exercise Activities Using Electromyography and IMU Sensors. Appl. Syst. Innov. 2024, 7, 59. https://doi.org/10.3390/asi7040059
Mekruksavanich S, Jitpattanakul A. A Residual Deep Learning Method for Accurate and Efficient Recognition of Gym Exercise Activities Using Electromyography and IMU Sensors. Applied System Innovation. 2024; 7(4):59. https://doi.org/10.3390/asi7040059
Chicago/Turabian StyleMekruksavanich, Sakorn, and Anuchit Jitpattanakul. 2024. "A Residual Deep Learning Method for Accurate and Efficient Recognition of Gym Exercise Activities Using Electromyography and IMU Sensors" Applied System Innovation 7, no. 4: 59. https://doi.org/10.3390/asi7040059
APA StyleMekruksavanich, S., & Jitpattanakul, A. (2024). A Residual Deep Learning Method for Accurate and Efficient Recognition of Gym Exercise Activities Using Electromyography and IMU Sensors. Applied System Innovation, 7(4), 59. https://doi.org/10.3390/asi7040059