Fast Wearable Sensor–Based Foot–Ground Contact Phase Classification Using a Convolutional Neural Network with Sliding-Window Label Overlapping
Abstract
:1. Introduction
2. Foot–Ground Contact Phases and Labeling Method
3. Data Acquisition and Preprocessing
3.1. Training Dataset Acquisition
3.2. Data Augmentation
3.3. Standardization
3.4. Sliding-Window Label Overlapping Method
3.5. Validation Dataset Acquisition
4. CNN Model for Real-Time FGCC
Sensitivity Analysis
5. Results and Discussion
Author Contributions
Funding
Conflicts of Interest
Appendix A
Algorithm A1 Pseudocode of four sub-phase labeling process |
1: Interrupt Service Routine (every 10 ms) |
2: if (read FSR0 > 10 kgf) |
3: FSR0 = True |
4: else |
5: FSR0 = False |
6: if (read FSR1 > 10 kgf) |
7: FSR1 = True |
8: else |
9: FSR1 = False |
10: if (read FSR2 > 10 kgf) |
11: FSR2 = True |
12: else |
13: FSR2 = False |
14: if (FSR0 and FSR1 = False and FSR2 = True) |
15: Output = Heel Strike |
16: else if (FSR0 = False and FSR1 = True and FSR2 |
17: = True) or (FSR0 = True and FSR1 = False |
18: and FSR2 = True) |
19: Output = Full Contact |
20: else if(FSR0 = True and FSR1 = True and FSR2 |
21: = False) |
22: Output = Heel off |
23: else if(FSR0 = True and FSR1 = False and |
24: FSR2 = False) |
25: Output = Toe off |
Appendix B
Label 2 | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Orientation | Acceleration | Angular Velocity | Rate of Euler Angle | ||||||||
Foot | −0.1551 (±0.4065) | −0.0377 (±0.3778) | −0.1357 (±1.0435) | −0.0044 (±0.6404) | 0.0074 (±0.2723) | 0.0367 (±0.2364) | 0.0494 (±0.4883) | 0.0514 (±0.6035) | −0.2059 (±0.5320) | −0.00019 (±0.6639) | 0.1616 (±0.3891) |
Shank | 0.3824 (±0.5310) | 0.2539 (±0.4975) | −0.0946 (±1.0713) | 0.4032 (±0.6129) | −0.3422 (±0.4613) | 0.0198 (±0.3701) | 0.3205 (±0.4281) | 0.3953 (±0.4091) | −0.2978 (±0.4173) | 0.1115 (0.4251) | 0.4286 (±0.2765) |
Label 3 | |||||||||||
Orientation | Acceleration | Angular Velocity | Rate of Euler Angle | ||||||||
Foot | −0.3302 (±0.4323) | −0.2600 (±0.3925) | 0.0593 (±0.8412) | −0.0069 (±0.8265) | −0.0069 (±0.6012) | 0.2146 (±0.5781) | 0.0421 (±0.5644) | −0.0461 (±0.4672) | −0.0828 (±0.6505) | 0.0063 (±0.8427) | 0.1086 (±0.4537) |
Shank | −0.2768 (±0.5474) | −0.4322 (±0.4090) | 0.2467 (±0.8486) | −0.2761 (±0.8104) | 0.2586 (±0.8352) | 0.0909 (±0.7499) | 0.3244 (±0.5664) | 0.2431 (±0.3592) | −0.1093 (±1.0977) | −0.0776 (±0.9521) | 0.2585 (±0.3217) |
Label 4 | |||||||||||
Orientation | Acceleration | Angular Velocity | Rate of Euler Angle | ||||||||
Foot | −0.5414 (±0.6849) | −0.9097 (±0.6816) | 0.0868 (±1.0554) | 0.0475 (±1.4317) | −0.0921 (±1.5240) | −0.4572 (±1.5838) | 0.1292 (±0.9869) | 0.0885 (±0.8901) | 0.4093 (±1.0212) | −0.0021 (±0.9594) | 0.0802 (±1.0908) |
Shank | −1.0112 (0.6641) | −1.0129 (±0.5170) | −0.1106 (±0.9957) | −0.6221 (±0.8928) | 0.5041 (±1.0838) | −0.1153 (±1.8441) | −0.0473 (±1.1490) | −0.1010 (±0.8285) | 0.6609 (±1.1596) | 0.3421 (±1.0326) | −0.1097 (±0.9633) |
Label 5 | |||||||||||
Orientation | Acceleration | Angular Elocity | Rate of Euler Angle | ||||||||
Foot | 1.0371 (±1.3766) | 1.1429 (±1.1608) | 0.0018 (±1.0502) | −0.0273 (±1.0757) | 0.0821 (±1.2921) | 0.0955 (±1.1537) | −0.2158 (±1.6274) | −0.0773 (±1.6817) | −0.0241 (±1.5070) | −0.0058 (±1.4361) | −0.3793 (±1.6018) |
Shank | 0.7715 (±1.1886) | 1.1079 (±1.0357) | −0.0841 (±1.0382) | 0.4077 (±1.2366) | −0.3574 (±1.2485) | −0.0321 (±0.6482) | −0.7134 (±1.3111) | −0.6467 (±1.6110) | −0.0981 (±0.9490) | −0.3254 (±1.3263) | −0.6997 (±1.5509) |
Appendix C
Appendix D
Algorithm A2 Pseudocode for assigning the representative label to a single window composed of the multiple labels |
1: Function SLO (Entire labeled data set M (), |
2: SLO_width=14, SLO_pitch=1, SLO_ratio=N) |
3: (Rotate 90 degrees counterclockwise to create |
4: 2214 images) |
5: rotate M CCW 90 ← M(23n) |
6: for i=1:SLO_pitch:(size(M,2)-SLO_width) |
7: slo_image(:,:,i) = M([i:slo_width+(i-1)],[2:23]) |
8: → Save the 2214 image in three dimensions |
9: slo_label(1,:,i) = M([i:slo_width+(i-1)],1) |
10: → Save the 114 label in three dimensions |
11: end |
12: for i=1:1:size(slo_image,3) |
13: if (slo_label(1,14-(N-1):14,i)==5) |
14: save mat2gray(slo_image(i)) to the folder "L_5" |
15: else if (slo_label(1,14-(N-1):14,i)==4) |
16: save mat2gray(slo_image(i)) to the folder "L_4" |
17: else if (slo_label(1,14-(N-1):14,i)==3) |
18: save mat2gray(slo_image(i)) to the folder "L_3" |
19: else if (slo_label(1,14-(N-1):14,i)==2) |
20: save mat2gray(slo_image(i)) to the folder "L_2" |
21: end |
22: end |
Appendix E
References
- Shin, S.; Cha, J. Human activity recognition system using multimodal sensor and deep learning based on LSTM. Trans. Korean Soc. Mech. Eng. 2018, 42, 111–121. [Google Scholar] [CrossRef]
- Horst, F.; Lapuschkin, S.; Samek, W.; Müller, K.-R.; Schöllhorn, W.I. What is Unique in Individual Gait Patterns? Understanding and Interpreting Deep Learning in Gait Analysis. arXiv 2018, arXiv:1808.04308. [Google Scholar]
- Zügner, R.; Tranberg, R.; Lisovskaja, V.; Kärrholm, J. Different reliability of instrumented gait analysis between patients with unilateral hip osteoarthritis, unilateral hip prosthesis and healthy controls. BMC Musculoskelet. Disord. 2018, 19, 224. [Google Scholar] [CrossRef] [PubMed]
- Hsu, W.-C.; Sugiarto, T.; Lin, Y.-J.; Yang, F.-C.; Lin, Z.-Y.; Sun, C.-T.; Hsu, C.-L.; Chou, K.-N. Multiple-Wearable-Sensor-Based Gait Classification and Analysis in Patients with Neurological Disorders. Sensors 2018, 18, 3397. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Caramia, C.; Torricelli, D.; Schmid, M.; Munoz-Gonzalez, A.; Gonzalez-Vargas, J.; Grandas, F.; Pons, J.L. IMU-Based Classification of Parkinson’s Disease from Gait: A Sensitivity Analysis on Sensor Location and Feature Selection. IEEE J. Biomed. Health Inform. 2018, 22, 1765–1774. [Google Scholar] [CrossRef] [PubMed]
- Zhao, H.; Wang, Z.; Qiu, S.; Shen, Y.; Wang, J. IMU-based gait analysis for rehabilitation assessment of patients with gait disorders. In Proceedings of the 2017 4th International Conference on Systems and Informatics (ICSAI), Hangzhou, China, 11–13 November 2017; pp. 622–626. [Google Scholar]
- Qiu, S.; Liu, L.; Zhao, H.; Wang, Z.; Jiang, Y. MEMS Inertial Sensors Based Gait Analysis for Rehabilitation Assessment via Multi-Sensor Fusion. Micromachines 2018, 9, 442. [Google Scholar] [CrossRef] [Green Version]
- Yang, C.-C.; Hsu, Y.-L. A Review of Accelerometry-Based Wearable Motion Detectors for Physical Activity Monitoring. Sensors 2010, 10, 7772–7788. [Google Scholar] [CrossRef]
- Al-Amri, M.; Nicholas, K.; Button, K.; Sparkes, V.; Sheeran, L.; Davies, J. Inertial Measurement Units for Clinical Movement Analysis: Reliability and Concurrent Validity. Sensors 2018, 18, 719. [Google Scholar] [CrossRef] [Green Version]
- Lim, C.K.; Luo, Z.; Chen, I.-M.; Yeo, S.H. Wearable wireless sensing system for capturing human arm motion. Sens. Actuators A Phys. 2011, 166, 125–132. [Google Scholar] [CrossRef]
- Abellanas, A.; Frizera, A.; Ceres, R.; Gallego, J.A. Estimation of gait parameters by measuring upper limb–walker interaction forces. Sens. Actuators A Phys. 2010, 162, 276–283. [Google Scholar] [CrossRef]
- Ofli, F.; Kurillo, G.; Obdrzalek, S.; Bajcsy, R.; Jimison, H.B.; Pavel, M. Design and Evaluation of an Interactive Exercise Coaching System for Older Adults: Lessons Learned. IEEE J. Biomed. Health Inf. 2016, 20, 201–212. [Google Scholar] [CrossRef] [PubMed]
- Yuan, Q.; Chen, I.-M. Localization and velocity tracking of human via 3 IMU sensors. Sens. Actuators A Phys. 2014, 212, 25–33. [Google Scholar] [CrossRef]
- Yuan, Q.; Chen, I.-M. Human velocity and dynamic behavior tracking method for inertial capture system. Sens. Actuators A Phys. 2012, 183, 123–131. [Google Scholar] [CrossRef]
- Zhang, J.; Cao, Y.; Qiao, M.; Ai, L.; Sun, K.; Mi, Q.; Zang, S.; Zuo, Y.; Yuan, X.; Wang, Q. Human motion monitoring in sports using wearable graphene-coated fiber sensors. Sens. Actuators A Phys. 2018, 274, 132–140. [Google Scholar] [CrossRef]
- Vu, C.C.; Kim, J. Human motion recognition using SWCNT textile sensor and fuzzy inference system based smart wearable. Sens. Actuators A Phys. 2018, 283, 263–272. [Google Scholar] [CrossRef]
- King, K.; Yoon, S.W.; Perkins, N.C.; Najafi, K. Wireless MEMS inertial sensor system for golf swing dynamics. Sens. Actuators A Phys. 2008, 141, 619–630. [Google Scholar] [CrossRef]
- Martínez-Villaseñor, L.; Ponce, H.; Espinosa-Loera, R.A. Multimodal Database for Human Activity Recognition and Fall Detection. Proceedings 2018, 2, 1237. [Google Scholar] [CrossRef] [Green Version]
- Farooq, A.; Won, C.S. A Survey of Human Action Recognition Approaches that use an RGB-D Sensor. IEIE Trans. Smart Process. Comput. 2015, 4, 281–290. [Google Scholar] [CrossRef] [Green Version]
- Kim, H.; Kang, Y.; Valencia, D.R.; Kim, D. An Integrated System for Gait Analysis Using FSRs and an IMU. In Proceedings of the 2018 Second IEEE International Conference on Robotic Computing (IRC), Laguna Hills, CA, USA, 31 January–2 February 2018; pp. 347–351. [Google Scholar]
- Wu, D.; Wang, Z.; Chen, Y.; Zhao, H. Mixed-kernel based weighted extreme learning machine for inertial sensor based human activity recognition with imbalanced dataset. Neurocomputing 2016, 190, 35–49. [Google Scholar] [CrossRef]
- Woodman, O.J. An introduction to inertial navigation. Citado 2007, 2, 19. [Google Scholar]
- Kim, M.; Lee, D. Development of an IMU-based foot–ground contact detection (FGCD) algorithm. Ergonomics 2017, 60, 384–403. [Google Scholar] [CrossRef] [PubMed]
- Teufl, W.; Lorenz, M.; Miezal, M.; Taetz, B.; Fröhlich, M.; Bleser, G. Towards Inertial Sensor Based Mobile Gait Analysis: Event-Detection and Spatio-Temporal Parameters. Sensors 2018, 19, 38. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Hsu, Y.-L.; Yang, S.-C.; Chang, H.-C.; Lai, H.-C. Human Daily and Sport Activity Recognition Using a Wearable Inertial Sensor Network. IEEE Access 2018, 6, 31715–31728. [Google Scholar] [CrossRef]
- Janidarmian, M.; Roshan Fekr, A.; Radecka, K.; Zilic, Z. A Comprehensive Analysis on Wearable Acceleration Sensors in Human Activity Recognition. Sensors 2017, 17, 529. [Google Scholar] [CrossRef] [PubMed]
- Almaslukh, B.; Artoli, A.; Al-Muhtadi, J. A Robust Deep Learning Approach for Position-Independent Smartphone-Based Human Activity Recognition. Sensors 2018, 18, 3726. [Google Scholar] [CrossRef] [Green Version]
- Sztyler, T.; Stuckenschmidt, H. On-body localization of wearable devices: An investigation of position-aware activity recognition. In Proceedings of the 2016 IEEE International Conference on Pervasive Computing and Communications (PerCom), Sydney, NSW, Australia, 14–19 March 2016; pp. 1–9. [Google Scholar]
- Um, T.T.; Babakeshizadeh, V.; Kulic, D. Exercise motion classification from large-scale wearable sensor data using convolutional neural networks. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; pp. 2385–2390. [Google Scholar]
- Tiberio, D. The Effect of Excessive Subtalar Joint Pronation on Patellofemoral Mechanics: A Theoretical Model. J. Orthop. Sports. Phys. 1987, 9, 160–165. [Google Scholar] [CrossRef]
- Available online: https://sites.google.com/a/mdex.co.kr/mdex-sensor-info-2017/online-shop-eng/ra12p (accessed on 1 September 2020).
- Paulich, M.; Schepers, M.; Rudigkeit, N.; Bellusci, G. Xsens MTw Awinda: Miniature wireless Inertial-Magnetic Motion Tracker for Highly Accurate 3D Kinematic Applications; Xsens: Enschede, The Netherlands, 2018. [Google Scholar]
- MathWorks Home Page. Available online: https://www.mathworks.com/help/images/ref/mat2gray.html?s_tid=srchtitle (accessed on 1 September 2020).
- LeCun, Y.; Boser, B.; Denker, J.S.; Henderson, D.; Howard, R.E.; Hubbard, W.; Jackel, L.D. Backpropagation applied to handwritten zip code recognition. Neural Comput. 1989, 1, 541–551. [Google Scholar] [CrossRef]
- Toledo-Pérez, D.C.; Martínez-Prado, M.A.; Gómez-Loenzo, R.A.; Paredes-García, W.J. A study of movement classification of the lower limb based on up to 4-EMG channels. Electronics 2019, 8, 259. [Google Scholar] [CrossRef] [Green Version]
Input Parameter | CNN Model Parameter | ||||||
---|---|---|---|---|---|---|---|
SLO width | Image Shape | Pooling method | Layer no. | Filter size | Drop-out rate | Stride width | Activation function |
14 | Average-Pooling | 3 | 0.3 | 1 | ReLU |
Level | SLO Ratio [%] | Filter Width | Filter Height |
---|---|---|---|
1 | 30(4) | 3 | 3 |
2 | 50(7) | 5 | 5 |
3 | 70(10) | 7 | 7 |
No. | SLO Ratio | Filter Width | Filter Height | Train Acc [%] | Test Acc [%] | Val Acc [%] |
---|---|---|---|---|---|---|
1 | 1 | 1 | 1 | 99.97 | 99.84 | 84.80 |
2 | 1 | 2 | 2 | 100 | 99.88 | 79.40 |
3 | 1 | 3 | 3 | 99.99 | 99.86 | 79.29 |
4 | 2 | 1 | 2 | 99.90 | 99.48 | 78.23 |
5 | 2 | 2 | 3 | 99.87 | 99.33 | 76.96 |
6 | 2 | 3 | 1 | 99.86 | 99.35 | 77.55 |
7 | 3 | 1 | 3 | 99.99 | 99.49 | 78.77 |
8 | 3 | 2 | 1 | 99.99 | 99.51 | 81.41 |
9 | 3 | 3 | 2 | 99.90 | 99.32 | 74.92 |
SW | HS | FC | HO | Total | |
---|---|---|---|---|---|
Accuracy | 82.27 | 81.61 | 82.12 | 93.18 | 84.80 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Jeon, H.; Kim, S.L.; Kim, S.; Lee, D. Fast Wearable Sensor–Based Foot–Ground Contact Phase Classification Using a Convolutional Neural Network with Sliding-Window Label Overlapping. Sensors 2020, 20, 4996. https://doi.org/10.3390/s20174996
Jeon H, Kim SL, Kim S, Lee D. Fast Wearable Sensor–Based Foot–Ground Contact Phase Classification Using a Convolutional Neural Network with Sliding-Window Label Overlapping. Sensors. 2020; 20(17):4996. https://doi.org/10.3390/s20174996
Chicago/Turabian StyleJeon, Haneul, Sang Lae Kim, Soyeon Kim, and Donghun Lee. 2020. "Fast Wearable Sensor–Based Foot–Ground Contact Phase Classification Using a Convolutional Neural Network with Sliding-Window Label Overlapping" Sensors 20, no. 17: 4996. https://doi.org/10.3390/s20174996
APA StyleJeon, H., Kim, S. L., Kim, S., & Lee, D. (2020). Fast Wearable Sensor–Based Foot–Ground Contact Phase Classification Using a Convolutional Neural Network with Sliding-Window Label Overlapping. Sensors, 20(17), 4996. https://doi.org/10.3390/s20174996