Development of Smartphone Application for Markerless Three-Dimensional Motion Capture Based on Deep Learning Model
Abstract
:1. Introduction
2. Materials and Methods
2.1. Training Dataset for Deep Learning
2.2. Execution Environment
2.3. Deep Learning for 3D Human Motion Estimation
3. Results
3.1. How to Use the TDPT for Gait Test Application
3.1.1. Shooting Direction
3.1.2. Precautions for Use
3.2. Data Processing for Gait Analysis
3.3. Verification of 3D Relative Coordinates on TDPT for Gait Test Application
4. Discussion
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Ishikawa, M.; Yamada, S.; Yamamoto, K. Agreement study on gait assessment using a video-assisted rating method in patients with idiopathic normal-pressure hydrocephalus. PLoS ONE 2019, 14, e0224202. [Google Scholar] [CrossRef] [PubMed]
- Marmarou, A.; Bergsneider, M.; Relkin, N.; Klinge, P.; Black, P.M. Development of guidelines for idiopathic normal-pressure hydrocephalus: Introduction. Neurosurgery 2005, 57, S1–S3, discussion ii–v. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Nakajima, M.; Yamada, S.; Miyajima, M.; Ishii, K.; Kuriyama, N.; Kazui, H.; Kanemoto, H.; Suehiro, T.; Yoshiyama, K.; Kameda, M.; et al. Guidelines for Management of Idiopathic Normal Pressure Hydrocephalus (Third Edition): Endorsed by the Japanese Society of Normal Pressure Hydrocephalus. Neurol. Med. Chir. 2021, 61, 63–97. [Google Scholar] [CrossRef] [PubMed]
- Scully, A.E.; Lim, E.C.W.; Teow, P.P.; Tan, D.M.L. A systematic review of the diagnostic utility of simple tests of change after trial removal of cerebrospinal fluid in adults with normal pressure hydrocephalus. Clin. Rehabil. 2018, 32, 942–953. [Google Scholar] [CrossRef]
- Stolze, H.; Kuhtz-Buschbeck, J.P.; Drucke, H.; Johnk, K.; Diercks, C.; Palmie, S.; Mehdorn, H.M.; Illert, M.; Deuschl, G. Gait analysis in idiopathic normal pressure hydrocephalus—which parameters respond to the CSF tap test? Clin. Neurophysiol. 2000, 111, 1678–1686. [Google Scholar] [CrossRef]
- Chen, S.-W.; Lin, S.-H.; Liao, L.-D.; Lai, H.-Y.; Pei, Y.-C.; Kuo, T.-S.; Lin, C.-T.; Chang, J.-Y.; Chen, Y.-Y.; Lo, Y.-C.; et al. Quantification and recognition of parkinsonian gait from monocular video imaging using kernel-based principal component analysis. Biomed. Eng. Online 2011, 10, 99. [Google Scholar] [CrossRef] [Green Version]
- Hausdorff, J.M. Gait dynamics, fractals and falls: Finding meaning in the stride-to-stride fluctuations of human walking. Hum. Mov. Sci. 2007, 26, 555–589. [Google Scholar] [CrossRef] [Green Version]
- Kitade, I.; Kitai, R.; Neishi, H.; Kikuta, K.I.; Shimada, S.; Matsumine, A. Relationship between gait parameters and MR imaging in idiopathic normal pressure hydrocephalus patients after shunt surgery. Gait Posture 2018, 61, 163–168. [Google Scholar] [CrossRef]
- Panciani, P.P.; Migliorati, K.; Muratori, A.; Gelmini, M.; Padovani, A.; Fontanella, M. Computerized gait analysis with inertial sensor in the management of idiopathic normal pressure hydrocephalus. Eur. J. Phys. Rehabil. Med. 2018, 54, 724–729. [Google Scholar] [CrossRef]
- Wang, L.; Li, Y.; Xiong, F.; Zhang, W. Gait recognition using optical motion capture: A decision fusion based method. Sensors 2021, 21, 3496. [Google Scholar] [CrossRef]
- Williams, M.A.; Thomas, G.; de Lateur, B.; Imteyaz, H.; Rose, J.G.; Shore, W.S.; Kharkar, S.; Rigamonti, D. Objective assessment of gait in normal-pressure hydrocephalus. Am. J. Phys. Med. Rehabil. 2008, 87, 39–45. [Google Scholar] [CrossRef] [PubMed]
- Yamada, S.; Aoyagi, Y.; Yamamoto, K.; Ishikawa, M. Quantitative evaluation of gait disturbance on an instrumented timed up-and-go test. Aging Dis. 2019, 10, 23–36. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Yamada, S.; Aoyagi, Y.; Ishikawa, M.; Yamaguchi, M.; Yamamoto, K.; Nozaki, K. Gait assessment using three-dimensional acceleration of the trunk in idiopathic normal pressure hydrocephalus. Front. Aging Neurosci. 2021, 13, 653964. [Google Scholar] [CrossRef] [PubMed]
- Diaz-San Martin, G.; Reyes-Gonzalez, L.; Sainz-Ruiz, S.; Rodriguez-Cobo, L.; Lopez-Higuera, J.M. Automatic Ankle Angle Detection by Integrated RGB and Depth Camera System. Sensors (Basel) 2021, 21, 1909. [Google Scholar] [CrossRef] [PubMed]
- Springer, S.; Yogev Seligmann, G. Validity of the kinect for gait assessment: A focused review. Sensors 2016, 16, 194. [Google Scholar] [CrossRef] [PubMed]
- Tamura, H.; Tanaka, R.; Kawanishi, H. Reliability of a markerless motion capture system to measure the trunk, hip and knee angle during walking on a flatland and a treadmill. J. Biomech. 2020, 109, 109929. [Google Scholar] [CrossRef] [PubMed]
- Baak, A.; Muller, M.; Bharaj, G.; Seidel, H.; Theobalt, C. A data-driven approach for real-time full body pose reconstruction from a depth camera. In Proceedings of the IEEE 13th International Conference on Computer Vision, London, UK, 16–17 November 2013; p. 1092. [Google Scholar]
- Buker, L.C.; Zuber, F.; Hein, A.; Fudickar, S. HRDepthNet: Depth Image-Based Marker-Less Tracking of Body Joints. Sensors 2021, 21, 1356. [Google Scholar] [CrossRef]
- Gutta, V.; Fallavollita, P.; Baddour, N.; Lemaire, E.D. Development of a smart hallway for marker-less human foot tracking and stride analysis. IEEE J. Transl. Eng. Health Med. 2021, 9, 2100412. [Google Scholar] [CrossRef]
- Cao, Z.; Hidalgo, G.; Simon, T.; Wei, S.E.; Sheikh, Y. OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields. IEEE Trans. Pattern Anal. Mach. Intell. 2021, 43, 172–186. [Google Scholar] [CrossRef] [Green Version]
- Fernandez-Gonzalez, P.; Koutsou, A.; Cuesta-Gomez, A.; Carratala-Tejada, M.; Miangolarra-Page, J.C.; Molina-Rueda, F. Reliability of Kinovea® Software and Agreement with a Three-Dimensional Motion System for Gait Analysis in Healthy Subjects. Sensors 2020, 20, 3154. [Google Scholar] [CrossRef]
- Martinez, J.; Hossain, R.; Romero, J.; Little, J.J. A simple yet effective baseline for 3D human pose estimation. Int. Conf. Comput. Vis. (ICCV) 2017, 2640–2649. [Google Scholar] [CrossRef]
- Nakano, N.; Sakura, T.; Ueda, K.; Omura, L.; Kimura, A.; Iino, Y.; Fukashiro, S.; Yoshioka, S. Evaluation of 3D Markerless Motion Capture Accuracy Using OpenPose With Multiple Video Cameras. Front. Sports Act. Living 2020, 2, 50. [Google Scholar] [CrossRef] [PubMed]
- Zago, M.; Luzzago, M.; Marangoni, T.; De Cecco, M.; Tarabini, M.; Galli, M. 3D Tracking of Human Motion Using Visual Skeletonization and Stereoscopic Vision. Front. Bioeng. Biotechnol. 2020, 8, 181. [Google Scholar] [CrossRef] [PubMed]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016. [Google Scholar]
- Sun, X.; Xiao, B.; Wei, F.; Liang, S.; Wei, Y. Integral Human Pose Regression. Eur. Conf. Comput. Vis. (ECCV) 2018, 11210, 536–553. [Google Scholar] [CrossRef] [Green Version]
TDPT-GT | VICON (×10) | r | |
---|---|---|---|
Right Shoulder | (0.3, 35.3, 1.6) | (4.2, 14.0, 0.7) | (0.87, 0.58, −0.84) |
Left Shoulder | (−7.3, 35.0, −6.4) | (3.8, 14.0, 2.2) | (0.90, −0.49, −0.34) |
Right Elbow | (4.3, 8.1, 4.4) | (4.6, 11.5, 0.2) | (0.92, −0.12, −0.94) |
Left Elbow | (−11.0, 7.3, −9.9) | (3.8, 11.4, 2.8) | (0.89, −0.36, −0.67) |
Right Wrist | (4.6, −15.3, 8.7) | (4.4, 10.1, −0.7) | (0.95, −0.23, −0.89) |
Left Wrist | (−17.5, −14.4, −12.0) | (3.4, 10.1, 3.4) | (0.87, −0.40, −0.82) |
Right Hip joint | (−1.1, −27.3, 2.9) | (4.6, 8.8, 1.2) | (0.85, −0.02, −0.85) |
Left Hip joint | (−5.9, −27.5, −2.0) | (4.3, 8.8, 2.0) | (0.79, 0.52, −0.29) |
Right Knee | (−2.7, −79.1, 1.9) | (4.7, 4.8, 1.4) | (0.93, −0.77, −0.60) |
Left Knee | (−8.3, −79.0, −4.0) | (4.5, 4.9, 2.1) | (0.77, 0.49, −0.19) |
Right Ankle | (−3.3, −132.2, 4.4) | (5.2, 1.2, 1.6) | (0.84, −0.75, −0.88) |
Left Ankle | (−7.2, −131.6, −2.4) | (4.9, 1.2, 2.2) | (0.81, 0.14, −0.61) |
Right Toe | (−6.0, −140.3, 7.4) | (4.8, 0.7, 1.3) | (0.92, −0.66, −0.77) |
Left Toe | (−11.1, −140.4, −0.6) | (4.4, 0.7, 2.2) | (0.74, 0.57, −0.72) |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Aoyagi, Y.; Yamada, S.; Ueda, S.; Iseki, C.; Kondo, T.; Mori, K.; Kobayashi, Y.; Fukami, T.; Hoshimaru, M.; Ishikawa, M.; et al. Development of Smartphone Application for Markerless Three-Dimensional Motion Capture Based on Deep Learning Model. Sensors 2022, 22, 5282. https://doi.org/10.3390/s22145282
Aoyagi Y, Yamada S, Ueda S, Iseki C, Kondo T, Mori K, Kobayashi Y, Fukami T, Hoshimaru M, Ishikawa M, et al. Development of Smartphone Application for Markerless Three-Dimensional Motion Capture Based on Deep Learning Model. Sensors. 2022; 22(14):5282. https://doi.org/10.3390/s22145282
Chicago/Turabian StyleAoyagi, Yukihiko, Shigeki Yamada, Shigeo Ueda, Chifumi Iseki, Toshiyuki Kondo, Keisuke Mori, Yoshiyuki Kobayashi, Tadanori Fukami, Minoru Hoshimaru, Masatsune Ishikawa, and et al. 2022. "Development of Smartphone Application for Markerless Three-Dimensional Motion Capture Based on Deep Learning Model" Sensors 22, no. 14: 5282. https://doi.org/10.3390/s22145282
APA StyleAoyagi, Y., Yamada, S., Ueda, S., Iseki, C., Kondo, T., Mori, K., Kobayashi, Y., Fukami, T., Hoshimaru, M., Ishikawa, M., & Ohta, Y. (2022). Development of Smartphone Application for Markerless Three-Dimensional Motion Capture Based on Deep Learning Model. Sensors, 22(14), 5282. https://doi.org/10.3390/s22145282