Preliminary Validation of a Low-Cost Motion Analysis System Based on RGB Cameras to Support the Evaluation of Postural Risk Assessment
Abstract
:Featured Application
Abstract
1. Introduction
2. Materials and Methods
2.1. The proposed Motion Analysis System
- -
- New system based on the CMU model from the tf-pose-estimation project, computationally lighter than those provided by Openpose and therefore able to provide real-time processing requiring lower CPU and GPU performances.
- -
- Addition of estimation of torso rotation relative to the pelvis and of head rotation relative to shoulders.
- -
- Distinction between abduction, adduction, extension, and flexion categories in the calculation of the angles between body segments
- -
- Person tracking is no longer based on K-Means clustering, as it was computationally heavy and not very accurate.
- -
- Modification of the system architecture to ensure greater modularity and ability to work even with a single camera shot.
- Resolution: 720 p.
- Distortion-free lenses: wide angle lenses should be avoided.
2.1.1. Data Collection
- x: horizontal coordinate.
- y: vertical coordinate.
- c: confidence index in the range [0; 1].
2.1.2. Parameter Calculation
2.2. Experimental Case Study
2.2.1. Experimental Procedure
- T-pose: the subjects have to stand straight up, with their feet placed symmetrically and feet slightly apart and with their arms fully extended.
- Seated: subjects have to sit on a stool 70 cm in height, with the back straight, hands leaned on the knees, and feet on the floor.
- Standing Relaxed: the subjects have to stand comfortably facing straight ahead, with their feet placed symmetrically and feet slightly apart.
- Reach: the subjects must stand straight, feet well apart and with the arms stretched forward, simulating the act of grasping an object placed above their head.
- Pick up: the subjects have to pick up a box (dimensions 30.5 cm × 21 cm × 10.5 cm, weight 5 kg) from the floor, and raise it in front of them, keeping it at pelvic level.
2.2.2. Data Analysis
3. Results
4. Discussion
Study Limitations
5. Conclusions
- A high amount of highly expensive cameras, placed in the space in a way that is impracticable in a real work environment.
- A preliminary calibration procedure.
- The use of wearable markers may invalidate the quality of the measurement as they are invasive.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Badri, A.; Boudreau-Trudel, B.; Souissi, A.S. Occupational health and safety in the industry 4.0 era: A cause for major concern? Saf. Sci. 2018, 109, 403–411. [Google Scholar] [CrossRef]
- European Agency for Safety and Health at Work. Work-Related Musculoskeletal Disorders: Prevalence, Costs and Demographics in the EU. EU-OSHA. Available online: https://osha.europa.eu/en/publications/msds-facts-and-figures-overview-prevalence-costs-and-demographics-msds-europe/view (accessed on 5 July 2021).
- European Commission. The 2015 Ageing Report: Economic and Budgetary Projections for the 28 EU Member State. Available online: https://ec.europa.eu/economy_finance/publications/european_economy/2015/pdf/ee3_en.pdf (accessed on 5 July 2021).
- Ilmarinen, J. Physical requirements associated with the work of aging workers in the European Union. Exp. Aging Res. 2002, 28, 7–23. [Google Scholar] [CrossRef] [PubMed]
- Kenny, G.P.; Groeller, H.; McGinn, R.; Flouris, A.D. Age, human performance, and physical employment standards. Appl. Physiol. Nutr. Metab. 2016, 41, S92–S107. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Battini, D.; Persona, A.; Sgarbossa, F. Innovative real-time system to integrate ergonomic evaluations into warehouse design and management. Comput. Ind. Eng. 2014, 77, 1–10. [Google Scholar] [CrossRef]
- Mengoni, M.; Ceccacci, S.; Generosi, A.; Leopardi, A. Spatial Augmented Reality: An application for human work in smart manufacturing environment. Procedia Manuf. 2018, 17, 476–483. [Google Scholar] [CrossRef]
- Vignais, N.; Miezal, M.; Bleser, G.; Mura, K.; Gorecky, D.; Marin, F. Innovative system for real-time ergonomic feedback in industrial manufacturing. Appl. Ergon. 2013, 44, 566–574. [Google Scholar] [CrossRef]
- Lowe, B.D.; Dempsey, P.G.; Jones, E.M. Ergonomics assessment methods used by ergonomics professionals. Appl. Ergon. 2019, 81, 10. [Google Scholar] [CrossRef]
- Ceccacci, S.; Matteucci, M.; Peruzzini, M.; Mengoni, M. A multipath methodology to promote ergonomics, safety and efficiency in agile factories. Int. J. Agil. Syst. Manag. 2019, 12, 407–436. [Google Scholar] [CrossRef]
- Snook, S.H.; Ciriello, V.M. The design of manual handling tasks: Revised tables of maximum acceptable weights and forces. Ergonomics 1991, 34, 1197–1213. [Google Scholar] [CrossRef]
- McAtamney, L.; Corlett, E.N. RULA: A survey method for the investigation of work-related upper limb disorders. Appl. Ergon. 1993, 24, 91–99. [Google Scholar] [CrossRef]
- Hignett, S.; McAtamney, L. Rapid entire body assessment (REBA). Appl. Ergon. 2000, 31, 201–205. [Google Scholar] [CrossRef]
- Moore, J.S.; Garg, A. The strain index: A proposed method to analyze jobs for risk of distal upper extremity disorders. Am. Ind. Hyg. Assoc. J. 1995, 56, 443. [Google Scholar] [CrossRef]
- Occhipinti, E. OCRA: A concise index for the assessment of exposure to repetitive movements of the upper limbs. Ergonomics 1998, 41, 1290–1311. [Google Scholar] [CrossRef] [PubMed]
- Burdorf, A.; Derksen, J.; Naaktgeboren, B.; Van Riel, M. Measurement of trunk bending during work by direct observation and continuous measurement. Appl. Ergon. 1992, 23, 263–267. [Google Scholar] [CrossRef]
- Fagarasanu, M.; Kumar, S. Measurement instruments and data collection: A consideration of constructs and biases in ergonomics research. Int. J. Ind. Ergon. 2002, 30, 355–369. [Google Scholar] [CrossRef]
- Altieri, A.; Ceccacci, S.; Talipu, A.; Mengoni, M. A Low Cost Motion Analysis System Based on RGB Cameras to Support Ergonomic Risk Assessment in Real Workplaces. In Proceedings of the ASME 2020 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, American Society of Mechanical Engineers Digital Collection, St. Louis, MO, USA, 17–19 August 2020. [Google Scholar] [CrossRef]
- De Magistris, G.; Micaelli, A.; Evrard, P.; Andriot, C.; Savin, J.; Gaudez, C.; Marsot, J. Dynamic control of DHM for ergonomic assessments. Int. J. Ind. Ergon. 2013, 43, 170–180. [Google Scholar] [CrossRef] [Green Version]
- Xsense. Available online: https://www.xsens.com/motion-capture (accessed on 5 July 2021).
- Vicon Blue Trident. Available online: https://www.vicon.com/hardware/blue-trident/ (accessed on 5 July 2021).
- Vicon Nexus. Available online: https://www.vicon.com/software/nexus/ (accessed on 5 July 2021).
- Optitrack. Available online: https://optitrack.com/ (accessed on 5 July 2021).
- Manghisi, V.M.; Uva, A.E.; Fiorentino, M.; Gattullo, M.; Boccaccio, A.; Evangelista, A. Automatic Ergonomic Postural Risk Monitoring on the Factory Shopfloor‒The Ergosentinel Tool. Procedia Manuf. 2020, 42, 97–103. [Google Scholar] [CrossRef]
- Schall, M.C., Jr.; Sesek, R.F.; Cavuoto, L.A. Barriers to the Adoption of Wearable Sensors in the Workplace: A Survey of Occupational Safety and Health Professionals. Hum. Factors. 2018, 60, 351–362. [Google Scholar] [CrossRef] [PubMed]
- Aitpayev, K.; Gaber, J. Collision Avatar (CA): Adding collision objects for human body in augmented reality using Kinect. In Proceedings of the 2012 6th International Conference on Application of Information and Communication Technologies (AICT), Tbilisi, GA, USA, 17–19 October 2012; pp. 1–4. [Google Scholar] [CrossRef]
- Bian, Z.P.; Chau, L.P.; Magnenat-Thalmann, N. Fall detection based on skeleton extraction. In Proceedings of the 11th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry, New York, NY, USA, 2–4 December 2012; pp. 91–94. [Google Scholar] [CrossRef]
- Chang, C.Y.; Lange, B.; Zhang, M.; Koenig, S.; Requejo, P.; Somboon, N.; Rizzo, A.A. Towards pervasive physical rehabilitation using Microsoft Kinect. In Proceedings of the 2012 6th International Conference on Pervasive Computing Technologies for Healthcare (Pervasive Health) and Workshops, San Diego, CA, USA, 21–24 May 2012; pp. 159–162. [Google Scholar]
- Farhadi-Niaki, F.; GhasemAghaei, R.; Arya, A. Empirical study of a vision-based depth-sensitive human-computer interaction system. In Proceedings of the 10th Asia Pacific Conference on Computer Human Interaction, New York, NY, USA, 28–31 August 2012; pp. 101–108. [Google Scholar] [CrossRef]
- Villaroman, N.; Rowe, D.; Swan, B. Teaching natural user interaction using openni and the microsoft kinect sensor. In Proceedings of the 2011 Conference on Information Technology Education, New York, NY, USA, 20–22 December 2011; pp. 227–232. [Google Scholar] [CrossRef]
- Diego-Mas, J.A.; Alcaide-Marzal, J. Using Kinect sensor in observational methods for assessing postures at work. Appl. Ergon. 2014, 45, 976–985. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Manghisi, V.M.; Uva, A.E.; Fiorentino, M.; Bevilacqua, V.; Trotta, G.F.; Monno, G. Real time RULA assessment using Kinect v2 sensor. Appl. Ergon. 2017, 65, 481–491. [Google Scholar] [CrossRef]
- Marinello, F.; Pezzuolo, A.; Simonetti, A.; Grigolato, S.; Boscaro, B.; Mologni, O.; Gasparini, F.; Cavalli, R.; Sartori, L. Tractor cabin ergonomics analyses by means of Kinect motion capture technology. Contemp. Eng. Sci. 2015, 8, 1339–1349. [Google Scholar] [CrossRef]
- Clark, R.A.; Pua, Y.H.; Fortin, K.; Ritchie, C.; Webster, K.E.; Denehy, L.; Bryant, A.L. Validity of the Microsoft Kinect for assessment of postural control. Gait Posture 2012, 36, 372–377. [Google Scholar] [CrossRef]
- Bonnechere, B.; Jansen, B.; Salvia, P.; Bouzahouene, H.; Omelina, L.; Moiseev, F.; Sholukha, C.J.; Rooze, M.; Van Sint Jan, S. Validity and reliability of the kinect within functional assessment activities: Comparison with standardstereo-photogrammetry. Gait Posture 2014, 39, 593–598. [Google Scholar] [CrossRef] [PubMed]
- Plantard, P.; Auvinet, E.; Le Pierres, A.S.; Multon, F. 2015, Pose Estimation with a Kinect for Ergonomic Studies: Evaluation of the Accuracy Using a Virtual Mannequin. Sensors 2015, 15, 1785–1803. [Google Scholar] [CrossRef]
- Patrizi, A.; Pennestrì, E.; Valentini, P.P. Comparison between low-cost marker-less and high-end marker-based motion capture systems for the computer-aided assessment of working ergonomics. Ergonomics 2015, 59, 155–162. [Google Scholar] [CrossRef] [PubMed]
- Plantard, P.; Hubert PH, S.; Le Pierres, A.; Multon, F. Validation of an ergonomic assessment method using Kinect data in real workplace conditions. Appl. Ergon. 2017, 65, 562–569. [Google Scholar] [CrossRef] [PubMed]
- Xu, X.; McGorry, R.W. The validity of the first and second generation Microsoft Kinect™ for identifying joint center locations during static postures. Appl. Ergon. 2015, 49, 47–54. [Google Scholar] [CrossRef] [PubMed]
- Schroder, Y.; Scholz, A.; Berger, K.; Ruhl, K.; Guthe, S.; Magnor, M. Multiple kinect studies. Comput. Graph. 2011, 2, 6. [Google Scholar]
- Zhang, H.; Yan, X.; Li, H. Ergonomic posture recognition using 3D view-invariant features from single ordinary camera. Autom. Constr. 2018, 94, 1–10. [Google Scholar] [CrossRef]
- Cao, Z.; Hidalgo, G.; Simon, T.; Wei, S.E.; Sheikh, Y. OpenPose: Realtime multi-person 2D pose estimation using Part Affinity Fields. IEEE Trans. Pattern Anal. Mach. Intell. 2019, 43, 172–186. [Google Scholar] [CrossRef] [Green Version]
- Cao, Z.; Simon, T.; Wei, S.E.; Sheikh, Y. Realtime multi-person 2d pose estimation using part affinity fields. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 7291–7299. [Google Scholar] [CrossRef] [Green Version]
- Bradski, G. The OpenCV Library. Dr. Dobb’s J. Softw. Tools 2000, 25, 120–123. [Google Scholar]
- Ota, M.; Tateuchi, H.; Hashiguchi, T.; Kato, T.; Ogino, Y.; Yamagata, M.; Ichihashi, N. Verification of reliability and validity of motion analysis systems during bilateral squat using human pose tracking algorithm. Gait Posture 2020, 80, 62–67. [Google Scholar] [CrossRef] [PubMed]
- He Ling, W.A.N.G.; Yun-Ju, L.E.E. Occupational evaluation with Rapid Entire Body Assessment (REBA) via imaging processing in field. In Proceedings of the Human Factors Society Conference, Elsinore, Denmark, 25–28 July 2019. [Google Scholar]
- Li, L.; Martin, T.; Xu, X. A novel vision-based real-time method for evaluating postural risk factors associated with muskoskeletal disorders. Appl. Ergon. 2020, 87, 103138. [Google Scholar] [CrossRef] [PubMed]
- MassirisFernández, M.; Fernández, J.Á.; Bajo, J.M.; Delrieux, C.A. Ergonomic risk assessment based on computer vision and machine learning. Comput. Ind. Eng. 2020, 149, 10. [Google Scholar] [CrossRef]
- Ojelaide, A.; Paige, F. Construction worker posture estimation using OpenPose. Constr. Res. Congr. 2020. [Google Scholar] [CrossRef]
- Da Silva Neto, J.G.; Teixeira, J.M.X.N.; Teichrieb, V. Analyzing embedded pose estimation solutions for human behaviour understanding. In Anais Estendidos do XXII Simpósio de Realidade Virtual e Aumentada; SBC: Porto Alegre, Brasil, 2020; pp. 30–34. [Google Scholar]
- TF-Pose. Available online: https://github.com/tryagainconcepts/tf-pose-estimation (accessed on 5 July 2021).
- Lindera. Available online: https://www.lindera.de/technologie/ (accessed on 5 July 2021).
- Obuchi, M.; Hoshino, Y.; Motegi, K.; Shiraishi, Y. Human Behavior Estimation by using Likelihood Field. In Proceedings of the International Conference on Mechanical, Electrical and Medical Intelligent System, Gunma, Japan, 4–6 December 2021. [Google Scholar]
- Agrawal, Y.; Shah, Y.; Sharma, A. Implementation of Machine Learning Technique for Identification of Yoga Poses. In Proceedings of the 2020 IEEE 9th International Conference on Communication Systems and Network Technologies (CSNT), Gwalior, India, 10–12 April 2020; pp. 40–43. [Google Scholar] [CrossRef]
- Contini, R. Body Segment Parameters, Part II. Artif. Limbs 1972, 16, 1–19. [Google Scholar]
- Romero, J.; Kjellström, H.; Ek, C.H.; Kragic, D. Non-parametric hand pose estimation with object context. Image Vis. Comput. 2013, 31, 555–564. [Google Scholar] [CrossRef]
- Wu, Z.; Hoang, D.; Lin, S.Y.; Xie, Y.; Chen, L.; Lin, Y.Y.; Fan, W. Mm-hand: 3d-aware multi-modal guided hand generative network for 3d hand pose synthesis. arXiv 2020, arXiv:2010.01158. [Google Scholar]
Angles | Considered Keypoints | Camera View | ||
---|---|---|---|---|
Left | Right | Frontal | Lateral | |
Neck flexion/extension | 17-1-11 | 16-1-8 | X | |
Shoulder abduction | 11-5-6 | 8-2-3 | X | |
Shoulder flexion/extension | 11-5-6 | 8-2-3 | X | |
Elbow flexion/extension angle | 5-6-7 | 2-3-4 | X | X |
Trunk flexion/extension | 1-11-12 | 1-8-9 | X | |
Knee bending angle | 11-12-13 | 8-9-10 | X |
Pairs of Angles Compared between the Two Systems. | |
---|---|
Vicon | RGB-MAS |
Average between L and R neck flexion/extension | Neck flexion/extension |
L/R shoulder abduction/adduction Y component | L/R shoulder abduction |
L/R shoulder abduction/adduction X component | L/R shoulder flexion/extension |
L/R elbow flexion/extension | L/R elbow flexion/extension |
Average between L and R spine flexion/extension | Trunk flexion/extension |
L/R knee flexion/extension | L/R knee bending angle |
RMSE RGB-MAS vs. Vicon [°] | ||||
---|---|---|---|---|
T-Pose | Seated | Standing Relaxed | Reach | |
Neck flexion/extension | 6.83 | 16.47 | 19.21 | 9.58 |
Left shoulder abduction | 12.66 | 45.16 | 13.07 | 45.46 |
Right shoulder abduction | 11.64 | 50.66 | 7.93 | 43.23 |
Left shoulder flexion/extension | 27.86 | 57.19 | 21.53 | 71.29 |
Right shoulder flexion/extension | 33.73 | 52.90 | 57.88 | 82.93 |
Left elbow flexion/extension | 7.13 | 16.90 | 21.15 | 27.05 |
Right elbow flexion/extension | 5.46 | 13.19 | 22.11 | 53.30 |
Trunk flexion/extension | 0.35 | 8.61 | 0.91 | 2.95 |
Left knee flexion/extention | 2.39 | 46.25 | 7.38 | 24.76 |
Right knee flexion/extention | 0.21 | 4.79 | 0.07 | 0.12 |
RMSE RGB-MAS vs. Manual [°] | RMSE Vicon vs. Manual [°] | ||||||||
---|---|---|---|---|---|---|---|---|---|
T-Pose | Seated | Standing Relaxed | Reach | Pick Up | T-Pose | Seated | Standing Relaxed | Reach | |
Neck flexion/extension | 9.19 | 13.10 | 19.63 | 6.87 | 28.05 | 8.26 | 15.47 | 8.64 | 8.32 |
Left shoulder abduction | 6.90 | 44.13 | 7.54 | 49.68 | 8.03 | 7.29 | 7.52 | 13.00 | 38.11 |
Right shoulder abduction | 6.67 | 47.00 | 7.82 | 51.65 | 7.55 | 6.35 | 6.54 | 7.09 | 38.08 |
Left shoulder flexion/extension | 32.62 | 53.84 | 12.03 | 82.10 | 30.43 | 21.08 | 16.19 | 19.55 | 101.52 |
Right shoulder flexion/extension | 50.91 | 50.70 | 70.02 | 71.31 | 28.01 | 23.14 | 15.66 | 21.43 | 110.20 |
Left elbow flexion/extension | 3.60 | 15.03 | 14.21 | 36.16 | 15.83 | 6.21 | 7.08 | 15.56 | 16.35 |
Right elbow flexion/extension | 2.20 | 9.67 | 18.63 | 25.45 | 10.26 | 5.60 | 17.97 | 10.35 | 39.00 |
Trunk flexion/extension | 3.48 | 30.06 | 5.46 | 20.71 | 33.46 | 3.28 | 33.60 | 4.59 | 17.93 |
Left knee flexion/extention | 3.81 | 53.12 | 6.82 | 22.73 | 30.10 | 1.98 | 18.85 | 2.68 | 8.39 |
Right knee flexion/extention | 3.95 | 20.69 | 1.94 | 22.63 | 29.19 | 3.75 | 21.24 | 1.90 | 22.52 |
Median | ||||||
---|---|---|---|---|---|---|
Manual | RGB-MAS | Vicon | ||||
Left | Right | Left | Right | Left | Right | |
T-Pose | 3.00 | 3.00 | 3.00 | 3.00 | 3.00 | 3.00 |
Relaxed | 2.50 | 2.50 | 3.00 | 3.00 | 3.00 | 3.00 |
Sit | 4.00 | 4.00 | 4.00 | 4.00 | 4.00 | 4.00 |
Reach | 4.50 | 4.50 | 4.00 | 4.00 | 3.00 | 3.00 |
Pickup | 5.50 | 5.50 | 6.00 | 6.00 | - | - |
RULA RMSE (+SD) | ||||
---|---|---|---|---|
RGB-MAS vs. Manual | Vicon vs. Manual | |||
Left | Right | Left | Right | |
T-Pose | 0.00 (0.58) | 1.00 (0.75) | 0.41 (0.37) | 0.41 (0.37) |
Relaxed | 0.71 (0.00) | 2.45 (0.37) | 0.82 (0.37) | 0.82 (0.37) |
Sit | 0.58 (0.75) | 1.41 (0.76) | 0.71 (0.69) | 0.82(0.58) |
Reach | 1.35 (1.07) | 1.35 (0.82) | 1.78 (0.76) | 1.78 (0.76) |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Agostinelli, T.; Generosi, A.; Ceccacci, S.; Khamaisi, R.K.; Peruzzini, M.; Mengoni, M. Preliminary Validation of a Low-Cost Motion Analysis System Based on RGB Cameras to Support the Evaluation of Postural Risk Assessment. Appl. Sci. 2021, 11, 10645. https://doi.org/10.3390/app112210645
Agostinelli T, Generosi A, Ceccacci S, Khamaisi RK, Peruzzini M, Mengoni M. Preliminary Validation of a Low-Cost Motion Analysis System Based on RGB Cameras to Support the Evaluation of Postural Risk Assessment. Applied Sciences. 2021; 11(22):10645. https://doi.org/10.3390/app112210645
Chicago/Turabian StyleAgostinelli, Thomas, Andrea Generosi, Silvia Ceccacci, Riccardo Karim Khamaisi, Margherita Peruzzini, and Maura Mengoni. 2021. "Preliminary Validation of a Low-Cost Motion Analysis System Based on RGB Cameras to Support the Evaluation of Postural Risk Assessment" Applied Sciences 11, no. 22: 10645. https://doi.org/10.3390/app112210645
APA StyleAgostinelli, T., Generosi, A., Ceccacci, S., Khamaisi, R. K., Peruzzini, M., & Mengoni, M. (2021). Preliminary Validation of a Low-Cost Motion Analysis System Based on RGB Cameras to Support the Evaluation of Postural Risk Assessment. Applied Sciences, 11(22), 10645. https://doi.org/10.3390/app112210645