A Narrative Review on Wearable Inertial Sensors for Human Motion Tracking in Industrial Scenarios
Abstract
:1. Introduction
- Cell. It is not a real cooperating scenario, because the robot is located in a traditional cage far away from the human.
- Coexistence. The human and the robot work alongside each other but they do not share a workspace.
- Synchronized. The human and the robot share a workspace, but only one of the interaction partners is present in the workspace at a time.
- Cooperation. Both the human and the robot perform tasks at the same time in the shared workspace, but they do not work simultaneously on the same product or component.
- Collaboration. The human and the robot work simultaneously on the same product or component.
2. Materials and Methods
- Number of publications per year;
- Aim of the work;
- Body district involved in the motion tracking;
- Number of adopted MIMUs/IMUs;
- Presence/absence of a technology combined to MIMUs/IMUs;
- Presence/absence of a real-time analysis;
- Inclusion/exclusion of the magnetometer in the sensor fusion process.
Study | Year | Aim | Body District | MIMUs/IMUs Number | Technology | Real-Time | Magnetometer |
---|---|---|---|---|---|---|---|
Huang C. [15] | 2020 | Risk assessment | Total body | 17 | MIMUs | Yes | Yes |
Peppoloni L. [16] | 2016 | Risk assessment | Upper limb | 3 | MIMUs + EMGs | Yes | Yes |
Giannini P. [17] | 2020 | Risk assessment | Total body | 11 | MIMUs + EMGs | Yes | Yes |
Monaco M.G.L. [18] | 2019 | Risk assessment | Upper body | 8 | MIMUs + EMGs | No | Yes |
Santos S. [19] | 2020 | Risk assessment | Upper body | 4 | MIMUs | Yes | Yes |
Humadi A. [20] | 2021 | Risk assessment | Upper body | 17 | MIMUs | No | Yes |
Peppoloni L. [21] | 2014 | Risk assessment | Upper limb | 4 | MIMUs | Yes | Yes |
Yan X. [22] | 2017 | Risk assessment | Upper body | 2 | MIMUs | Yes | Yes |
Merino G. [23] | 2019 | Risk assessment | Total body | 17 | IMUs | No | No |
Chan Y. [24] | 2022 | Risk assessment | Upper body | 6 | MIMUs | No | Yes |
Fletcher S.R. [25] | 2018 | Risk assessment | Total body | 17 | MIMUs | Yes | Yes |
Li J. [26] | 2018 | Risk assessment | Total body | 7 | MIMUs | Yes | Yes |
Caputo F. [27] | 2019 | Risk assessment | Upper body | 6 | MIMUs | No | Yes |
Nunes M.L. [28] | 2022 | Risk assessment | Upper body | 7 | MIMUs | No | Yes |
Martinez K. [29] | 2022 | Risk assessment | Total body | 9 | MIMUs | Yes | Yes |
Hubaut R. [30] | 2022 | Risk assessment | Upper body | 4 | IMUs + EMGs | No | No |
Colim A. [31] | 2021 | Risk assessment | Upper body | 11 | MIMUs | No | Yes |
Schall M.C. [32] | 2021 | Risk assessment | Upper body | 4 | IMUs | No | No |
Olivas-Padilla B. [33] | 2021 | Risk assessment | Total body | 52 | MIMUs | No | Yes |
Winiarski S. [34] | 2021 | Risk assessment | Total body | 16 | MIMUs | No | Yes |
Zhang J. [35] | 2020 | Collaborative robotics | Upper body | 5 | MIMUs + vision | Yes | Yes |
Ates G. [36] | 2021 | Collaborative robotics | Upper body | 5 | MIMUs | No | Yes |
Skulj G. [37] | 2021 | Collaborative robotics | Upper body | 5 | IMUs | Yes | No |
Wang W. [38] | 2019 | Collaborative robotics | Upper limb | 1 | IMUs + EMGs | Yes | No |
Sekhar R. [39] | 2012 | Collaborative robotics | Upper limb | 1 | IMUs | Yes | No |
Chico A. [40] | 2021 | Collaborative robotics | Upper limb | 1 | MIMUs + EMGs | Yes | Yes |
Tao Y. [41] | 2018 | Collaborative robotics | Upper limb | 6 | MIMUs | No | Yes |
Al-Yacoub A. [42] | 2020 | Collaborative robotics | Upper body | 1 | IMUs + EMGs + vision | Yes | No |
Tortora S. [43] | 2019 | Collaborative robotics | Upper limb | 2 | IMUs + EMGs | Yes | No |
Resende A. [44] | 2021 | Collaborative robotics | Upper body | 9 | MIMUs | Yes | Yes |
Amorim A. [45] | 2021 | Collaborative robotics | Upper limb | 1 | MIMUs + vision | Yes | Yes |
Pellois R. [46] | 2018 | Collaborative robotics | Upper limb | 2 | IMUs | No | No |
Grapentin A. [47] | 2020 | Collaborative robotics | Hand | 6 | IMUs | Yes | No |
Bright T. [48] | 2021 | Collaborative robotics | Hand | 15 | IMUs | No | No |
Digo E. [49] | 2022 | Collaborative robotics | Upper limb | 2 | IMUs | Yes | No |
Lin C.J. [50] | 2022 | Collaborative robotics | Upper limb | 3 | MIMUs + EMGs | Yes | Yes |
Rosso V. [51] | 2022 | Collaborative robotics | Upper limb | 1 | IMUs | No | No |
Tuli T.B. [52] | 2022 | Collaborative robotics | Upper limb | 3 | MIMUs + vision | Yes | Yes |
Tarabini M. [53] | 2018 | Tracking in industry | Upper body | 6 | MIMUs + vision | Yes | Yes |
Tarabini M. [54] | 2018 | Tracking in industry | Upper body | 6 | MIMUs + vision | No | Yes |
Caputo F. [55] | 2018 | Tracking in industry | Total body | 10 | MIMUs | No | Yes |
Digo E. [56] | 2022 | Tracking in industry | Upper body | 3 | IMUs | Yes | No |
Borghetti M. [57] | 2020 | Tracking in industry | Hand | 2 | MIMUs | No | Yes |
Bellitti P. [58] | 2019 | Tracking in industry | Hand | 2 | MIMUs | No | Yes |
Fang W. [59] | 2017 | Tracking in industry | Head | 1 | IMUs + vision | Yes | No |
Manns M. [60] | 2021 | Action recognition | Total body | 8 | MIMUs | Yes | Yes |
Al-Amin M. [61] | 2019 | Action recognition | Upper body | 2 | MIMUs + EMGs + vision | Yes | Yes |
Al-Amin M. [62] | 2022 | Action recognition | Upper limb | 2 | MIMUs | No | Yes |
Kubota A. [63] | 2019 | Action recognition | Upper limb | 1 | IMUs + EMGs + vision | No | No |
Calvo A.F. [64] | 2018 | Action recognition | Total body | 4 | MIMUs + EMGs + vision | Yes | Yes |
Antonelli M. [65] | 2021 | Action recognition | Upper body | 4 | IMUs | No | No |
Digo E. [66] | 2020 | Other | Upper body | 7 | MIMUs + vision | No | Yes |
Maurice P. [67] | 2019 | Other | Total body | 17 | MIMUs + vision | No | Yes |
Li J. [68] | 2017 | Other | Hand | 10 | MIMUs | Yes | Yes |
3. Results and Discussion
3.1. Number of Publications per Year
3.2. Aim of the Work
3.3. Involved Body District and Number of Adopted MIMUs/IMUs
3.4. Presence/Absence of a Technology Combined to MIMUs/IMUs
3.5. Presence/Absence of a Real-Time Analysis
3.6. Inclusion/Exclusion of the Magnetometer in the Sensor Fusion Process
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Conflicts of Interest
References
- Kagermann, H.; Wahlster, W.; Helbig, J. Recommendations for Implementing the Strategic Initiative INDUSTRIE 4.0—Securing the Future of German Manufacturing Industry; Forschungsunion: Berlin, Germany, 2013. [Google Scholar]
- Hermann, M.; Pentek, T.; Otto, B. Design principles for industrie 4.0 scenarios. In Proceedings of the Annual Hawaii International Conference on System Sciences, Big Island, HI, USA, 5–8 January 2004; IEEE: Piscataway, NJ, USA, 2016; pp. 3928–3937. [Google Scholar]
- Merkel, L.; Berger, C.; Schultz, C.; Braunreuther, S.; Reinhart, G. Application-specific design of assistance systems for manual work in production. In Proceedings of the IEEE International Conference on Industrial Engineering and Engineering Management, Bangkok, Thailand, 16–19 December 2018; pp. 1189–1193. [Google Scholar]
- Korhan, O.; Memon, A.A. Introductory chapter: Work-related musculoskeletal disorders. In Work-Related Musculoskeletal Disorders; IntechOpen: London, UK, 2019. [Google Scholar]
- Kim, I.J. The Role of Ergonomics for Construction Industry Safety and Health Improvements. J. Ergon. 2017, 7, 2–5. [Google Scholar] [CrossRef]
- Roy, S.; Edan, Y. Investigating Joint-Action in Short-Cycle Repetitive Handover Tasks: The Role of Giver Versus Receiver and its Implications for Human-Robot Collaborative System Design. Int. J. Soc. Robot. 2020, 12, 973–988. [Google Scholar] [CrossRef]
- International Federation of Robotics: Executive Summary World Robotics 2021 Industrial Robots. Available online: https://ifr.org/img/worldrobotics/Executive_Summary_WR_Industrial_Robots_2021.pdf (accessed on 29 November 2022).
- Villani, V.; Pini, F.; Leali, F.; Secchi, C. Survey on human–robot collaboration in industrial settings: Safety, intuitive interfaces and applications. Mechatronics 2018, 55, 248–266. [Google Scholar] [CrossRef]
- Tsarouchi, P.; Makris, S.; Chryssolouris, G. Human–robot interaction review and challenges on task planning and programming. Int. J. Comput. Integr. Manuf. 2016, 29, 916–931. [Google Scholar] [CrossRef]
- Bauer, W.; Bender, M.; Braun, M.; Rally, P.; Scholtz, O. Lightweight Robots in Manual Assembly–Best to Start Simply; Frauenhofer-Institut für Arbeitswirtschaft und Organisation IAO: Stuttgart, Germany, 2016. [Google Scholar]
- ISO/TS 15066:2016; Robots and Robotic Devices-Collaborative Robots. ISO: Geneva, Switzerland, 2016.
- Tsarouchi, P.; Michalos, G.; Makris, S.; Athanasatos, T.; Dimoulas, K.; Chryssolouris, G. On a human–robot workplace design and task allocation system. Int. J. Comput. Integr. Manuf. 2017, 30, 1272–1279. [Google Scholar] [CrossRef]
- Digo, E.; Antonelli, M.; Cornagliotto, V.; Pastorelli, S.; Gastaldi, L. Collection and Analysis of Human Upper Limbs Motion Features for Collaborative Robotic Applications. Robotics 2020, 9, 33. [Google Scholar] [CrossRef]
- Caruso, M.; Sabatini, A.M.; Laidig, D.; Seel, T.; Knaflitz, M.; Della Croce, U.; Cereatti, A. Analysis of the Accuracy of Ten Algorithms for Orientation Estimation Using Inertial and Magnetic Sensing under Optimal Conditions: One Size Does Not Fit All. Sensors 2021, 21, 2543. [Google Scholar] [CrossRef]
- Huang, C.; Kim, W.; Zhang, Y.; Xiong, S. Development and validation of a wearable inertial sensors-based automated system for assessing work-related musculoskeletal disorders in the workspace. Int. J. Environ. Res. Public Health 2020, 17, 6050. [Google Scholar] [CrossRef]
- Peppoloni, L.; Filippeschi, A.; Ruffaldi, E.; Avizzano, C.A. A novel wearable system for the online assessment of risk for biomechanical load in repetitive efforts. Int. J. Ind. Ergon. 2016, 52, 1–11. [Google Scholar] [CrossRef]
- Giannini, P.; Bassani, G.; Avizzano, C.A.; Filippeschi, A. Wearable Sensor Network for Biomechanical Overload Assessment in Manual Material Handling. Sensors 2020, 20, 3877. [Google Scholar] [CrossRef]
- Monaco, M.G.L.; Fiori, L.; Marchesi, A.; Greco, A.; Ghibaudo, L.; Spada, S.; Caputo, F.; Miraglia, N.; Silvetti, A.; Draicchio, F. Biomechanical overload evaluation in manufacturing: A novel approach with sEMG and inertial motion capture integration. In Congress of the International Ergonomics Association; Springer: Berlin/Heidelberg, Germany, 2019; pp. 719–726. [Google Scholar]
- Santos, S.; Folgado, D.; Rodrigues, J.; Mollaei, N.; Fujão, C.; Gamboa, H. Explaining the Ergonomic Assessment of Human Movement in Industrial Contexts. In Proceedings of the 13th International Joint Conference on Biomedical Engineering Systems and Technologies (BIOSTEC 2020), Valletta, Malta, 24–26 February 2020; pp. 79–88. [Google Scholar]
- Humadi, A.; Nazarahari, M.; Ahmad, R.; Rouhani, H. Instrumented Ergonomic Risk Assessment Using Wearable Inertial Measurement Units: Impact of Joint Angle Convention. IEEE Access 2021, 9, 7293–7305. [Google Scholar] [CrossRef]
- Peppoloni, L.; Filippeschi, A.; Ruffaldi, E. Assessment of task ergonomics with an upper limb wearable device. In Proceedings of the 22nd Mediterranean Conference on Control and Automation (MED 2014), Palermo, Italy, 16–19 June 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 340–345. [Google Scholar]
- Yan, X.; Li, H.; Li, A.R.; Zhang, H. Wearable IMU-based real-time motion warning system for construction workers’ musculoskeletal disorders prevention. Autom. Constr. 2017, 74, 2–11. [Google Scholar] [CrossRef]
- Merino, G.; da Silva, L.; Mattos, D.; Guimarães, B.; Merino, E. Ergonomic evaluation of the musculoskeletal risks in a banana harvesting activity through qualitative and quantitative measures, with emphasis on motion capture (Xsens) and EMG. Int. J. Ind. Ergon. 2019, 69, 80–89. [Google Scholar] [CrossRef]
- Chan, Y.S.; Teo, Y.X.; Gouwanda, D.; Nurzaman, S.G.; Gopalai, A.A.; Thannirmalai, S. Musculoskeletal modelling and simulation of oil palm fresh fruit bunch harvesting. Sci. Rep. 2022, 12, 1–13. [Google Scholar] [CrossRef]
- Fletcher, S.R.; Johnson, T.L.; Thrower, J. A Study to Trial the Use of Inertial Non-Optical Motion Capture for Ergonomic Analysis of Manufacturing Work. Proc. Inst. Mech. Eng. 2018, 232, 90–98. [Google Scholar] [CrossRef] [Green Version]
- Li, J.; Lu, Y.; Nan, Y.; He, L.; Wang, X.; Niu, D. A Study on Posture Analysis of Assembly Line Workers in a Manufacturing Industry. Adv. Intell. Syst. Comput. 2018, 820, 380–386. [Google Scholar] [CrossRef]
- Caputo, F.; Greco, A.; D’Amato, E.; Notaro, I.; Spada, S. Imu-based motion capture wearable system for ergonomic assessment in industrial environment. In Proceedings of the International Conference on Applied Human Factors and Ergonomics, San Diego, CA, USA, 16–20 July 2019; Springer: Berlin/Heidelberg, Germany, 2019; pp. 215–225. [Google Scholar]
- Nunes, M.L.; Folgado, D.; Fujao, C.; Silva, L.; Rodrigues, J.; Matias, P.; Barandas, M.; Carreiro, A.; Madeira, S.; Gamboa, H. Posture Risk Assessment in an Automotive Assembly Line using Inertial Sensors. IEEE Access 2022, 10, 83221–83235. [Google Scholar] [CrossRef]
- Beltran Martinez, K.; Nazarahari, M.; Rouhani, H. K-score: A novel scoring system to quantify fatigue-related ergonomic risk based on joint angle measurements via wearable inertial measurement units. Appl. Ergon. 2022, 102, 103757. [Google Scholar] [CrossRef]
- Hubaut, R.; Guichard, R.; Greenfield, J.; Blandeau, M. Validation of an Embedded Motion-Capture and EMG Setup for the Analysis of Musculoskeletal Disorder Risks during Manhole Cover Handling. Sensors 2022, 22, 436. [Google Scholar] [CrossRef]
- Colim, A.; Cardoso, A.; Arezes, P.; Braga, A.C.; Peixoto, A.C.; Peixoto, V.; Wolbert, F.; Carneiro, P.; Costa, N.; Sousa, N. Digitalization of Musculoskeletal Risk Assessment in a Robotic-Assisted Assembly Workstation. Safety 2021, 7, 74. [Google Scholar] [CrossRef]
- Schall, M.C.; Zhang, X.; Chen, H.; Gallagher, S.; Fethke, N.B. Comparing upper arm and trunk kinematics between manufacturing workers performing predominantly cyclic and non-cyclic work tasks. Appl. Ergon. 2021, 93, 103356. [Google Scholar] [CrossRef] [PubMed]
- Olivas-Padilla, B.E.; Manitsaris, S.; Menychtas, D.; Glushkova, A. Stochastic-biomechanic modeling and recognition of human movement primitives, in industry, using wearables. Sensors 2021, 21, 2497. [Google Scholar] [CrossRef] [PubMed]
- Winiarski, S.; Chomątowska, B.; Molek-Winiarska, D.; Sipko, T.; Dyvak, M. Added Value of Motion Capture Technology for Occupational Health and Safety Innovations. Hum. Technol. 2021, 17, 235–260. [Google Scholar] [CrossRef]
- Zhang, J.; Li, P.; Zhu, T.; Zhang, W.A.; Liu, S. Human Motion Capture Based on Kinect and IMUs and Its Application to Human-Robot Collaboration. In Proceedings of the 5th IEEE International Conference on Advanced Robotics and Mechatronics (ICARM 2020), Shenzhen, China, 18–21 December 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 392–397. [Google Scholar]
- Ateş, G.; Kyrkjebø, E. Human-Robot Cooperative Lifting Using IMUs and Human Gestures. In Proceedings of the Annual Conference Towards Autonomous Robotic Systems, Lincoln, UK, 8–10 September 2021; Springer: Cham, Switzerland, 2021; pp. 88–99. [Google Scholar]
- Škulj, G.; Vrabič, R.; Podržaj, P. A wearable imu system for flexible teleoperation of a collaborative industrial robot. Sensors 2021, 21, 5871. [Google Scholar] [CrossRef] [PubMed]
- Wang, W.; Li, R.; Diekel, Z.M.; Chen, Y.; Zhang, Z.; Jia, Y. Controlling object hand-over in human-robot collaboration via natural wearable sensing. IEEE Trans. Hum.-Mach. Syst. 2019, 49, 59–71. [Google Scholar] [CrossRef]
- Sekhar, R.; Musalay, R.K.; Krishnamurthy, Y.; Shreenivas, B. Inertial sensor based wireless control of a robotic arm. In Proceedings of the 2012 IEEE International Conference on Emerging Signal Processing Applications (ESPA 2012), Las Vegas, NV, USA, 12–14 January 2012; IEEE: Piscataway, NJ, USA, 2012; pp. 87–90. [Google Scholar]
- Chico, A.; Cruz, P.J.; Vásconez, J.P.; Benalcázar, M.E.; Álvarez, R.; Barona, L.; Valdivieso, Á.L. Hand Gesture Recognition and Tracking Control for a Virtual UR5 Robot Manipulator. In Proceedings of the 2021 IEEE Fifth Ecuador Technical Chapters Meeting (ETCM), Cuenca, Ecuador, 12–15 October 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 1–6. [Google Scholar]
- Tao, Y.; Fang, Z.; Ren, F.; Wang, T.; Deng, X.; Sun, B. A Method Based on Wearable Devices for Controlling Teaching of Robots for Human-robot Collaboration. In Proceedings of the 2018 Chinese Automation Congress (CAC), Xi’an, China, 30 November–2 December 2018; IEEE: Piscataway, NJ, USA, 2019; pp. 2270–2276. [Google Scholar]
- Al-Yacoub, A.; Buerkle, A.; Flanagan, M.; Ferreira, P.; Hubbard, E.-M.; Lohse, N. Effective Human-Robot Collaboration through Wearable Sensors. In Proceedings of the IEEE Symposium on Emerging Technologies and Factory Automation (ETFA 2020), Vienna, Austria, 8–11 September 2020; pp. 651–658. [Google Scholar]
- Tortora, S.; Michieletto, S.; Stival, F.; Menegatti, E. Fast human motion prediction for human-robot collaboration with wearable interface. In Proceedings of the 2019 IEEE International Conference on Cybernetics and Intelligent Systems (CIS) and IEEE Conference on Robotics, Automation and Mechatronics (RAM), Bangkok, Thailand, 18–20 November 2019; pp. 457–462. [Google Scholar]
- Resende, A.; Cerqueira, S.; Barbosa, J.; Damasio, E.; Pombeiro, A.; Silva, A.; Santos, C. Ergowear: An ambulatory, non-intrusive, and interoperable system towards a Human-Aware Human-robot Collaborative framework. In Proceedings of the 2021 IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC), Santa Maria da Feira, Portugal, 28–29 April 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 56–61. [Google Scholar]
- Amorim, A.; Guimares, D.; Mendona, T.; Neto, P.; Costa, P.; Moreira, A.P. Robust human position estimation in cooperative robotic cells. Robot. Comput. Integr. Manuf. 2021, 67, 102035. [Google Scholar] [CrossRef]
- Pellois, R.; Brüls, O.; Brüls, B. Human arm motion tracking using IMU measurements in a robotic environnement. In Proceedings of the 21st IMEKO International Symposium on Measurements in Robotics (ISMCR 2018), Mons, Belgium, 26–28 September 2018. [Google Scholar]
- Grapentin, A.; Lehmann, D.; Zhupa, A.; Seel, T. Sparse Magnetometer-Free Real-Time Inertial Hand Motion Tracking. In Proceedings of the IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, Karlsruhe, Germany, 14–16 September 2020; pp. 94–100. [Google Scholar]
- Bright, T.; Adali, S.; Bright, G. Close human robot collaboration by means of a low-cost sensory glove for advanced manufacturing systems. In Proceedings of the International Conference on Electrical, Computer, Communications and Mechatronics Engineering (ICECCME), Mauritius, 7–8 October 2021. [Google Scholar] [CrossRef]
- Digo, E.; Cereatti, A.; Gastaldi, L.; Pastorelli, S.; Caruso, M. Modeling and kinematic optimization of the human upper limb for collaborative robotics. In Proceedings of the 4th IFToMM Italy Conference (IFIT 2022), Naples, Italy, 7–9 September 2022; Springer: Berlin/Heidelberg, Germany, 2022. [Google Scholar]
- Lin, C.J.; Peng, H.Y. A study of the human-robot synchronous control based on IMU and EMG sensing of an upper limb. In Proceedings of the ASCC 2022-2022 13th Asian Control Conference Proceeding, Jeju, Korea, 4–7 May 2022; pp. 1474–1479. [Google Scholar] [CrossRef]
- Rosso, V.; Gastaldi, L.; Pastorelli, S. Detecting Impulsive Movements to Increase Operators’ Safety in Manufacturing. In Mechanisms and Machine Science; 108 MMS; Springer: Berlin/Heidelberg, Germany, 2022; pp. 174–181. [Google Scholar] [CrossRef]
- Tuli, T.B.; Manns, M.; Zeller, S. Human motion quality and accuracy measuring method for human–robot physical interactions. Intell. Serv. Robot. 2022, 15, 503–512. [Google Scholar] [CrossRef]
- Tarabini, M.; Marinoni, M.; Mascetti, M.; Marzaroli, P.; Corti, F.; Giberti, H.; Mascagni, P.; Villa, A.; Eger, T. Real-Time Monitoring of the Posture at the Workplace Using Low Cost Sensors. In Congress of the International Ergonomics Association; Springer: Berlin/Heidelberg, Germany, 2018; pp. 678–688. [Google Scholar]
- Tarabini, M.; Marinoni, M.; Mascetti, M.; Marzaroli, P.; Corti, F.; Giberti, H.; Villa, A.; Mascagni, P. Monitoring the human posture in industrial environment: A feasibility study. In Proceedings of the 2018 IEEE Sensors Applications Symposium, SAS 2018-Proceedings, Seoul, Korea, 12–14 March 2018; pp. 1–6. [Google Scholar]
- Caputo, F.; D’Amato, E.; Greco, A.; Notaro, I.; Spada, S. Human posture tracking system for industrial process design and assessment. Adv. Intell. Syst. Comput. 2018, 722, 450–455. [Google Scholar] [CrossRef]
- Digo, E.; Gastaldi, L.; Antonelli, M.; Pastorelli, S.; Cereatti, A.; Caruso, M. Real-time estimation of upper limbs kinematics with IMUs during typical industrial gestures. Procedia Comput. Sci. 2022, 200, 1041–1047. [Google Scholar] [CrossRef]
- Borghetti, M.; Bellitti, P.; Lopomo, N.F.; Serpelloni, M.; Sardini, E. Validation of a modular and wearable system for tracking fingers movements. Acta IMEKO 2020, 9, 157–164. [Google Scholar] [CrossRef]
- Bellitti, P.; Bona, M.; Borghetti, M.; Sardini, E.; Serpelloni, M. Application of a Modular Wearable System to Track Workers’ Fingers Movement in Industrial Environments. In Proceedings of the 2019 IEEE International Workshop on Metrology for Industry 4.0 and IoT, MetroInd 4.0 and IoT 2019-Proceedings, Naples, Italy, 4–6 June 2019; pp. 137–142. [Google Scholar]
- Fang, W.; Zheng, L.; Xu, J. Self-contained optical-inertial motion capturing for assembly planning in digital factory. Int. J. Adv. Manuf. Technol. 2017, 93, 1243–1256. [Google Scholar] [CrossRef]
- Manns, M.; Tuli, T.B.; Schreiber, F. Identifying human intention during assembly operations using wearable motion capturing systems including eye focus. Procedia CIRP 2021, 104, 924–929. [Google Scholar] [CrossRef]
- Al-Amin, M.; Tao, W.; Doell, D.; Lingard, R.; Yin, Z.; Leu, M.C.; Qin, R. Action recognition in manufacturing assembly using multimodal sensor fusion. Procedia Manuf. 2019, 39, 158–167. [Google Scholar] [CrossRef]
- Al-Amin, M.; Qin, R.; Tao, W.; Doell, D.; Lingard, R.; Yin, Z.; Leu, M.C. Fusing and refining convolutional neural network models for assembly action recognition in smart manufacturing. Proc. Inst. Mech. Eng. Part C J. Mech. Eng. Sci. 2022, 236, 2046–2059. [Google Scholar] [CrossRef]
- Kubota, A.; Iqbal, T.; Shah, J.A.; Riek, L.D. Activity recognition in manufacturing: The roles of motion capture and sEMG+inertial wearables in detecting fine vs. gross motion. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 6533–6539. [Google Scholar]
- Calvo, A.F.; Holguin, G.A.; Medeiros, H. Human Activity Recognition Using Multi-modal Data Fusion. In Iberoamerican Congress on Pattern Recognition; Springer: Cham, Switzerland, 2018; pp. 946–953. [Google Scholar]
- Antonelli, M.; Digo, E.; Pastorelli, S.; Gastaldi, L. Wearable MIMUs for the identification of upper limbs motion in an industrial context of human-robot interaction. In Proceedings of the International Conference on Informatics in Control, Automation and Robotics, Alsace, France, 21–23 July 2015; Springer: Berlin/Heidelberg, Germany, 2021. [Google Scholar]
- Digo, E.; Antonelli, M.; Pastorelli, S.; Gastaldi, L. Upper Limbs Motion Tracking for Collaborative Robotic Applications. In Human Interaction, Emerging Technologies and Future Applications III (IHIET 2020); Springer: Cham, Switzerland, 2020; pp. 391–397. [Google Scholar]
- Maurice, P.; Malaisé, A.; Amiot, C.; Paris, N.; Richard, G.-J.; Rochel, O.; Ivaldi, S. Human movement and ergonomics: An industry-oriented dataset for collaborative robotics. Int. J. Rob. Res. 2019, 38, 1529–1537. [Google Scholar] [CrossRef] [Green Version]
- Li, J.; Wang, Z.; Jiang, Y.; Qiu, S.; Wang, J.; Tang, K. Networked gesture tracking system based on immersive real-time interaction. In Proceedings of the 2017 IEEE 21st International Conference on Computer Supported Cooperative Work in Design, CSCWD 2017, Wellington, New Zealand, 26–28 April 2017; pp. 139–144. [Google Scholar]
- Halme, R.J.; Lanz, M.; Kämäräinen, J.; Pieters, R.; Latokartano, J.; Hietanen, A. Review of vision-based safety systems for human-robot collaboration. Procedia CIRP 2018, 72, 111–116. [Google Scholar] [CrossRef]
- Ligorio, G.; Sabatini, A.M. Dealing with magnetic disturbances in human motion capture: A survey of techniques. Micromachines 2016, 7, 43. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Digo, E.; Pastorelli, S.; Gastaldi, L. A Narrative Review on Wearable Inertial Sensors for Human Motion Tracking in Industrial Scenarios. Robotics 2022, 11, 138. https://doi.org/10.3390/robotics11060138
Digo E, Pastorelli S, Gastaldi L. A Narrative Review on Wearable Inertial Sensors for Human Motion Tracking in Industrial Scenarios. Robotics. 2022; 11(6):138. https://doi.org/10.3390/robotics11060138
Chicago/Turabian StyleDigo, Elisa, Stefano Pastorelli, and Laura Gastaldi. 2022. "A Narrative Review on Wearable Inertial Sensors for Human Motion Tracking in Industrial Scenarios" Robotics 11, no. 6: 138. https://doi.org/10.3390/robotics11060138
APA StyleDigo, E., Pastorelli, S., & Gastaldi, L. (2022). A Narrative Review on Wearable Inertial Sensors for Human Motion Tracking in Industrial Scenarios. Robotics, 11(6), 138. https://doi.org/10.3390/robotics11060138