State Estimation Using a Randomized Unscented Kalman Filter for 3D Skeleton Posture
Abstract
:1. Introduction
2. Related Works
2.1. Extended Kalman Filter (EKF)
Algorithm 1: Extended Kalman Filter (EKF) [23] |
Input: denotes the sensor-measured values of the k-frames. Output: denotes the estimated values of the k-frames. 1: Initial parameters each time step (k) = 1, initial mean , initial covariance 2: for k = 1 to N (max no. of iterations) do 3: Compute Jacobian: (defined in the Equation (10)) Time update (prediction):
return estimated data () |
2.2. Unscented Kalman Filter (UKF)
Algorithm 2: Unscented Kalman Filter (UKF) [26] |
Input: denotes the sensor-measured values of the k-frames. Output: denotes the estimated values of the k-frames. 1: Initial parameters each time step k = 1, initial mean , initial covariance 2: for k = 1 to N (total number of iterations) do 3: Generate sigma points: using the Equation (14) Time update (prediction): 4: State prediction step: Predict , and using (defined in Equa tion (11)). 5: Regenerate 2L + 1 (defined in Equation (14)) sigma points using 6: Measurement prediction step: Predict and using (de fined in Equation (15)). Measurement update (correction):
return estimated data () |
3. Proposed 3D Skeleton Posture Estimation Using a Randomized Unscented Kalman Filter
Algorithm 3: Proposed method (RUKF) |
Input: denotes the sensor-measured values of the k-frames. Output: denotes the estimated values of the k-frames. 1: Initial parameters k = 1, initial mean , initial covariance 2: for k = 1 to N (maximum no. of iterations) do 3: State covariance update function: (defined in the Equation (35)) Time update (prediction): 4: State prediction step: Predict , and (defined in the Equations (25) and (26)) 5: Measurement estimate prediction using , and (using step. 4) 6: Measurement covariance update function using and an observation function (Using step. 5) 7: Measurement covariance update function (defined in the Equation (29)) using , and (Using the step. 6) 8: The cross-covariance update function defined in the Equation (30) using , and Measurement update (correction) 9: Update the cross-covariance defined in Equation (31), Kalman gain defined in Equation (32), state estimate defined in Equation (33) and sate covariance (defined in Equation (34)) end for return estimated data () |
4. Experimental Results
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Sophie, A.; Sohaib, L.; Joelle, T.; Thierry, D. Action recognition based on 2D skeletons extracted from RGB videos. MATEC Web. Conf. 2019, 277, 1–14. [Google Scholar]
- Aggarwal, J.K.; Xia, L. Human activity recognition from 3d data: A review. Pattern Recognit. Lett. 2014, 48, 70–80. [Google Scholar] [CrossRef]
- Biswas, K.K.; Basu, S.K. Gesture recognition using Microsoft Kinect. In Proceedings of the 5th International Conference on Automation, Robotics and Applications, Wellington, New Zealand, 6–8 December 2011. [Google Scholar]
- Xia, L.; Chia-Chih, C.; Aggarwal, J.K. Human Detection Using Depth Information by Kinect. In Proceedings of the Computer Vision and Pattern Recognition (CVPR), Colorado Springs, CO, USA, 20–25 June 2011. [Google Scholar]
- Yunru, M.; Kumar, M.; Nichola, C.; Xiangbin, W.; Ye, M.; Yanxin, Z. The validity and reliability of a Kinect v2-based Gait Analysis system for children with cerebral Palsy. Sensors 2019, 19, 1–16. [Google Scholar]
- Andersen, M.; Jensen, T.; Lisouski, P.; Mortensen, A.; Hansen, M. Kinect Depth Sensor Evaluation for Computer Vision Applications; Technical Report, ECE-TR-6; Department of Electrical and Computer Engineering, Aarhus University: Aarhus, Denmark, 2012; pp. 1–13. [Google Scholar]
- Kong, L.; Xiaohui, Y.; Maharjan, A.M. A hybrid framework for automatic joint detection of human poses in depth frames. Pattern Recognit. 2018, 77, 216–225. [Google Scholar] [CrossRef]
- Microsoft Azure Kinect Camera. Available online: https://www.microsoft.com/en-us/research/project/skeletal-tracking-on-azure-kinect (accessed on 10 April 2020).
- Mallick, T.; Pratim Das, P.; Arun Kumar, M. Characterizations of Noise in Kinect Depth images: A review. IEEE Sens. J. 2014, 14, 1731–1740. [Google Scholar] [CrossRef]
- Ju, S.; Sen-ching, S.C. Layer Depth Denoising and Completion for Structured-Light RGB-D Cameras. In Proceedings of the 2013 IEEE Conference on Computer Vision and Pattern Recognition, Portland, OR, USA, 23–28 June 2013. [Google Scholar]
- Peter, F.; Michael, B.; Diego, R. Kinect v2 for Mobile Robot Navigation: Evaluation and Modeling. In Proceedings of the 2015 International Conference on Advanced Robotics (ICAR), Istanbul, Turkey, 27–31 July 2015. [Google Scholar]
- Camplani, M.; Salgado, L. Efficient spatio-temporal hole filling strategy for Kinect depth maps. In Proceedings of the SPIE, Burlingame, CA, USA, 30 January 2012. [Google Scholar]
- Nguyen, C.V.; Izadi, D.L.S. Modelling Kinect sensor noise for improved 3D reconstruction and tracking. In Proceedings of the 2012 Second International Conference on 3D Imaging, Modeling, Processing, Visualization & Transmission, Zurich, Switzerland, 13–15 October 2012. [Google Scholar]
- Lin, B.; Su, M.; Cheng, P.; Tseng, P.; Chen, S. Temporal and spatial denoising of depth maps. Sensors 2015, 15, 18506–18525. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Costilla-Reyes, O.; Scully, P.; Ozanyan, K.B. Temporal pattern recognition in gait activities recorded with a footprint imaging sensor system. IEEE Sens. J. 2016, 16, 8815–8822. [Google Scholar] [CrossRef]
- Essmaeel, K.; Gallo, L.; Damiani, E.; De Pietro, G.; Albert Dipanda, A. Temporal denoising of Kinect depth data. In Proceedings of the 2012 Eighth International Conference on Signal Image Technology and Internet Based Systems, Naples, Italy, 25–29 November 2012. [Google Scholar]
- Sergey, M.; Vatolin, D.; Yury, B.; Maxim, S. Temporal filtering for depth maps generated by Kinect depth camera. In Proceedings of the 2011 3DTV Conference: The True Vision—Capture, Transmission and Display of 3D Video (3DTV-CON), Antalya, Turkey, 16–18 May 2011. [Google Scholar]
- Simone, M.; Giancarlo, C. Joint denoising and interpolation of depth maps for MS Kinect Sensors. In Proceedings of the 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Kyoto, Japan, 25–30 March 2012. [Google Scholar]
- Rashi, C.; Dasgupta, H. An approach for noise removal on Depth Images. In Proceedings of the Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016. [Google Scholar]
- Junyi, L.; Xiaojin, G.; Jilin, L. Guided In painting and filtering for Kinect Depth Maps. In Proceedings of the 21st International Conference on Pattern Recognition (ICPR 2012), Tsukuba, Japan, 11–15 November 2012. [Google Scholar]
- Loumponias, K.; Vretos, N.; Daras, P.; Tsaklidis, G. Using Kalman filter and Tobit Kalman filter in order to improve the motion recorded by Kinect sensor II. In Proceedings of the 29th Panhellenic statistics conference, Nassau, Bahamas, 4–7 May 2016. [Google Scholar]
- Del Rincón, J.M.; Makris, D.; Uruñuela, C.O.; Nebel, J.C. Tracking human position and lower body parts using Kalman and particle filters constrained by human biomechanics. IEEE Trans. Syst. Man Cybern. Part B 2011, 41, 26–37. [Google Scholar] [CrossRef]
- Jody, S.; Fumio, H.; John, A. Application of extended Kalman filter for improving the accuracy and smoothness of Kinect skeleton-joint estimates. J. Eng. Math. 2014, 88, 161–175. [Google Scholar]
- Amir, B.; Lora, C.; John, L.C. Hip and Trunk kinematics estimation in gait through Kalman filter using IMU data at the Ankle. IEEE Sens. J. 2018, 18, 4253–4260. [Google Scholar]
- Gustafsson, F.; Hendeby, G. Some relations between extended and unscented Kalman filters. IEEE Trans. Signal Process. 2012, 60, 545–555. [Google Scholar] [CrossRef] [Green Version]
- Manuel, G. Kinematic Data Filtering with Unscented Kalman Filter-Application to Senior Fitness Tests Using the Kinect Sensor; University of Lisbon: Lisbon, Portugal, 2017; pp. 1–8. Available online: https://fenix.tecnico.ulisboa.pt (accessed on 1 March 2020).
- Moon, S.; Park, Y.; Ko, D.W.; Suh, H. Multiple Kinect Sensor Fusion for Human Skeleton Tracking Using Kalman Filtering. Int. J. Adv. Robot. Syst. 2016, 13, 1–10. [Google Scholar] [CrossRef] [Green Version]
- Julier, S.J.; Uhlmann, J.K. A new extension of the Kalman filter to nonlinear systems. In Proceedings of the Signal Processing, Sensor Fusion, and Target Recognition VI, Orlando, FL, USA, 21–25 April 1997. [Google Scholar]
- Carpenter, J.; Clifford, P.; Fearnhead, P. An improved particle filter for non-linear problems. IEEE Proc. Radar Sonar Navig. 1999, 146, 2–7. [Google Scholar] [CrossRef]
- Li, T.; Bolic, M.; Djuric, P.M. Resampling methods for particle filtering: Classification, implementation, and strategies. IEEE Signal Process. Mag. 2015, 32, 70–86. [Google Scholar] [CrossRef]
- Elfring, J.; Torta, E.; van de Molengraft, R. Particle Filters: A Hands-On Tutorial. Sensors 2021, 21, 438. [Google Scholar] [CrossRef] [PubMed]
- Gustafsson, F.; Gunnarsson, F.; Bergman, N.; Forssell, U.; Jansson, J.; Karlsson, R.; Nordlund, P.J. Particle Filters for Positioning, Navigation, and Tracking. IEEE Trans. Signal Process. 2002, 50, 425–437. [Google Scholar] [CrossRef] [Green Version]
- Nobahari, H.; Zandavi, S.M.; Mohammadkarimi, H. Simplex filter: A novel heuristic filter for nonlinear systems state estimation. Appl. Soft Comput. 2016, 49, 474–484. [Google Scholar] [CrossRef]
- Zandavi, S.M.; Vera, C. State estimation of nonlinear dynamic system using novel heuristic filter based on genetic algorithm. Soft Comput. 2018, 23, 5559–5570. [Google Scholar] [CrossRef]
- Straka, O.; Dunik, J.; Simandl, M. Randomized Unscented Kalman Filter in Target Tracking. In Proceedings of the 2012 15th International Conference on Information Fusion, Singapore, 9–12 July 2012. [Google Scholar]
- Dunik, J.; Straka, O.; Simandl, M. The development of a Randomized Unscented Kalman Filter. IFAC Proc. Vol. 2011, 44, 8–13. [Google Scholar] [CrossRef] [Green Version]
- Fei, H.; Brian, R.; William, H.; Hao, Z. Space time representation of people based on 3D skeletal data: A review. Comput. Vis. Image Underst. 2017, 158, 85–105. [Google Scholar]
- Tenorth, M.; Bandouch, J.; Beetz, M. The TUM kitchen data set of everyday manipulation activities for motion tracking and action recognition. In Proceedings of the IEEE 12th International Conference on Computer Vision workshops, Kyoto, Japan, 27 September–4 October 2009. [Google Scholar]
- Kazemi, V.; Burenius, M.; Azizpour, H.; Sullivan, J. Multi-view body part recognition with random forests. In Proceedings of the British Machine Vision Conference, Bristol, UK, 9–13 September 2013. [Google Scholar]
- Alexandre, B.; Christian, V.; Sergi Bermudez, B.; Élvio, G.; Fátima, B.; Filomena, C.; Simão, O.; Hugo, G. A dataset for the automatic assessment of functional senior fitness tests using Kinect and physiological sensors. In Proceedings of the International Conference on Technology and Innovation in Sports, Health and Wellbeing (TISHW), Vila Real, Portugal, 1–3 December 2016. [Google Scholar]
- Liu, J.; Shahroudy, A.; Perez, M.; Wang, G.; Duan, L.; Kot, A.C. NTU RGB+D 120: A Large-Scale Benchmark for 3D Human Activity Understanding. IEEE Trans. Pattern Anal. Mach. Intell. 2020, 42, 2684–2701. [Google Scholar]
- Marco, C.; Matteo, M.; Emanuele, M. Skeleton estimation and tracking by means of depth data fusion from depth camera networks. Robot. Auton. Syst. 2018, 110, 151–159. [Google Scholar]
- George, R.; Kyria, P.; Anita, B. A method for determination of upper extremity kinematics. Gait Posture 2002, 15, 113–119. [Google Scholar]
- Ben, C.; Adeline, P.; Sion, H.; Majid, M. Skeleton-Free Body Pose Estimation from Depth Images for Movement Analysis. In Proceedings of the 2015 IEEE International Conference on Computer Vision Workshop (ICCVW), Santiago, Chile, 7–13 December 2015. [Google Scholar]
- Wei, S.; Ke, D.; Xiang, B.; Tommer, L. Exemplar-Based Human Action Pose Correction and Tagging. In Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition, Providence, RI, USA, 16–21 June 2012. [Google Scholar]
- Adeline, P.; Lili, T.; Sion, H.; Massimo, C.; Dima, D.; Majid, M. Online quality assessment of human movement from skeleton data. In Proceedings of the British Machine Vision Conference 2014, Nottingham, UK, 1–5 September 2014. [Google Scholar]
Joints | EKF | UKF | Proposed | |||
---|---|---|---|---|---|---|
Mean | ±SD | Mean | ±SD | Mean | ±SD | |
Pelvis | 1.11 | ±0.46 | 0.58 | ±0.40 | 0.35 | ±0.26 |
Spine_Naval | 1.78 | ±0.85 | 0.72 | ±0.60 | 0.28 | ±0.29 |
Spine_chest | 1.31 | ±0.78 | 1.11 | ±0.81 | 0.39 | ±0.39 |
Shoulder_left | 1.64 | ±0.72 | 1.41 | ±1.04 | 0.35 | ±0.19 |
Elbow_Left | 1.77 | ±0.92 | 1.55 | ±1.80 | 0.49 | ±0.26 |
Hand_left | 4.82 | ±3.04 | 4.78 | ±2.90 | 1.43 | ±0.86 |
Hand_Right | 3.51 | ±1.80 | 3.18 | ±1.79 | 1.21 | ±0.77 |
Wrist_Left | 4.23 | ±2.46 | 2.94 | ±1.91 | 1.37 | ±0.83 |
Eye_Right | 3.42 | ±9.86 | 4.05 | ±9.68 | 1.58 | ±5.47 |
Ear_Right | 5.83 | ±17.63 | 4.63 | ±15.6 | 1.60 | ±4.78 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Musunuri, Y.R.; Kwon, O.-S. State Estimation Using a Randomized Unscented Kalman Filter for 3D Skeleton Posture. Electronics 2021, 10, 971. https://doi.org/10.3390/electronics10080971
Musunuri YR, Kwon O-S. State Estimation Using a Randomized Unscented Kalman Filter for 3D Skeleton Posture. Electronics. 2021; 10(8):971. https://doi.org/10.3390/electronics10080971
Chicago/Turabian StyleMusunuri, Yogendra Rao, and Oh-Seol Kwon. 2021. "State Estimation Using a Randomized Unscented Kalman Filter for 3D Skeleton Posture" Electronics 10, no. 8: 971. https://doi.org/10.3390/electronics10080971
APA StyleMusunuri, Y. R., & Kwon, O. -S. (2021). State Estimation Using a Randomized Unscented Kalman Filter for 3D Skeleton Posture. Electronics, 10(8), 971. https://doi.org/10.3390/electronics10080971