Experimental Procedure for the Metrological Characterization of Time-of-Flight Cameras for Human Body 3D Measurements
Abstract
:1. Introduction
2. Materials and Methods
2.1. Specifications of the Evaluated Sensors
2.2. Evaluation of Error Sources and Performance
2.3. Measuring Set-Up
3. Evaluation of Temperature-Related Errors
4. Evaluation of Depth-Related Errors
4.1. Depth Amplitude Errors Evaluation
4.2. Depth Distortion Evaluation
4.3. Temporal Errors Evaluation
4.4. Overall Depth Measurement Uncertainty Evaluation
5. Application Example: 3D Reconstruction
5.1. Object Reconstruction
- Each point cloud was manually inspected to remove the elements of the scene not belonging to the cylinder. This was performed by applying a depth filter to cut off data outside the area of interest, thus obtaining only the point cloud of the cylinder.
- The camera performance was evaluated by comparing the external radius of the measured cylinder with respect to the nominal one of 122 mm. The measured external radius was estimated individually for each acquisition by analyzing the point cloud with MATLAB using a cylindrical fit provided by the software.
- For each station, the mean value over 30 frames of the external radius and the corresponding standard deviation was computed.
5.2. Human Body Reconstruction
- : angle between the vertical axis and the left forearm.
- : angle between the vertical axis and the right forearm.
- : angle between the left forearm and the left arm.
- : angle between the right forearm and the right arm.
- : angle between the left arm and the left hand.
- : angle between the right arm and the right hand.
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Giancola, S.; Valenti, M.; Sala, R. State-of-the-Art Devices Comparison. In A Survey on 3D Cameras: Metrological Comparison of Time-of-Flight, Structured-Light and Active Stereoscopy Technologies; SpringerBriefs in Computer Science; Springer: Cham, Switzerland, 2018; pp. 29–39. [Google Scholar]
- Horaud, R.; Hansard, M.; Evangelidis, G.; Ménier, C. An overview of depth cameras and range scanners based on time-of-flight technologies. Mach. Vis. Appl. 2016, 27, 1005–1020. [Google Scholar] [CrossRef] [Green Version]
- Rapp, H.; Frank, M.; Hamprecht, F.A.; Jähne, B. A theoretical and experimental investigation of the systematic errors and statistical uncertainties of time-of-flight-cameras. Int. J. Intell. Syst. Technol. Appl. 2008, 5, 402–413. [Google Scholar] [CrossRef] [Green Version]
- Frank, M.; Plaue, M.; Rapp, H.; Koethe, U.; Jähne, B.; Hamprecht, F.A. Theoretical and experimental error analysis of continuous-wave time-of-flight range cameras. Opt. Eng. 2009, 48, 013602. [Google Scholar]
- He, Y.; Liang, B.; Zou, Y.; He, J.; Yang, J. Depth Errors Analysis and Correction for Time-of-Flight (ToF) Cameras. Sensors 2017, 17, 92. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Sarbolandi, H.; Lefloch, D.; Kolb, A. Kinect range sensing: Structured-light versus Time-of-Flight Kinect. Comput. Vis. Image Underst. 2015, 139, 1–20. [Google Scholar] [CrossRef] [Green Version]
- Wasenmüller, O.; Stricker, D. Comparison of Kinect V1 and V2 Depth Images in Terms of Accuracy and Precision. In Computer Vision—ACCV 2016 Workshops; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2017; Volume 10117, pp. 34–45. [Google Scholar]
- Corti, A.; Giancola, S.; Mainetti, G.; Sala, R. A metrological characterization of the Kinect V2 time-of-flight camera. Robot. Auton. Syst. 2016, 75, 584–594. [Google Scholar] [CrossRef]
- He, Y.; Chen, S. Recent Advances in 3D Data Acquisition and Processing by Time-of-Flight Camera. IEEE Access 2019, 7, 12495–12510. [Google Scholar] [CrossRef]
- Chen, S.; Yi, J.; Ding, H.; Wang, Z.; Min, J.; Wu, H.; Cao, S.; Mu, J. 3D Object Reconstruction with Kinect Based on QR Code Calibration. In Proceedings of the 2020 IEEE International Conference on Artificial Intelligence and Computer Applications (ICAICA), Dalian, China, 27–29 June 2020; pp. 459–463. [Google Scholar]
- He, H.; Wang, H.; Sun, L. Research on 3D point-cloud registration technology based on Kinect V2 sensor. In Proceedings of the 2018 Chinese Control And Decision Conference (CCDC), Shenyang, China, 9–11 June 2018; pp. 1264–1268. [Google Scholar]
- Shen, B.; Yin, F.; Chou, W. A 3D Modeling Method of Indoor Objects Using Kinect Sensor. In Proceedings of the 2017 10th International Symposium on Computational Intelligence and Design (ISCID), Hangzhou, China, 9–10 December 2017; Volume 1, pp. 64–68. [Google Scholar]
- Zhao, Y.; Carraro, M.; Munaro, M.; Menegatti, E. Robust multiple object tracking in RGB-D camera networks. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; pp. 6625–6632. [Google Scholar]
- Jiao, J.; Yuan, L.; Tang, W.; Deng, Z.; Wu, Q. A Post-Rectification Approach of Depth Images of Kinect v2 for 3D Reconstruction of Indoor Scenes. ISPRS Int. J. Geo-Inf. 2017, 6, 349. [Google Scholar] [CrossRef] [Green Version]
- Chen, Y.; Zhang, B.; Zhou, J.; Wang, K. Real-time 3D unstructured environment reconstruction utilizing VR and Kinect-based immersive teleoperation for agricultural field robots. Comput. Electron. Agric. 2020, 175, 105579. [Google Scholar] [CrossRef]
- Oliver, A.; Kang, S.; Wünsche, B.C.; MacDonald, B. Using the Kinect as a Navigation Sensor for Mobile Robotics. In Proceedings of the 27th Conference on Image and Vision Computing New Zealand, Dunedin, New Zealand, 26–28 November 2012; Association for Computing Machinery: New York, NY, USA, 2012; pp. 509–514. [Google Scholar]
- Popov, V.L.; Ahmed, S.A.; Shakev, N.G.; Topalov, A.V. Detection and Following of Moving Targets by an Indoor Mobile Robot using Microsoft Kinect and 2D Lidar Data. In Proceedings of the 2018 15th International Conference on Control, Automation, Robotics and Vision (ICARCV), Singapore, 18–21 November 2018; pp. 280–285. [Google Scholar]
- Lai, C.C.; Su, K.L. Development of an intelligent mobile robot localization system using Kinect RGB-D mapping and neural network. Comput. Electr. Eng. 2018, 67, 620–628. [Google Scholar] [CrossRef]
- Henry, P.; Krainin, M.; Herbst, E.; Ren, X.; Fox, D. RGB-D mapping: Using Kinect-style depth cameras for dense 3D modeling of indoor environments. Int. J. Robot. Res. 2012, 31, 647–663. [Google Scholar] [CrossRef]
- Caruso, L.; Russo, R.; Savino, S. Microsoft Kinect V2 vision system in a manufacturing application. Robot. Comput.-Integr. Manuf. 2017, 48, 174–181. [Google Scholar] [CrossRef]
- Rodriguez-Garavito, C.H.; Camacho-Munoz, G.; Álvarez-Martínez, D.; Cardenas, K.V.; Rojas, D.M. 3D Object Pose Estimation for Robotic Packing Applications. In Applied Computer Sciences in Engineering. WEA 2018. Communications in Computer and Information Science; Springer: Cham, Switzerland, 2018; Volume 916, pp. 453–463. [Google Scholar]
- Nascimento, H.; Mujica, M.; Benoussaad, M. Collision Avoidance Interaction Between Human and a Hidden Robot Based on Kinect and Robot Data Fusion. IEEE Robot. Autom. Lett. 2021, 6, 88–94. [Google Scholar] [CrossRef]
- Pasinetti, S.; Nuzzi, C.; Lancini, M.; Fornaser, A.; Docchio, F.; Sansoni, G. Development and characterization of a safety system for robotic cells based on multiple Time of Flight (TOF) cameras and point cloud analysis. In Proceedings of the 2018 IEEE International Workshop on Metrology for Industry 4.0 and IoT, Brescia, Italy, 16–18 April 2018. [Google Scholar]
- Halme, R.-J.; Minna, L.; Kämäräinen, J.; Pieters, R.; Latokartano, J.; Hietanen, A. Review of vision-based safety systems for human-robot collaboration. Procedia CIRP 2018, 72, 111–116. [Google Scholar] [CrossRef]
- Palmieri, P.; Melchiorre, M.; Scimmi, L.S.; Pastorelli, S.; Mauro, S. Human Arm Motion Tracking by Kinect Sensor Using Kalman Filter for Collaborative Robotics. In Advances in Italian Mechanism Science; IFToMM ITALY 2020; Mechanisms and Machine Science; Springer: Cham, Switzerland, 2021; Volume 91, pp. 326–334. [Google Scholar]
- Nuzzi, C.; Ghidini, S.; Pagani, R.; Pasinetti, S.; Coffetti, G.; Sansoni, G. Hands-Free: A robot augmented reality teleoperation system. In Proceedings of the 2020 17th International Conference on Ubiquitous Robots (UR), Kyoto, Japan, 22–26 June 2020. [Google Scholar]
- Sankar, S.; Tsai, C.-Y. ROS-Based Human Detection and Tracking from a Wireless Controlled Mobile Robot Using Kinect. Appl. Syst. Innov. 2019, 2, 5. [Google Scholar] [CrossRef] [Green Version]
- Cao, Z.; Hidalgo, G.; Simon, T.; Wei, S.-E.; Sheikh, Y. OpenPose: Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields. IEEE Trans. Pattern Anal. Mach. Intell. 2019, 43, 172–186. [Google Scholar] [CrossRef] [Green Version]
- Carraro, M.; Munaro, M.; Burke, J.; Menegatti, E. Real-time marker-less multi-person 3d pose estimation in rgb-depth camera networks. arXiv 2017, arXiv:1710.06235. [Google Scholar]
- Nuzzi, C.; Pasinetti, S.; Pagani, R.; Ghidini, S.; Beschi, M.; Coffetti, G.; Sansoni, G. MEGURU: A gesture-based robot program builder for Meta-Collaborative workstations. Robot. Comput.-Integr. Manuf. 2021, 68, 102085. [Google Scholar] [CrossRef]
- Torres, S.H.M.; Kern, M.J. 7 DOF industrial robot controlled by hand gestures using microsoft kinect v2. In Proceedings of the 2017 IEEE 3rd Colombian Conference on Automatic Control (CCAC), Cartagena, Colombia, 18–20 October 2017; pp. 1–6. [Google Scholar]
- Ganguly, B.; Vishwakarma, P.; Biswas, S.; Rahul. Kinect Sensor Based Single Person Hand Gesture Recognition for Man-Machine Interaction. In Computational Advancement in Communication Circuits and Systems; Lecture Notes in Electrical Engineering; Springer: Singapore, 2020; Volume 575, pp. 139–144. [Google Scholar]
- Roy, G.; Bhuiya, A.; Mukherjee, A.; Bhaumik, S. Kinect Camera Based Gait Data Recording and Analysis for Assistive Robotics-An Alternative to Goniometer Based Measurement Technique. Procedia Comput. Sci. 2018, 133, 763–771. [Google Scholar] [CrossRef]
- Pasinetti, S.; Fornaser, A.; Lancini, M.; De Cecco, M.; Sansoni, G. Assisted Gait Phase Estimation Through an Embedded Depth Camera Using Modified Random Forest Algorithm Classification. IEEE Sens. J. 2020, 20, 3343–3355. [Google Scholar] [CrossRef]
- Pasinetti, S.; Nuzzi, C.; Covre, N.; Luchetti, A.; Maule, L.; Serpelloni, M.; Lancini, M. Validation of Marker-Less System for the Assessment of Upper Joints Reaction Forces in Exoskeleton Users. Sensors 2020, 20, 3899. [Google Scholar] [CrossRef] [PubMed]
- Mettel, M.R.; Alekseew, M.; Stocklöw, C.; Braun, A. Safety Services in Smart Environments Using Depth Cameras. In Ambient Intelligence: AmI 2017; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2017; Volume 10217, pp. 80–93. [Google Scholar]
- Butaslac, I.I.; Luchetti, A.; Parolin, E.; Fujimoto, Y.; Kanbara, M.; De Cecco, M.; Kato, H. The Feasibility of Augmented Reality as a Support Tool for Motor Rehabilitation. Int. Conf. Augment. Real. Virtual Real. Comput. Graph. 2020, 12243, 165–173. [Google Scholar]
- Luchetti, A.; Parolin, E.; Butaslac, I.; Fujimoto, Y.; Kanbara, M.; Bosetti, P.; De Cecco, M.; Kato, H. Stepping over Obstacles with Augmented Reality based on Visual Exproprioception. In Proceedings of the 2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), Recife, Brazil, 9–13 November 2020. [Google Scholar]
- Albert, J.A.; Owolabi, V.; Gebel, A.; Brahms, C.M.; Granacher, U.; Arnrich, B. Evaluation of the Pose Tracking Performance of the Azure Kinect and Kinect v2 for Gait Analysis in Comparison with a Gold Standard: A Pilot Study. Sensors 2020, 20, 5104. [Google Scholar] [CrossRef] [PubMed]
- Tölgyessy, M.; Dekan, M.; Chovanec, L.; Hubinský, P. Evaluation of the Azure Kinect and Its Comparison to Kinect V1 and Kinect V2. Sensors 2021, 21, 413. [Google Scholar] [CrossRef] [PubMed]
- Choo, B.; Landau, M.; DeVore, M.; Beling, P.A. Statistical Analysis-Based Error Models for the Microsoft KinectTM Depth Sensor. Sensors 2014, 14, 17430–17450. [Google Scholar] [CrossRef] [Green Version]
- Kurillo, G.; Hemingway, E.; Cheng, M.-L.; Cheng, L. Evaluating the Accuracy of the Azure Kinect and Kinect v2. Sensors 2022, 22, 2469. [Google Scholar] [CrossRef]
- Pasinetti, S.; Hassan, M.M.; Eberhardt, J.; Lancini, M.; Docchio, F.; Sansoni, G. Performance Analysis of the PMD Camboard Picoflexx Time-of-Flight Camera for Markerless Motion Capture Applications. IEEE Trans. Instrum. Meas. 2019, 68, 4456–4471. [Google Scholar] [CrossRef] [Green Version]
- Crenna, F.; Rossi, G.; Palazzo, A. Measurement of human movement under metrological controlled conditions. Acta Imeko 2015, 4, 48–56. [Google Scholar] [CrossRef] [Green Version]
- Hussmann, S.; Knoll, F.; Edeler, T. Modulation Method Including Noise Model for Minimizing the Wiggling Error of TOF Cameras. IEEE Trans. Instrum. Meas. 2014, 63, 1127–1136. [Google Scholar] [CrossRef]
- Bamji, C.S.; Mehta, S.; Thompson, B.; Elkhatib, T.; Wurster, S.; Akkaya, O.; Payne, A.; Godbaz, J.; Fenton, M.; Rajasekaran, V.; et al. IMpixel 65nm BSI 320MHz demodulated TOF Image sensor with 3μm global shutter pixels and analog binning. In Proceedings of the 2018 IEEE International Solid—State Circuits Conference—(ISSCC), San Francisco, CA, USA, 11–15 February 2018. [Google Scholar]
- Foix, S.; Alenya, G.; Torras, C. Lock-in time-of-flight (ToF) cameras: A survey. IEEE Sens. J. 2011, 11, 1917–1926. [Google Scholar] [CrossRef] [Green Version]
- Fürsattel, P.; Placht, S.; Balda, M.; Schaller, C.; Hofmann, H.; Maier, A.; Riess, C. A Comparative Error Analysis of Current Time-of-Flight Sensors. IEEE Trans. Comput. Imaging 2016, 2, 27–41. [Google Scholar] [CrossRef]
- Joint Committee for Guides in Metrology (JCGM). Guide to the Expression of Uncertainty in Measurement (GUM). 2008. Available online: https://www.bipm.org (accessed on 6 October 2022).
- Kahlmann, T.; Remondino, F.; Ingensand, H. Calibration for increased accuracy of the range imaging camera swissranger. In Proceedings of the ISPRS Commission V Symposium ‘Image Engineering and Vision Metrology’, Dresden, Germany, 25–27 September 2006; Volume 36, pp. 136–141. [Google Scholar]
- Chiarabando, F.; Chiarabando, R.; Piatti, D.; Rinaudo, F. Sensors for 3D Imaging: Metric Evaluation and Calibration of a CCD/CMOS Time-of-Flight Camera. Sensors 2009, 9, 10080–10096. [Google Scholar] [CrossRef] [Green Version]
- Steiger, O.; Felder, J.; Weiss, S. Calibration of time-of-flight range imaging cameras. In Proceedings of the 2008 15th IEEE International Conference on Image Processing, San Diego, CA, USA, 12–15 October 2008; pp. 1968–1971. [Google Scholar]
- Imiya, A.; Kawamoto, K. Learning dimensionality and orientations of 3D objects. Pattern Recognit. Lett. 2001, 22, 75–83. [Google Scholar] [CrossRef]
- Kiyomitsu, K.; Kakinuma, A.; Takahashi, H.; Kamijo, N.; Ogawa, K.; Tsumura, N. Volume measurement of the leg with the depth camera for quantitative evaluation of edema. In Advanced Biomedical and Clinical Diagnostic and Surgical Guidance Systems XV; SPIE BiOS: Bellingham, WA, USA, 2017. [Google Scholar]
- Fornaser, A.; Tomasin, P.; De Cecco, M.; Tavernini, M.; Zanetti, M. Automatic graph based spatiotemporal extrinsic calibration of multiple Kinect V2 ToF cameras. Robot. Auton. Syst. 2017, 98, 105–125. [Google Scholar] [CrossRef]
- Martinek, M.; Grosso, R.; Greiner, G. Optimized Canonical Coordinate Frames for 3D Object Normalization. In Proceedings of the Vision, Modeling and Visualization (VMV), Magdeburg, Germany, 12–14 November 2012. [Google Scholar]
Kinect Azure NFOV Unb. | Basler ToF 640 | Basler Blaze 101 | |
---|---|---|---|
Resolution | 640 × 576 px | 640 × 480 px | 640 × 480 px |
Frame rate | 30 fps | 20 fps | 30 fps |
FoV | 75 × 65 deg | 57 × 43 deg | 67 × 51 deg |
Working range | 0.5–3.86 m | 0.5–5.8 m | 0.5–5.5 m |
Dimension | 103 × 39 × 126 mm | 141.9 × 76.4 × 61.5 mm | 100 × 81 × 64 mm |
Power | 5.9 W | 15 W | 22 W |
Weight | 0.440 kg | 0.400 kg | 0.690 kg |
Warm-Up Time | Depth Amplitude | Depth Distortion | Temporal Error | Overall Uncertainty | |
---|---|---|---|---|---|
Env. conditions | Ensure optimal temperature (i.e., 24 °C) Ensure constant illumination without natural light interference | ||||
Reference target | Opaque target with verified planarity especially in the central region | ||||
Hardware set-up | Camera mounted on support at fixed height Ensure camera perpendicularity with respect to the target | ||||
Fixed distance Turn off camera before experiment for at least 4 h | Define a set of distances in the optimal working range according to the camera datasheet | ||||
Data acquisition | 1 depth frame or point cloud every 10 s at 30 fps | 30 depth frames or point cloud at each at 30 fps | |||
Data analysis | Group frames belonging to 5 min time windows (30 frames total) Extract 15 × 15 ROI around the central pixel Compute mean depth and standard deviation (Equation (1)) | Extract target frame to each Compute error (Equations (2)–(5)) | Extract only the depth value of the central pixel frame to each Compute error (Equations (2)–(5)) | Extract only the depth value of the central pixel frame to each Compute deviation (Equation (6)) Compute linear regression and check | Extract 20 × 20 ROI around the central pixel Use all data points inside ROI Compute linear regression and check |
Data correction | Apply bias correction and obtain (Equation (1)) | Ensure that is the relative error not the absolute depth | // | Remove outliers before applying linear regression | |
How to visualize | X-axis: time [s] Y-axis: with corresponding [mm] Optional: secondary y-axis showing relative error [%] | IR image and Depth error X-axis: x coordinate [px] and [mm], respectively Y-axis: y coordinate [px] and [mm], respectively Show color bar | X-axis: distance [mm] Y-axis: [mm] | X-axis: [mm] Y-axis: [mm] Show linear regression line | X-axis: [mm] Y-axis: [mm] Show linear regression line |
Warm-Up Time | Depth Amplitude | Depth Distortion | Temporal Error | Overall Uncertainty | |
---|---|---|---|---|---|
Kinect Azure * | Not needed | 2 to 5 mm | −18 to 10 mm | 0.8 to 3.2 mm | 13 mm |
Kinect Azure DS | Not provided | Not provided | ≤17 mm | Not provided | |
Kinect Azure [1] | 50 min | Not provided | −7 to 0 mm | 0.5 to 2 mm | Not provided |
Kinect Azure [2] | Not provided | −2 to 0 mm | 1.1 to 12.7 mm | 0.6 to 3.7 mm | Not provided |
Basler Blaze 101 * | 50 min | 3 to 7 mm | −11 to 9 mm | 0.5 to 3 mm | 6 mm |
Basler Blaze 101 DS | 20 min | −5 to 5 mm | Not provided | <2 mm | Not provided |
Basler ToF 640 * | 50 min | −10 to −6 mm | −48 to 76 mm | 0.6 to 3.8 mm | 13 mm |
Basler ToF 640 DS | 20 min | −10 to 10 mm | Not provided | ≤8 mm | Not provided |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Pasinetti, S.; Nuzzi, C.; Luchetti, A.; Zanetti, M.; Lancini, M.; De Cecco, M. Experimental Procedure for the Metrological Characterization of Time-of-Flight Cameras for Human Body 3D Measurements. Sensors 2023, 23, 538. https://doi.org/10.3390/s23010538
Pasinetti S, Nuzzi C, Luchetti A, Zanetti M, Lancini M, De Cecco M. Experimental Procedure for the Metrological Characterization of Time-of-Flight Cameras for Human Body 3D Measurements. Sensors. 2023; 23(1):538. https://doi.org/10.3390/s23010538
Chicago/Turabian StylePasinetti, Simone, Cristina Nuzzi, Alessandro Luchetti, Matteo Zanetti, Matteo Lancini, and Mariolino De Cecco. 2023. "Experimental Procedure for the Metrological Characterization of Time-of-Flight Cameras for Human Body 3D Measurements" Sensors 23, no. 1: 538. https://doi.org/10.3390/s23010538
APA StylePasinetti, S., Nuzzi, C., Luchetti, A., Zanetti, M., Lancini, M., & De Cecco, M. (2023). Experimental Procedure for the Metrological Characterization of Time-of-Flight Cameras for Human Body 3D Measurements. Sensors, 23(1), 538. https://doi.org/10.3390/s23010538