Next Article in Journal
An Unobtrusive Fall Detection and Alerting System Based on Kalman Filter and Bayes Network Classifier
Next Article in Special Issue
Simultaneous Calibration: A Joint Optimization Approach for Multiple Kinect and External Cameras
Previous Article in Journal
Photoacoustic Drug Delivery
Previous Article in Special Issue
Easy and Fast Reconstruction of a 3D Avatar with an RGB-D Sensor
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Breathing Analysis Using Thermal and Depth Imaging Camera Video Records

1
Department of Computing and Control Engineering, University of Chemistry and Technology in Prague, 166 28 Prague, Czech Republic
2
Faculty of Applied Informatics, Tomas Bata University in Zlín, 760 05 Zlín, Czech Republic
3
Czech Institute of Informatics, Robotics and Cybernetics, Czech Technical University in Prague, 166 36 Prague, Czech Republic
4
Faculty of Medicine in Hradec Králové, Department of Neurology, Charles University, 500 05 Hradec Kralove, Czech Republic
5
School of Electrical and Electronic Engineering, Newcastle University, Newcastle upon Tyne, NE1 7RU, UK
*
Author to whom correspondence should be addressed.
Sensors 2017, 17(6), 1408; https://doi.org/10.3390/s17061408
Submission received: 8 April 2017 / Revised: 21 May 2017 / Accepted: 13 June 2017 / Published: 16 June 2017
(This article belongs to the Special Issue Imaging Depth Sensors—Sensors, Algorithms and Applications)

Abstract

:
The paper is devoted to the study of facial region temperature changes using a simple thermal imaging camera and to the comparison of their time evolution with the pectoral area motion recorded by the MS Kinect depth sensor. The goal of this research is to propose the use of video records as alternative diagnostics of breathing disorders allowing their analysis in the home environment as well. The methods proposed include (i) specific image processing algorithms for detecting facial parts with periodic temperature changes; (ii) computational intelligence tools for analysing the associated videosequences; and (iii) digital filters and spectral estimation tools for processing the depth matrices. Machine learning applied to thermal imaging camera calibration allowed the recognition of its digital information with an accuracy close to 100% for the classification of individual temperature values. The proposed detection of breathing features was used for monitoring of physical activities by the home exercise bike. The results include a decrease of breathing temperature and its frequency after a load, with mean values −0.16 °C/min and −0.72 bpm respectively, for the given set of experiments. The proposed methods verify that thermal and depth cameras can be used as additional tools for multimodal detection of breathing patterns.

1. Introduction

The use of different sensors is essential for the study of many physiological and mental activities, neurological diseases [1,2] and motion and gait disorders [3,4]. The explanation of biomedical signals is also important for the development of assisted living technologies [5] and specific studies are related to polysomnography [6] and the study of many signals, including breathing and motion, as well as EEG and ECG signals.
Special attention is paid to temperature changes of facial parts affected by emotions, mental activities, or neurological disorders. The study of the temperature distribution over different parts of the face can be used in face and emotion detection [7,8,9,10,11,12], age recognition [13], motion [14], psychophysiology [15,16], neurology [17], and stress detection [18,19].
Noninvasive methods of breathing monitoring include electrical impedance tomography, respiratory inductance plethysmography [20,21], capnography and measurement of the tracheal sound, air capacity of the lungs or thoracic and abdominal circumference changes during respiration [22,23]. Thermal imaging can be used to measure both breathing rate and exhaled air temperature to provide useful information about the body load during physical activity and to study potential symptoms of certain respiratory diseases [24].
The respiratory rate is an important indicator [25] for monitoring of a person’s health. Some studies are devoted to sensing technologies in smart cities [26] that follow people’s vital signs without body instrumentation. These systems are often used for diagnosis of neurological disorders as well. A specific research is devoted to the respiration affect to cortical neuronal activity that modulates sensory, motor, emotional and cognitive processes [27].
The present paper is devoted to the noninvasive analysis of the breathing rate by facial temperature distribution using a thermal imaging camera [28,29] and by thorax movement monitoring recorded by the MS Kinect depth sensor. Data obtained from these instruments are then used for multimodal breathing analysis [30] and for monitoring of physical activities. Both the sequences of the thermal images and the depth matrices are acquired on the basis of contactless measurement.
A special attention is paid to the adaptive detection of facial thermographic regions. The present paper applies specific methods for their recognition allowing to detect the breathing rate or to recognize facial neurological disorders.
The proposed method of respiratory data processing is based on their statistical and numerical analysis using different functional transforms for the fast and robust estimation of desired features [20,23]. The respiratory rate estimation using chest motion analysis and facial thermographic data includes application of digital filtering and spectral analysis as well.

2. Methods

2.1. Data Acquisition

Figure 1 presents the general principle of the use of the thermal imaging camera [31] to detect temperature changes in the mouth area for the analysis of breathing. The calibration bar associated with each video frame is presented in the upper part of Figure 1a, showing the temperatures and associated image grey levels. Figure 1a presents the fixed ROI and the current position of the moving ROI as well. The recorded videosequence of temperature changes can then be used to analyse the time evolution of the temperature in the selected region of interest (ROI).
The block diagram of the thermal imaging camera shown in Figure 2a presents its optical systems, radiation detector, and electronics for processing and presenting images. The lens projects thermal radiation to the radiation detector, which measures its intensity. This information is then digitized and transferred to the resulting thermogram. The basic parameters of the SEEK Compact thermal camera used in this study and listed in Table 1 include its optical resolution and temperature range. The sensor responds to long-wave infrared radiation with wavelengths between 7.5 and 14 μ m.
An alternative approach to breathing data acquisition using an MS Kinect depth sensor is presented in Figure 3. For the selected thorax area, the depth sensing camera evaluates a matrix whose values indicate the distances of the individual pixels from the depth sensor. A videosequence of such frames can be used to determine the time evolution of chest movements in selected regions.
The range imaging methods used in depth sensors are based on the specific computational technologies that create the matrices whose elements carrying the information about the distance of the corresponding image component from the sensor [32,33]. The device features a sensor that is capable of capturing depth maps using the ‘Time of Flight’ technology [34]. The basic parameters of the MS Kinect used in this study are summarized in Table 1 as well.

2.2. Data Processing

The sequence of images recorded by the thermal camera were acquired with the changing temperature ranges associated with each videoframe as presented in Figure 1a. The adaptive recognition of these temperature ranges was performed by the two-layer neural network with the sigmoidal and softmax transfer functions [3] trained to recognize individual digits with the accuracy close to 100%. The classification model is able to detect the minimal and maximal temperature values in each thermographic frame and associated grey levels for the determination of the temperatures in each image.
The accuracy of the thermal camera was tested for a flat surface with equal temperature values and analysis of individual image frames as presented in Figure 2b,c. For applications requiring accurate temperature values the comparison with the calibrated thermometer is presented as well.
Facial temperature values are useful for detection of neurological disorders and facial symmetry analysis. The time stability and precision of the thermal camera was tested for a sequence of images recorded with a sampling period of 5 s in the face area with a stable temperature distribution over a short period of time for a healthy individual. The area around the eyes illustrated in Figure 4a was selected and the area of the regions in the selected temperature range of 26–28 °C presented in Figure 4c was analysed. The results in Figure 4d,e enjoy a precision better than 7% related to the mean temperature, which is sufficient for the given case.
The area (ROI) for the time evolution of temperature changes was specified empirically according to the first frame at first, as presented in Figure 5a. This fixed area assumed that the face maintained a stable position during the observation. To allow more flexible observations, the proposed algorithm includes the automatic detection of the area of interest using the following steps:
  • videorecording of the face area during a selected time range,
  • extraction of thermographic frames with the selected sampling frequency (of 10 Hz) and a given resolution,
  • automatic determination of temperature ranges in each thermographic frame and the adaptive calibration of each thermal image,
  • detection of the mouth area using the selected number of initial frames with the largest temperature changes and the adaptive update of this ROI for each subsequent thermal image,
  • evaluation of the mean temperature in the specified window of a changing position and size in each frame.
An example of the time evolution of the mean breathing temperature in the selected mouth region is presented in Figure 1a. The mean temperature in each video frame is associated with the grey level and the dot size in each time instant is in relation to the currently determined size of the mouth area.
An alternative analysis of breathing based upon thorax movement [1] was based on the mean value of the distance of the selected chest region from a MS Kinect depth sensor.
The analysis of multimodal records { x ( n ) } n = 0 N - 1 of breathing obtained from the thermal imaging camera and depth sensors used similar signal processing methods. Their de-noising was performed by finite impulse response (FIR) filtering of a selected order, M, resulting in a new sequence { y ( n ) } n = 0 N - 1 using the relation
y ( n ) = k = 0 M - 1 b ( k ) x ( n - k )
with coefficients { b ( k ) } k = 0 M - 1 defined to form a filter of the selected type and cutoff frequencies. In the case of breathing signals, a band pass filter was used to extract the frequency components in the range of 0 . 05 , 1 . 5 Hz.
The spectral components were then calculated by the discrete Fourier transform forming the sequence
Y ( k ) = n = 0 N - 1 y ( n ) exp ( - j k n 2 π N )
for k = 0 , 1 , , N - 1 related to the frequency f k = k N × f s . For selected records 300 s long and a sampling frequency of f s = 10 Hz, resulting in each record being N = 3000 samples long, the frequency resolution was 1 / N × f s = 0 . 0033 Hz, which is sufficient for the given study.

3. Results

The detection of breathing features was verified during monitoring of physical activities by the home exercise bike. Each experiment was 40 min long and it included two periods of physical exercises followed by two restful periods with each of them 10 min long. The total number of 25 experiments was performed by one individual in similar home conditions.
Figure 5 presents the results of the adaptive detection of the mouth area and the evolution of the mean breathing temperature in the selected time range of 30 s with a sampling frequency of 10 Hz. The adaptive recognition of the temperature range in each frame is applied in this process as well.
The resulting mean temperatures recorded by the thermal imaging camera evaluated from the fixed and the adaptively specified and moving temperature regions of interest are presented in Figure 6a. The moving ROI assumes its detection from temperature changes recorded during the selected number of previous frames. The history between 1 and 8 s long was selected (presented by the green vertical line in Figure 6a). As a result of the adaptive algorithm, the range of temperatures recorded by the moving ROI is larger, owing to the more precisely defined area of temperature changes following a possible slow movement of the head. The mean value of the distance of the chest area from the MS Kinect sensor recorded simultaneously is presented in Figure 6a as well.
Figure 6b presents the comparison of the breathing frequencies estimated from the spectral components evaluated for signals recorded both by the thermal imaging camera (using the fixed and moving ROI) and the MS Kinect depth sensor. All frequencies detected by this algorithm are the same for the given frequency resolution.
Table 2 presents the results of mean temperatures and temperature ranges evaluated in the fixed region of interest selected for the thermal imaging camera video records, as well as the adaptively changing positions and areas of this region evaluated by the proposed method. Table 2 presents further estimated breathing frequencies using the thermal imaging camera records. Since the records are 300 s long and the frame frequency is 10 Hz, all results are accurate within a frequency resolution of 0.0033 Hz (0.2 bpm). The same frequency was evaluated by the MS Kinect depth sensor which moreover provides information about thoracic and abdominal motion [1] during respiration.
Both thermal imaging and motion data can be used for monitoring of the breathing rate during physical activity. The proposed method of adaptive specification of the breathing area and temperature range recognition was applied for the analysis of the evolution of the breathing features recorded during physical exercise and in the following resting time period. Figure 7 presents the breathing temperatures recorded by the thermal imaging camera in the selected time range and the corresponding evolution of mean temperatures and breathing frequency evaluated in time windows 60 s long. Figure 7a presents the evolution of mean temperatures in the fixed and moving mouth area showing wider temperature ranges for the moving ROI (specified in Table 2). The moving ROI position results in more robust detection of temperature changes allowing to follow motion of the head. Regression coefficients for the set of exercises 30 min long performed at room temperature and evaluated for the subsequent restful period of 7 min are summarized in Table 3. The resulting mean regression coefficients are −0.16 °C/min. and −0.72 bpm in the given case. These results correspond with the physiological explanation of breathing changes.
Table 4 presents mean delays of frequency and temperature changes related to the start of the physical exercise or restful period evaluated for 32 segments 10 min long. An example of a selected test is presented in Figure 7. Physiological needs cause faster change of the breathing frequency at the beginning of the physical activity. This increase of the air flow volume causes on the other hand a longer period of breath temperature change during segments with the physical exercise. Results of the delay of selected physiological functions related to changes of physical activities for cycling experiments [35] correspond to the tests specified above.

4. Conclusions

This paper proposes a method for the use of thermal and depth sensors to detect breathing features. The presentation includes a description of a machine learning method for the recognition of the temperature ranges as well as an adaptive specification of the mouth region using the sequence of thermographic images. The application is devoted to the study of selected physiological features evaluated during physical activities.
The results achieved show an example of simple sensors application for breathing analysis using simple thermal imaging cameras for the detection of temperature changes and MS Kinect depth sensors for the analysis of the motion in the chest area. The proposed method of thermographic regions detection allows both analysis of the breathing rate and the study of neurological problems in the facial area.
It is assumed that simple sensors can form an alternative tool for the detection of medical disorders, including sleep and breathing analysis in the home environment.

Supplementary Materials

The following are available online at https://www.mdpi.com/1424-8220/17/6/1408/s1, S1: A video record of a selected set of thermal images showing the fixed and moving ROIs with facial regions of different mean temperatures and S2: a video record of selected depth frames acquired in the selected thorax area.

Acknowledgments

The real data were kindly analysed at the Department of Neurology of the Charles University in Hradec Králové. No ethical approval was required for this study.

Author Contributions

Aleš Procházka was responsible for the mathematical and computational tools of both the thermal and depth data processing, Hana Charvátová recorded all thermal camera images and contributed to their analysis, Oldřich Vyšata interpreted the results from the medical and neurological point of view, Jakub Kopal applied selected computational methods for signal preprocessing, and Jonathon Chambers contributed to the methodology of the thermographic signal processing.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Procházka, A.; Schatz, M.; Vyšata, O. Microsoft kinect visual and depth sensors for breathing and heart rate analysis. Sensors 2016, 16, 1–11. [Google Scholar] [CrossRef] [PubMed]
  2. Lee, J.; Hong, M.; Ryu, S. Sleep monitoring system using kinect sensor. Int. J. Distrib. Sens. Netw. 2015, 2015. [Google Scholar] [CrossRef]
  3. Procházka, A.; Vyšata, O.; Vališ, M.; Ťupa, O.; Schatz, M.; Mařík, V. Bayesian classification and analysis of gait disorders using image and depth sensors of Microsoft Kinect. Digit. Signal Prog. 2015, 47, 169–177. [Google Scholar] [CrossRef]
  4. Procházka, A.; Vyšata, O.; Vališ, M.; Ťupa, O.; Schatz, M.; Mařík, V. Use of Image and depth sensors of the Microsoft Kinect for the detection of gait disorders. Neural Comput. Appl. 2015, 26, 1621–1629. [Google Scholar] [CrossRef]
  5. Erden, F.; Velipasalar, S.; Alkar, A.; Cetin, A. Sensors in assisted living. IEEE Signal Process. Mag. 2016, 33, 36–44. [Google Scholar] [CrossRef]
  6. Procházka, A.; Schätz, M.; Centonze, F.; Kuchyňka, J.; Vyšata, O.; Vališ, M. Extraction of breathing features using MS Kinect for sleep stage detection. Signal Image Video Process. 2016, 10, 1278–1286. [Google Scholar] [CrossRef]
  7. Appel, V.; Belini, V.; Jong, D.; Magalhães, D.; Caurin, G. Classifying emotions in rehabilitation robotics based on facial skin temperature. In Proceedings of the IEEE RAS and EMBS International Conference on Biomedical Robotics and Biomechatronics, Sao Paulo, Brazil, 12–14 August 2014; pp. 276–281. [Google Scholar]
  8. Boccanfuso, L.; Wang, Q.; Leite, I.; Li, B.; Torres, C.; Chen, L.; Salomons, N.; Foster, C.; Barney, E.; Ahn, Y.; et al. A thermal emotion classifier for improved human–robot interaction. In Proceedings of the 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), New York, NY, USA, 26–31 August 2016; pp. 718–723. [Google Scholar]
  9. Kwaśniewska, A.; Rumiński, J. Face detection in image sequences using a portable thermal camera. In Proceedings of the 13th Quantitative Infrared Thermography Conference, Gdansk, Poland, 4–8 July 2016. [Google Scholar]
  10. Latif, M.; Md. Yusof, H.; Sidek, S.; Rusli, N.; Fatai, S. Emotion detection from thermal facial imprint based on GLCM features. ARPN-JEAS 2016, 11, 345–350. [Google Scholar]
  11. Nguyen, H.; Kotani, K.; Chen, F.; Le, B. Estimation of human emotions using thermal facial information. In Proceedings of the SPIE—The International Society for Optical Engineering, ICGIP 2013, Hong Kong, China, 26–27 October 2013. [Google Scholar]
  12. Rahulamathavan, Y.; Phan, R.C.V.; Chambers, J.A.; Parish, D.J. Facial expression recognition in the encrypted domain based on local fisher discriminant analysis. IEEE Trans. Affect. Comput. 2013, 4, 83–92. [Google Scholar] [CrossRef]
  13. Cheong, Y.; Yap, V.; Nisar, H. A novel face detection algorithm using thermal imaging. In Proceedings of the 2014 IEEE Symposium on Computer Applications and Industrial Electronics, ISCAIE, Penang, Malaysia, 7–8 April 2014; pp. 208–213. [Google Scholar]
  14. Liu, P.; Yin, L. Spontaneous facial expression analysis based on temperature changes and head motions. In Proceedings of the11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition, FG 2015, Ljubljana, Slovenia, 4–8 May 2015. [Google Scholar]
  15. Cardone, D.; Pinti, P.; Merla, A. Thermal infrared imaging-based computational psychophysiology for psychometrics. Comput. Math. Method Med. 2015, 2015, 1–8. [Google Scholar] [CrossRef] [PubMed]
  16. Ioannou, S.; Gallese, V.; Merla, A. Thermal infrared imaging in psychophysiology: Potentialities and limits. Psychophysiology 2014, 51, 951–963. [Google Scholar] [CrossRef] [PubMed]
  17. Nhan, B.; Chau, T. Classifying affective states using thermal infrared imaging of the human face. IEEE Trans. Biomed. Eng. 2010, 57, 979–987. [Google Scholar] [CrossRef] [PubMed]
  18. Hong, K.; Hong, S. Real-time stress assessment using thermal imaging. Vis. Comput. 2016, 32, 1369–1377. [Google Scholar] [CrossRef]
  19. Engert, V.; Merla, A.; Grant, J.; Cardone, D.; Tusche, A.; Singer, T. Exploring the use of thermal infrared imaging in human stress research. PLoS ONE 2014, 9, e90782. [Google Scholar] [CrossRef] [PubMed]
  20. Kim, H.; Kim, J.-Y.; Im, C.-H. Fast and robust real-time estimation of respiratory rate from photoplethysmography. Sensors 2016, 16, 1494. [Google Scholar] [CrossRef] [PubMed]
  21. Zhang, X.; Ding, Q. Respiratory rate estimation from the photoplethysmogram via joint sparse signal reconstruction and spectra Psion. Biomed. Signal Process. Control 2017, 35, 1–7. [Google Scholar] [CrossRef]
  22. Hu, M.H.; Zhai, G.T.; Li, D.; Fan, Y.Z.; Chen, X.H.; Yang, X.K. Synergetic use of thermal and visible imaging techniques for contactless and unobtrusive breathing measurement. J. Biomed. Opt. 2017, 22, 1–11. [Google Scholar] [CrossRef] [PubMed]
  23. Lin, Y.-D.; Chien, Y.-H.; Chen, Y.-H. Wavelet-based embedded algorithm for respiratory rate estimation from PPG signal. Biomed. Signal Process. Control 2017, 36, 138–145. [Google Scholar] [CrossRef]
  24. Carpagnano, G.E.; Foschino-Barbaro, M.P.; Crocetta, C.; Lacedonia, D.; Saliani, V.; Zoppo, L.D.; Barnes, P.J. Validation of the exhaled breath temperature measure: Reference values in healthy subjects. Chest 2017, 151, 855–860. [Google Scholar] [CrossRef] [PubMed]
  25. Khalidi, F.Q.; Saatchi, R.; Burke, D.; Elphick, H.; Tan, S. Respiration rate monitoring methods: A review. Pediatr. Pulmonol. 2011, 46, 523–529. [Google Scholar] [CrossRef] [PubMed]
  26. Adib, F.; Mao, H.; Kabelac, Z.; Katabi, D.; Miller, R.C. Smart homes that monitor breathing and heart rate. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI 2015, Seoul, Korea, 18–23 April 2015; pp. 837–846. [Google Scholar]
  27. Heck, D.H.; McAfee, S.S.; Liu, Y.; Babajani-Feremi, A.; Rezaie, R.; Freeman, W.J.; Wheless, J.W.; Papanicolaou, A.C.; Ruszinko, M.; Sokolov, Y.; Kozma, R. Breathing as a fundamental rhytm of brain function. Front. Neural Circuits 2017, 10, 115. [Google Scholar] [CrossRef] [PubMed]
  28. Murthy, R.; Pavlidis, I.; Tsiamyrtzis, P. Touchless Monitoring of breathing function. In Proceedings of the 26th Annual International Conference of the IEEE EMBS, San Francisco, CA, USA, 1–5 September 2004. [Google Scholar]
  29. Al-Obaisi, F.; Alqatawna, J.; Faris, H.; Rodan, A.; Al-Kadi, O. Pattern recognition of thermal images for monitoring of breathing function. Int. J. Control Autom. 2015, 8, 381–392. [Google Scholar] [CrossRef]
  30. Folke, M.; Cernerud, L.; Ekström, M.; Hök, B. Critical review of non-invasive respiratory monitoring in medical care. Med. Biol. Eng. Comput. 2003, 41, 377–383. [Google Scholar] [CrossRef] [PubMed]
  31. Usamentiaga, R.; Venegas, P.; Guerediaga, J.; Vega, L.; Molleda, J.; Bulnes, F.G. Infrared Thermography for temperature measurement and non-destructive testing. Sensors 2014, 14, 12305–12348. [Google Scholar] [CrossRef] [PubMed]
  32. Xia, J.; Siochi, R.A. A real-time respiratory motion monitoring system using microsoft kinect sensor. Med. Phys. 2012, 39, 2682–2685. [Google Scholar] [CrossRef] [PubMed]
  33. Griessenberger, H.; Heib, D.P.J.; Kunz, A.B.; Hoedlmoser, K.; Schabus, M. Assessment of a wireless headband for automatic sleep scoring. Sleep Breath. 2013, 17, 747–752. [Google Scholar] [CrossRef] [PubMed]
  34. Kolb, A.; Barth, E.; Koch, R.; Larsen, R. Time-of-flight sensors in computer graphics. In Eurographics 2009—State of the Art Reports; Pauly, M., Greiner, G., Eds.; The Eurographics Association: Geneva, Switzerland, 2009; pp. 119–134. [Google Scholar]
  35. Charvátová, H.; Procházka, A.; Vaseghi, S.; Vyšata, O.; Vališ, M. GPS-based analysis of physical activities using positioning and heart rate cycling data. Signal Image Video Process. 2017, 11, 251–258. [Google Scholar] [CrossRef]
Figure 1. Data acquisition presenting (a) specification of the fixed and moving region of interest (ROI); (b) facial regions of different mean temperatures in the selected thermal image frame.
Figure 1. Data acquisition presenting (a) specification of the fixed and moving region of interest (ROI); (b) facial regions of different mean temperatures in the selected thermal image frame.
Sensors 17 01408 g001
Figure 2. Thermal image analysis presenting (a) the block diagram of the thermal camera; (b) a selected image frame of a compact surface with equal temperature values; and (c) the distribution of values recorded by individual pixels.
Figure 2. Thermal image analysis presenting (a) the block diagram of the thermal camera; (b) a selected image frame of a compact surface with equal temperature values; and (c) the distribution of values recorded by individual pixels.
Sensors 17 01408 g002
Figure 3. Principle of data acquisition by an MS Kinect depth sensor, presenting a selected depth frame with the regions of interest used to detect the chest movement.
Figure 3. Principle of data acquisition by an MS Kinect depth sensor, presenting a selected depth frame with the regions of interest used to detect the chest movement.
Sensors 17 01408 g003
Figure 4. Thermal imaging camera accuracy analysis presenting (a) a selected thermal image with the region of interest (ROI) and the temperature bar; (b) areas specifying subregions with temperatures in the range of 26 °C and 28 °C; (c) the ROI temperature surface plot; (d) a global analysis of percentage of thermal pixels in the selected range of 26 °C and 28 °C; and (e) percentage values of thermal pixels in the selected range of 26 °C and 28 °C detected in two selected areas and their evolution for 24 images recorded for two minutes with a sampling period of 5 s.
Figure 4. Thermal imaging camera accuracy analysis presenting (a) a selected thermal image with the region of interest (ROI) and the temperature bar; (b) areas specifying subregions with temperatures in the range of 26 °C and 28 °C; (c) the ROI temperature surface plot; (d) a global analysis of percentage of thermal pixels in the selected range of 26 °C and 28 °C; and (e) percentage values of thermal pixels in the selected range of 26 °C and 28 °C detected in two selected areas and their evolution for 24 images recorded for two minutes with a sampling period of 5 s.
Sensors 17 01408 g004
Figure 5. Principle of the use of the thermal imaging camera for breathing analysis, presenting the time evolution of the mean breathing temperature detected in the selected mouth region (a) during the physical exercise with the higher temperature and breathing frequency; (b) during the restful period with the lower temperature and breathing frequency.
Figure 5. Principle of the use of the thermal imaging camera for breathing analysis, presenting the time evolution of the mean breathing temperature detected in the selected mouth region (a) during the physical exercise with the higher temperature and breathing frequency; (b) during the restful period with the lower temperature and breathing frequency.
Sensors 17 01408 g005
Figure 6. Data processing presenting (a) signals recorded by the thermal imaging camera and the MS Kinect depth sensor; and (b) detection of the breathing frequency from the fixed ROI, the moving ROI, and the MS Kinect depth sensor.
Figure 6. Data processing presenting (a) signals recorded by the thermal imaging camera and the MS Kinect depth sensor; and (b) detection of the breathing frequency from the fixed ROI, the moving ROI, and the MS Kinect depth sensor.
Sensors 17 01408 g006
Figure 7. An example of records and results evaluated during two 10 min long segments of the physical exercises followed by two 10 min long resting time periods presenting (a) the time evolution of breathing temperatures; (b) the time evolution of the mean breathing temperature in a time window 60 s long; and (c) associated breathing frequency.
Figure 7. An example of records and results evaluated during two 10 min long segments of the physical exercises followed by two 10 min long resting time periods presenting (a) the time evolution of breathing temperatures; (b) the time evolution of the mean breathing temperature in a time window 60 s long; and (c) associated breathing frequency.
Sensors 17 01408 g007
Table 1. Basic parameters of the thermal camera and MS Kinect sensors used for breathing analysis.
Table 1. Basic parameters of the thermal camera and MS Kinect sensors used for breathing analysis.
Thermo Camera Specifications MS Kinect Specifications
FeatureDescription FeatureDescription
Thermal sensor resolution 206 × 156 RGB stream resolution 1920 × 1080
Detection distance300 m Depth stream resolution 512 × 424
Temperature range−40–330 °C Infrared stream resolution 512 × 424
Frame rate<9 Hz Depth range0.4–4 m
MicrobolometerVanadium Oxide Frame rate<30 Hz
Lens materialChalcogenide
Pixel pitch12 μ m
Spectral range7.5–14 μ m
Table 2. Mean temperatures (T), temperature ranges (R) and evaluated breathing frequencies (F) for fixed and changing regions of interest using thermal imaging camera records 5 min long acquired in the same restful periods of different physical tests.
Table 2. Mean temperatures (T), temperature ranges (R) and evaluated breathing frequencies (F) for fixed and changing regions of interest using thermal imaging camera records 5 min long acquired in the same restful periods of different physical tests.
TestFixed ROI Moving ROI
T (°C)R (°C)F (bpm) T (°C)R (°C)F (bpm)
126.493.4914.79 27.1910.0714.79
226.263.1715.61 27.029.1215.61
326.795.6615.41 27.0910.6115.41
427.334.3115.82 27.4711.3815.82
526.144.0016.23 26.959.4416.23
627.554.2014.58 27.329.0714.58
727.544.1316.64 27.409.9816.64
Table 3. Regression coefficients and the mean squared errors S of the temperature and breathing frequency decrease during the time period of 7 min after the physical exercise 30 min long recorded by the thermal image camera.
Table 3. Regression coefficients and the mean squared errors S of the temperature and breathing frequency decrease during the time period of 7 min after the physical exercise 30 min long recorded by the thermal image camera.
ExperimentTemperature Evolution Frequency Evolution
Reg. Coeff. [°C/min]S [%]Aver. Reg. Coeff. Reg. Coeff. [bpm]S [%]Aver. Reg. Coeff.
MeanSTD MeanSTD
1−0.2520.001 −0.5130.401
2−0.1820.001 −1.5710.476
3−0.1350.001−0.1620.059 −0.2500.055−0.7200.619
4−0.0920.003 −0.1170.508
5−0.1480.001 −1.1500.358
Table 4. Mean delays of frequency and temperature changes related to the change of physical activity (physical exercise or restful period) for the set of 32 records 10 min long.
Table 4. Mean delays of frequency and temperature changes related to the change of physical activity (physical exercise or restful period) for the set of 32 records 10 min long.
Breathing FeatureSegmentMean Deleay (s)STD
FrequencyLoad7617
Rest9847
TemperatureLoad18859
Rest13034

Share and Cite

MDPI and ACS Style

Procházka, A.; Charvátová, H.; Vyšata, O.; Kopal, J.; Chambers, J. Breathing Analysis Using Thermal and Depth Imaging Camera Video Records. Sensors 2017, 17, 1408. https://doi.org/10.3390/s17061408

AMA Style

Procházka A, Charvátová H, Vyšata O, Kopal J, Chambers J. Breathing Analysis Using Thermal and Depth Imaging Camera Video Records. Sensors. 2017; 17(6):1408. https://doi.org/10.3390/s17061408

Chicago/Turabian Style

Procházka, Aleš, Hana Charvátová, Oldřich Vyšata, Jakub Kopal, and Jonathon Chambers. 2017. "Breathing Analysis Using Thermal and Depth Imaging Camera Video Records" Sensors 17, no. 6: 1408. https://doi.org/10.3390/s17061408

APA Style

Procházka, A., Charvátová, H., Vyšata, O., Kopal, J., & Chambers, J. (2017). Breathing Analysis Using Thermal and Depth Imaging Camera Video Records. Sensors, 17(6), 1408. https://doi.org/10.3390/s17061408

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop