Next Article in Journal
The Impact Analysis of Land Features to JL1-3B Nighttime Light Data at Parcel Level: Illustrated by the Case of Changchun, China
Next Article in Special Issue
The Importance of Respiratory Rate Monitoring: From Healthcare to Sport and Exercise
Previous Article in Journal
Monitoring the Structural Health of Glass Fibre-Reinforced Hybrid Laminates Using Novel Piezoceramic Film
Previous Article in Special Issue
A Wearable Device for Breathing Frequency Monitoring: A Pilot Study on Patients with Muscular Dystrophy
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Sensing Systems for Respiration Monitoring: A Technical Systematic Review

EduQTech, Electrical/Electronics Engineering and Communications Department, EUP Teruel, Universidad de Zaragoza, 44003 Teruel, Spain
*
Author to whom correspondence should be addressed.
Sensors 2020, 20(18), 5446; https://doi.org/10.3390/s20185446
Submission received: 9 August 2020 / Revised: 16 September 2020 / Accepted: 16 September 2020 / Published: 22 September 2020

Abstract

:
Respiratory monitoring is essential in sleep studies, sport training, patient monitoring, or health at work, among other applications. This paper presents a comprehensive systematic review of respiration sensing systems. After several systematic searches in scientific repositories, the 198 most relevant papers in this field were analyzed in detail. Different items were examined: sensing technique and sensor, respiration parameter, sensor location and size, general system setup, communication protocol, processing station, energy autonomy and power consumption, sensor validation, processing algorithm, performance evaluation, and analysis software. As a result, several trends and the remaining research challenges of respiration sensors were identified. Long-term evaluations and usability tests should be performed. Researchers designed custom experiments to validate the sensing systems, making it difficult to compare results. Therefore, another challenge is to have a common validation framework to fairly compare sensor performance. The implementation of energy-saving strategies, the incorporation of energy harvesting techniques, the calculation of volume parameters of breathing, or the effective integration of respiration sensors into clothing are other remaining research efforts. Addressing these and other challenges outlined in the paper is a required step to obtain a feasible, robust, affordable, and unobtrusive respiration sensing system.

Graphical Abstract

1. Introduction

Continuous monitoring of physiological variables is essential for health and well-being applications. One of the most interesting physiological variables is respiration. Breathing information is useful for health condition assessment [1]. It can help diagnose respiratory diseases, such as asthma, sleep apnea, and chronic obstructive pulmonary diseases (chronic bronchitis, emphysema, and non-reversible asthma) [2]. It is also used to identify heart failure or heart attack [3] and may serve as an indicator of changes in the nervous system, cardiovascular system, or excretory system, among others [4]. Once a disease has been diagnosed, breathing monitoring may be used during the treatment or for the surveillance of patients. It also plays a relevant role in the monitoring of newborn babies. Some of them are born under delicate conditions, and this monitoring may avoid any casualty due to infant sleep apnea [5]. Older people suffering from age-related conditions and diseases, like Parkinson or dementia [6], and sedentary patients could also benefit from unobtrusive health surveillance [7].
Breathing monitoring is also applicable to the field of work health and safety at work [8]. Firstly, having breathing information from workers can be helpful in assessing work-related risks to plan preventive actions to be undertaken before a work disease appears. The analysis of respiratory information may lead to the design of safer work places. Secondly, respiratory monitoring may help prevent job accidents and is especially useful for jobs, such as plane piloting, industrial machine drivers, car, bus, or train drivers, who can benefit from having breathing information on real time [9].
Respiratory monitoring has also been applied in the analysis of human emotions [10,11]. Respiratory rate (RR) can be associated with emotions, such as fear, stress, anger, happiness, sadness, or surprise [12]. This can be used to prevent mental diseases and in the treatment of patients with mental disorders. Human emotions are also useful in psychological studies, for example, to assess or understand consumer and social trends [13]. They have also been applied in assessing the level of safety of drivers [14] by monitoring their emotional state. They were also used in the computer science field to improve software engineering processes, overcoming the limitations of usability questionnaires and helping to provide more personalized web experiences. For example, they can be used to obtain information about consumer behavior on websites and their interactions. Respiratory monitoring may contribute to real time recognition of emotions, which is an area of active research in the video game industry to generate dynamic gaming experiences [15]. There are also applications in the education field and e-Learning. Some emotional states have positive effects on learning processes, while others hinder them. It is possible to personalize the learning process by providing the most effective resources for each emotional state [16].
Respiratory information is also applied in the sports field to monitor the performance of athletes during their activities [17,18]. This information can be used to optimize their training or to prevent health problems. Similarly, it is used in Magnetic Resonance Imaging (MRI) machines to guarantee the good conditions of patients throughout the process [19] and to reduce their level of stress [20].
Another less common application of respiratory sensing is the evaluation of the health of combat soldiers [21,22]. This has a double utility: it provides information on the integrity of soldiers and allows collecting field information. Breathing monitoring has also been used for emergency situations, such as rescue of or searching for people, in which breathing information is required in a non-contact way for faster and more effective intervention [23].
Figure 1 shows an overview of the applications of respiration monitoring.
To perform respiratory monitoring, several approaches were proposed in the literature [24]. Monitoring systems use sensors to measure breathing parameters. There are large differences among approaches depending on sensing techniques and sensors, breathing parameters, sensor locations, system setups, communication protocols, processing stations, energy autonomy and power consumption, field of application, algorithms used to process sensor data, software of analysis, and performance evaluation, among others. Given that the number of studies and approaches has increased dramatically in recent years, it may be useful to review existing systems, discussing trends, challenges and issues in this field.
There are several existing reviews in the field of wearable sensors. For example, the survey of Mukhopadhyay et al. [25] focused on wearable sensors to monitor human activity continuously. They described the typical architecture of a human activity monitoring system based on sensors, microcontrollers, communication modules, and remote processing. The paper outlined transmission technologies and energy harvesting issues and predicted an increase in interest in wearable devices in the near future. Similarly, the work of Nag et al. [26] reviewed flexible wearable sensors to monitor physiological parameters. The study focused on the materials used to manufacture sensors based on different factors, such as application, material availability, cost, or manufacturing techniques. Different operating principles were identified: electromechanical, pressure and strain, chemical, and magnetic field-based, among others. The transmission technologies used in the sensing systems and their possible applications were also reviewed in detail. Finally, the paper identified several challenges and future opportunities. The most relevant was the expected reduction in the cost of manufacturing flexible sensing systems. However, this paper focused exclusively on flexible sensing systems, and no review of other technologies was performed. In addition, it did not specifically address respiration sensing, but instead considered sensors for any type of physiological parameter. Similarly, the reviews of Chung et al. [27] and Bandodkar et al. [28] also focused on wearable flexible sensors, but specifically targeted at sweat analysis. Meanwhile, the review of Lopez-Nava et al. [29] addressed inertial sensors for human motion analysis. Different aspects were studied: sensor type, number of sensing devices and their combination, processing algorithms, measured motion units, systems used for comparison, and number of test subjects and their age range, among others. The review identified a trend toward low-cost wearable systems.
Seshadri et al. [30] presented a work focused on wearable sensors to monitor athletes’ internal and external workload. The paper addressed wearable devices for athletes comprehensively, including physical performance, physiological and mental status, and biochemical composition. RR was considered as one more physiological parameter. In fact, sensors to measure position, motion, impact, biomechanical forces, heart rate, muscle oxygen saturation, and sleep quality were also considered. The paper concluded that wearable sensors had the potential to minimize the onset of injuries and evaluate athlete performance in real time.
Aroganam et al. [31] reviewed wearable sensors for sport applications excluding professional sports. Communication technologies, battery life, and applications were widely discussed. The paper concluded that inertial and Global Positioning System (GPS) sensors were predominant in sport wearables. A gap was detected in user experience studies of existing devices. Meanwhile, Al-Eidan et al. [32] presented a systematic review on wrist-worn wearable sensors. They focused on user interface, interaction, and use studies of the sensing systems. Processing techniques were also analyzed showing high variability among them and including machine learning techniques and threshold-based methods. Similarly, validation experiments lasted from 2 s to 14 weeks and most of the experiments were performed under laboratory conditions. Few studies presented real-world setups with target users. Other aspects analyzed were sampling frequencies and features extracted. Challenges of wrist-based systems were identified in relation to weight, battery life, lack of standardization, safety, user acceptance, or design.
Mansoor et al. [33] performed a review on wearable sensors for older adults. The review focused on sensor target population, sensor type, application area, data processing, and usability. Fourteen papers were analyzed. They identified barriers, such as inaccurate sensors, battery issues, restriction of movements, lack of interoperability, and low usability. The paper concluded that these technical challenges should be resolved for successful use of wearable devices.
Heikenfeld et al. [34] conducted a review on wearable sensors that interfaced with the epidermis. Wearable sensors were classified into four broad groups: mechanical, electrical, optical, and chemical. Several subgroups were identified within each category. Body-to-signal transduction, actual devices and demonstrations, and unmet challenges were discussed. The paper concluded that, in general, sensing categories had remained isolated from each other in commercial products, and strategies were still needed to easily attach and detach disposable systems.
Witte and Zarnekow [35] reviewed wearable sensors for medical applications. Ninety-seven papers were analyzed in relation to disease treatments, fields of application, vital parameters measured, and target patients. The paper identified a trend toward heart and mental diseases monitoring. Sensors were used for monitoring or diagnosis, collecting physical activity data, or heart rate data. The work of Pantelopoulos et al. [36] surveyed wearable biosensor systems for health monitoring. The design of multiparameter physiological sensing systems was discussed in detail. Meanwhile, the study of Liang et al. [37] addressed wearable mobile medical monitoring systems. Emphasis was placed on devices based on wireless sensing networks, and special attention was given to textile technologies. Finally, the paper of Charlton et al. [38] reviewed the estimation of the RR using two different signals: the electrocardiogram (ECG) and the pulse oximetry (photoplethysmogram, PPG).
A recent review on contact-based sensors to measure RR was published by Massaroni et al. [24]. This paper identified seven contact-based techniques: measuring of respiratory airflow, respiratory sounds, air temperature, air humidity, air components, chest wall movements, and modulation cardiac activity. Several possible sensors could be used for each technique. Some of the sensors identified in the review were flowmeters, anemometers, fiber optic sensors, microphones, thermistors, thermocouples, pyroelectric sensors, capacitive sensors, resistive sensors, nanocrystal and nanoparticles sensors, infrared, inductive, transthoracic, inertial, ECG sensors, and PPG sensors, among others. The paper presented a detailed description of each sensing technology, focusing on metrological properties and operating principles. Equations were provided for most sensors. In addition, the study compared the optimal techniques for clinical settings (respiratory airflow, air temperature, air components, chest wall movements, and modulation of cardiac activities), occupational settings (respiratory airflow, air components, and chest wall movements) and sport and exercise (respiratory airflow and chest wall movements). These techniques were considered optimal for controlled environments.
A previous work on respiration sensors was published by AL-Khalidi et al. [39]. This paper covered both non-contact and contact-based methods and provided a general description of several sensing techniques. On the one hand, contact-based technologies included five sensing methods: acoustic, airflow detection, chest and abdominal movement measuring, transcutaneous CO2 monitoring, oximetry prove (SpO2), and electrocardiogram derived methods. On the other hand, non-contact technologies included radar-based detection, optical methods, thermal sensors, and thermal imaging. The paper concluded that non-contact RR monitoring had advantages over contact methods since they caused the least discomfort to patients.
Three other related surveys were published, to the best of our knowledge. The review by van Loon et al. [40] studied respiratory monitoring from a hospital perspective without analyzing technical items. The review of Rajala and Lekkala [41] focused exclusively in the film-type sensor materials polyvinylidenefluoride (PVDF) and electro-mechanical film (EMFi), while the recent review of Massaroni et al. [42] analyzed fiber Bragg grating sensors for cardiorespiratory monitoring.
In this paper, we present a survey on sensing systems for respiratory monitoring. This paper has several novelties with respect to the existing reviews in the state of the art:
  • This review is not exclusively focused on sensor metrological properties or operating principles. Instead, this paper also reviews all the different aspects involved in the design and development of a respiration sensing system: communication protocols, processing stations, energy autonomy and power consumption, general system setups, sensor location and size, breathing parameters, validation methods, details of the test experiments, processing algorithms, software used for analysis, and performance evaluation. To the best of our knowledge, this is the first review paper that analyzes all these aspects in breathing sensors.
  • This paper does not focus exclusively on RR. In addition, sensors that measure other breathing parameters are also surveyed.
  • Unlike previous reviews, this survey is systematic. Studies on respiration sensors were obtained using objective selection criteria. They were then subjected to detailed analysis.
Therefore, this paper provides a comprehensive overview of all aspects to consider in the design of respiratory sensing systems. It aims to help engineers and researchers to identify the different options at each design stage.
The structure of this review is as follows: Section 2 presents the study design, selection criteria, and organization of the review results; Section 3 describes the results of the literature search, which are classified into different groups, the items of analysis and the results of the analysis of those items for each study; Section 4 discusses the trends in respiratory monitoring, the issues in the design of respiration sensors, and the current challenges in this field, highlighting the research opportunities; and, finally, Section 5 draws some conclusions.

2. Materials and Methods

2.1. Search and Selection Procedure

A systematic search of the literature was carried out to identify relevant papers in the field of sensors for respiratory monitoring. The IEEE (Institute of Electrical and Electronics Engineers) Explore and Google Scholar were used for this review. IEEE Explore is a reference in engineering studies and Google Scholar provides a broader perspective to complement the results. Four sets of keywords were selected to perform the searches. To identify these keywords, a preliminary study was conducted that examined key studies in this field. As a result, the five search terms selected were the following: (1) “breathing” plus “monitoring”, (2) “respiratory” plus “monitoring”, (3) “breathing” plus “sensor”, (4) “respiratory” plus “sensor”, and (5) “respiration” plus “sensor”.
To analyze the most recent research, articles from 2010 to 2019 were considered. Searches were conducted in February 2019 and repeated in March 2020. The sort by relevance of IEEE Xplore and Google Scholar was used to obtain the most relevant articles first. According to the official IEEE Xplore website, the search results are “sorted by how well they match the search query as determined by IEEE Xplore” [43]. Regarding the relevance criteria of Google Scholar, its official website points out that the rank is made by “weighting the full text of each document, where it was published, who it was written by, as well as how often and how recently it has been cited in other scholarly literature” [44]. Journals, magazines, and conferences were considered in the searches. As a result of the five searches in the two repositories, more than a million results were obtained. For each search and repository, the 100 most relevant papers were selected, resulting in 1000 studies. This number is high enough to provide a comprehensive review of the topic. The title and abstract of all these studies were examined and those not related to the subject of the review were discarded, resulting in 236 papers. Then, a second selection was made based on the content of the papers, discarding those that did not deal with sensors for respiratory monitoring. Finally, 198 papers were obtained. All of them were subjected to a detailed analysis that is presented in Section 3. Figure 2 (top) shows an overview of the selection procedure. Figure 2 (bottom) presents the PRISMA diagram that details the item selection process [45].

2.2. Organization of the Results

The search results were analyzed in detail. For that, papers were divided into two categories: wearable systems and environmental systems. This is a typical classification found in several sensor-related studies [24,46]. Wearable methods require individuals to carry the sensors, while environmental methods place them around subjects. The wearable category includes 113 studies, while the environmental category comprises the remaining 85 studies.
Different aspects of respiratory sensing systems were analyzed for each paper. The items selected can be divided into four categories (Figure 3): (1) sensor and breathing parameter, (2) data transmission and power consumption, (3) experiments performed for sensor validation, and (4) sensor measurement processing.
The category “sensor and breathing parameter” includes the following items of analysis: (1.1) sensing technique and sensor, (1.2) breathing parameter, and (1.3) sensor location and size.
Four items are included in the category “data transmission and power consumption”: (2.1) general system setup, (2.2) communication protocol, (2.3) processing station, and (2.4) energy autonomy and power consumption.
The category “sensor validation” comprises several items related to the design of experiments to validate the sensors (they are listed in Section 3.3).
Three items are included in the “sensor measurement processing” category: (4.1) performance evaluation, (4.2) processing algorithm, and (4.3) software used for the analysis.
For each category, we first describe in detail the different items of analysis, except item (1.1) “sensing technique and sensor”, which was described extensively in the review of Massaroni et al. [24]. Then, we provide the value of those items for each study selected for both categories (wearable and environmental). Results were subjected to critical analysis and discussion.

3. Results

This section has been structured around the four categories of analysis introduced in Section 2.2. First, the items of analysis and their possible values are described in detail for each category (Section 3.1.1, Section 3.2.1, Section 3.3.1 and Section 3.4.1). Then, the values of those items provided in the studies selected are analyzed and discussed (Section 3.1.2, Section 3.2.2, Section 3.3.2 and Section 3.4.2).

3.1. Sensor and Breathing Parameter

3.1.1. Items of Analysis

This category includes the following items of analysis: sensing technique and sensor, breathing parameter, and sensor location and size.

Sensing Technique and Sensor

According to the review of Massaroni et al. [24], two different dimensions can be observed in the operating principle: the technique selected to obtain respiration information and the sensor used to capture that information. For each possible technique, there are several sensors available.
To classify the papers analyzed in this review, the classification established in the work of Massaroni et al. [24] was used. It was expanded to also cover environmental breathing sensors. The techniques and sensors identified were:
  • Technique based on measurements of respiratory airflow. Possible sensors are differential flowmeters, turbine flowmeters, hot wire anemometers, photoelectric sensors, and fiber optic sensors.
  • Technique based on measurements of respiratory sounds. Possible sensors are microphones.
  • Technique based on measurements of air temperature. Possible sensors are thermistors, thermocouples, pyroelectric sensors, fiber optic sensors, infrared sensors, and cameras.
  • Technique based on measurements of air humidity. Possible sensors are capacitive sensors, resistive sensors, nanocrystal and nanoparticles sensors, impedance sensors, and fiber optic sensors.
  • Technique based on measurements of chest wall movements. Three different types of measurement were identified in this technique:
    Strain measurements: Possible sensors are resistive sensors, capacitive sensors, inductive sensor, fiber optic sensors, piezoelectric sensors, pyroelectric sensors, and triboelectric nanogenerator.
    Impedance measurements: Possible sensors are transthoracic impedance sensors.
    Movement measurements: Possible sensors are accelerometers, gyroscopes and magnetometers, frequency shift sensors, DC (direct current) generators, ultrasonic proximity sensors, cameras, optical sensors, inductive sensors, and Kinect sensors.
  • Technique based on measurements of modulation cardiac activity. Possible sensors are ECG sensors (for biopotential measurements), PPG sensors (for light intensity measurements), radar sensors, and Wi-Fi transmitters and receivers.
Equations and details of the different sensors are included in the reference review paper [24].

Breathing Parameters

Breathing parameters are the metrics provided as output of the sensing process. Possible breathing parameters are the following:
  • Respiratory rate (RR): Number of breaths (inspiration and expiration cycles) performed by a subject in one minute (Figure 4). It is measured in breaths/min (bpm). Other metrics derived from the RR can also be calculated [10]:
    Breathing period: Time duration of a breathing cycle(s).
    Inspiratory time: Part of the breathing period that corresponds to inspiration (s). According to Figure 4A, it can be obtained as t b t a
    Expiratory time: Part of the breathing period that corresponds to expiration (s). According to Figure 4A, it can be obtained as t d t c
  • Volume parameters: Metrics that provide volume information obtained from inhaled or exhaled air during breathing. Volume metrics comprise a set of sub-metrics related to the volume of air available in the lungs [47]. Some of the metrics that could be found in the breathing sensor studies were:
    Tidal volume (TV): It is the volume of air inhaled or exhaled during normal respiration (without forcing breathing). It is measured in liters (L). From the volume versus time signal represented in Figure 4B, the TV for a given breathing period could be calculated as T V = | V n 1 V n | = | V n V n 1 | , where Vn is the air volume associated with the n respiration peak or valley.
    Minute volume (MV): It is the volume of air inhaled or exhaled by a subject in one minute during normal breathing. It is measured in L/min. It can be roughly obtained from the TV and the RR as M V = T V · R R . From the representation of Figure 4B, the MV can be calculated as M V = i = 2 n | V i 1 V i | ;   i : i [ 2 , n ] : 2 | i , where n is the number of peaks (or valleys) in the air volume curve that can be found in one minute of breathing.
    Peak inspiration flow (PIF): According to Warner and Patel [47], it is the maximum flow at which a given tidal volume breath can be delivered. It is measured in L/min. From the representation of Figure 4B, it can be obtained as P I F = ( V b V a ) / ( t b t a ) , where (Va, ta) is the point associated with the valley in the time-volume curve before inspiration, and (Vb, tb) is the point related to the peak of inspiration at which the given tidal volume is delivered.
    Exhalation flow rate (EFR): Volume of air exhaled per time unit. It is expressed in L/s and can be calculated as E F R = ( V o l u m e   o f   e x h a l e d   a i r ) / ( E x h a l a t i o n   t i m e ) [48]. From the representation of Figure 4B, it can be obtained as E F R = ( V 3 V 4 ) / ( t 4 t 3 ) , where (V3, t3) is the point corresponding to a peak of the time-volume curve, and (V4, t4) is the next valley of the curve.
    There are other air volume metrics, such as peak expiratory flow (maximum flow at which a given tidal volume can be exhaled; it can be obtained as ( V c V d ) / ( t d t c ) from Figure 4B), vital capacity (volume of air expired after deep inhalation; it can be obtained as V e V f   from Figure 4B), or forced vital capacity (same as vital capacity but maximizing the expiratory effort; it can be obtained as V g V h   from Figure 4B), among others [49]. They have barely been used in breathing sensor studies.
    Compartmental volume: Instead of considering air volume, this metric measures the change in volume of breathing-related body parts, like chest, thorax, or abdomen [49].
  • Respiration patterns: There are studies in which the purpose is to identify patterns in the signals obtained from the recording of respiration instead of providing a particular breathing parameter. Common patterns identified are abnormal breathing [50,51,52], apnea episodes [50,51,53,54], Kussmaul’s respiration, Cheyne-Stokes breathing, Biot’s respiration, Cheyne-Stokes variant, or dysrhythmic breathing, among others [53]. There are also studies that identified the type of breathing (heavy or shallow breathing, mouth breathing, abdominal breathing, or chest breathing) [53].

Sensor Location and Size

Sensor location and size play a relevant role in system usability and can determine the acceptability of the technology by its potential users [55,56]. Figure 5 shows possible locations for wearable systems. The locations are chest (diaphragm or pectoral muscle), abdomen, waist, arm, forearm, finger, mouth (including mouth mask), nose (nasal bridge, above lip or nostril), wrist, neck (suprasternal notch area), or back. Regarding environmental systems, sensors can be located at a fixed distance from the subject, can be integrated into an object commonly used by the subject (pillow, mat, mattress, etc.), or can be distributed on nodes, among others. The location of a sensor largely depends on its operating principle and the specific application.

3.1.2. Results of the Analysis

Table 1 presents the results of the analysis of the items technique, sensor, parameter, and location and size for the studies in the wearable category, while Table 2 shows the results for the environmental papers.
In relation to the sensing techniques and sensors, Figure 6 and Figure 7 show the main results for the wearable and environmental categories, respectively. Most authors chose to detect chest wall movements (60%). For the environmental category, modulation of cardiac activity was also very common [5,7,50,52,54,163,164,168,174,175,177,178,189,190,196,197,198,199,205,206,207,213,214,219,221,223,224,225,226,229,232,233]. Meanwhile, air temperature and air humidity were the second [2,64,91,104,107,116,120,128,133,145,153,154,155,156,162] and third [66,74,75,80,82,89,97,101,137,138] most widely implemented techniques in the wearable category, at great distance. In this category, fiber optic sensors were used in almost 19% of the studies, resistive sensors in 15%, accelerometers in more than 11%, and capacitive sensors in more than 9%. Great variability in sensors can be found in studies of this category, as there is no predominant type. This contrasts with the environmental category since radars are used in more than 33% of the studies, being the leading technology followed by cameras (18%) and fiber optic sensors (14%). There are types of sensors, such as magnetometers, gyroscopes, microphones, optical sensors, inductive sensors, or thermistors, in which its use is very limited in both categories [62,73,76,77,81,91,95,106,107,115,119,120,131,135,156,160,162,185,193,217].
Regarding breathing parameters (Figure 8), RR was obtained in 60% of the wearable studies and in 79% of the environmental studies. It was the most widely used parameter by far. Other metrics based on the analysis of the magnitude versus time curve, such as breathing period or expiratory/inspiratory times, were barely used (2% in the wearable category) [94,103]. The representation of the volume versus time curve or the use of volumetric parameters was not common. They appeared in 10% of the studies of the wearable category [2,17,49,61,67,111,113,116,122,127,147] and in 5% of the studies of the environmental group [48,51,52,215]. Among the possible volume metrics, tidal volume was the most common in the wearable category [2,17,49,61,111,113,116,122,127], while it was found in one study of the environmental category [52]. The rest of the metrics (MV, vital capacity, peak inspiratory flow, peak expiratory flow, and compartmental volume) were used in isolated cases. A considerable number of studies detected respiratory patterns in both wearable [17,143,152,159] and environmental categories [10,19,50,51,52,53,54,180,182,194,218]. The most common approach was to detect abnormal breathing patterns to identify respiratory disorders, such as apnea. This was especially common in environmental systems.
Regarding sensor location, most wearable studies placed them on the chest or abdomen (Figure 9). This was the most common trend by far. It was also common that sensors were embedded in shirts at chest or abdomen level [21,49,59,65,69,84,85,94,108,113,123,142,143,151,235]. This was the location selected by 15% of the studies. Nose or mouth were also widely used locations to place the sensors. As a particular case of sensors placed in the nose or mouth, several researchers integrated them into a mask [66,75,80,82,92,101,107,137,156]. This contrasts with locations, like fingers, waist, arms, or wrists, in which use was residual [93,115,117,118,126,139,157].
Figure 10 shows the locations adopted in the environmental studies. On the one hand, the most common approach was to place the sensor at a fixed distance from the subject. Fifty-two% of the studies used this setup. On the other hand, Figure 10 shows that placing the sensors as nodes without precise control of the distance between the sensor and the subject was adopted by 6% of the studies. Meanwhile, 29% of the studies integrated the sensors into mats or pillows [9,19,164,165,166,169,170,173,179,182,183,186,194,201,202,203,210,211,212,217,218,220,227,230,231,236] to measure breathing parameters during rest activities mainly. The rest of the environmental locations shown in Table 2 were only used in isolated cases.

3.2. Data Transmission and Power Consumption

3.2.1. Items of Analysis

This category includes the following items of analysis: general system setup, communication protocol, processing station, and energy autonomy and power consumption.

General System Setup

Different configurations can be found in systems for respiratory monitoring depending on the data transmission architecture. Systems can be roughly divided into two categories (Figure 11): (A) those that perform data processing on a centralized processing platform and (B) those that perform data processing near the remote sensing unit.
  • Systems that perform centralized processing: Data processing is done in a centralized system that does not need to be close to the subject being monitored. The magnitude values registered by the sensors are acquired and conditioned [24] and then transmitted to a centralized processing unit. Three different approaches can be found depending on the specific point where the acquisition & conditioning module and transmission module are placed:
    The acquisition & conditioning and transmission modules are in the same package as the sensing unit (cases 1.x of Figure 11A, x     [ 1 . . 2 ] ).
    The acquisition & conditioning module is in the same package as the sensing unit, but the transmission module is placed externally (cases 2.x of Figure 11A, x     [ 1 . . 2 ] ).
    Both the acquisition & conditioning and transmission modules are not included in the same package as the sensing unit (cases 3.x of Figure 11A, x     [ 1 . . 2 ] ).
For all three approaches, data visualization can be done in two different ways: next to the processing unit of the registered signals (cases 1.1, 2.1, and 3.1 of Figure 11A) or at a different point (cases 2.1, 2.2, and 3.2 of Figure 11A).
  • Systems that perform remote processing: Processing of breathing signals to determine the respiratory parameters of interest is performed near the subject whose breathing is being monitored. Three different setups are possible depending on whether the acquisition & conditioning module and the processing module are included in the same package as the sensing unit:
    The acquisition & conditioning circuits, the microcontroller for the processing and the data transmission module are placed in the same package as the sensing unit (cases 4.x of Figure 11B, x     [ 1 . . 2 ] ).
    The acquisition & conditioning circuits are placed in the same package as the sensing unit. However, the microcontroller in charge of the processing and the data transmission module are placed in an external package, which is not compactly integrated with the sensing module (cases 5.x of Figure 11B, x     [ 1 . . 2 ] ).
    The acquisition & conditioning circuits, the microprocessor and the data transmission module are placed in a different package than the sensing unit (cases 6.x of Figure 11B, x     [ 1 . . 2 ] ).
Regarding data visualization, it can be done in two different ways: remotely without the need for data transmission (in this case, the data transmission module is not included) (cases 4.1, 5.1, and 6.1 of Figure 11B) or in a central unit (cases 4.2, 5.2, and 6.2 of Figure 11B).

Communication Protocol

Communication between the different modules of the system can be classified according to whether it is wired or wireless:
  • Wired transmission: All system elements (sensing, acquisition, conditioning, transmission, processing, and visualization) are physically connected. The USB (universal serial bus) protocol is the most common way of transmitting the acquired respiratory signals.
  • Wireless transmission: Subjects wear the sensing system without cable connections to other elements of the system. The transmission and reception of measurements is carried out through a wireless transmission technology. Therefore, the usability of the system increases [55]. Different transmission technologies can be found in existing studies [237]:
    Bluetooth: It is a standard and communication protocol for personal area networks. It is suitable for applications that require continuous data transmission with a medium data transmission rate (up to 1 Mbps). It uses a radio communication system, which means that the transmitting and receiving devices do not need to be in line of sight. It operates in the 2.4–2.485 GHz band with a low transmission distance (1 to 100 m, typically). There are five Bluetooth classes (1, 1.5, 2, 3, and 4). Most Bluetooth-based respiration monitoring systems use class 2 or higher. This means that the transmission distance is short (less than 10 m, in general), but the power consumption is also moderate [237].
    Wi-Fi: This technology is generally used for local area networks instead of personal area networks, like Bluetooth. It has much higher data transmission rates and power consumption is also higher. At a typical 2.4 GHz operating frequency, it can consume a maximum of 100 mW. Wi-Fi operating band is in the 2.4–5 GHz range. In general, the transmission range is between 50 m and 100 m, although it can be greatly extended in some conditions. This technology is suitable for applications where constant high-speed data transmission is required, the transmission distance is relatively large, and power consumption is not an issue [238].
    GSM/GPRS: Global System for Mobile Communications (GSM) is a standard for mobile communication that belongs to the second-generation (2G) of digital cellular networks. It requires base stations to which the mobile devices connect. The coverage range of base stations varies from a few meters to dozen of kilometers. Within this 2G technology, it is also possible to find the General Packet Radio Service (GPRS), which is data-oriented. The transmission rate of GPRS is low (around 120 kbps, although this rate is usually lower in real conditions) with a limitation of 2 W of power consumption. The frequency band of this technology is in the range of 850–1900 MHz [239].
    Zigbee: It is a specification of several high-level communication protocols. Zigbee is used for the creation of personal area networks that do not need high data transmission rates. ZigBee can operate in the industrial, medical and scientific radio bands, which may vary among countries. This is the reason why it generally works in the 2.4 GHz band that is available worldwide. If the system operates in the 2.4 GHz band, its data transmission rate is 250 kbps. Devices using this technology are generally inexpensive since the required microprocessor is simple due to the low transmission rate of Zigbee. Power consumption is low since nodes can be asleep until some information is received. It is useful for applications that do not require constant transmission. The range of transmission distance is similar to that of Bluetooth technology [237].
    Radio frequency: These modules are suitable for applications that do not need a high speed of data transmission. Radio frequency works in the Ultra High Frequency band (433 MHz) and requires a receiver-transmitter pair. It is low power and cheap, with a small module size. Communication range is from 20 to 200 m. This range depends on the input voltage of the module: at higher voltages, greater communication distance is reached. Working voltage for this technology ranges from 3.5 to 12 V. Radio broadcasting is performed through amplitude modulation. Radio frequency requires both receivers and transmitters to incorporate a microcontroller module. Typical power consumption is up to 10 mW.
Table 3 shows a schematic comparison of some key properties of the main wireless transmission technologies used in respiration studies.

Processing Station

Another item of analysis is the platform on which the recorded signals are processed to obtain respiratory information. Several options exist in the state of the art (Figure 12):
  • PC (personal computer): The respiration sensing system is connected or linked to a local PC that performs the processing of the registered breathing signals.
  • Smartphone/Tablet: The sensing system communicates wirelessly with a smartphone application that runs the processing algorithm ubiquitously.
  • Cloud: Breathing signals are sent wirelessly to a remote server, which performs cloud computing.
  • Embedded hardware: Processing is performed directly on embedded systems, which are located in or near the sensing unit package.

Energy Autonomy and Power Consumption

Regarding the power supply, systems can be categorized according to whether (1) they harvest part of the energy required for system operation, (2) they use rechargeable batteries, or (3) they are directly connected to a power source through a cable. This section analyses the first two categories in more detail since systems connected to a power source are of less interest as they have unlimited power availability.
(1) 
Energy Harvesters
Few were the studies found in the systematic searches conducted in this review that harvested energy [77,84,104]. However, some energy harvesting techniques have been reported experimentally in other wearable systems [240,241,242,243,244,245,246,247,248,249]. This section presents a description of these techniques and how they were implemented in the respiratory sensing systems. They were based on magnetic induction, piezo electric effect, triboelectric power generation, pyroelectric effect, thermoelectric effect, electrostatic power generation, and solar cells.
  • Magnetic induction generator: A small electric generator can be used to transform mechanical energy into electrical energy according to Faraday’s law. An electric current is induced in the generator coils by a changing magnetic field produced by the movement of the rotor due to the mechanical energy applied to it during breathing. The amount of generated voltage can be calculated according to Equation (1) [135].
V =   N × K 1 K 2 × d C C h e s t d t ,
where N is the number of turns of the coil, C C h e s t is the circumference change of the chest, K2 is the proportionality constant between C C h e s t and the angular displacement, and K1 is the proportionality constant between the magnetic flux and the rate of change of the angular displacement. The prototype presented by Padasdao et al. [135] attached the motor to a plastic housing with an armature fixed to the rotor gears (or shaft) (Figure 13A). A non-elastic wire was wrapped around the chest. One side of the wire was fixed to the plastic housing and the other end was attached to the armature. A piece of hard felt was fixed to the housing to help stabilize the device against the body. A spring was attached between the armature and the plastic housing to provide a restoring force to the armature. During inspiration, the non-elastic wire pulled the armature, leading to rotor rotation. During expiration, the spring pulled the armature back, leading to rotor rotation in the opposite direction. In this way, energy was harvested. In the work of Padasdao et al. [135], the electrical signal generated was used to obtain the RR instead of supplying power to the system. However, this is an example of how respiratory movements can be converted into electrical energy.
Other respiration-based energy harvesting systems can be found in the literature. The works of Delnavaz et al. [240] and Goreke et al. [241] used air flow to produce power with magnetic induction generators. On the one hand, the prototype of Delnavaz et al. [240] was made up of two fixed magnets located at the ends of a tube (opposite poles facing each other) with a free magnet inside the tube (Figure 13B). The free magnet was suspended due to the repulsive forces with the fixed magnets. A coil was wrapped around the outside of the tube. When a subject breathed into the tube, the free magnet moved around its static position. In this way, a voltage was induced in the coil since it was crossed by a variable magnetic field, which caused the magnetic induction. Experimental results showed that more than 3 µW were generated. The induced voltage in a closed circuit (U) was proportional to the magnetic flux gradient (dϕ/dx) and the velocity of the magnet (dx/dt), according to Equation (2).
U = N d d t d x d t .
On the other hand, a microelectromechanical-scale turbine was presented by Goreke et al. [241]. The turbine had 12 blades on its outer contour and ball bearings around the center embedded in grooves (Figure 13C). A permanent magnet was integrated in the area between the ball bearings and the turbine blades. The entire prototype was encapsulated in a package with rectangular openings for the airflow. The prototype presented was under development and not fully implemented. The operating principle of the system could be as follows: by flowing air for the rectangular openings, the blades rotate and move the turbine in such a way that its coils see a variable magnetic field generated by the fixed magnet. This generates power through magnetic induction. The maximum power generated was 370 mW.
  • Piezoelectric energy harvesting: These harvesters generate a voltage when compressed or stretched [242]. In the work of Shahhaidar et al. [242], they were embedded in a belt alongside the chest. Due to low capacitance of the piezoresistive materials, the overall harvested energy was low. Therefore, this piezoresistive configuration was unable to provide the necessary energy to power the entire system. The main drawback to adopting this energy harvesting technique for respiration sensors is that the required vibration frequency is much higher than the respiration frequency. In this sense, the paper of Li et al. [243] presented a prototype based on the interaction between a piezoelectric cantilever and a magnet placed on a substrate (Figure 14B). The vertical vibration of the cantilever due to the magnet presence allowed generating a constant amount of energy. The substrate with the magnet was attached to subject body (a limb joint). The movements of the subject led to substrate stretching and contraction, which caused the vibration of the piezoelectric cantilever. The energy generated was stable for different types of movements, since it was tested on different parts of the body. The energy harvester worked correctly for subject movements in the frequency range of 0.5–5.0 Hz. It has potential to be used with breathing movements. Meanwhile, Wang et al. [244] presented a piezoelectric rubber band that could be mounted on an elastic waistband to generate electricity from the circumferential stretch caused by breathing. The paper showed a structure made up of top and bottom electrodes with two solid layers and one void layer in between (Figure 14A). They were made of composite polymeric and metallic microstructures with embedded bipolar charges. Finally, the work of Sun et al. [245] presented an energy harvester from respiration air flow based on the piezoelectric effect. They used piezoelectric polyvinylidene fluoride (PVDF) microbelts that oscillated under low-speed airflow to generate electrical power in the order of magnitude of µW (Figure 14C).
  • Triboelectric energy harvesting: They generate charges by rubbing two different materials (one is an electron donor and the other is an electron acceptor), resulting in the creation of a potential in the contact region [250]. One possible setup is to attach the tribo-pair to a belt to detect variations in abdominal circumference. Triboelectric generators were used in breathing studies as a means of measuring RR, but not as energy harvesters, since the power generated is low for the power requirements of the entire respiration monitoring system that includes also a data transmission module. In the work of Zhang et al. [246] two belts (one extensible and one inextensible) were attached to each side of two materials (Figure 15A). A mechanical experiment was performed to obtain the peak voltage for different sliding amplitudes in the range of 2.5 to 30 mm that represents the typical displacement of a breathing depth. The result of this experiment was Equation (3).
    V p e a k = 0.01383 X M a x + 0.0092 ,
    where V p e a k is the peak value of the voltage, and the X m a x is the maximum sliding displacement of the tribo-pair. A similar approach was proposed by Zhang et al. [77]. They presented a tribo-pair with both sides of one material fixed to two “Z-shaped” connectors that were attached to a belt with an inextensible part and an extensible part (Figure 15B). The abdominal contraction and expansion associated with respiration caused deformation of the two “Z-shaped” connectors. This deformation led to a process of contact and separation of the tribo-pair, generating an electrical signal.
A self-powered respiratory sensor and energy harvester was also shown in the work of Vasandani et al. [247]. The working principle was very similar to the work of Zhang et al. [77] but, in this case, a prototype was built with movable and fixed supports (Figure 15C). The two materials were fixed to these two supports. The movements associated with respiration caused an angular displacement of the movable support by means of a belt and a lever mechanism, harvesting energy. The voltage obtained between the electrodes was zero in case of full contact and rose to 9.34 V for a 60° separation. The maximum area power density was 7.584 mW/m2.
  • Electrostatic energy harvesting: It is based on the change of parameters of a capacitive device, which is called electrostatic energy harvester. Breathing may cause separation of the capacitor plates or modification of the plate area, among others [251]. This energy harvesting technique is not common in respiratory systems. The prototype of Seo et al. [248] showed a capacitor made of two metal electrodes and an insulating layer in between. The capacitance of the prototype varied with respiration. This was because the area of the top electrode was variable depending on the presence of a wet surface associated with respiration (Figure 16). Humid exhaled breath air was cooled by the ambient air on the top surface of the insulated material. Thus, the water molecules were condensed, acting as part of the upper electrode and changing the capacitance of the prototype. This condensation provided a thick layer that became part of the electrode. Then, the water naturally evaporated due to its vapor pressure and the device returned to its original status. The variable capacitance allowed the charges to circulate, harvesting electrostatic energy. The prototype presented in Reference [248] reported a generated power of 2 μW/cm2.
  • Pyroelectric energy harvesting: These harvesters are based on the reorientation of dipoles owing to temperature fluctuations [252]. Therefore, they need a temperature variation in time. Xue et al. [249] presented a prototype made of a pyroelectric component (metal coated PVDF film) covered with electrodes and mounted on the respirator of a mask at the location where air flows during breathing (Figure 17). The size of the prototype was 3.5 × 3.5 cm. The estimated current generated can be derived from the pyroelectric effect equation:
I = A p d T d t
where I is the generated current, A in the sensing area, p is the pyroelectric coefficient (approximately 27 μC/m2 K), and dT/dt is the variation in temperature. Temperature variation is due to the difference between human body temperature and ambient temperature. It is also influenced by the transformation of water vapor into exhaled gas. The pyroelectric generator is heated by expiration and cooled by inspiration. Therefore, electricity is harvested from a change in temperature over time. Peak power reached up to 8.31 μW with an external load of 50 MΩ.
  • Thermoelectric energy harvesting: These harvesters are based on the Seebeck effect. They convert a temperature gradient into electric power. Therefore, they need a temperature variation in space [253]. A thermoelectric module is an array of p-type and n-type semiconductors. According to Nozariasbmarz et al. [252], the conversion efficiency of a thermoelectric generator can be calculated as:
η = T H T C T H ( 1 + Z T ) 1 ( 1 + Z T ) + T C / T H ,
where TC and TH are the temperature of the cold and hot sides, respectively. ZT is the dimensionless figure of merit for the thermoelectric module. For the thermoelectric material, ZT can be calculated according to:
Z T = s 2 σ k T ,
where s is the Seebeck coefficient, σ is the electrical conductivity, k is the thermal conductivity, and T is the absolute temperature.
Thermoelectric energy harvesters are not usually considered to power respiratory sensors. In the review of Nozariabsbmarz et al. [252], it was reported that several generators used the heat from the wrist for thermoelectric power generation.
  • Solar cells: This technology has been also used to power respiratory sensing systems. The energy produced by the solar cells is stored in a battery through a charge regulator that also controls the discharge of the battery to power the sensing system. The charge regulator controls that both the battery and the sensing system are supplied with adequate voltage and current levels. Figure 18 shows an example of sensing system powered by solar cells. Solar-powered systems have not been extensively explored in existing studies. As an exception, the work of Gorgutsa et al. [84] presented a Received Signal Strength Indicator through standard Bluetooth protocol using a hybrid-spiral antenna made of multi-material fibers. The system was integrated into a cotton shirt. They used a low-power Bluetooth module that was powered by a rechargeable battery and a solar cell on a custom printed circuit board.
(2) 
Battery-Powered Systems
Battery-powered systems require, at least, a battery and a charger. These two elements should be considered in the sizing of the system. Batteries are usually one of the most limiting components in terms of space (Figure 19).
Power autonomy determines the viability of a system. The autonomy of a battery-powered respiration sensing system is obtained by calculating or measuring its battery life, which is defined as the time that a system can operate with a fully charged battery. Two different factors must be determined when performing tests to measure battery life: system operating mode and the way of measuring battery life.
Regarding system operating mode, there are essentially two different approaches:
  • Continuous operation: Battery life is measured with the breathing device operating continuously.
  • Continuous operation + inactivity periods: A typical daily use of the system is considered, which may include certain inactivity periods in which the device is in “idle” mode or even off (not used).
Regarding the way of measuring battery life, it should be noticed that it depends on the type of battery used and its parameters. The main parameter of a battery is its capacity, which determines the nominal amount of charge that can be stored. It is usually expressed in mAh. As a general rule, the higher the capacity, the longer the battery life. However, capacity depends on several external factors, such as discharge rate, operating temperature, aging, and state of charge (SOC). When a battery is discharged at low rate (low current), the energy is delivered more efficiently. Higher discharge rates (higher currents demanded by the breathing system) lead to a reduction in effective battery capacity [255]. Temperature also affects battery capacity in such a way that low temperatures decrease capacity. Aging may also decrease the capacity [256]. If a battery is not full, the state of charge (SOC) must also be considered. It represents the percentage of capacity that is currently available with respect to the rated capacity.
The most common and sensible approach is that tests are conducted with a new fully-charged battery that operates in the nominal temperature range and discharges within the nominal current range. Under these conditions, the nominal capacity of the battery can be considered its true capacity. Otherwise, different reduction factors (<1) should be applied to rated capacity. Therefore, different ways to measure battery life experimentally can be found in existing studies:
  • Measure of battery life directly: A battery can be considered discharged when the voltage drops below a certain value (3.6 V [257] for common small batteries). Therefore, by taking a full battery and monitoring the output voltage, it is possible to obtain battery life with expression (7).
    B a t t e r y L i f e   ( h ) = I n i t i a l T i m e D i s c h a r g e T i m e .
  • Measure of current consumption: Current consumption of the respiratory sensing system can be measured experimentally or estimated from the datasheets of the system components. The formula for calculating battery life is different for each operation mode:
    Continuous operation: The system is assumed to operate continuously consuming an average current value.
    B a t e r y   L i f e   ( h ) = C a p a c i t y ( m A h ) · S O C f a c t o r · C f a c t o r · T ª f a c t o r · A g e f a c t o r O C   ( m A ) ,
    where S O C f a c t o r ,     C f a c t o r ,     T ª f a c t o r ,   A g e f a c t o r [ 0 , 1 ] are reduction factors of the capacity to be applied in case tests are not performed under the optimal conditions mentioned above, and OC is the average value of the operating current.
    Continuous operation + inactivity periods (rough estimate): Current consumption in the operation and inactivity periods is assumed to be “constant”.
    B a t e r y   L i f e   ( h ) = C a p a c i t y ( m A h ) · S O C f a c t o r · C f a c t o r · T ª f a c t o r · A g e f a c t o r O C   ( m A h ) · n m i n O C n m i n t o t a l + I C   ( m A h ) n m i n I C n m i n t o t a l ,
    where IC(mAh) is the average value of current consumed by the system in idle or non-active modes, n m i n o c is the number of minutes that the breathing system is in operation mode during a certain period of time (for instance, one day), n m i n I c is the number of minutes that the breathing system is in idle or non-active modes for the same time period, and n m i n t o t a l = n m i n O C + n m i n I C .
    Continuous operation + inactivity periods (fine estimate): The calculation of battery life is performed using a more accurate model. Different values of current consumption are considered in operation and inactivity modes. In this calculation, the system can adopt not only two states, but n states. Let c = [ c 1 , c 2 , , c n ] be the average current values of each of the n different states of the respiratory system considered, and n m i n = [ n m i n 1 , n m i n 2 , , n m i n n ] the number of minutes in a given period of time (for instance, one day) that the breathing system remains in each state of the n possible states. The calculation can be done with Equation (10).
    B a t e r y   L i f e   ( h ) = C a p a c i t y ( m A h ) · S O C f a c t o r · C f a c t o r · T ª f a c t o r · A g e f a c t o r i = 1 n c i · n m i n i j = 1 n n m i n j .

3.2.2. Results of the Analysis

The previously described items were analyzed for the studies found as a result of the systematic review. These items were the use of wired or wireless data transmission, the performance of centralized or remote processing, the specific station used to carry out processing and the energy autonomy of the prototypes. They were studied for the wearable category as these elements are limiting in non-contact sensing systems. However, they are less crucial in environmental systems, since most of them use wired communications and are connected to a power source.
Table 4 shows a comparison of the approaches found in the state of the art for the wearable group. The first two columns of Table 4 show the specific studies that used wired and wireless data transmission, and Figure 20 presents the percentage distribution of the type of transmission. The use of wired and wireless technologies was similar.
Figure 21 shows the distribution of wireless technologies used for data transmission. Bluetooth was the preferred technology, as it is suitable for applications that send point-to-point information over relatively short distances and require high-speed data transmission. Its main drawback is power consumption, which could be a limitation for continuous monitoring, as existing studies state that the battery life is not more than a few hours. However, in view of Table 4, this method seems suitable for many applications. Wi-Fi, radio frequency, or Zigbee were used in a limited number of studies [73,96,144,156,159]. Regarding wired transmission, third column of Table 4 shows that USB communication was the preferred option [74,76,86,87,88,109,114,118,133,141,158].
Once measurements are transmitted, a main station processes them. Figure 22 shows the percentage distribution of the processing stations used in the studies selected in the systematic searches. PCs were the preferred processing stations, showing that most authors performed centralized processing, while the use of smartphones, tablets or cloud computing was not so common [2,59,65,66,67,74,76,77,84,91,98,99,101,102,107,109,116,119,122,130,132,134,143,144,156], although they were found in 30% of studies.
Regarding energy autonomy of systems, the use of energy harvesters was residual [84,104], which can be due to the fact that studies presented complete systems that included data transmission and processing modules. These modules are energy demanding, and therefore the use of energy harvesters can only be used as a complement, but not as the primary power source. In this regard, many studies [2,3,17,62,65,73,84,86,87,89,91,98,99,101,114,115,116,119,131,144,145,147,162] used rechargeable batteries to power the systems. The most common declared battery lives were in the order of hours (Figure 23) [2,17,62,69,101,115,119], although some studies did not even provide data on this point.
There were a set of studies focused on minimizing power consumption. They included low power data transmission technologies. In this regard, Milici et al. used wireless transponders [91] to obtain autonomy of more than one year, while Mahbub et al. [98] adopted Impulse Radio Ultra-Wideband (IR UWB), which led to an autonomy of about 40 days. In general, battery live is highly dependent on transmission technology. The works of Bhattacharya et al. [156], Puranik et al. [73], White et al. [96], Ciobotariu et al. [144], and Mitchell et al. [159] used wearable devices with Wi-Fi [73,96,144,156], Zigbee [159], or GSM/GPRS [144], with high variability in power consumption.

3.3. Validation Experiments

3.3.1. Items of Analysis

Different items were considered to analyze the validation experiments carried out in the studies:
  • Subjects: Almost all studies used volunteers to assess the respiration sensing systems. In this case, it is required to provide data, such as the number of subjects who participated in the tests and their main characteristics (age, weight, height, sex, and health status). As breathing studies generally involve humans, it is mandatory to have the approval of the competent ethical committee (following the Declaration of Helsinki [258]) to recruit the subjects to participate in the study, to inform them about the study, and to obtain their consent.
  • Activities/positions: This item refers to the specific activities or positions that volunteers who participate in the tests are asked to perform as part of the validation experiments. The most common positions adopted in existing studies are represented in Figure 24 with an example sensor.
  • Whether or not motion artifacts are included in the different activities.
  • Number and values of RRs or volume rates to be tested in the experiments.
  • Number of repetitions of the different test scenarios.
  • Duration: The designed tests (activities and positions, number of RRs or volume rates, and number of repetitions) determine the duration of the experiments.
  • Experiment design: This item refers to the strategies adopted to validate the breathing sensors. Three main methods have been found in the state of the art (Figure 25):
Artificial validation prototypes: Some studies used artificial prototypes that emulated human conditions rather than real volunteers. On the one hand, if the sensor were worn on the chest, diaphragm, or thorax, a mechanical structure that emulated human respiration could serve for validation. That was the approach adopted by Padasdao et al. [135]: a motor moved a mechanical chest to the rhythm and depth of human breathing (Figure 25A). Similarly, the work of Witt et al. [141] also used a mechanical chest driven by a stepper motor, setting the amplitude and frequency of the movements to simulate breathing activity. Another set of works [77,94,110,114,146] used machines or custom prototypes that applied traction and compression movements to simulate human respiration on strain sensors. On the other hand, if the system is to be worn in the nose or mouth, an artificial prototype can be built that emulates the airflow associated with respiration. For that, Agcayazi et al. [123] used a mannequin equipped with an inflatable cuff bladder that emulated breathing cycles, which is similar to the prototype of Koch et al. [90]. For humidity sensors, authors designed controlled humidity chambers using humidifiers and dry air compressors [74] or switches for controlling nitrogen flow and a motor to control the dispersion of water vapor [97]. Finally, other studies presented artificial validation prototypes adapted to the specific sensors used for respiration monitoring. Zito et al. [226] validated a radar sensor with a moving target that emulated the movements associated with breathing.
Validation using artificial prototypes has the advantage that different respiration or volume rates can be programmed precisely. These theoretical values can be compared with the measurements obtained with the sensor. Thus, no error can be attributed to the validation method. A typical validation workflow using this method is outlined in Figure 26. In this method, sensor measurements may be contained in matrix   A = ( a i j ) ϵ k × m , where k is the number of repetitions per parameter, and m is the number of different parameter values to evaluate. This measurement matrix A can be compared with the reference matrix B = ( b i j ) ϵ k × m . Matrix B contains the reference values used to program the artificial validation prototype. Therefore, all the elements in a given row have the same value as the jth reference parameter (column) remains the same for all repetitions ( i ϵ [ 1 . . k ] , row).
Metronome as reference: When humans are involved in the validation experiments, one option is to use a metronome to set the rate of respiration that subjects must follow during the tests (Figure 25B). The advantage of this method over artificial prototypes is that the sensing system is tested with the target subjects and not with an emulation of a human chest or throat. However, its weak point is that subjects may not accurately follow the rate of the metronome. Therefore, part of the measurement error can be attributed to the test design itself rather than to the sensing system. A typical validation workflow using a metronome as a reference is summarized in Figure 27. The measurements recorded by the sensor under validation may be contained in matrix A = ( a d e f g h ) ϵ n × k × p × l × m , which is a five-dimensional matrix with the measured values for each subject d, repetition e, activity f, position g and parameter value h. The reference to compare A is matrix B = ( b d e f g h ) ϵ n × k × p × l × m , which contains the reference breathing parameters set in the metronome for each subject d, repetition e, activity f, position g and parameter value h. Therefore, B exclusively contains the values of vector z = [ z 1 , z 2 , z m ] , which are the possible settings for the metronome (Figure 27).
Validation against a reference device: The most complete way to validate a new sensor is to compare its performance with the performance of a reference sensor considered as a gold standard (Figure 25C). The reference sensor and the sensor under validation must be worn at the same time to obtain synchronized measurements. Having synchronized measurements allows the sensing capabilities of both sensors to be compared fairly. The sensor under test should provide measurements as close as possible to those of the reference sensor. It is important to note that the reference sensor also has a measurement error. Therefore, this error should be considered in the comparison, as it may influence the results. Respiratory values provided by the reference device may differ slightly from real values. This validation method faces several challenges. First, it is essential to synchronize both measuring instrument and this synchronization can be difficult. Second, most commercial products do not provide information on how the final breathing parameter (RR or volume parameter) is obtained, so the comparison of measurements may not be obvious. In addition, most products do not allow selecting the refresh time window or do not even provide information about the length of this window, so it is not possible to know the set of measurements used to calculate the output respiration values. Figure 27 shows a typical block diagram of the validation method when using a reference device. The results of this validation are matrices A = ( a d e f g h ) ϵ n × k × p × l × m   and C = ( a d e f g h ) ϵ n × k × p × l × m , which contain the measured values for each subject d, repetition e, activity f, position g, and parameter h for the sensor under evaluation and for the reference sensor, respectively.

3.3.2. Results of the Analysis

Table 5 presents the results of the analysis of different items of the validation experiments for both wearable and environmental systems. Large differences among studies were observed in all aspects of the experiments: protocol, number of subjects, positions, types of breathing, duration, and inclusion of motion artifacts.
In relation to the number of subjects involved in the tests, 71% of the studies that provided this data included 10 subjects or less. Only 13% of the studies included more than 20 subjects [7,10,19,64,81,132,135,147,173,175,211,232]. There were also a considerable number of studies (53) that did not even provide this information. A part of them did not use subjects for sensor validation.
Regarding the duration of the experiments, most of the studies carried out short experiments of a few minutes. In fact, 58% of the studies performed tests of less than 5 minutes [69,70,81,86,87,88,96,100,102,104,111,125,136,161,163,165,171,172,173,174,193,195,196,200,212,215,221,228,229,232]. Most of the works that conducted longer tests included sleep studies [7,17,53,60,115,146,148,164,165,169,173,192,198,205,211,220,223]. Twenty-six studies reported that motion artifacts were considered during testing. They showed that the inclusion of motion artifacts in experiments greatly influenced sensor performance [2,9,17,53,61,62,66,67,68,81,108,109,117,119,131,132,135,147,157,178,187,190,196,205,210,221,225]. In relation to the activities or positions considered in the experiments, lying down and sitting were the most tested positions. Other positions or activities, like standing, walking, moving, or running, were used in a minority of studies [2,17,21,61,62,66,67,68,77,79,81,91,94,101,102,103,108,110,111,115,118,119,124,129,131,132,135,146,147,148,149,177,178,188,205,214,233,235]. Most of the studies that provided information on activities considered more than one position [2,6,9,17,21,52,53,61,62,66,67,68,77,79,85,86,87,88,94,102,108,110,115,118,119,124,129,131,132,133,135,146,147,148,157,164,165,169,171,177,178,187,196,198,205,210,211,213,214,220,221,223,225,233,235]. It was also common to test different values of the respiration parameter (for example, RRs from 10 to 22.5 bpm in the study of Vanegas et al. [254]).
In relation to the validation protocol, Figure 28 shows the distribution of the analyzed studies in the three categories introduced in Section 3.3.2: validation with an artificial prototype, metronome as reference and validation against a reference device. A new category was created to cover studies that performed informal validation. It was called “human observation”, since an expert provided a value of the breathing parameter from direct observation of the signals recorded by the sensors. Figure 28 shows that validation using a reference device was the predominant approach (adopted by 67% of the studies that performed validation), followed by the use of an artificial validation prototype (10%) [69,74,77,90,93,97,110,114,116,119,123,135,141,146,150,226]. It is also worth noting that 53 studies presented the sensing systems without providing evidence of their validation.

3.4. Sensor Measurement Processing

3.4.1. Items of Analysis

This category includes the following items: performance evaluation, software used for the analysis, and processing algorithm. This section describes them in detail.

Performance Evaluation

The evaluation of sensor performance can be done using several figures of merit, such as absolute error, relative/percentage error, root mean square error, correlation factor, Bland-Altman plot, calculation of accuracies, or linear regression.
Absolute error (Δ): Difference between the value measured by the sensor under test (x) and the reference value (y). It is calculated according to Equation (11).
Δ = x y .
It is more common to provide the mean absolute error (MAE) as the mean of the absolute value of all absolute errors:
MAE = 1 n i = 1 n | x i y i | ,
where n is the number of measurements obtained from the sensor under test, xi the values of those measurements, and yi the reference values associated with those measurements for the “artificial validation prototype” method and the “metronome as reference” method or the measurements of the reference device for the “validation against a reference device” method.
Relative error (RE): Absolute error of the breathing sensor under test (Δ) divided by its reference (true) value (y). Thus, it provides an error value relative to the size of the breathing parameter being measured. It can be obtained according to Equation (13). The mean of the relative errors (MRE) can be obtained using Equation (14).
RE = Δ y ,
MRE = 1 n i = 1 n | x i y i | y i ,
where n, xi, and yi are the same parameters as for the MAE.
If the relative error is expressed as a percentage, it is called the percentage error, although many authors also provide the relative error in percentage.
Root mean square error (RMSE): In respiration sensing studies, it is also used to compare the difference between the values measured by the sensor under analysis and the reference results. It is the root mean of these differences and can be obtained according to Equation (15).
RMSE = i = 1 n ( x i y i ) 2 n ,
where n, xi, and yi are the same parameters as for the MAE.
Correlation factor: It provides a measure of the relationships between the measurements taken by the respiration sensor under test and the reference data. There are different ways to calculate this correlation factor. Pearson correlation factor is one of most extended (Equation (16)).
γ x y = n x i y i x i y i n x i 2 ( x i ) 2 n y i 2 ( y i ) 2 ,
where n, xi, and yi are the same parameters as for the MAE. A correlation factor of 1 means maximum agreement between measured and reference data (optimal case), while a factor of 0 means that there is no relationship between the datasets.
Bland-Altman analysis: It is a graphical method to compare the measurements from the breathing sensor under test with the reference breathing values. A scatter diagram is drawn with the horizontal axis representing the mean between measured values and reference values ( ( x i   +   y i ) 2 ) and the vertical axis representing the difference between those values (xiyi). In addition, a horizontal line is included in the plot with the mean value of all differences. Two more horizontal lines (one upper and one lower) are plotted representing the limits of agreement (±1.96 times the standard deviation of the differences). The Bland-Altman plot is useful to show relationships between the magnitude of the breathing parameter and the differences between measured values. It may also help to identify systematic errors in measurements or to detect outliers, among others. This method is especially suitable for the validation method in which the sensor under evaluation is compared to a reference device.
Accuracy: It is the proportion of true results with respect to the total number of samples [259]. It can be used in studies of respiration sensors that identify breathing patterns within a given set of k possible patterns. It can also be applied to studies that determine the value of a breathing parameter within a discrete set of k possible values. Let x = [ x 1 , x 2 , , x n ] be the values of the n measurements taken by a respiration sensing system or the n labels of the breathing patterns recognized by the system. Suppose that, from the n different samples, m samples are correctly identified or measured, since they belong to the correct class of the k possible classes. Therefore, (n-m) samples are not classified correctly. The accuracy of the breathing system can be obtained as:
A c c u r a c y   ( % ) = m n · 100 .
Linear regression: It models the relationship between the values measured by the respiration sensing system under test (dependent variable) and the reference measurements (independent variable) by fitting a linear equation. The equation to fit has the form of y = a + b x , where y is the dependent variable, x is the independent variable, b is the slope of the line, and a is the intercept (value of yi when x i = 0 ). This linear fitting is performed using x = [ x 1 , x 2 , , x n ] , which is the set of n reference values of the breathing parameter, and y = [ y 1 , y 2 , , y n ] , which is the set of n values of the parameter measured by the sensing system under evaluation. In these conditions, the values of each xi and yi should be as close as possible i [ 1 . . n ] . This means that, if the match between the reference values and the measured values was perfect, the linear model should be a line with and intercept of 0 and a slope of 1.
In addition, the coefficient of determination r 2 could also be calculated to obtain what percentage of the variation in the values measured by the sensing system are predictable from the variation of the reference values according to Equation (18).
r 2 = 1 S E r e s S E y ¯ ,
where S E y ¯ = i = 1 n ( y i y ¯ ) 2 is the sum of the squares of the difference of each measured value yi with respect to the mean value of all measurements y ¯ , and S E r e s = i = 1 n ( y i ( a + b x i ) )   is the sum of the squares of the difference of each measured value yi with respect to the value predicted by the model. If S E r e s   is small, it means that the line is a good fit, and r 2 will be close to 1. Otherwise, if S E r e s   is large, it means that the difference between the measured values yi and the line is large, and r 2 will be close to 0 (bad linear fit). If the breathing system measured exactly the same values as the reference system, S E r e s would be zero and r 2 would be 1, which would be the ideal case.

Analysis Software

The most common tools used to analyze the measurements recorded by the sensors are:
  • MATLAB: Popular numerical computing environment and programming language that is suitable for the implementation of algorithms, matrix operations, or data plotting, among others.
  • Labview: System engineering software for applications that require testing, measurements, control, fast hardware access, and data information.
  • Others: An extensive set of tools has been used in existing studies, such as Python (high-level, programming language specially focused on facilitating code readability), R (free software environment and programming language for statistical computing [260]), C# (general-purpose programming language developed by Microsoft [261]), C (general-purpose programming language that supports structured programming), OpenCV (open source software library for computer vision and machine learning [262]), Blynk (Internet of Things platform), Kubios HRV (heart rate variability analysis software for professionals and scientists), Audacity (free open-source audio software), Kinect SDK (suitable for developing gesture or voice recognition applications, using Kinect sensor technology [263]), LabChart (physiological data analysis and acquisition software [264]), Acqknowledge (software to measure, transform, replay and analyze data [265]), mobile/Android (mobile operating system), LabWindows/CVI (software development environment specially focused on measurement applications [266]), microcontroller/microprocessor (suitable if the processing is not done in any external software, but directly in the same microprocessor or microcontroller that controls the sensor), or custom applications (PC applications in which the native source could not be accurately determined).

Processing Algorithm

A broad set of algorithms has been used to process measurements from respiration sensors such as peak detection, maximum-minimum detection, detection of zero-crossings, threshold detection, frequency analysis, wavelet transform, or Kalman filter, among others. They are briefly described in this subsection.
Peak detection: It is based on the detection of peaks in the signals registered by the sensing system (Figure 29). If no restriction is imposed regarding peak prominence, a peak can be calculated on a signal x = [ x 1 , x 2 , , x n ] according to Equation (19), where n is the number of samples in the signal. However, this method is extremely sensitive to noise and fluctuations (Figure 29A). To improve detection, it is possible to set a minimum surrounding number of samples (p) in which the values must be below the peak value (Equation (20)) to accept the detected peak (Figure 29B). Another option is to impose a strictly increasing slope on the p samples preceding the peak and/or a strictly descending slope on the p samples after the peak (Figure 29C), according to Equation (21).
x i 1 < x i > x i + 1       i : i [ 1 , n ] ,
x j < x i > x h         j : j [ i p , i 1 ]     ;     h : h [ i + 1 , i + p ] ,
x j 1 < x j         j : j [ i p , i ]     a n d       x h + 1 < x h     h : h [ i , i + p ] .
The peak detection method to process respiration signals has several important parameters that determine the number of detected peaks. Peaks selected according to Equations (19), (20), or (21) can be classified according to the prominence of the peak, discarding those peaks that are below a threshold value to avoid the effect of noise and fluctuations. Peak prominence can be defined as the vertical distance between the closest local minima (in horizontal direction) and the peak, although there are other possible definitions [267]. Let y = [ y 1 , y 2 , , y n ] be a vector containing the magnitude of all local minima of signal x, and b = [ b 1 , b 2 , , b n ] the position (horizontal value) of the local minima y. If ai is the position of peak i, and bk is an element of b that satisfies that b k = min | b a i | (the position of the local minima closest to i), then, a peak i will only be accepted if its prominence is above a set threshold (PP), | x i y k | < P P (Figure 29D).
Another parameter that may be used to determine the number of peaks is the distance among them. Breathing signals are low frequency (usually less than 25 bpm [254]); therefore, a threshold (TD) is generally established to discard those peaks that do not differ by, at least, TD from another previously detected peak (Figure 29E). Let c = [ c 1 , c 2 , , c q ] be the position of the q peaks detected in a signal. A new peak candidate i, with position on the horizontal axis di, will only be accepted if | d i c j | < T D   ,   j : j [ 1 , q ] .
It is also common to discard peaks that do not reach a certain level TL ( x i < T L ) (Figure 29F) or, alternatively, that a new peak i is discarded if its value does not differ a given threshold TV from the q peaks already detected; that is, if | x i x j | < T D   ,   j : j [ 1 , q ] (Figure 29G).
Maximum-minimum detection: A popular processing technique is to identify maximum and/or minimum points in the breathing signals (x). Massaroni et al. [103] used the maximum and minimum values to obtain the respiratory period (Tr), as well as inspiratory (Ti) and expiratory (Te) time. The process for detecting maximum and minimum points is similar to peak detection.
Zero-crossings: Technique based on the detection of the crossings of a breathing signal by a “zero” level taken as a reference. Given a respiration signal composed of n values   x = [ x 1 , x 2 , , x n ] , a new zero crossing at the i value is detected when inequality (22) is satisfied.
x i 1 < x i < x i + 1         i [ 1 . . n ] .
One of the challenges of this method is to find the “zero” level taken as a reference to detect the crossing. One possible option is to detect the maximum and minimum values in a specific window and obtain the “zero” level as the mean of those values ( max ( x ) + min ( x ) ) / 2 . However, this method is sensitive to outliers (Figure 30A). A possible solution is to take the median of x as the “zero” level (Figure 30A). Another option is to remove 10–20% of the largest and smallest values of x, obtaining a subset of values yx. Then, the “zero” level can be calculated as the mean of the maximum and minimum values of y.
The “zero-crossings” technique is also affected by trends or biases in the measurements. Trends may be due to movement of the sensing element or movement of the subject. It is a common phenomenon, especially in belt-attached breathing sensors. Figure 30B shows a real breathing signal with trends (blue curve) from a public dataset [254]. If the “zero” level is calculated on a signal with trends, many crossings may go undetected since the same “zero” level is not a valid reference for the entire signal. To solve this problem, it is possible to eliminate trends in the signal by subtracting the bias (Figure 30B, orange signal). Another option could be to split the signal into shorter windows and calculate a different “zero” level for each window (Figure 30C).
This technique is also sensitive to noise since the number of zero-crossings may increase in noisy signals. Figure 30D shows how noise is confused with multiple crossings at the “zero” level in a breathing signal. This can be avoided by defining a minimum distance in the horizontal direction (TD). Let z = [ z 1 , z 2 , , z q ] be the positions in the horizontal axis of q detected “zero-crossings”; then, a new “zero-crossing” i with position di will only be considered if | d i z j | < T D   ,   j : j [ 1 , q ] .
Threshold detection: This technique is similar to “peak detection”, “maximum-minimum detection” or “zero-crossing detection”. In this case, the level to detect is not a characteristic point of the curve but a certain threshold value. The same analysis performed for the previous categories could be applied to this method.
Frequency analysis: This category includes different techniques that make use of frequency information to obtain respiration parameters. The most common approach is to use the well-known Fourier Transform (FT). Several studies detected peaks in the spectrum of respiration signals or in their power spectral density (PSD) to obtain the breathing parameters. This method depends on the time window (Figure 31A). To provide meaningful data, long time windows are desirable. However, this limits refresh time of the system. A compromise between accuracy and refresh time is required. Figure 31A shows a breathing signal and its spectra obtained with the FT for different refresh time windows. The example respiration signal has a frequency of 0.33 Hz (20 bpm) and a sampling frequency of 50 Hz. For a 4-s time window, the maximum available resolution is   F s / N , that is, 0.25 Hz. Figure 31A shows that the detected frequency is in the range of 0.25–0.5 Hz. This resolution is 0.125 Hz for the 8-s time window (frequency detected in the 0.25–0.375 Hz range) and 0.0625 Hz for the 16-s time window (frequency detected in the 0.3125–0.344 Hz range). It can be seen that the wider the time window, the more accurate results are obtained using this method. However, wide time windows make it difficult to apply respiration monitoring systems to critical scenarios where instantaneous values must be provided.
This transform is also sensitive to noise fluctuations. Noise fluctuations are generally of a much higher frequency than breathing signals. Therefore, it is common to pre-filter the signals to remove frequencies that exceed those of breathing activities. Figure 31B shows the frequency spectrum of a real respiration signal without filtering and with digital low-pass filtering. It can be seen that the peak of the respiration frequency (0.33 Hz) is more separated from the rest of the spectrum values in the filtered signal (7 units for the filtered signal and 5 for the unfiltered signal). If noise levels increased, it would even be difficult to distinguish the peak associated with the respiration frequency.
On the other hand, low frequency signal fluctuations may appear due to movements in the sensing device or movements of subjects if breathing is measured during dynamic activities, such as walking. These fluctuations must be treated to provide accurate results. They can be mathematically modeled according to Equation (23).
v ( t ) = A [ 1 + λ sin ( w f t ) ] sin ( w t φ ) ,
where w is the angular frequency of the normal breathing signal, wf represents the angular frequency of the interference-causing activity, and λ is the magnitude of that activity. Figure 31C shows an example of a real breathing signal with low frequency fluctuations. Those frequency fluctuations can lead to peaks at very low frequency values of the spectrum. As low-pass filters are generally applied, those frequencies would not be removed and could therefore be confused with the respiration parameter, which is also low frequency.
Sudden movements of subjects may also cause fluctuations in signals, which can affect the measurements. Figure 31D shows an example of a real respiration signal with fluctuations due to movements during acquisition tests. The bottom of Figure 31D shows its spectrum with a peak in the respiration frequency and other lower peaks (in red) at close frequencies due to signal fluctuations.
Other studies have also obtained breathing parameters from frequency using frequency modulation (FM) or amplitude modulation (AM).
Wavelet transform: It is used to decompose the breathing signal in such a way that a new representation can be obtained that allows a better detection of respiration peaks or crossings. It has been used in the continuous or in the discrete form [268]. In the continuous wavelet transform (CWT) a comparison is made between the respiration signal and an analyzing wavelet ψ. The wavelet is shifted by applying a dilation factor (b) and is compressed or stretched by applying a scale factor (a). Therefore, the CWT can be calculated according to Equation (24).
C W T ( a , b ) = x ( t ) ψ a b * ( t ) d t ,
where x(t) is the breathing signal under analysis, and denotes the complex conjugate [269]. The scale factor (a) has an inverse relationship with the frequency (the higher the value of a, the lower the frequency, and vice versa). The dilation factor (b) allows delaying (or advancing) the wavelet onset. Therefore, it contains time information. In this way, the CWT can provide a kind of time-frequency representation where high frequency resolution is obtained for low frequencies and high time resolution is obtained for high frequencies. This is shown in Figure 32A where a real respiration signal is processed with the CWT. The time-frequency representation of the processed signal is shown in Figure 32A (right). It can be seen that a low frequency value around 0.33 Hz is identified with high resolution in frequency but low resolution in time. In the example respiration signal, the frequency remains fairly constant around the value of 0.33 Hz.
A variant of the WT is the multiresolution analysis (MRA) [269]. The MRA represents the voltage signal at different resolution levels by progressively analyzing the breathing signals into finer octave bands (Figure 32B). For that, the original signal is convolved with high and low pass filters that represent the prototype wavelet. The outputs of the low pass filter are called “approximation coefficients”, while the outputs of the high pass filter are called “detail coefficients”. Approximation coefficients are down-sampled by a factor of 2 and are again subjected to high-pass and low-pass filtering, obtaining a new set of “detail” and “approximation” coefficients. This process is repeated iteratively, resulting in different resolution levels. For a given decomposition level n, the detail coefficients contain information on a particular set of frequencies (from f s / 2 n   to f s / 2 n + 1 ), with fs being the sampling frequency. Regarding the “approximation coefficients” of the same decomposition level, they contain low-frequency information in the range f s / 2 n + 1 0   Hz. The number of decomposition levels of the MRA depends on the specific breathing signal, so the band of the respiration frequencies can be correctly identified. It is affected by the sampling frequency of the system. This decomposition process is explained graphically in Figure 32B. The original respiration signal (x) can be reconstructed from its detail and approximation coefficients as follows:
x = j = 1 l D j + A l ,
where l is the number of decomposition levels. Figure 32B also shows an example of this technique applied to a breathing signal with a sampling frequency of 50 Hz. Six decomposition levels were selected to obtain five sets of detail coefficients in the ranges 25–12.5 Hz, 12.5–6.25 Hz, 6.25–3.125 Hz, 3.125–1.563 Hz, 1.563–0.781 Hz and one set of approximation coefficients in the range 0.781–0 Hz. The first and third levels of detail coefficients and the sixth level of approximation coefficients were represented in Figure 32B as an example. In this case, the level of interest was the sixth (approximation coefficients) since breathing signals are of low frequency. The Fourier Transform was performed on the coefficients of the sixth level, obtaining a clear peak at the frequency of 0.33 Hz, which matches the breathing frequency of the sample respiration signal (20 bpm).
In the work of Scalise et al. [232], the signal was decomposed into 12 levels and level 11 was considered to obtain the RR. Guo et al. [166] performed a 4-level decomposition, selecting level 3 to calculate the RR. Therefore, the wavelet transform is used to obtain the respiration signals in the desired frequency band.
Kalman filter: This technique has been used by several studies as a sensor fusion method. Thus, it is not a method to extract breathing parameters but to fuse measurements from different sensors. When multiple respiration sensors are available, the measurements they provide are not exactly the same. Furthermore, measurements always contain noise. The Kalman filter is used to provide a final value based on the measurements of the different sensors, the model of variation of the breathing parameter, the noise model of the sensors, and the variation model [270]. Figure 33 shows an overview of the Kalman filter algorithm adapted to the fusion of breathing sensors.
The Kalman filter has two distinct phases: prediction and update. The prediction phase estimates the state (breathing parameter) in the current time step using the state estimate from the previous time step (previous breathing parameter). The breathing parameter predicted in this phase is called the “a priori” state estimate x ^ k and is obtained according to Equation (26).
x ^ k = A x ^ k 1 ,
where x ^ k 1 is the state estimate in the previous state, in this case the previous breathing parameter estimated, and A is the state transition model. Matrix A represents the expected evolution of x ^ k 1 for the next transition. As breathing does not vary much in the short term [102], a common approach is to define A as an identity matrix, so the “a priori” state estimate x ^ k is equal to the previous state x ^ k 1 . If respiration is not expected to be constant in the short term, A should contain the linear variation model. The “a priori” estimate covariance P k (Equation (27)), which is a measure of the accuracy of the “a priori” state estimate x ^ k , must also be predicted. It depends on the transition model A, the value of the covariance in the previous transition P k 1 , and Q. Q is the covariance of the process noise (the noise of x ^ k prediction model). In order to apply the Kalman filter, the process noise must follow a Gaussian distribution with zero mean and covariance Qk ( ~ N ( 0 , Q ) ) . Although A and Q can vary at each time step k, it is common for them to take a constant value. Many methods exist to determine Q. In the breathing system presented by Yoon et al. [131] Q was a diagonal matrix (which is a common approach) of the order of 10−4.
P k = A P k 1 A T + Q .
Once the “a priori” state estimate x ^ k has been predicted, the update phase comes into play. In the update phase, the “a priori” state estimate x ^ k is refined using the measurements y k recorded by the sensors. The result is the final value of the breathing parameter x ^ k , which is called the “a posteriori” state estimate (Equation (28)).
x ^ k = x ^ k + K k ( y k H x ^ k ) .
The estimation of x ^ k   depends on the predicted “a priori” state estimate x ^ k , the measurements registered by the different breathing sensors y k and the matrices Kk and H. Kk is known in the Kalman filter as the optimal Kalman gain. It minimizes the “a posterior” error covariance. A common way to calculate it is according to Equation (29).
K k = P k H T H P k H T + R .
This gain depends on the “a priori” estimate covariance P k   and two model parameters (H and R). H is the observation model that relates the measurements taken by the sensors y k to the state space x k (breathing parameter), as follows y k = H x ^ k . It is common that previous techniques introduced in this section (peak detection, maximum-minimum detection, zero-crossings, threshold detection, frequency analysis, or wavelet transform) are used to directly estimate the respiration parameter from the measurements. In that case, the measurement space and the state space are the same. Thus, H could simply be the identity matrix. If the respiration parameter (RR, for example) were not provided directly as a result of the measurements, and other parameters were given instead (such as the number of peaks, zero-crossings, etc.), matrix H would contain the equations to calculate the breathing parameter from those values. Those equations were previously introduced in this section.
R is the covariance of the observation noise (the noise associated with the measurements y k ). The observation noise should also follow a Gaussian distribution with zero mean and covariance R (~N(0,R)). Although H and R can vary at each time step k, it is common that they adopt a constant value.
In the update phase, the covariance is also updated to obtain the “a posteriori” estimate covariance P k according to Equation (30).
P k = P k K k H P k .
As a result of the update phase, the final breathing parameter x ^ k is estimated, which is the output of this algorithm. However, the Kalman filter is an iterative method that recalculates x ^ k at each time step. Therefore, the “a posteriori” state estimate x ^ k at the current time step will be the previous state estimate x ^ k 1 at the next time step. The same happens with the covariance since the “a posteriori” estimate covariance P k at the current time step will be the previous estimate covariance P k 1 at the next time step. In this way, the algorithm can start a new prediction process again (Figure 33). The whole process is repeated indefinitely. The output of the system at each transition is the “a posteriori” estimate of the respiration parameters x ^ k .

3.4.2. Results of the Analysis

Table 6 presents the results of the analysis for the wearable studies and Table 7 shows the results of the environmental studies. The second column of each table includes the specific data processing techniques used in each study. Figure 34 represents the number of works that use the different processing methods for wearable and environmental studies. The category “custom algorithm” was added to refer to processing algorithms that cannot be classified in any other group, as they are specific to the sensor used for respiration monitoring. It can be seen that “peak detection” in respiration signals and “frequency analysis” using the Fourier Transform were some of the most widely used methods by both wearable and environmental studies. The sum of the percentages of use of techniques based on the detection of levels in their different forms (peaks, maximum and minimum values, zero crossings, or thresholds) was 42% for the wearable category and 33% for the environmental systems. In the environmental category, the variability in data processing methods was much greater than in the wearable category, as a large number of studies applied “custom algorithms”. The use of wavelet decomposition or the Kalman filter was residual [102,131,165,166,193,212,232].
Figure 35 shows the figures of merit used to provide a value of sensor performance for the wearable and environmental studies. The categories “graphical comparison” and “graphical monitoring”, which could be considered as informal metrics, were added to the list of evaluation metrics of Section 3.4.1. The category “graphical comparison” refers to studies that visually compared the performance of the sensing system under evaluation with a reference system, but did not use an objective metric. The category “graphical monitoring” indicates that measurements from sensors were plotted, but no formal metric was calculated and no quantitative comparison was made. Figure 35 shows that “absolute error”, “relative/percentage error”, “Bland-Altman plot”, and “correlation coefficient” were the preferred formal metrics for wearable and environmental systems. The use of “root mean square error” [48,52,95,107,115,117,147,164,170,171,187,198], “linear regression” [68,161,167,183,209], and “accuracy” [7,76,133,161,179,190,193,207,216] was limited. Furthermore, the percentage of studies that provided an “informal” figure of merit was much higher for the wearable category (45%) than for the environmental group (17%). Therefore, a stronger assessment can be seen in environmental studies. In general, validation results show low error values and a high correlation with reference devices. The details for the different studies are included in the fourth column of Table 5 and Table 6. Fifty-two% of the studies that used relative or percentage errors provided a value less than 5%, while only 12% reported a value greater than 10%. Correlation coefficients greater than 0.95 were provided by 46% of the studies [19,52,61,112,126,149,176,180,208,215,222,228,232]. Regarding absolute error, 78% of the studies that calculated the RR as the breathing parameter provided a value less than 2 bpm [5,19,64,115,124,132,135,172,175,176,177,182,189,195,201,207,218,224]. No study reported an absolute error value greater than 4 bpm. In relation to the Bland-Altman analysis, the mean of the differences was less than 0.2 bpm in 49% of the studies that provided data on this metric [48,66,68,69,78,94,95,112,115,126,167,170,172,180,195,200,210,228].
Regarding the tools for measurement processing, Figure 36 shows the distribution of use of the different tools for the wearable and environmental respiration monitoring systems. MATLAB was the preferred software for both types of systems, since it was adopted by half of the studies, while NI Labview was the second most used tool as it appeared in 20–30% of the works [2,3,19,63,71,85,86,87,88,127,129,131,138,142,146,153,163,201,218,219,220,223,225]. The rest of the tools relied heavily on the specific sensor used to capture the data. For instance, Audacity, as a sound processing tool, could only be used in microphone-based respiration monitoring [162]; OpenCV, as a computer vision library, was suitable for respiration monitoring through images [187,216]. Therefore, the use of the rest of the tools was residual.

4. Discussion

Respiratory monitoring has been actively investigated in recent years, as can be deduced from the high number of studies included in this systematic review of the literature. While monitoring breathing in hospital or controlled environments poses fewer problems, the main research challenge is to monitor breathing for a long period in the user’s daily environment.
Following the approaches of previous works [24], two different sets of systems for respiratory monitoring were identified. On the one hand, wearable systems have the advantages that they can be used in any environment, either indoors or outdoors, are generally easy to install and, in most cases, inexpensive. However, the level of obtrusiveness can determine the acceptability of this type of systems. Some sensors, such as those designed to be worn on the face or neck, are more obtrusive. A set of wearable sensing technologies are less invasive. This may be the case of those that are worn in the chest, abdomen, arms, or wrist. This might be one of the reasons why the detection of chest movements is the predominant approach. Another reason could be that chest seems to be the part of the body that presents the greatest variations in its state as a result of the respiratory activity. However, most of these technologies require users to wear a belt on the chest or abdomen, electrodes that make contact with the skin, or tight clothing to detect the movement of the thorax [254]. These restrictions might cause discomfort in the long term or, in extreme cases, even skin problems. The proposal of Teichmann et al. [56] is original since the sensor is carried in a pocket of a shirt that does not need to be tight. This represents an advantage over other approaches, although some users may find the lack of integration into clothing uncomfortable. Future research can go in that direction. A common drawback of wearable systems is that they are heavily affected by artifacts caused by non-breathing movements. This leads to larger measurement errors, which can even compromise the viability of the sensing systems in extreme cases. On the other hand, environmental sensors have the advantages that they are non-invasive, data transmission can be done with cable communication and battery life does not limit their operation. However, their scope is restricted to a particular area. Any change in the environment (for example, the relocation of furniture) can modify the detection capabilities. Additionally, some technologies, such as computer vision, present privacy concerns, which may affect user acceptance [271]. Environmental sensors seem suitable for home or hospital applications, but not for continuous monitoring of moving subjects. In fact, usability is a big challenge in respiratory monitoring. Several authors have highlighted this fact [10,19,51,52,115,159,187,254]. However, despite this, we identified a clear gap in the literature since it was not possible to find any usability analysis of the sensors implemented in the existing studies. Future research should also focus on usability. For that, well-known usability tests can be applied to evaluate the level of acceptance of technology by its potential users. For example, the User Acceptance of Information Technology (UTAUT) model [272,273] may serve. This model was applied previously to evaluate smart wearable devices [274], including health care [275,276], and m-health devices [277]. Other parameters, such as size or weight, can also affect the adoption of the technology. These parameters have only been provided in a limited number of studies [2,3,17,21,49,62,64,68,72,85,93,94,97,98,99,103,105,108,110,113,114,116,119,120,124,126,135,136,138,141,143,144,146,148,161]. Future works should also consider size and weight as important factors in the design of sensing systems, subjecting them to evaluation.
Regarding the type of sensors used for respiratory monitoring, fiber optic sensors prevailed in the wearable category. This may be due to their insensitivity to electromagnetic fields, their high resistance to water and corrosion, and their compact size and low weight [125]. This technology also allows monitoring different types of physiological parameters simultaneously [278]. In addition, resistive sensors and accelerometers were the second and third most widely used technologies. This might be explained as they are suitable for detecting movement variations, and their design is simpler than other technologies, such as capacitive, pyroelectric, or piezoelectric sensors, among others. In relation to the environmental category, most researchers designed radar-based sensors. Cameras were also widely used. The great development of computer vision technology in recent years [279] makes the detection of chest movements through video image technically feasible. However, cameras present privacy concerns, which may be why radar sensors are the preferred non-contact technology. Radar systems are also small in size, low cost, and simple in structure, which provides advantages in installation and operation [280]. The researches that decided to integrate the sensors into everyday objects again opted for fiber optic technology and sensors based on the measurement of resistance changes. This could be due to the advantages of these technologies, which have been mentioned before. However, the use of non-object-embedded environmental systems was the predominant approach, as they do not require users to be in permanent contact with an object, increasing system applications.
Comparing the performance of sensors is a challenge. It is difficult to compare the performance of different studies, since there is no standardized test to validate the sensors. Authors designed customized experiments with great differences among them. Many aspects were defined differently: the type and values of the respiratory parameters considered, the positions of the subjects during the tests, the number of human participants involved in the experiments, their characteristics, or the duration of the experiments. Differences were also found in the inclusion of motion artifacts and in the mechanical devices that simulated respiration, among others. A consequence of this is that performances provided by existing studies are not comparable, since they were obtained under different test conditions. Therefore, a future research effort is to design a common evaluation framework. This framework should include quiet and rapid breathing and different postures, such as standing, lying, or sitting. Experiments should include motion artifacts since they affect sensor performance, as shown by several studies [53,108,157,158]. Additionally, they should involve a number of users high enough to obtain significant results. In general, the number of subjects participating in existing studies remains low.
In view of the results shown in Section 3.3.2, it is a fact that existing studies carry out short experiments to validate the sensors. However, little attention is paid to their long-term behavior. The effect of temperature, sensor aging, or the characteristics of the carrier subjects (such as height or weight) on the sensing systems have not been actively explored. A sensor that works well in a laboratory environment might not work as well in a real setting when used for a long time. If there were errors in the measurements, this would cause frequent recalibrations of the sensors. Therefore, a research challenge is to test the behavior of respiratory monitoring systems in more realistic environments. The declared performances in laboratory or controlled settings are high. The challenge is to prove that they are equally high in real-world usage. Sensor aging might be a problem in terms of system performance. However, it is less critical in terms of cost as replacement of sensing parts is generally affordable due to its typically low price.
Regarding the declared performances, the validation of the sensors should be done considering reference devices. This is the approach adopted by most of the studies. Other validation methods, such as the use of metronomes or artificial prototypes, are less common since, unlike reference devices, they are not well-established systems that can be acquired by the scientific community to replicate experiments or compare results. There is less consensus with respect to the figure of merit used to determine the accuracy of the sensing systems. The relative, absolute and RMS errors, the slope of the linear regression line, and the correlation factor have been considered. It is also common to apply the Bland-Altman analysis [281]. It not only provides information on the differences between the measurements of both sensors but also shows the variation of the differences with respect to the magnitude of the measurements. In addition, the standard deviation of the difference is also used to obtain the upper and lower limits of agreement. The high variability of the figures of merit makes it difficult to compare the studies. One issue of respiratory monitoring is that the acceptable margin of error is not clearly defined and, therefore, it is difficult to determine whether a new sensing system is in agreement with reference devices. This may be a consequence of the lack of a common experimental framework, since the margin of error depends on the specific experiments carried out. For example, the acceptable error may be different for slow or rapid breathing. Ideally, both the mean of differences and the limits of agreement in the Bland-Altman analysis should be provided. As complementary information, it would also be interesting to have the mean absolute or relative errors, or the correlation factor. This would facilitate comparison of system performance among studies. However, this is not the most common approach and only a limited number of studies have incorporated it [10,44,57,60,61,65,78,92,100,109,112,114,122,123,124,132,163,165,166,167,168,169,176,178,191,196,205,206,211,224,228].
Additionally, the parameter to be sensed varies among studies. The most common breathing parameter obtained by existing studies is the RR. However, several studies calculate volume parameters, which are useful for many applications. There are studies that provide both [2,49,52,61,116,122,147], although the calculation of the volume parameters is a challenge since it depends on the specific technology. An approximation for a capacitive textile force sensor can be found in the work of Hoffmann et al. [17]. However, this is still and open research topic. Obtaining an accurate estimate of volume parameters using the sensing techniques presented in this review is not easy, especially for wearable systems.
Regarding processing algorithms, it can be concluded that detection of peaks, maximum and minimum values, thresholds, or zero crossings were effective in determining respiration parameters. Frequency analysis also provided good results. This aspect seems to have been successfully resolved in existing studies. The use of other processing techniques is residual, as they generally require more computing resources, are more complex, and are highly dependent on the specific design of the sensing system.
Wired and wireless transmission are used equally, as the type of transmission is usually determined by the type of sensor. Within wireless systems, Bluetooth was the preferred option. This may be explained because most sensing systems communicate with a smartphone/tablet or PC that is close to the sensing unit. In fact, PC processing is the main trend. This can be a consequence of the majority of studies presenting laboratory prototypes that are far from usable portable systems. In general, authors do not give much thought to the amount of resources that the processing algorithms use as they perform centralized offline processing on a PC using numerical computing software, such as MATLAB. This could compromise the real time operation of the systems when they are running continuously. Future research efforts should focus on designing suitable processing techniques to run ubiquitously in real time on the same microprocessor of the sensing unit or on smartphones.
Power consumption is crucial in wearable respiratory monitoring. Most studies did not provide information on power consumption or battery life. In addition, there was no consensus on the measurement procedure and the energy parameters that should be provided. In this context, it is very difficult to compare the power consumption provided by the different studies fairly. For example, battery life varies greatly depending on factors, such as data transmission procedure (continuous, intermittent, or without transmission), sensor operating time (non-stop, several hours a day, etc.), monitoring visualization (real-time display, without visualization, etc.), or the inclusion of the processing of the measurements in the power study, among others. When a study provides the battery life of a respiratory sensor, not only must the capacity of the particular battery used in the study be provided, but also other characteristics that affect its performance: depth of discharge, cycle lifetime, or c-rate [282]. It would also be interesting that researchers indicate the power consumption of each component of the system and not just the total autonomy of the device [98]. This would allow the identification of the critical components and facilitate the comparison of different systems. Given that autonomy is a limiting factor, strategies to reduce power consumption are required [135]. However, only a few works have implemented them. Several respiratory monitoring studies focused on transmission technology, since it is usually the most demanding module [91,98]. For example, technologies, such as Wi-Fi or Bluetooth, consume more energy than Impulse Radio Ultra-wideband. Other strategies adopted were the down-sampling of data to reduce computational load [119]. The limited number of works that adopted energy saving strategies may be a symptom that researchers are more focused on validating their sensors than in real-world applications. This can also be inferred from the short battery life indicated by most studies that include this data (less than 12 h typically). Furthermore, the use of energy harvesting techniques in respiratory monitoring is another open research question, since the number of studies that implemented them is still residual [77,84,104,135,240,241,242,243,244,245,246,247,248,250,251,252,253]. Most studies presented laboratory experiments instead of functional prototypes.
An ideal breathing sensor should be mobile, easy to use, imperceptible and immune to body motion [48,119,135]. Several authors agree that a system that covers all these aspect should be successfully integrated into clothing [21,59,65,69,84,85,94,103,108,113,119,123,142,143,151]. This is a consequence of the fact that long-term home monitoring entails direct connection between patient and system. However, this poses several problems, such as the adaptation of the sensing system to different sizes of clothing, the integration of the energy supply, or the washing of sensors, among others. In fact, this is an open research question.

5. Conclusions

This paper presents a systematic review on sensors for respiratory monitoring, filling a gap in the state of the art since no published reviews analyzing respiratory sensors from a comprehensive point of view could be found to the best of our knowledge. As a result of several searches, an overwhelming number of studies was found. They were sorted by relevance and, finally, 198 studies were obtained to be examined in detail. They were classified into two groups: wearable and environmental sensors. Several aspects were analyzed: sensing techniques, sensors, breathing parameters, sensor location and size, general system setups, communication protocols, processing stations, energy autonomy, sensor validation experiments, processing algorithms, performance evaluation, and software used for the analysis. As a result, detection of chest movements was identified as the most common technique using fiber optic sensors for the wearable systems and radar sensors for the environmental systems. The RR was the most common breathing parameter obtained in 68% of the studies. Most of the studies performed centralized measurement processing on a PC using MATLAB software. Bluetooth was by far the prevalent communication technology (60% of the wearable studies adopted it), and almost all wireless respiration sensing systems were battery powered. The most common validation approach was to use a reference device to perform real tests on real subjects. Furthermore, a high percentage of studies obtained the breathing parameter after performing frequency analysis or peak detection on the measurements. Meanwhile, the most common figures of merit selected to provide evidence on sensor performance were absolute and relative errors, Bland-Altman analysis, and correlation coefficients.
This review also identified future research challenges. One of them is the need to define a common framework to validate the sensors, since each author carried out his or her own experiments. This makes it difficult to compare sensor performances. Similarly, measurements of power consumption were made under different conditions. A common measurement procedure is required to compare sensor autonomies fairly. There are no long-term evaluations that study the effect of aging, environmental conditions, or characteristics of the subjects on sensor performance.
Usability tests are also lacking in existing studies. Similarly, the figure of merit to provide sensor performance varies from one study to another. The Bland-Altman analysis was identified as the most appropriate method to validate the sensors against reference devices. Other research challenges are the implementation of energy-saving or energy-harvesting strategies, the application of respiratory sensors to real-world scenarios, or the calculation of volume parameters in the different sensing technologies. All these are remaining research efforts.
Finally, several authors highlighted the integration of respiratory monitoring sensors in clothing as a promising technology. This is a future research effort, which presents several challenges for a feasible, long-term, and unobtrusive solution.

Author Contributions

Conceptualization, R.I. and I.P.; methodology, E.V. and R.I.; formal analysis, E.V. and R.I.; investigation, E.V., R.I. and I.P.; resources, E.V.; data curation, E.V. and R.I.; writing—original draft preparation, R.I.; writing—review and editing, R.I and E.V.; supervision, R.I.; project administration, I.P.; funding acquisition, I.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the European Union and the Gobierno de Aragón, grant number “Programa Operativo FEDER Construyendo Europa desde Aragon T49_20R”, the Universidad de Zaragoza and the Centro Universitario de la Defensa de Zaragoza, grant number UZCUD2019-TEC-02, and the Consejo Nacional de Ciencia y Tecnología CONACyT, grant number 709365.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Analysis of techniques, sensors, breathing parameters, and sensor locations and sizes for studies of the wearable category published before 2018.
Table A1. Analysis of techniques, sensors, breathing parameters, and sensor locations and sizes for studies of the wearable category published before 2018.
StudyTechniqueSensorMeasured ParameterLocationSize
Agcayazi 2017 [123]Chest wall movementsCapacitive-Chest (shirt)-
Aileni 2017 [134]Chest wall movementsResistiveRRChest-
Basra 2017 [145]Air temperaturePyroelectricRRNose (nostril)-
Bhattacharya 2017 [156]Air temperatureThermistorRRMouth mask-
Das 2017 [162]Air temperatureThermistor-Nose (near)
Mouth (near)
-
Fajkus 2017 [83]Chest wall movementsFiber opticRRChest-
Gorgutsa 2017 [84]Chest wall movementsFrequency shiftMonitoring of breathingChest (shirt)-
Guay 2017 [85]Chest wall movementsFrequency shiftMonitoring of breathingChest (shirt)20 × 10 cm
Kam 2017 [86,87,88]Chest wall movementsFiber opticMonitoring of breathingChest-
Kano 2017 [89]Air humidityNanocrystal and nanoparticles-Nose (near)
Mouth (near)
-
Koch 2017 [90]Chest wall movementsResistiveMonitoring of breathingAbdomen-
Milici 2017 [91]Air temperatureThermistorRR
Monitoring of respiratory diseases
Nose (near)
Mouth (near)
-
Nakazumi 2017 [92]Respiratory air flowPhotoelectricMonitoring of breathingMouth mask (diving)-
Park 2017 [93]Abdomen movementsCapacitiveMonitoring of breathingWaist20 × 10 × 1 mm (electrode)
Presti 2017 [94]Chest wall movementsFiber opticRR
Respiratory period
Chest and
abdomen (shirt, front and back)
1.5 cm
Valipour 2017 [95]Respiratory soundsMicrophoneRRNose (near)
Mouth (near)
-
White 2017 [96]Chest wall movementsCapacitiveRRChest (below left pectoral muscle)-
Yan 2017 [97]Air humidityNanocrystal and nanoparticles Monitoring of breathingMouth (4 cm away)5 × 2 × 1 mm
Mahbub 2016–2017 [98,99]Chest wall movementsPiezoelectric RRChest1.6 × 1.6 cm
Chethana 2016 [100]Chest wall movementsFiber opticRRChest (interspace of pulmonic area)
Güder 2016 [101]Air humidityNanocrystal and nanoparticles Monitoring of breathingMouth mask-
Lepine 2016 [102]Chest wall movementsAccelerometer
ECG
RRChest-
Massaroni 2016 [103]Chest wall movementsFiber optic Respiratory periodChest and
abdomen (textile)
1 cm
Massaroni 2016 [49]Chest wall movementsFiber optic RR
TV
Compartmental volume
Chest and
abdomen (shirt)
10 × 10 cm
Moradian 2016 [104]Air temperaturePyroelectricMonitoring of breathingNose (below)-
Nag 2016 [105]Chest wall movementsCapacitiveMonitoring of breathingChest (diaphragm)50 mm2
Nam 2016 [106]Respiratory soundsMicrophoneRRNose (near)
Mouth (near)
-
Raji 2016 [107]Air temperatureThermistorRR
Monitoring of respiratory diseases
Mouth mask-
Ramos-García 2016 [108]Chest wall movementsResistiveRRChest (shirt)23 × 4 cm
Rotariu 2016 [109]Chest wall movementsPiezoelectricRR
Monitoring of respiratory diseases
Chest-
Atalay 2015 [110]Chest wall movementsResistiveRRChest
Abdomen
2.7 × 9.3 cm
Ciocchetti 2015 [111]Chest wall movementsFiber opticTVChest-
Estrada 2015 [112]Chest wall movementsAccelerometerRRChest-
Gargiulo 2015 [113]Chest wall/ abdomen movementsResistiveTVChest and
abdomen (shirt)
5 × 7cm (4 units)
Grlica 2015 [114]Chest wall movementsCapacitive-Chest4.5 × 1.7 cm
Hernandez 2015 [115]Chest wall movementsAccelerometer GyroscopeRRWrist-
Jiang 2015 [116]Air temperaturePyroelectric MV, peak inspiratory flow, RR, TVNose (below)5 × 25 × 100 mm
Karlen 2015 [117]Modulation cardiac activityPPGRRFinger (on sensor)-
Kazmi 2015 [118]Modulation cardiac activityPPGRRFinger (on sensor)-
Metshein 2015 [21]Modulation cardiac activityECGRRChest (electrode shirt)8 × 17cm (electrode surface)
Teichmann 2015 [119]Chest wall movementsTransthoracic impedance RRChest (pocket of shirt)10 × 8 cm
Wei 2015 [120]Air temperatureOther (micro electro-mechanical sensor)Respiration detectionNose (5 cm away)1.8 × 2.4 mm
Yang 2015 [3]Chest wall movementsFiber opticRRChest70 cm (belt)
Bifulco 2014 [121]Chest wall movementsPiezoelectricMonitoring of breathingChest-
Fekr 2014 [122]Chest wall movementsAccelerometerRR
TV
Chest (middle sternum region)-
Hesse 2014 [124]Chest wall movementsResistiveRRChest3 × 1.96 × 2.8 cm
Krehel 2014 [125]Chest wall movementsFiber opticRRChest (different regions)-
Min 2014 [126]Chest wall movementsCapacitiveRRWaist83 × 3.86 × 0.135 cm
Petrovic 2014 [127]Chest wall movementsFiber optic MV
TV
Chest (lower third of the thorax)-
Sanchez 2014 [128]Air temperatureFiber optic Monitoring of breathingNose (near)
Mouth (near)
-
Wo 2014 [129]Abdomen movementsFiber optic RRAbdomen-
Yang 2014 [130]Chest wall movementsCapacitiveRRChest
Abdomen
10 × 1 cm
Yoon 2014 [131]Chest wall movementsAccelerometer GyroscopeRRChest-
Chan 2013 [132]Chest wall movements
Modulation cardiac activity
Accelerometer
ECG
RRChest-
Huang 2013 [133]Air temperaturePyroelectric RRNose (near)
Mouth (near)
-
Kundu 2013 [161]Chest wall/ abdomen movementsCapacitiveRRChest and abdomen (anterior and posterior region)9 × 13 cm
16 × 16 cm
5 × 5 cm
Padasdao 2013 [135]Chest wall movementsDC generatorRRChestCoin size
Cao 2012 [2]Air temperaturePyroelectricRR, MV, peak inspiration flow, TVNose (near)
Mouth (near)
7 × 4.5 × 1.8 cm
Chiu 2012 [136]Chest wall movementsPiezoelectric RRChest48 × 19 × 4 mm
Favero 2012 [137]Air humidityNanocrystal and nanoparticles Monitoring of breathingMouth mask (3 cm from nose)
Mathew 2012 [138]Air humidityNanocrystal and nanoparticles Monitoring of breathingNose (5 cm away)1 mm length
Scully 2012 [139]Modulation cardiac activityPPGRRFinger (on sensor)-
Trobec 2012 [140]Modulation cardiac activityECGRRChest (different regions)-
Witt 2012 [141]Chest wall movementsFiber optic Monitoring of breathingChest1 cm (elastic part)
Zieba 2012 [142]Chest wall movementsResistiveRRChest (shirt)-
Carlos 2011 [143]Chest wall movementsResistiveRR
Coughing events
Chest and abdomen (shirt)5.3 × 3.2 cm
Ciobotariu 2011 [144]Chest wall movementsPiezoelectricRR
Monitoring of respiratory diseases
Chest28 cm
Guo 2011 [146]Chest wall movementsResistiveRRChest10 × 0.25 cm
Hoffmann 2011 [17]Chest wall movementsCapacitiveRespiration pattern
TV (among others)
Chest3 × 3 cm
Liu 2011 [147]Chest wall movementsResistiveRR, MVAbdomen-
Liu 2011 [148]Abdomen movementsAccelerometerRRAbdomen23 mm diameter
Mann 2011 [149]Chest wall movementsAccelerometerRR
Respiratory disease monitoring
Neck (Midclavicular line and lower costal margin intersection)-
Ono 2011 [150]Chest wall movementsAccelerometerRRChest-
Silva 2011 [151]Chest wall movementsFiber optic RRChest (shirt)-
Yang 2011 [152]Chest wall movementsCapacitiveRespiration patternChest
Abdomen
-
Yoo 2010–2011 [153,154,155] Air temperature
Abdomen movements
Fiber optic Monitoring of breathingNose (below)
Abdomen
-
Ansari 2010 [157]Modulation cardiac activityECGRRArm
Forearm
-
De Jonckheere 2010 [158]Chest wall movementsFiber opticMonitoring of breathingChest
Abdomen
-
Mitchell 2010 [159]Chest wall movementsResistiveRespiration patternChest
Abdomen
-
Zhang 2010 [160]Chest wall movementsFrequency shiftRespiration signalChest-
Table A2. Analysis of sensing techniques, sensors, breathing parameters, and sensor location and size for studies of the environmental category published before 2018.
Table A2. Analysis of sensing techniques, sensors, breathing parameters, and sensor location and size for studies of the environmental category published before 2018.
StudyTechniqueSensorMeasured ParameterLocationSize
Azimi 2017 [183]Chest wall movementsPressure (piezoelectric)RRMat80 × 90 cm
Cho 2017 [184]Air temperatureCameraRR
Respiratory pattern
-7.2 × 2.6 × 1.8 cm
Leicht 2017 [185]Chest wall movementsInductiveRROthers (vehicle seatbelt)-
Li 2017 [186]Chest wall movementsFiber opticRRMat-
Li 2017 [187]Chest wall movementsCameraRRDistance from subject (1.4 m above)-
Prathosh 2017 [10]Chest wall movementsCameraRR
Respiratory pattern
Distance from subject (0.91 m away)-
Prochazka 2017 [188]Air temperatureCameraRRDistance from subject (in front of face)-
Tataraidze 2017 [7]Modulation cardiac activityRadarRR
Movement activity periods
Distance from subject-
Wang 2017 [189]Modulation cardiac activityRadarRRDistance from subject (5–55 cm away)-
Heldt 2016 [5]Modulation cardiac activityRadarRRDistance from subject (15–50 cm away)-
Kukkapalli 2016 [190]Modulation cardiac activityRadarRROthers (neck pendant)-
Prochazka 2016 [191]Chest wall movementsKinectRRDistance from subject (in front of face)-
Tveit 2016 [192]Chest wall movementsCameraRRDistance from subject (above)-
Ushijima 2016 [6]Chest wall movementsKinectRespiration detectionDistance from subject (0.8–4 m away)-
Erden 2015 [193]Chest wall movementsPyroelectric
Vibration
RRDistance from subject (20–100 cm above, pyroelectric)
Mat (vibration)
-
Huang 2015 [54]Modulation cardiac activityRadarRespiratory patternsDistance from subject (1 m away)-
Liu 2015 [194]Chest wall movementsPiezoelectricRR
Respiratory pattern
Mat250 × 125 cm
Pereira 2015 [195]Air temperatureCameraRRDistance from subject (1.5–2 m away)-
Ravichandran 2015 [196]Modulation cardiac activityWi-Fi transmitter and receiverRR
Falls
Distance from subject (0.9–4.3 m away)
Sasaki 2015 [197]Modulation cardiac activityRadarRRDistance from subject (1–2 m above, diagonally)-
Zakrzewski 2015 [198]Modulation cardiac activityRadarRRDistance from subject (1.5 m above)-
Arlotto 2014 [199]Modulation cardiac activityRadarBreathing monitoringDistance from subject (30 cm away)
Bernacchia 2014 [200]Chest wall movementsKinectRRDistance from subject (120 cm above)-
Bernal 2014 [51]Chest wall movementsCameraRespiratory pattern, peak inspiratory and expiratory flow, vital capacityDistance from subject (above)-
Chen 2014 [201]Chest wall movementsFiber opticRRMat25 × 20 cm
Lee 2014 [52]Modulation cardiac activityRadarRR
TV
Respiratory pattern
Distance from subject (80 cm away)-
Luis 2014 [202]Chest wall movementsCapacitiveRespiration signalMat (chest region)Different electrode sizes:
9 × 24 cm; 14 × 25 cm; 22 × 4 cm; 10 × 10 cm
Mukai 2014 [203]Chest wall movementsCapacitiveRRMat (chest region)478 × 478 × 3.5 mm
Nukaya 2014 [204]Chest wall movementsPiezoelectricBreathing monitoringOthers (below a neonatal bed)
Patwari 2014 [205]Modulation cardiac activityWi-Fi transmitter and receiverRRNodes-
Patwari 2014 [206]Modulation cardiac activityWi-Fi transmitter and receiverRR
Amplitude and phase
Nodes-
Shao 2014 [48]Chest wall movementsCameraRR
Exhalation flow rate
Distance from subject (0.5 m away)-
Taheri 2014 [207]Modulation cardiac activityRadarRRDistance from subject (beside bed)-
Wang 2014 [53]Chest wall movementsCameraRespiratory pattern--
Bartula 2013 [208]Chest wall movementsCameraRRDistance from subject (beside bed)-
Chen 2013 [209]Respiratory airflowFiber opticOxygen concentrationOthers (inside measurement instrument)-
Dziuda 2013 [210]Chest wall movementsFiber opticRRMat220 × 95 × 1.5 mm
Klap 2013 [211]Chest wall movementsPiezoelectricRRMat (chest region)-
Lau 2013 [19]Chest wall movementsFiber opticRR Respiratory patternMat-
Nijsure 2013 [50]Modulation cardiac activityRadarRespiratory monitoring
Respiratory pattern
Distance from subject (1–3 m away)-
Sprager 2013 [212]Chest wall movementsFiber opticRRMat
Vinci 2013 [213]Modulation cardiac activityRadarRRDistance from subject (1 m above)
Yavari 2013 [214]Modulation cardiac activityRadarRespiration detectionDistance from subject (1–1.5 m away)-
Aoki 2012 [215]Chest wall movementsKinectExhalation flow rateDistance from subject (1.2 m away, 1.1 m height)-
Boccanfuso 2012 [216]Air temperatureInfraredRRDistance from subject-
Bruser 2012 [217]Chest wall movementsOpticalRespiratory activityMat200 × 90 cm
Chen 2012 [218]Chest wall movementsFiber optic RR
Respiratory pattern
Mat-
Dziuda 2012 [9,236]Chest wall movementsFiber opticRRMat (pneumatic cushion)36 cm diameter, 7 cm height
Gu 2012 [219]Modulation cardiac activityRadarBreathing monitoringDistance from subject (50 cm above)5 × 5 cm
Lokavee 2012 [220]Chest wall movementsResistiveRRMat, pillow-
Shimomura 2012 [221]Modulation cardiac activityRadarRRDistance from subject (2 m above)-
Xia 2012 [222]Chest wall movementsKinectRespiration signalDistance from subject (above)-
Lai 2011 [223]Modulation cardiac activityRadarRR
Respiration amplitude
Distance from subject (1 m away)3 × 4 cm (antenna)
Otsu 2011 [224]Modulation cardiac activityRadarRRDistance from subject (0.8 m above)
Postolache 2011 [225]Modulation cardiac activityRadarRROthers (embedded in wheelchair)-
Zito 2011 [226]Modulation cardiac activityRadarBreathing monitoringDistance from subject (25–40 cm from chest)
Heise 2010 [227]Chest wall movementsResistiveRRMat130 × 7.6 cm
Min 2010 [228]Chest wall movementsUltrasonic (proximity)RRDistance from subject (1 m away)-
Mostov 2010 [229]Modulation cardiac activityRadarRRDistance from subject (2 m away)10 × 10 × 5 cm
Nishiyama 2010 [230,231]Chest wall movementsFiber opticRespiration monitoringMat (bed, chest region)-
Scalise 2010 [232]Modulation cardiac activityRadarRRDistance from subject (1.5 m away, perpendicular)-
Silvious 2010 [233]Modulation cardiac activityRadarRRNodes (transmitter: 11.3 m from subject;
receiver: 4.3 m from subject)
-
Tan 2010 [234]Chest wall movementsCameraRRDistance from subject (0.5–1 m away)-
Table A3. Analysis of transmission technology, processing station, and energy autonomy for studies in the wearable category published before 2018.
Table A3. Analysis of transmission technology, processing station, and energy autonomy for studies in the wearable category published before 2018.
StudyWireless TransmissionWired TransmissionProcessing StationBattery CapacityBattery Life (Type Battery)
Agcayazi 2017 [123]Bluetooth (low energy)----
Aileni 2017 [134]Bluetooth-PC, smartphone, tablet device--
Basra 2017 [145]-Serial communication
LCD integrated
Microcontroller, PC, mixed signal oscilloscope9 V, 500 mAh-
Bhattacharya 2017 [156]Wi-Fi-Smartphone, cloud storage--
Das 2017 [162]-Mono audio jackPC9 V, 500 mAh-
Fajkus 2017 [83]-InterrogatorPC--
Gorgutsa 2017 [84]Bluetooth (low energy)-PC, tablet device-(Rechargeable battery and 630 mV solar cell)
Guay 2017 [85]-GPIB interfacePC--
Kam 2017 [86,87,88]-USBPC-(5 V DC power bank)
Kano 2017 [89]Bluetooth--9 V, 500 mAh-
Koch 2017 [90]-----
Milici 2017 [91]Backscattered field-Cloud storage330 h>1 year
Nakazumi 2017 [92]-DAQ (Arduino)PC--
Park 2017 [93]-DAQPC--
Presti 2017 [94]-InterrogatorPC--
Valipour 2017 [95]Radio-frequency transceiver-PC--
White 2017 [96]Wi-Fi-PC--
Yan 2017 [97]-----
Mahbub 2016–2017 [98,99]Impulse radio ultra-wide band-Cloud storage600 mAh40 days
Chethana 2016 [100]-Interrogator---
Güder 2016 [101]Bluetooth-Tablet device, smartphone, cloud storage2600 mAh9 h
Lepine 2016 [102]Bluetooth (low energy)-Smartphone--
Massaroni 2016 [103]-InterrogatorPC--
Massaroni 2016 [49]-InterrogatorPC--
Moradian 2016 [104]Passive ultra-high-frequency RFID-Oscilloscope-(Self-powered passive sensor)
Nag 2016 [105]-----
Nam 2016 [106]-Data storagePC--
Raji 2016 [107]Radio-frequency-Cloud storage--
Ramos-García 2016 [108]-Serial communicationPC--
Rotariu 2016 [109]-USBTablet device--
Atalay 2015 [110]-----
Ciocchetti 2015 [111]-InterrogatorPC--
Estrada 2015 [112]-Data storagePC--
Gargiulo 2015 [113]-----
Grlica 2015 [114]-USBPC2000 mAh>25 days
Hernandez 2015 [115]-Data storagePC-6–9 h
Jiang 2015 [116]Bluetooth (low energy)-PC, smartphone-(Rechargeable battery, wireless charger)
Karlen 2015 [117]-Data storagePC--
Kazmi 2015 [118]-USBPC--
Metshein 2015 [21]-----
Teichmann 2015 [119]Bluetooth-Smartphone2.95 Wh2.23 h
Wei 2015 [120]-----
Yang 2015 [3]Bluetooth-PC--
Bifulco 2014 [121]-----
Fekr 2014 [122]Bluetooth (low energy)-PC, smartphone, tablet device, cloud storage--
Hesse 2014 [124]-Data storagePC--
Krehel 2014 [125]-InterrogatorPC--
Min 2014 [126]--PC--
Petrovic 2014 [127]-InterrogatorPC--
Sanchez 2014 [128]-SpectrometerPC--
Wo 2014 [129]-DAQPC--
Yang 2014 [130]Bluetooth (low energy)-Smartphone-24 h
Yoon 2014 [131]Bluetooth-PC-(Li-polymer battery)
Chan 2013 [132]Bluetooth (low energy)-Smartphone-(Coin battery)
Huang 2013 [133]-USBPC--
Kundu 2013 [161]-----
Padasdao 2013 [135]-----
Cao 2012 [2]Bluetooth-Smartphone1000 mAh>10 h
Chiu 2012 [136]--PC--
Favero 2012 [137]-Interrogator---
Mathew 2012 [138]-InterrogatorPC--
Scully 2012 [139]-Data storagePC--
Trobec 2012 [140]--PC--
Witt 2012 [141]-USBPC--
Zieba 2012 [142]--PC--
Carlos 2011 [143]Bluetooth-PC, cloud storage--
Ciobotariu 2011 [144]Wi-Fi, GSM-Tablet device-5 weeks
Guo 2011 [146]Bluetooth-PC--
Hoffmann 2011 [17]Bluetooth-PC590 mAh12 h
Liu 2011 [147]-Data Storage (SD card)PC370 mAh54 h
Liu 2011 [148]Radio-frequency (transceiver)-Base station--
Mann 2011 [149]--PC--
Ono 2011 [150]-Data storagePC--
Silva 2011 [151]-Interrogator---
Yang 2011 [152]Bluetooth-PC--
Yoo 2010–2011 [153,154,155] -DAQPC--
Ansari 2010 [157]-----
De Jonckheere 2010 [158]-USBPC--
Mitchell 2010 [159]Zigbee-PC--
Zhang 2010 [160]Bluetooth-PC--
Table A4. Analysis of the processing algorithm, performance evaluation, and software for the studies of the wearable category published before 2018.
Table A4. Analysis of the processing algorithm, performance evaluation, and software for the studies of the wearable category published before 2018.
StudyAlgorithmPerformance EvaluationPerformance ValueAnalysis Software
Agcayazi 2017 [123]-Graphical monitoring--
Aileni 2017 [134]----
Basra 2017 [145]Frequency analysisGraphical monitoring-MATLAB
Bhattacharya 2017 [156]Threshold detectionGraphical comparison-Blynk
Das 2017 [162]-Graphical monitoring-Audacity
Fajkus 2017 [83]Frequency analysisRelative error3.9%MATLAB
Gorgutsa 2017 [84]Received signal strength indicatorGraphical monitoring-Custom application
Guay 2017 [85]-Graphical monitoring-Labview
Kam 2017 [86,87,88]-Relative error<4.08%Labview
MATLAB
Kano 2017 [89]-Graphical monitoring--
Koch 2017 [90]Custom algorithmGraphical monitoring--
Milici 2017 [91]Peak detectionGraphical comparison--
Nakazumi 2017 [92]-Graphical monitoring--
Park 2017 [93]-Graphical monitoring-MATLAB
Presti 2017 [94]Max-min detectionBland-Altman analysis0.006–0.008 bpm (BA MOD)MATLAB
Valipour 2017 [95]-Root mean square error
Bland-Altman analysis
1.26 bpm (RMSE)
0.1 bpm (BA, MOD)
-
White 2017 [96]Frequency analysis--MATLAB
Yan 2017 [97]-Graphical monitoring--
Mahbub 2016–2017 [98,99]-Graphical monitoring--
Chethana 2016 [100]Frequency analysis---
Güder 2016 [101]-Graphical monitoring--
Lepine 2016 [102]Kalman filterAbsolute error2.11–5.98 bpmMATLAB
Massaroni 2016 [103]Max-min detectionBland-Altman analysis
Percentage error
<0.14 s (BA, MOD)
1.14% (PE)
-
Massaroni 2016 [49]Max-min detection
Custom algorithm
Relative error−1.59% (RE for RR)
14% (RE for TV)
MATLAB
Moradian 2016 [104]-Graphical monitoring--
Nag 2016 [105]-Graphical monitoring--
Nam 2016 [106]Frequency analysisMean relative error<1%MATLAB
Raji 2016 [107]Threshold detectionRoot mean square error1.7–2 bpm-
Ramos-García 2016 [108]Peak detection
Frequency analysis
Correlation factor0.41-
Rotariu 2016 [109]Peak detection--LabWindows/CVI
Atalay 2015 [110]Frequency analysis- -
Ciocchetti 2015 [111]Peak detection
Custom algorithm
Manual verification
Correlation factor
Percentage error
Bland-Altman analysis
0.87 (correlation)
8.3% (PE)
<0.002 s (BA, mean difference, time between peaks)
MATLAB
Estrada 2015 [112]Peak detection
Custom Algorithm
Correlation factor
Bland-Altman analysis
0.8–0.97 (correlation)
−0.01 bpm (BA, MOD)
MATLAB
Gargiulo 2015 [113]-Relative error<10%MATLAB
Grlica 2015 [114]-Graphical monitoring-C#
Hernandez 2015 [115]Max-min detection
Frequency analysis
Bland-Altman analysis
Mean absolute error
Root mean square error
0.15 bpm (BA, MOD)
0.38 bpm (MAE)
1.25 bpm (RMSE)
-
Jiang 2015 [116]Custom algorithmRespiration simulation--
Karlen 2015 [117]Custom algorithmBland-Altman analysis
Root mean square error
6.01 bpm (RMSE)MATLAB
Kazmi 2015 [118]-Graphical comparison-Easy pulse analyzer
Cool term software
Kuvios HRV
Metshein 2015 [21]-Graphical comparison--
Teichmann 2015 [119]Frequency analysis (Frequency modulation)
Custom algorithm
Graphical comparison-Microcontroller
Wei 2015 [120]----
Yang 2015 [3]Manual verificationGraphical comparison-Labview
Bifulco 2014 [121]----
Fekr 2014 [122]Numeric integration algorithmCorrelation factor
Relative error
0.85 (correlation)
0.2% (RE)
-
Hesse 2014 [124]Peak detectionMean absolute error0.32 bpmMicrocontroller
Krehel 2014 [125]-Correlation factor
Bland-Altman analysis
±3 bpm (BA)MATLAB
Min 2014 [126]Peak detection
Custom Algorithm
Correlation factor
Bland-Altman analysis
0.98 (correlation)
0.0015 bpm (BA, MOD)
MATLAB
Petrovic 2014 [127]-Mean relative error
Bland-Altman analysis
8.7% (RE-MV-)
10.5% (RE-TV-)
−1 (BA, MOD)
MATLAB
Labview
Sanchez 2014 [128]----
Wo 2014 [129]Frequency analysis--Labview
Yang 2014 [130]-Graphical monitoring--
Yoon 2014 [131]Kalman filterRelative error7.3%MATLAB
Labview
Chan 2013 [132]-Absolute error<2 bpm-
Huang 2013 [133]-Accuracy98.8%-
Kundu 2013 [161]-Accuracy
Coefficient of determination
100% (acc)
0.906 (r2)
-
Padasdao 2013 [135]Frequency analysis (Fast Fourier Transform)Bland-Altman analysis
Mean absolute error
0.23–0.48 bpm (BA, MOD)
0.00027 bpm (MAE)
-
Cao 2012 [2]Peak detectionGraphical comparison-Labview
Chiu 2012 [136]Frequency analysisPaired t-testNo statistical difference with referenceMATLAB
Favero 2012 [137]----
Mathew 2012 [138]Zero-crossing detectionGraphical monitoring-Labview
Scully 2012 [139]Frequency analysis (Frequency modulation)Graphical comparison-MATLAB
Trobec 2012 [140]ECG-derived algorithm--MATLAB
Witt 2012 [141]-Graphical comparison--
Zieba 2012 [142]Manual verification--Labview
Carlos 2011 [143]-Graphical monitoring-Custom application
Ciobotariu 2011 [144]Max-min detection
Custom algorithm
Graphical comparison-C#
Guo 2011 [146]Peak detectionSimulation
Graphical comparison
-Labview
Hoffmann 2011 [17]Custom algorithmCorrelation factor
Relative error
0.92 (correlation)-
Liu 2011 [147]Empirical Mode DecompositionMean percentage error
Root mean square error
6.1%, 14.6% (MPE)
4.1 bpm, 9.8 bpm (RMSE)
-
Liu 2011 [148]Principal Component Analysis
Frequency analysis
Relative error10%-
Mann 2011 [149]Threshold detectionCorrelation factor0.97-
Ono 2011 [150]Custom algorithmDisplacement comparison-Objective-C
Silva 2011 [151]Frequency analysisGraphical comparison--
Yang 2011 [152]----
Yoo 2010–2011 [153,154,155] -Graphical monitoring-Labview
Ansari 2010 [157]Frequency analysis---
De Jonckheere 2010 [158]-Graphical comparison
Bland-Altman analysis
--
Mitchell 2010 [159]Manual verificationGraphical comparison-Physput
Zhang 2010 [160]-Biofeedback (audiovisual feedback signal)-MATLAB
Table A5. Analysis of the processing algorithm, performance evaluation, and software for the studies of the environmental category published before 2018.
Table A5. Analysis of the processing algorithm, performance evaluation, and software for the studies of the environmental category published before 2018.
StudyAlgorithmPerformance EvaluationPerformance ValueAnalysis Software
Azimi 2017 [183]Peak detection
Custom algorithm
Linear regression0.968 and 1.0223 (slope)-
Cho 2017 [184]Custom algorithm---
Leicht 2017 [185]-Graphical comparison-MATLAB
Li 2017 [186]Frequency analysisGraphical monitoring-MATLAB
Li 2017 [187]Custom algorithmRoot mean square error1.12 bpmOpenCV
Prathosh 2017 [10]Custom algorithmCorrelation factor
Bland-Altman analysis
0.94 (correlation)
0.88 bpm (BA, MOD)
-
Prochazka 2017 [188]Neural Network---
Tataraidze 2017 [7]Peak detection
Custom algorithm
Accuracy97%MATLAB
Wang 2017 [189]Frequency analysis (Short-Time Fourier Transform)Absolute error0.11–0.33 bpm-
Heldt 2016 [5]Threshold detectionMean absolute error1.2 bpmAcknowledge
Kukkapalli 2016 [190]Peak detectionAccuracy>95%
Prochazka 2016 [191]Custom algorithmRelative error0.06–0.26%-
Tveit 2016 [192]Phase-based respiration detection
Pixel-intensity detection
Manual verification
Relative error
7.21–11.57%-
Ushijima 2016 [6]-Graphical comparison--
Erden 2015 [193]Wavelet transform
Empirical mode decomposition
Accuracy93%-
Huang 2015 [54]Peak detection
Threshold detection
---
Liu 2015 [194]Peak detection
Threshold detection
Relative error1.8–5.7%-
Pereira 2015 [195]Custom algorithmCorrelation factor
Mean Absolute Error
Bland-Altman analysis
0.92 (correlation)
0.53 bpm (MAE)
0.025 bpm (BA, MOD)
MATLAB
Ravichandran 2015 [196]Zero-crossing detection
Frequency analysis
Linear predictive coding
Least-squares harmonic analysis
Absolute error2.16 bpm-
Sasaki 2015 [197]-Relative error3%-
Zakrzewski 2015 [198]Linear/ non-Linear demodulationMean squared error-MATLAB
Arlotto 2014 [199]-Graphical comparison--
Bernacchia 2014 [200]Custom algorithmCorrelation factor
Bland-Altman analysis
0.96 (correlation)
0 bpm (BA, MOD)
MATLAB
Bernal 2014 [51]Peak detection
Zero-crossing detection
Custom algorithm
Graphical comparison--
Chen 2014 [201]Peak detectionAbsolute error
Relative error
1.65 bpm (AE)
9.9% (RE)
Labview
Lee 2014 [52]Frequency analysisCorrelation factor
Root mean square error
0.90–0.976 (correlation)
0.0038–0.076 bpm (RMSE)
MATLAB
Luis 2014 [202]Custom algorithm--MATLAB
Mukai 2014 [203]Frequency analysis---
Nukaya 2014 [204]-Scatterplot--
Patwari 2014 [205]Frequency analysisRelative error1 bpm-
Patwari 2014 [206]Custom algorithmRelative error0.1 to 0.4 bpm-
Shao 2014 [48]Custom algorithmCorrelation factor
Bland-Altman analysis
Root mean square error
0.93 (correlation)
0.02 bpm (BA, MOD)
1.2 bpm (RMSE)
MATLAB
Taheri 2014 [207]Custom algorithmMean absolute error
Accuracy
0.93–1.77 bpm (MAE)
79–89% (Acc)
-
Wang 2014 [53]Threshold detectionOthers (confusion matrix)94%-
Bartula 2013 [208]Custom algorithmCorrelation factor0.98-
Chen 2013 [209]Frequency analysisLinear regression
Bland-Altman analysis
0.999 (r2)-
Dziuda 2013 [210]Max detectionRelative error
Bland-Altman analysis
<8% (RE)
0 bpm (BA, MOD)
Custom application
Klap 2013 [211]Proprietary algorithmsRelative error4–8% (RE)-
Lau 2013 [19]Peak detectionCorrelation factor
Mean absolute error
0.971 (correlation)
2 bpm (MAE)
Labview
Nijsure 2013 [50]Custom algorithmCorrelation factor0.814-
Sprager 2013 [212]Wavelet transformRelative error7.37 ± 7.20%MATLAB
Vinci 2013 [213]Frequency analysisGraphical comparison-MATLAB
Yavari 2013 [214]-Graphical comparison--
Aoki 2012 [215]Custom algorithmCorrelation factor
Bland-Altman plot
0.98 (correlation)Kinect SDK
Boccanfuso 2012 [216]Sinusoidal curve-fitting functionAccuracy-OpenCV
Bruser 2012 [217]----
Chen 2012 [218]Frequency analysisMean absolute error2 bpmLabview
Dziuda 2012 [9,236]Max-min detectionRelative error12%C#
Gu 2012 [219]-Graphical comparison-Labview
Lokavee 2012 [220]-Graphical comparison-Labview
Shimomura 2012 [221]Frequency analysisRelative error1.61%-
Xia 2012 [222]-Correlation factor0.958–0.978-
Lai 2011 [223]Multipeak detectionCorrelation factor0.5–0.83MATLAB
Labview
Otsu 2011 [224]Custom algorithmAbsolute error0.19 bpm-
Postolache 2011 [225]Peak detection--Labview
Android app
Zito 2011 [226]-Graphical comparison--
Heise 2010 [227]Zero-crossing detection---
Min 2010 [228]Envelop detection
Zero-crossing detection
Correlation factor
Bland-Altman analysis
0.93–0.98 (correlation)
−0.002–0.006 bpm (BA, MOD)
MATLAB
Mostov 2010 [229]Custom algorithm---
Nishiyama 2010 [230,231]-Graphical monitoring--
Scalise 2010 [232]Wavelet transformCorrelation factor
Bland-Altman analysis
0.98 (correlation)
13 ms (BAP, MOD)
-
Silvious 2010 [233]-Graphical comparison--
Tan 2010 [234]Custom algorithmGraphical comparison-Custom application

References

  1. Milenković, A.; Otto, C.; Jovanov, E. Wireless sensor networks for personal health monitoring: Issues and an implementation. Comput. Commun. 2006, 29, 2521–2533. [Google Scholar] [CrossRef]
  2. Cao, Z.; Zhu, R.; Que, R.-Y. A wireless portable system with microsensors for monitoring respiratory diseases. IEEE Trans. Biomed. Eng. 2012, 59, 3110–3116. [Google Scholar]
  3. Yang, X.; Chen, Z.; Elvin, C.S.M.; Janice, L.H.Y.; Ng, S.H.; Teo, J.T.; Wu, R. Textile Fiber Optic Microbend Sensor Used for Heartbeat and Respiration Monitoring. IEEE Sens. J. 2015, 15, 757–761. [Google Scholar] [CrossRef]
  4. Subbe, C.P.; Kinsella, S. Continuous monitoring of respiratory rate in emergency admissions: Evaluation of the RespiraSenseTM sensor in acute care compared to the industry standard and gold standard. Sensors 2018, 18, 2700. [Google Scholar] [CrossRef] [Green Version]
  5. Heldt, G.P.; Ward, R.J., III. Evaluation of Ultrasound-Based Sensor to Monitor Respiratory and Nonrespiratory Movement and Timing in Infants. IEEE Trans. Biomed. Eng. 2016, 63, 619–629. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  6. Ushijima, T.; Satake, J. Development of a Breathing Detection Robot for a Monitoring System. In Proceedings of the Joint 8th International Conference on Soft Computing and Intelligent Systems (SCIS) and 17th International Symposium on Advanced Intelligent Systems (ISIS), Sapporo, Japan, 25–28 August 2016; pp. 790–795. [Google Scholar]
  7. Tataraidze, A.; Anishchenko, L.; Korostovtseva, L.; Bochkarev, M.; Sviryaev, Y.; Alborova, I. Detection of movement activity and breathing cycles on bioradiolocation signals. In Proceedings of the IEEE nternational Conference on Microwaves, Antennas, Communications and Electronic Systems (COMCAS), Tel Aviv, Israel, 13–15 November 2017; pp. 1–4. [Google Scholar]
  8. Gatti, U.C.; Schneider, S.; Migliaccio, G.C. Physiological condition monitoring of construction workers. Autom. Constr. 2014, 44, 227–233. [Google Scholar] [CrossRef]
  9. Dziuda, L.; Skibniewski, F.W.; Krej, M.; Lewandowski, J. Monitoring Respiration and Cardiac Activity Using Fiber Bragg Grating-Based Sensor. IEEE Trans. Biomed. Eng. 2012, 59, 1934–1942. [Google Scholar] [CrossRef] [PubMed]
  10. Prathosh, A.P.; Praveena, P.; Mestha, L.K.; Bharadwaj, S. Estimation of Respiratory Pattern From Video Using Selective Ensemble Aggregation. IEEE Trans. Signal Process. 2017, 65, 2902–2916. [Google Scholar] [CrossRef] [Green Version]
  11. Liu, X.; Wang, Q.; Liu, D.; Wang, Y.; Zhang, Y.; Bai, O.; Sun, J. Human emotion classification based on multiple physiological signals by wearable system. Technol. Health Care 2018, 26, 459–469. [Google Scholar] [CrossRef] [Green Version]
  12. Homma, I.; Masaoka, Y. Breathing rhythms and emotions. Exp. Physiol. 2008, 93, 1011–1021. [Google Scholar] [CrossRef] [Green Version]
  13. Ćosić, D. Neuromarketing in market research. Interdiscip. Descr. Complex Syst. INDECS 2016, 14, 139–147. [Google Scholar] [CrossRef] [Green Version]
  14. Kowalczuk, Z.; Czubenko Michałand Merta, T. Emotion monitoring system for drivers. IFAC-PapersOnLine 2019, 52, 200–205. [Google Scholar] [CrossRef]
  15. Granato, M. Emotions Recognition in Video Game Players Using Physiological Information. Ph.D. Thesis, Università degli studi di Milano, Milano, Italy, 2019. [Google Scholar]
  16. Kołakowska, A.; Landowska, A.; Szwoch, M.; Szwoch, W.; Wrobel, M.R. Emotion recognition and its applications. In Human-Computer Systems Interaction: Backgrounds and Applications 3; Springer: Berlin/Heidelberg, Germany, 2014; pp. 51–62. [Google Scholar]
  17. Hoffmann, T.; Eilebrecht, B.; Leonhardt, S. Respiratory Monitoring System on the Basis of Capacitive Textile Force Sensors. IEEE Sens. J. 2011, 11, 1112–1119. [Google Scholar] [CrossRef]
  18. Sperlich, B.; Aminian, K.; Düking, P.; Holmberg, H.-C. Wearable Sensor Technology for Monitoring Training Load and Health in the Athletic Population. Front. Physiol. 2019, 10. [Google Scholar] [CrossRef]
  19. Lau, D.; Chen, Z.; Teo, J.T.; Ng, S.H.; Rumpel, H.; Lian, Y.; Yang, H.; Kei, P.L. Intensity-Modulated Microbend Fiber Optic Sensor for Respiratory Monitoring and Gating During MRI. IEEE Trans. Biomed. Eng. 2013, 60, 2655–2662. [Google Scholar] [CrossRef]
  20. Krebber, K.; Lenke, P.; Liehr, S.; Schukar, M.; Wendt, M.; Witt, J.; Wosniok, A. Technology and applications of smart technical textiles based on fiber optic sensors. Adv. Photonics Renew. Energy 2008. [Google Scholar] [CrossRef]
  21. Metshein, M. A device for measuring the electrical bioimpedance with variety of electrode placements for monitoring the breathing and heart rate. In Proceedings of the 26th Irish Signals and Systems Conference (ISSC), Carlow, UK, 24–25 June 2015; pp. 1–4. [Google Scholar]
  22. Friedl, K.E. Military applications of soldier physiological monitoring. J. Sci. Med. Sport 2018, 21, 1147–1153. [Google Scholar] [CrossRef] [Green Version]
  23. Scalise, L.; Mariani Primiani, V.; Russo, P.; De Leo, A.; Shahu, D.; Cerri, G. Wireless sensing for the respiratory activity of human beings: Measurements and wide-band numerical analysis. Int. J. Antennas Propag. 2013, 2013, 396459. [Google Scholar] [CrossRef] [Green Version]
  24. Massaroni, C.; Nicolò, A.; Lo Presti, D.; Sacchetti, M.; Silvestri, S.; Schena, E. Contact-based methods for measuring respiratory rate. Sensors 2019, 19, 908. [Google Scholar] [CrossRef] [Green Version]
  25. Mukhopadhyay, S.C. Wearable Sensors for Human Activity Monitoring: A Review. IEEE Sens. J. 2015, 15, 1321–1330. [Google Scholar] [CrossRef]
  26. Nag, A.; Mukhopadhyay, S.C.; Kosel, J. Wearable Flexible Sensors: A Review. IEEE Sens. J. 2017, 17, 3949–3960. [Google Scholar] [CrossRef] [Green Version]
  27. Chung, M.; Fortunato, G.; Radacsi, N. Wearable flexible sweat sensors for healthcare monitoring: A review. J. R. Soc. Interface 2019, 16, 20190217. [Google Scholar] [CrossRef]
  28. Bandodkar, A.J.; Jeang, W.J.; Ghaffari, R.; Rogers, J.A. Wearable sensors for biochemical sweat analysis. Annu. Rev. Anal. Chem. 2019, 12, 1–22. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  29. López-Nava, I.H.; Muñoz-Meléndez, A. Wearable Inertial Sensors for Human Motion Analysis: A Review. IEEE Sens. J. 2016, 16, 7821–7834. [Google Scholar] [CrossRef]
  30. Seshadri, D.R.; Li, R.T.; Voos, J.E.; Rowbottom, J.R.; Alfes, C.M.; Zorman, C.A.; Drummond, C.K. Wearable sensors for monitoring the internal and external workload of the athlete. NPJ Digit. Med. 2019, 2, 71. [Google Scholar] [CrossRef] [PubMed]
  31. Aroganam, G.; Manivannan, N.; Harrison, D. Review on Wearable Technology Sensors Used in Consumer Sport Applications. Sensors 2019, 19, 1983. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  32. Al-Eidan, R.M.; Al-Khalifa, H.; Al-Salman, A.M. A Review of Wrist-Worn Wearable: Sensors, Models, and Challenges. J. Sens. 2018, 2018, 5853917. [Google Scholar] [CrossRef] [Green Version]
  33. Baig, M.M.; Afifi, S.; GholamHosseini, H.; Mirza, F. A Systematic Review of Wearable Sensors and IoT-Based Monitoring Applications for Older Adults—A Focus on Ageing Population and Independent Living. J. Med. Syst. 2019, 43, 233. [Google Scholar] [CrossRef] [PubMed]
  34. Heikenfeld, J.; Jajack, A.; Rogers, J.; Gutruf, P.; Tian, L.; Pan, T.; Li, R.; Khine, M.; Kim, J.; Wang, J.; et al. Wearable sensors: Modalities, challenges, and prospects. Lab Chip 2018, 18, 217–248. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  35. Witte, A.-K.; Zarnekow, R. Transforming Personal Healthcare through Technology—A Systematic Literature Review of Wearable Sensors for Medical Application. In Proceedings of the 52nd Hawaii International Conference on System Sciences, Grand Wailea, Maui, 8–11 January 2019; pp. 3848–3857. [Google Scholar]
  36. Pantelopoulos, A.; Bourbakis, N.G. A Survey on Wearable Sensor-Based Systems for Health Monitoring and Prognosis. Healthc. Inform. Res. 2010, 40, 1–12. [Google Scholar] [CrossRef] [Green Version]
  37. Liang, T.; Yuan, Y.J. Wearable Medical Monitoring Systems Based on Wireless Networks: A Review. IEEE Sens. J. 2016, 16, 8186–8199. [Google Scholar] [CrossRef]
  38. Charlton, P.H.; Birrenkott, D.A.; Bonnici, T.; Pimentel, M.A.F.; Johnson, A.E.W.; Alastruey, J.; Tarassenko, L.; Watkinson, P.J.; Beale, R.; Clifton, D.A. Breathing Rate Estimation From the Electrocardiogram and Photoplethysmogram: A Review. IEEE Sens. J. 2018, 11, 2–20. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  39. AL-Khalidi, F.Q.; Saatchi, R.; Burke, D.; Elphick, H.; Tan, S. Respiration rate monitoring methods: A review. Pediatr. Pulmonol. 2011, 46, 523–529. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  40. Van Loon, K.; Zaane, B.; Bosch, E.J.; Kalkman, C.; Peelen, L.M. Non-Invasive Continuous Respiratory Monitoring on General Hospital Wards: A Systematic Review. PLoS ONE 2015, 10, e0144626. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  41. Rajala, S.; Lekkala, J. Film-Type Sensor Materials PVDF and EMFi in Measurement of Cardiorespiratory Signals—A Review. IEEE Sens. J. 2012, 12, 439–446. [Google Scholar] [CrossRef]
  42. Massaroni, C.; Zaltieri, M.; Lo Presti, D.; Nicolò, A.; Tosi, D.; Schena, E. Fiber Bragg Grating Sensors for Cardiorespiratory Monitoring: A Review. IEEE Sens. J. 2020, 1. [Google Scholar] [CrossRef]
  43. IEEE Xplore Search Results Page. Available online: https://ieeexplore.ieee.org/Xplorehelp/searching-ieee-xplore/search-results-page (accessed on 8 September 2020).
  44. Google Sholar About. Available online: https://scholar.google.com/intl/es/scholar/about.html (accessed on 8 September 2020).
  45. PRISMA. Transparent Reporting of Systematic Reviews and Meta-Analyses. Available online: http://prisma-statement.org/ (accessed on 14 September 2020).
  46. Igual, R.; Medrano, C.; Plaza, I. Challenges, issues and trends in fall detection systems. Biomed. Eng. Online 2013, 12. [Google Scholar] [CrossRef] [Green Version]
  47. Warner, M.A.; Patel, B. Mechanical ventilation. In Benumof and Hagberg’s Airway Management; Elsevier: Amsterdam, The Netherlands, 2013; pp. 981–997. [Google Scholar]
  48. Shao, D.; Yang, Y.; Liu, C.; Tsow, F.; Yu, H.; Tao, N. Noncontact Monitoring Breathing Pattern, Exhalation Flow Rate and Pulse Transit Time. IEEE Trans. Biomed. Eng. 2014, 61, 2760–2767. [Google Scholar] [CrossRef]
  49. Massaroni, C.; Saccomandi, P.; Formica, D.; Lo Presti, D.; Caponero, M.A.; Di Tomaso, G.; Giurazza, F.; Muto, M.; Schena, E. Design and Feasibility Assessment of a Magnetic Resonance-Compatible Smart Textile Based on Fiber Bragg Grating Sensors for Respiratory Monitoring. IEEE Sens. J. 2016, 16, 8103–8110. [Google Scholar] [CrossRef]
  50. Nijsure, Y.; Tay, W.P.; Gunawan, E.; Wen, F.; Yang, Z.; Guan, Y.L.; Chua, A.P. An Impulse Radio Ultrawideband System for Contactless Noninvasive Respiratory Monitoring. IEEE Trans. Biomed. Eng. 2013, 60, 1509–1517. [Google Scholar] [CrossRef]
  51. Bernal, E.A.; Mestha, L.K.; Shilla, E. Non contact monitoring of respiratory function via depth sensing. In Proceedings of the IEEE-EMBS International Conference on Biomedical and Health Informatics (BHI), Valencia, Spain, 1–4 June 2014; pp. 101–104. [Google Scholar]
  52. Lee, Y.S.; Pathirana, P.N.; Steinfort, C.L.; Caelli, T. Monitoring and Analysis of Respiratory Patterns Using Microwave Doppler Radar. IEEE J. Transl. Eng. Health Med. 2014, 2, 1–12. [Google Scholar] [CrossRef] [PubMed]
  53. Wang, C.; Hunter, A.; Gravill, N.; Matusiewicz, S. Unconstrained Video Monitoring of Breathing Behavior and Application to Diagnosis of Sleep Apnea. IEEE Trans. Biomed. Eng. 2014, 61, 396–404. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  54. Huang, X.; Sun, L.; Tian, T.; Huang, Z.; Clancy, E. Real-time non-contact infant respiratory monitoring using UWB radar. In Proceedings of the 16th IEEE International Conference on Communication Technology, Hangzhou, China, 18–21 October 2015; pp. 493–496. [Google Scholar]
  55. Igual, R.; Plaza, I.; Medrano, C.; Rubio, M.A. Personalizable smartphone-based system adapted to assist dependent people. J. Ambient Intell. Smart Environ. 2014, 6, 569–593. [Google Scholar] [CrossRef]
  56. Igual, R.; Plaza, I.; Martín, L.; Corbalan, M.; Medrano, C. Guidelines to Design Smartphone Applications for People with Intellectual Disability: A Practical Experience. In Ambient Intelligence—Software and Applications; Advances in Intelligent Systems and Computing; Springer: Heidelberg, Germany, 2013; Volume 219, ISBN 9783319005652. [Google Scholar]
  57. Aitkulov, A.; Tosi, D. Optical fiber sensor based on plastic optical fiber and smartphone for measurement of the breathing rate. IEEE Sens. J. 2019, 19, 3282–3287. [Google Scholar] [CrossRef]
  58. Aitkulov, A.; Tosi, D. Design of an all-POF-fiber smartphone multichannel breathing sensor with camera-division multiplexing. IEEE Sens. Lett. 2019, 3, 1–4. [Google Scholar] [CrossRef]
  59. Balasubramaniyam, H.; Vignesh, M.S.; Abhirami, A.; Abanah, A. Design and Development of a IoT based Flexible and Wearable T-Shirt for Monitoring Breathing Rate. In Proceedings of the 3rd International Conference on Computing Methodologies and Communication (ICCMC), Erode, India, 27–29 March 2019; pp. 376–379. [Google Scholar]
  60. Bricout, A.; Fontecave-Jallon, J.; Colas, D.; Gerard, G.; Pépin, J.-L.; Guméry, P.-Y. Adaptive Accelerometry Derived Respiration: Comparison with Respiratory Inductance Plethysmography during Sleep. In Proceedings of the 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany, 23–27 July 2019; pp. 6714–6717. [Google Scholar]
  61. Chu, M.; Nguyen, T.; Pandey, V.; Zhou, Y.; Pham, H.N.; Bar-Yoseph, R.; Radom-Aizik, S.; Jain, R.; Cooper, D.M.; Khine, M. Respiration rate and volume measurements using wearable strain sensors. NPJ Digit. Med. 2019, 2, 1–9. [Google Scholar] [CrossRef]
  62. Elfaramawy, T.; Fall, C.L.; Arab, S.; Morissette, M.; Lellouche, F.; Gosselin, B. A Wireless Respiratory Monitoring System Using a Wearable Patch Sensor Network. IEEE Sens. J. 2019, 19, 650–657. [Google Scholar] [CrossRef]
  63. Fajkus, M.; Nedoma, J.; Martinek, R.; Brablik, J.; Vanus, J.; Novak, M.; Zabka, S.; Vasinek, V.; Hanzlikova, P.; Vojtisek, L. MR fully compatible and safe FBG breathing sensor: A practical solution for respiratory triggering. IEEE Access 2019, 7, 123013–123025. [Google Scholar] [CrossRef]
  64. Hurtado, D.E.; Abusleme, A.; Chávez, J.A.P. Non-invasive continuous respiratory monitoring using temperature-based sensors. J. Clin. Monit. Comput. 2019, 1–9. [Google Scholar] [CrossRef]
  65. Jayarathna, T.; Gargiulo, G.D.; Breen, P.P. Polymer sensor embedded, IOT enabled t-shirt for long-term monitoring of sleep disordered breathing. In Proceedings of the IEEE 5th World Forum on Internet of Things (WF-IoT), Limerick, UK, 15–18 April 2019; pp. 139–143. [Google Scholar]
  66. Kano, S.; Yamamoto, A.; Ishikawa, A.; Fujii, M. Respiratory rate on exercise measured by nanoparticle-based humidity sensor. In Proceedings of the 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany, 23–27 July 2019; pp. 3567–3570. [Google Scholar]
  67. Karacocuk, G.; Höflinger, F.; Zhang, R.; Reindl, L.M.; Laufer, B.; Möller, K.; Röell, M.; Zdzieblik, D. Inertial sensor-based respiration analysis. IEEE Trans. Instrum. Meas. 2019, 68, 4268–4275. [Google Scholar] [CrossRef]
  68. Massaroni, C.; Nicolò, A.; Girardi, M.; La Camera, A.; Schena, E.; Sacchetti, M.; Silvestri, S.; Taffoni, F. Validation of a wearable device and an algorithm for respiratory monitoring during exercise. IEEE Sens. J. 2019, 19, 4652–4659. [Google Scholar] [CrossRef]
  69. Massaroni, C.; Di Tocco, J.; Presti, D.L.; Longo, U.G.; Miccinilli, S.; Sterzi, S.; Formica, D.; Saccomandi, P.; Schena, E. Smart textile based on piezoresistive sensing elements for respiratory monitoring. IEEE Sens. J. 2019, 19, 7718–7725. [Google Scholar] [CrossRef]
  70. Nguyen, T.-V.; Ichiki, M. MEMS-Based Sensor for Simultaneous Measurement of Pulse Wave and Respiration Rate. Sensors 2019, 19, 4942. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  71. Presti, D.L.; Massaroni, C.; Di Tocco, J.; Schena, E.; Carnevale, A.; Longo, U.G.; D’Abbraccio, J.; Massari, L.; Oddo, C.M.; Caponero, M.A. Single-plane neck movements and respiratory frequency monitoring: A smart system for computer workers. In Proceedings of the 2019 II Workshop on Metrology for Industry 4.0 and IoT (MetroInd4. 0&IoT), Naples, Italy, 4–6 June 2019; pp. 167–170. [Google Scholar]
  72. Presti, D.L.; Massaroni, C.; D’Abbraccio, J.; Massari, L.; Caponero, M.; Longo, U.G.; Formica, D.; Oddo, C.M.; Schena, E. Wearable system based on flexible FBG for respiratory and cardiac monitoring. IEEE Sens. J. 2019, 19, 7391–7398. [Google Scholar] [CrossRef]
  73. Puranik, K.A.; Kanthi, M. Wearable Device for Yogic Breathing. In Proceedings of the 2019 Amity International Conference on Artificial Intelligence (AICAI), Dubai, UAE, 4–6 February 2019; pp. 605–610. [Google Scholar]
  74. Soomro, A.M.; Jabbar, F.; Ali, M.; Lee, J.-W.; Mun, S.W.; Choi, K.H. All-range flexible and biocompatible humidity sensor based on poly lactic glycolic acid (PLGA) and its application in human breathing for wearable health monitoring. J. Mater. Sci. Mater. Electron. 2019, 30, 9455–9465. [Google Scholar] [CrossRef]
  75. Xiao, S.; Nie, J.; Tan, R.; Duan, X.; Ma, J.; Li, Q.; Wang, T. Fast-response ionogel humidity sensor for real-time monitoring of breathing rate. Mater. Chem. Front. 2019, 3, 484–491. [Google Scholar] [CrossRef]
  76. Yuasa, Y.; Suzuki, K. Wearable device for monitoring respiratory phases based on breathing sound and chest movement. Adv. Biomed. Eng. 2019, 8, 85–91. [Google Scholar] [CrossRef]
  77. Zhang, H.; Zhang, J.; Hu, Z.; Quan, L.; Shi, L.; Chen, J.; Xuan, W.; Zhang, Z.; Dong, S.; Luo, J. Waist-wearable wireless respiration sensor based on triboelectric effect. Nano Energy 2019, 59, 75–83. [Google Scholar] [CrossRef]
  78. Dan, G.; Zhao, J.; Chen, Z.; Yang, H.; Zhu, Z. A Novel Signal Acquisition System for Wearable Respiratory Monitoring. IEEE Access 2018, 6, 34365–34371. [Google Scholar] [CrossRef]
  79. Koyama, Y.; Nishiyama, M.; Watanabe, K. Smart Textile Using Hetero-Core Optical Fiber for Heartbeat and Respiration Monitoring. IEEE Sens. J. 2018, 18, 6175–6180. [Google Scholar] [CrossRef]
  80. Malik, S.; Ahmad, M.; Punjiya, M.; Sadeqi, A.; Baghini, M.S.; Sonkusale, S. Respiration Monitoring Using a Flexible Paper-Based Capacitive Sensor. In Proceedings of the 2018 IEEE Sensors, New Delhi, India, 28–31 October 2018; pp. 1–4. [Google Scholar]
  81. Martin, A.; Voix, J. In-Ear Audio Wearable: Measurement of Heart and Breathing Rates for Health and Safety Monitoring. IEEE Trans. Biomed. Eng. 2018, 65, 1256–1263. [Google Scholar] [CrossRef] [PubMed]
  82. Pang, Y.; Jian, J.; Tu, T.; Yang, Z.; Ling, J.; Li, Y.; Wang, X.; Qiao, Y.; Tian, H.; Yang, Y.; et al. Wearable humidity sensor based on porous graphene network for respiration monitoring. Biosens. Bioelectron. 2018, 116, 123–129. [Google Scholar] [CrossRef] [PubMed]
  83. Fajkus, M.; Nedoma, J.; Martinek, R.; Vasinek, V.; Nazeran, H.; Siska, P. A non-invasive multichannel hybrid fiber-optic sensor system for vital sign monitoring. Sensors 2017, 17, 111. [Google Scholar] [CrossRef] [PubMed]
  84. Gorgutsa, S.; Bellemare-Rousseau, S.; Guay, P.; Miled, A.; Messaddeq, Y. Smart T-shirt with wireless respiration sensor. In Proceedings of the 2017 IEEE Sensors, Glasgow, UK, 29 October–1 November 2017; pp. 1–3. [Google Scholar]
  85. Guay, P.; Gorgutsa, S.; LaRochelle, S.; Messaddeq, Y. Wearable contactless respiration sensor based on multi-material fibers integrated into textile. Sensors 2017, 17, 1050. [Google Scholar] [CrossRef] [PubMed]
  86. Kam, W.; Mohammed, W.S.; Leen, G.; O’Sullivan, K.; O’Keeffe, M.; O’Keeffe, S.; Lewis, E. All plastic optical fiber-based respiration monitoring sensor. In Proceedings of the 2017 IEEE Sensors, Glasgow, UK, 29 October–1 November 2017; pp. 1–3. [Google Scholar]
  87. Kam, W.; Mohammed, W.S.; Leen, G.; O’Keeffe, M.; O’Sullivan, K.; O’Keeffe, S.; Lewis, E. Compact and Low-Cost Optical Fiber Respiratory Monitoring Sensor Based on Intensity Interrogation. J. Light. Technol. 2017, 35, 4567–4573. [Google Scholar] [CrossRef] [Green Version]
  88. Kam, W.; Mohammed, W.S.; O’Keeffe, S.; Lewis, E. Portable 3-D Printed Plastic Optical Fibre Motion Sensor for Monitoring of Breathing Pattern and Respiratory Rate. In Proceedings of the IEEE 5th World Forum on Internet of Things (WF-IoT), Limerick, UK, 15–18 April 2019; pp. 144–148. [Google Scholar]
  89. Kano, S.; Fujii, M. Battery-powered wearable respiration sensor chip with nanocrystal thin film. In Proceedings of the 2017 IEEE Sensors, Glasgow, UK, 29 October–1 November 2017; pp. 1–3. [Google Scholar]
  90. Koch, E.; Dietzel, A. Stretchable sensor array for respiratory monitoring. In Proceedings of the 19th International Conference on Solid-State Sensors, Actuators and Microsystems (TRANSDUCERS), Kaohsiung, Taiwan, 18–22 June 2017; pp. 2227–2230. [Google Scholar]
  91. Milici, S.; Lorenzo, J.; Lázaro, A.; Villarino, R.; Girbau, D. Wireless Breathing Sensor Based on Wearable Modulated Frequency Selective Surface. IEEE Sens. J. 2017, 17, 1285–1292. [Google Scholar] [CrossRef]
  92. Nakazumi, R.; Inoue, M.; Yoshimi, T.; Fuchiyama, S.; Tsuchiya, H. Development of a respiration monitoring sensor for diving worker. In Proceedings of the 56th Annual Conference of the Society of Instrument and Control Engineers of Japan (SICE), Kanazawa, Japan, 19–22 September 2017; pp. 206–208. [Google Scholar]
  93. Park, S.W.; Das, P.S.; Chhetry, A.; Park, J.Y. A Flexible Capacitive Pressure Sensor for Wearable Respiration Monitoring System. IEEE Sens. J. 2017, 17, 6558–6564. [Google Scholar] [CrossRef]
  94. Presti, D.L.; Massaroni, C.; Formica, D.; Saccomandi, P.; Giurazza, F.; Caponero, M.A.; Schena, E. Smart Textile Based on 12 Fiber Bragg Gratings Array for Vital Signs Monitoring. IEEE Sens. J. 2017, 17, 6037–6043. [Google Scholar] [CrossRef]
  95. Valipour, A.; Abbasi-Kesbi, R. A heartbeat and respiration rate sensor based on phonocardiogram for healthcare applications. In Proceedings of the Iranian Conference on Electrical Engineering (ICEE), Tehran, Iran, 2–4 May 2017; pp. 45–48. [Google Scholar]
  96. White, N.M.; Ash, J.; Wei, Y.; Akerman, H. A Planar Respiration Sensor Based on a Capaciflector Structure. IEEE Sens. Lett. 2017, 1, 1–4. [Google Scholar] [CrossRef]
  97. Yan, H.; Zhang, L.; Yu, P.; Mao, L. Sensitive and fast humidity sensor based on a redox conducting supramolecular ionic material for respiration monitoring. Anal. Chem. 2017, 89, 996–1001. [Google Scholar] [CrossRef]
  98. Mahbub, I.; Wang, H.; Islam, S.K.; Pullano, S.A.; Fiorillo, A.S. A low power wireless breathing monitoring system using piezoelectric transducer. In Proceedings of the IEEE International Symposium on Medical Measurements and Applications (MeMeA), Benevento, Italy, 15–18 May 2016; pp. 1–5. [Google Scholar]
  99. Mahbub, I.; Pullano, S.A.; Wang, H.; Islam, S.K.; Fiorillo, A.S.; To, G.; Mahfouz, M.R. A low-power wireless piezoelectric sensor-based respiration monitoring system realized in CMOS process. IEEE Sens. J. 2017, 17, 1858–1864. [Google Scholar] [CrossRef]
  100. Chethana, K.; Guru Prasad, A.S.; Omkar, S.N.; Asokan, S. Fiber bragg grating sensor based device for simultaneous measurement of respiratory and cardiac activities. J. Biophotonics 2017, 10, 278–285. [Google Scholar] [CrossRef] [PubMed]
  101. Güder, F.; Ainla, A.; Redston, J.; Mosadegh, B.; Glavan, A.; Martin, T.J.; Whitesides, G.M. Paper-based electrical respiration sensor. Angew. Chem. Int. Ed. 2016, 55, 5727–5732. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  102. Lepine, N.N.; Tajima, T.; Ogasawara, T.; Kasahara, R.; Koizumi, H. Robust respiration rate estimation using adaptive Kalman filtering with textile ECG sensor and accelerometer. In Proceedings of the 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, FL, USA, 16–20 August 2016; pp. 3797–3800. [Google Scholar]
  103. Massaroni, C.; Ciocchetti, M.; Di Tomaso, G.; Saccomandi, P.; Caponero, M.A.; Polimadei, A.; Formica, D.; Schena, E. Design and preliminary assessment of a smart textile for respiratory monitoring based on an array of Fiber Bragg Gratings. In Proceedings of the of the 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, FL, USA, 16–20 August 2016; pp. 6054–6057. [Google Scholar]
  104. Moradian, S.; Abdolvand, R. MEMS-based passive wireless respiration profile sensor. In Proceedings of the 2016 IEEE Sensors, Orlando, FL, USA, 30 October–3 November 2016; pp. 1–3. [Google Scholar]
  105. Nag, A.; Mukhopadhyay, S.C.; Kosel, J. Flexible carbon nanotube nanocomposite sensor for multiple physiological parameter monitoring. Sens. Actuators A Phys. 2016, 251, 148–155. [Google Scholar] [CrossRef] [Green Version]
  106. Nam, Y.; Reyes, B.A.; Chon, K.H. Estimation of Respiratory Rates Using the Built-in Microphone of a Smartphone or Headset. IEEE J. Biomed. Health Inform. 2016, 20, 1493–1501. [Google Scholar] [CrossRef] [PubMed]
  107. Raji, A.; Kanchana Devi, P.; Golda Jeyaseeli, P.; Balaganesh, N. Respiratory monitoring system for asthma patients based on IoT. In Proceedings of the Online International Conference on Green Engineering and Technologies (IC-GET), Coimbatore, India, 19 November 2016; pp. 1–6. [Google Scholar]
  108. Ramos-Garcia, R.I.; Da Silva, F.; Kondi, Y.; Sazonov, E.; Dunne, L.E. Analysis of a coverstitched stretch sensor for monitoring of breathing. In Proceedings of the 10th International Conference on Sensing Technology (ICST), Nanjing, China, 11–13 November 2016; pp. 1–6. [Google Scholar]
  109. Rotariu, C.; Cristea, C.; Arotaritei, D.; Bozomitu, R.G.; Pasarica, A. Continuous respiratory monitoring device for detection of sleep apnea episodes. In Proceedings of the IEEE 22nd International Symposium for Design and Technology in Electronic Packaging (SIITME), Oradea, Romania, 20–23 October 2016; pp. 106–109. [Google Scholar]
  110. Atalay, O.; Kennon, W.R.; Demirok, E. Weft-Knitted Strain Sensor for Monitoring Respiratory Rate and Its Electro-Mechanical Modeling. IEEE Sens. J. 2015, 15, 110–122. [Google Scholar] [CrossRef]
  111. Ciocchetti, M.; Massaroni, C.; Saccomandi, P.; Caponero, M.A.; Polimadei, A.; Formica, D.; Schena, E. Smart textile based on fiber bragg grating sensors for respiratory monitoring: Design and preliminary trials. Biosensors 2015, 5, 602–615. [Google Scholar] [CrossRef] [Green Version]
  112. Estrada, L.; Torres, A.; Sarlabous, L.; Jané, R. Respiratory signal derived from the smartphone built-in accelerometer during a Respiratory Load Protocol. In Proceedings of the 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 6768–6771. [Google Scholar]
  113. Gargiulo, G.D.; Gunawardana, U.; O’Loughlin, A.; Sadozai, M.; Varaki, E.S.; Breen, P.P. A wearable contactless sensor suitable for continuous simultaneous monitoring of respiration and cardiac activity. J. Sens. 2015, 2015, 151859. [Google Scholar] [CrossRef] [Green Version]
  114. Grlica, J.; Martinović, T.; Džapo, H. Capacitive sensor for respiration monitoring. In Proceedings of the IEEE Sensors Applications Symposium (SAS), Zadar, Croatia, 13–15 April 2015; pp. 1–6. [Google Scholar]
  115. Hernandez, J.; McDuff, D.; Picard, R.W. Biowatch: Estimation of heart and breathing rates from wrist motions. In Proceedings of the 9th International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth), Istanbul, Turkey, 20–23 May 2015; pp. 169–176. [Google Scholar]
  116. Jiang, P.; Zhao, S.; Zhu, R. Smart Sensing Strip Using Monolithically Integrated Flexible Flow Sensor for Noninvasively Monitoring Respiratory Flow. Sensors 2015, 15, 31738–31750. [Google Scholar] [CrossRef]
  117. Karlen, W.; Garde, A.; Myers, D.; Scheffer, C.; Ansermino, J.M.; Dumont, G.A. Estimation of Respiratory Rate From Photoplethysmographic Imaging Videos Compared to Pulse Oximetry. IEEE J. Biomed. Health Inform. 2015, 19, 1331–1338. [Google Scholar] [CrossRef]
  118. Kazmi, S.A.; Shah, M.H.; Khan, S.; Khalifa, O.O. Respiratory rate (RR) based analysis of PPG signal for different physiological conditions. In Proceedings of the International Conference on Smart Sensors and Application (ICSSA), Kuala Lumpur, Malaysia, 26–28 May 2015; pp. 166–171. [Google Scholar]
  119. Teichmann, D.; De Matteis, D.; Bartelt, T.; Walter, M.; Leonhardt, S. A Bendable and Wearable Cardiorespiratory Monitoring Device Fusing Two Noncontact Sensor Principles. IEEE J. Biomed. Health Inform. 2015, 19, 784–793. [Google Scholar] [CrossRef] [PubMed]
  120. Wei, C.; Lin, Y.; Chen, T.; Lin, R.; Liu, T. Respiration Detection Chip with Integrated Temperature-Insensitive MEMS Sensors and CMOS Signal Processing Circuits. IEEE Trans. Biomed. Circuits Syst. 2015, 9, 105–112. [Google Scholar] [CrossRef] [PubMed]
  121. Bifulco, P.; Gargiulo, G.D.; d’Angelo, G.; Liccardo, A.; Romano, M.; Clemente, F.; Cesarelli, M.; Angelo, G.; Liccardo, A.; Romano, M.; et al. Monitoring of respiration, seismocardiogram and heart sounds by a PVDF piezo film sensor. Measurement 2014, 11, 786–789. [Google Scholar]
  122. Fekr, A.R.; Radecka, K.; Zilic, Z. Tidal volume variability and respiration rate estimation using a wearable accelerometer sensor. In Proceedings of the 4th International Conference on Wireless Mobile Communication and Healthcare—Transforming Healthcare Through Innovations in Mobile and Wireless Technologies (MOBIHEALTH), Athens, Greece, 3–5 November 2014; pp. 1–6. [Google Scholar]
  123. Agcayazi, T.; Yokus, M.A.; Gordon, M.; Ghosh, T.; Bozkurt, A. A stitched textile-based capacitive respiration sensor. In Proceedings of the 2017 IEEE Sensors, Glasgow, UK, 29 October–1 November 2017; pp. 1–3. [Google Scholar]
  124. Hesse, M.; Christ, P.; Hörmann, T.; Rückert, U. A respiration sensor for a chest-strap based wireless body sensor. In Proceedings of the 2014 IEEE Sensors, Valencia, Spain, 2–5 November 2014; pp. 490–493. [Google Scholar]
  125. Krehel, M.; Schmid, M.; Rossi, R.; Boesel, L.; Bona, G.-L.; Scherer, L. An Optical Fibre-Based Sensor for Respiratory Monitoring. Sensors 2014, 14, 13088–13101. [Google Scholar] [CrossRef] [Green Version]
  126. Min, S.D.; Yun, Y.; Shin, H. Simplified Structural Textile Respiration Sensor Based on Capacitive Pressure Sensing Method. IEEE Sens. J. 2014, 14, 3245–3251. [Google Scholar] [CrossRef]
  127. Petrović, M.D.; Petrovic, J.; Daničić, A.; Vukčević, M.; Bojović, B.; Hadžievski, L.; Allsop, T.; Lloyd, G.; Webb, D.J. Non-invasive respiratory monitoring using long-period fiber grating sensors. Biomed. Opt. Express 2014, 5, 1136–1144. [Google Scholar] [CrossRef] [Green Version]
  128. Sanchez, P.; Zamarreno, C.R.; Zamarreo, C.R.; Hernaez, M.; Matias, I.R.; Arregui, F.J. Exhaled breath optical fiber sensor based on LMRs for respiration monitoring. In Proceedings of the 2014 IEEE Sensors, Valencia, Spain, 2–5 November 2014; pp. 1142–1145. [Google Scholar]
  129. Wo, J.; Wang, H.; Sun, Q.; Shum, P.P.; Liu, D. Noninvasive respiration movement sensor based on distributed Bragg reflector fiber laser with beat frequency interrogation. J. Biomed. Opt. 2014, 19, 17003. [Google Scholar] [CrossRef]
  130. Yang, C.M.; Yang, T.L.; Wu, C.C.; Hung, S.H.; Liao, M.H.; Su, M.J.; Hsieh, H.C. Textile-based capacitive sensor for a wireless wearable breath monitoring system. In Proceedings of the IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, USA, 11–14 January 2014; pp. 232–233. [Google Scholar]
  131. Yoon, J.-W.; Noh, Y.-S.; Kwon, Y.-S.; Kim, W.-K.; Yoon, H.-R. Improvement of dynamic respiration monitoring through sensor fusion of accelerometer and gyro-sensor. J. Electr. Eng. Technol. 2014, 9, 334–343. [Google Scholar] [CrossRef] [Green Version]
  132. Chan, A.M.; Selvaraj, N.; Ferdosi, N.; Narasimhan, R. Wireless patch sensor for remote monitoring of heart rate, respiration, activity, and falls. In Proceedings of the 35th Annual international conference of the IEEE engineering in medicine and biology society (EMBC), Osaka, Japan, 3–7 July 2013; pp. 6115–6118. [Google Scholar]
  133. Huang, Y.; Huang, K. Monitoring of breathing rate by a piezofilm sensor using pyroelectric effect. In Proceedings of the 1st International Conference on Orange Technologies (ICOT), Tainan, Taiwan, 12–16 March 2013; pp. 99–102. [Google Scholar]
  134. Aileni, R.M.; Pasca, S.; Strungaru, R.; Valderrama, C. Biomedical signal acquisition for respiration monitoring by flexible analog wearable sensors. In Proceedings of the 2017 E-Health and Bioengineering Conference (EHB), Sinaia, Romania, 22–24 June 2017; pp. 81–84. [Google Scholar]
  135. Padasdao, B.; Shahhaidar, E.; Stickley, C.; Boric-Lubecke, O. Electromagnetic Biosensing of Respiratory Rate. IEEE Sens. J. 2013, 13, 4204–4211. [Google Scholar] [CrossRef]
  136. Chiu, Y.-Y.; Lin, W.-Y.; Wang, H.-Y.; Huang, S.-B.; Wu, M.-H. Development of a piezoelectric polyvinylidene fluoride (PVDF) polymer-based sensor patch for simultaneous heartbeat and respiration monitoring. Sens. Actuators A Phys. 2013, 189, 328–334. [Google Scholar] [CrossRef]
  137. Favero, F.C.; Pruneri, V.; Villatoro, J. Microstructured optical fiber interferometric breathing sensor. J. Biomed. Opt. 2012, 17, 37006. [Google Scholar] [CrossRef] [PubMed]
  138. Mathew, J.; Semenova, Y.; Farrell, G. A miniature optical breathing sensor. Biomed. Opt. Express 2012, 3, 3325–3331. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  139. Scully, C.G.; Lee, J.; Meyer, J.; Gorbach, A.M.; Granquist-Fraser, D.; Mendelson, Y.; Chon, K.H. Physiological Parameter Monitoring from Optical Recordings With a Mobile Phone. IEEE Trans. Biomed. Eng. 2012, 59, 303–306. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  140. Trobec, R.; Rashkovska, A.; Avbelj, V. Two proximal skin electrodes—A body sensor for respiration rate. Sensors 2012, 12, 13813–13828. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  141. Witt, J.; Narbonneau, F.; Schukar, M.; Krebber, K.; De Jonckheere, J.; Jeanne, M.; Kinet, D.; Paquet, B.; Depre, A.; D’Angelo, L.T.; et al. Medical Textiles with Embedded Fiber Optic Sensors for Monitoring of Respiratory Movement. IEEE Sens. J. 2012, 12, 246–254. [Google Scholar] [CrossRef]
  142. Zięba, J.; Frydrysiak, M.; Błaszczyk, J. Textronic clothing with resistance textile sensor to monitoring frequency of human breathing. In Proceedings of the IEEE International Symposium on Medical Measurements and Applications Proceedings, Budapest, Hungary, 18–19 May 2012; pp. 1–6. [Google Scholar]
  143. Carlos, R.; Coyle, S.; Corcoran, B.; Diamond, D.; Tomas, W.; Aaron, M.; Stroiescu, F.; Daly, K. Web-based sensor streaming wearable for respiratory monitoring applications. In Proceedings of the 2011 IEEE Sensors, Limerick, UK, 28–31 October 2011; pp. 901–903. [Google Scholar]
  144. Ciobotariu, R.; Rotariu, C.; Adochiei, F.; Costin, H. Wireless breathing system for long term telemonitoring of respiratory activity. In Proceedings of the 7th International Symposium on Advanced Topics in Electrical (ATEE), Bucharest, Romania, 12–14 May 2011; pp. 1–4. [Google Scholar]
  145. Basra, A.; Mukhopadhayay, B.; Kar, S. Temperature sensor based ultra low cost respiration monitoring system. In Proceedings of the 9th International Conference on Communication Systems and Networks (COMSNETS), Bengaluru, India, 4–8 January 2017; pp. 530–535. [Google Scholar]
  146. Guo, L.; Berglin, L.; Li, Y.J.; Mattila, H.; Mehrjerdi, A.K.; Skrifvars, M. ’Disappearing Sensor’-Textile Based Sensor for Monitoring Breathing. In Proceedings of the International Conference on Control, Automation and Systems Engineering (CASE), Singapore, 30–31 July 2011; pp. 1–4. [Google Scholar]
  147. Liu, S.; Gao, R.X.; Freedson, P.S. Non-invasive respiration and ventilation prediction using a single abdominal sensor belt. In Proceedings of the 2011 IEEE Signal Processing in Medicine and Biology Symposium (SPMB), New York, NY, USA, 10 December 2011; pp. 1–5. [Google Scholar]
  148. Liu, G.-Z.; Guo, Y.-W.; Zhu, Q.-S.; Huang, B.-Y.; Wang, L. Estimation of respiration rate from three-dimensional acceleration data based on body sensor network. Telemed. E-Health 2011, 17, 705–711. [Google Scholar] [CrossRef]
  149. Mann, J.; Rabinovich, R.; Bates, A.; Giavedoni, S.; MacNee, W.; Arvind, D.K. Simultaneous Activity and Respiratory Monitoring Using an Accelerometer. In Proceedings of the 2011 International Conference on Body Sensor Networks, Dallas, TX, USA, 23–25 May 2011; pp. 139–143. [Google Scholar]
  150. Ono, T.; Takegawa, H.; Ageishi, T.; Takashina, M.; Numasaki, H.; Matsumoto, M.; Teshima, T. Respiratory monitoring with an acceleration sensor. Phys. Med. Biol. 2011, 56, 6279–6289. [Google Scholar] [CrossRef]
  151. Silva, A.F.; Carmo, J.; Mendes, P.M.; Correia, J.H. Simultaneous cardiac and respiratory frequency measurement based on a single fiber Bragg grating sensor. Meas. Sci. Technol. Meas. Sci. Technol. 2011, 22, 75801. [Google Scholar] [CrossRef] [Green Version]
  152. Yang, C.M.; Yang, T.; Wu, C.C.; Chu, N.N.Y. A breathing game with capacitive textile sensors. In Proceedings of the IEEE International Games Innovation Conference (IGIC), Orange, CA, USA, 2–3 November 2011; pp. 134–136. [Google Scholar]
  153. Yoo, W.J.; Jang, K.W.; Seo, J.K.; Heo, J.Y.; Moon, J.S.; Jun, J.H.; Park, J.-Y.; Lee, B. Development of optical fiber-based respiration sensor for noninvasive respiratory monitoring. Opt. Rev. 2011, 18, 132–138. [Google Scholar] [CrossRef]
  154. Yoo, W.-J.; Jang, K.-W.; Seo, J.-K.; Heo, J.-Y.; Moon, J.-S.; Park, J.-Y.; Lee, B.-S. Development of respiration sensors using plastic optical fiber for respiratory monitoring inside MRI system. J. Opt. Soc. Korea 2010, 14, 235–239. [Google Scholar] [CrossRef] [Green Version]
  155. Yoo, W.; Jang, K.; Seo, J.; Heo, J.Y.; Moon, J.S.; Lee, B.; Park, J.-Y. Development of Nasal-cavity-and Abdomen-attached Fiber-optic Respiration Sensors. J. Korean Phys. Soc. 2010, 57, 1550–1554. [Google Scholar]
  156. Bhattacharya, R.; Bandyopadhyay, N.; Kalaivani, S. Real time Android app based respiration rate monitor. In Proceedings of the International Conference of Electronics, Communication and Aerospace Technology (ICECA), Coimbatore, India, 20–22 April 2017; Volume 1, pp. 709–712. [Google Scholar]
  157. Ansari, S.; Belle, A.; Najarian, K.; Ward, K. Impedance plethysmography on the arms: Respiration monitoring. In Proceedings of the IEEE International Conference on Bioinformatics and Biomedicine Workshops (BIBMW), Hong Kong, China, 18 December 2010; pp. 471–472. [Google Scholar]
  158. De Jonckheere, J.; Jeanne, M.; Narbonneau, F.; Witt, J.; Paquet, B.; Kinet, D.; Kreber, K.; Logier, R. OFSETH: A breathing motions monitoring system for patients under MRI. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology, Buenos Aires, Argentina, 31 August–4 September 2010; pp. 1016–1019. [Google Scholar]
  159. Mitchell, E.; Coyle, S.; O’Connor, N.E.; Diamond, D.; Ward, T. Breathing Feedback System with Wearable Textile Sensors. In Proceedings of the International Conference on Body Sensor Networks, Singapore, 7–9 June 2010; pp. 56–61. [Google Scholar]
  160. Zhang, Z.; Wu, H.; Wang, W.; Wang, B. A smartphone based respiratory biofeedback system. In Proceedings of the 3rd International Conference on Biomedical Engineering and Informatics, Yantai, China, 16–18 October 2010; Volume 2, pp. 717–720. [Google Scholar]
  161. Kundu, S.K.; Kumagai, S.; Sasaki, M. A wearable capacitive sensor for monitoring human respiratory rate. Jpn. J. Appl. Phys. 2013, 52, 04CL05. [Google Scholar] [CrossRef]
  162. Das, T.; Guha, S.; Banerjee, N.; Basak, P. Development of thermistor based low cost high sensitive respiration rate measurement system using audio software with audio input. In Proceedings of the Third International Conference on Biosignals, Images and Instrumentation (ICBSII), Chennai, India, 16–18 March 2017; pp. 1–3. [Google Scholar]
  163. Al-Wahedi, A.; Al-Shams, M.; Albettar, M.A.; Alawsh, S.; Muqaibel, A. Wireless Monitoring of Respiration and Heart Rates Using Software-Defined-Radio. In Proceedings of the 2019 16th International Multi-Conference on Systems, Signals & Devices (SSD), Istanbul, Turkey, 21–24 March 2019; pp. 529–532. [Google Scholar]
  164. Chen, Y.; Kaneko, M.; Hirose, S.; Chen, W. Real-time Respiration Measurement during Sleep Using a Microwave Sensor. In Proceedings of the 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany, 23–27 July 2019; pp. 3791–3794. [Google Scholar]
  165. Gunaratne, P.; Tamura, H.; Yoshida, C.; Sakurai, K.; Tanno, K.; Takahashi, N.; Nagata, J. A Study on Breathing and Heartbeat Monitoring System during Sleeping Using Multi-Piezoelectric Elements. In Proceedings of the 2019 Moratuwa Engineering Research Conference (MERCon), Moratuwa, Sri Lanka, 3–5 June 2019; pp. 382–387. [Google Scholar]
  166. Guo, S.; Zhao, X.; Matsuo, K.; Liu, J.; Mukai, T. Unconstrained Detection of the Respiratory Motions of Chest and Abdomen in Different Lying Positions Using a Flexible Tactile Sensor Array. IEEE Sens. J. 2019, 19, 10067–10076. [Google Scholar] [CrossRef]
  167. Isono, S.; Nozaki-Taguchi, N.; Hasegawa, M.; Kato, S.; Todoroki, S.; Masuda, S.; Iida, N.; Nishimura, T.; Noto, M.; Sato, Y. Contact-free unconstraint respiratory measurements with load cells under the bed in awake healthy volunteers: Breath-by-breath comparison with pneumotachography. J. Appl. Physiol. 2019, 126, 1432–1441. [Google Scholar] [CrossRef]
  168. Ivanovs, A.; Nikitenko, A.; Di Castro, M.; Torims, T.; Masi, A.; Ferre, M. Multisensor low-cost system for real time human detection and remote respiration monitoring. In Proceedings of the Third IEEE 7International Conference on Robotic Computing (IRC), Naples, Italy, 25–27 February 2019; pp. 254–257. [Google Scholar]
  169. Joshi, R.; Bierling, B.; Feijs, L.; van Pul, C.; Andriessen, P. Monitoring the respiratory rate of preterm infants using an ultrathin film sensor embedded in the bedding: A comparative feasibility study. Physiol. Meas. 2019, 40, 45003. [Google Scholar] [CrossRef]
  170. Krej, M.; Baran, P.; Dziuda, Ł. Detection of respiratory rate using a classifier of waves in the signal from a FBG-based vital signs sensor. Comput. Methods Programs Biomed. 2019, 177, 31–38. [Google Scholar] [CrossRef]
  171. Lorato, I.; Bakkes, T.; Stuijk, S.; Meftah, M.; De Haan, G. Unobtrusive respiratory flow monitoring using a thermopile array: A feasibility study. Appl. Sci. 2019, 9, 2449. [Google Scholar] [CrossRef] [Green Version]
  172. Massaroni, C.; Lo Presti, D.; Formica, D.; Silvestri, S.; Schena, E. Non-contact monitoring of breathing pattern and respiratory rate via RGB signal measurement. Sensors 2019, 19, 2758. [Google Scholar] [CrossRef] [Green Version]
  173. Park, S.; Choi, H.; Yang, H.C.; Yoon, J.; Shin, H. Force-Sensing-Based Unobtrusive System for Awakening and Respiration Rate Analysis during Sleep. IEEE Sens. J. 2019, 19, 1917–1924. [Google Scholar] [CrossRef]
  174. Walterscheid, I.; Biallawons, O.; Berens, P. Contactless Respiration and Heartbeat Monitoring of Multiple People Using a 2-D Imaging Radar. In Proceedings of the 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany, 23–27 July 2019; pp. 3720–3725. [Google Scholar]
  175. Wang, T.; Zhang, D.; Wang, L.; Zheng, Y.; Gu, T.; Dorizzi, B.; Zhou, X. Contactless respiration monitoring using ultrasound signal with off-the-shelf audio devices. IEEE Internet Things J. 2018, 6, 2959–2973. [Google Scholar] [CrossRef]
  176. Xu, X.; Yu, J.; Chen, Y.; Zhu, Y.; Kong, L.; Li, M. BreathListener: Fine-grained Breathing Monitoring in Driving Environments Utilizing Acoustic Signals. In Proceedings of the 17th Annual International Conference on Mobile Systems, Applications, and Services, Breckenridge, CO, USA, 17–20 June 2019; pp. 54–66. [Google Scholar]
  177. Yang, Y.; Cao, J.; Liu, X.; Liu, X. Multi-Breath: Separate Respiration Monitoring for Multiple Persons with UWB Radar. In Proceedings of the IEEE 43rd Annual Computer Software and Applications Conference (COMPSAC), Milwaukee, WI, USA, 15–19 July 2019; Volume 1, pp. 840–849. [Google Scholar]
  178. Chen, C.; Han, Y.; Chen, Y.; Lai, H.; Zhang, F.; Wang, B.; Liu, K.J.R. TR-BREATH: Time-Reversal Breathing Rate Estimation and Detection. IEEE Trans. Biomed. Eng. 2018, 65, 489–501. [Google Scholar] [CrossRef] [PubMed]
  179. Chen, S.; Wu, N.; Ma, L.; Lin, S.; Yuan, F.; Xu, Z.; Li, W.; Wang, B.; Zhou, J. Noncontact heartbeat and respiration monitoring based on a hollow microstructured self-powered pressure sensor. ACS Appl. Mater. Interfaces 2018, 10, 3660–3667. [Google Scholar] [CrossRef] [PubMed]
  180. Massaroni, C.; Schena, E.; Silvestri, S.; Taffoni, F.; Merone, M. Measurement system based on RBG camera signal for contactless breathing pattern and respiratory rate monitoring. In Proceedings of the IEEE International Symposium on Medical Measurements and Applications (MeMeA), Rome, Italy, 11–13 June 2018; pp. 1–6. [Google Scholar]
  181. Massaroni, C.; Lo Presti, D.; Saccomandi, P.; Caponero, M.A.; D’Amato, R.; Schena, E. Fiber Bragg Grating Probe for Relative Humidity and Respiratory Frequency Estimation: Assessment During Mechanical Ventilation. IEEE Sens. J. 2018, 18, 2125–2130. [Google Scholar] [CrossRef]
  182. Sadek, I.; Seet, E.; Biswas, J.; Abdulrazak, B.; Mokhtari, M. Nonintrusive Vital Signs Monitoring for Sleep Apnea Patients: A Preliminary Study. IEEE Access 2018, 6, 2506–2514. [Google Scholar] [CrossRef]
  183. Azimi, H.; Soleimani Gilakjani, S.; Bouchard, M.; Bennett, S.; Goubran, R.A.; Knoefel, F. Breathing signal combining for respiration rate estimation in smart beds. In Proceedings of the IEEE International Symposium on Medical Measurements and Applications (MeMeA), Rochester, MN, USA, 7–10 May 2017; pp. 303–307. [Google Scholar]
  184. Cho, Y.; Bianchi-Berthouze, N.; Julier, S.J.; Marquardt, N. ThermSense: Smartphone-based breathing sensing platform using noncontact low-cost thermal camera. In Proceedings of the 2017 7th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW), San Antonio, TX, USA, 23–26 October 2017; pp. 83–84. [Google Scholar]
  185. Leicht, L.; Vetter, P.; Leonhardt, S.; Teichmann, D. The PhysioBelt: A safety belt integrated sensor system for heart activity and respiration. In Proceedings of the 2017 IEEE International Conference on Vehicular Electronics and Safety (ICVES), Vienna, Austria, 27–28 June 2017; pp. 191–195. [Google Scholar]
  186. Li, K.; Xu, W.; Zhan, N.; Wan, K.; Yu, C.; Yu, C. Non-wearable respiration monitoring based on Mach-Zehnder interferometer. In Proceedings of the Conference on Lasers and Electro-Optics Pacific Rim (CLEO-PR), Singapore, 31 July–4 August 2017; pp. 1–2. [Google Scholar]
  187. Li, M.H.; Yadollahi, A.; Taati, B. Noncontact Vision-Based Cardiopulmonary Monitoring in Different Sleeping Positions. IEEE J. Biomed. Health Inform. 2017, 21, 1367–1375. [Google Scholar] [CrossRef]
  188. Procházka, A.; Charvátová, H.; Vyšata, O.; Kopal, J.; Chambers, J. Breathing Analysis Using Thermal and Depth Imaging Camera Video Records. Sensors 2017, 17, 1408. [Google Scholar] [CrossRef] [PubMed]
  189. Wang, X.; Huang, R.; Mao, S. SonarBeat: Sonar Phase for Breathing Beat Monitoring with Smartphones. In Proceedings of the 26th International Conference on Computer Communication and Networks (ICCCN), Vancouver, BC, Canada, 31 July–3 August 2017; pp. 1–8. [Google Scholar]
  190. Kukkapalli, R.; Banerjee, N.; Robucci, R.; Kostov, Y. Micro-radar wearable respiration monitor. In Proceedings of the 2016 IEEE Sensors, Orlando, FL, USA, 30 October–3 November 2016; pp. 1–3. [Google Scholar]
  191. Procházka, A.; Schätz, M.; Vyšata, O.; Vališ, M. Microsoft Kinect Visual and Depth Sensors for Breathing and Heart Rate Analysis. Sensors 2016, 16, 996. [Google Scholar] [CrossRef] [Green Version]
  192. Tveit, D.M.; Engan, K.; Austvoll, I.; Meinich-Bache, Ø. Motion based detection of respiration rate in infants using video. In Proceedings of the IEEE International Conference on Image Processing (ICIP), Phoenix, AZ, USA, 25–28 September 2016; pp. 1225–1229. [Google Scholar]
  193. Erden, F.; Alkar, A.Z.; Cetin, A.E. Contact-free measurement of respiratory rate using infrared and vibration sensors. Infrared Phys. Technol. 2015, 73, 88–94. [Google Scholar] [CrossRef]
  194. Liu, J.J.; Huang, M.; Xu, W.; Zhang, X.; Stevens, L.; Alshurafa, N.; Sarrafzadeh, M. BreathSens: A Continuous On-Bed Respiratory Monitoring System With Torso Localization Using an Unobtrusive Pressure Sensing Array. IEEE J. Biomed. Health Inform. 2015, 19, 1682–1688. [Google Scholar] [CrossRef]
  195. Pereira, C.B.; Yu, X.; Blazek, V.; Leonhardt, S. Robust remote monitoring of breathing function by using infrared thermography. In Proceedings of the 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 4250–4253. [Google Scholar]
  196. Ravichandran, R.; Saba, E.; Chen, K.-Y.; Goel, M.; Gupta, S.; Patel, S.N. WiBreathe: Estimating respiration rate using wireless signals in natural settings in the home. In Proceedings of the IEEE International Conference on Pervasive Computing and Communications (PerCom), St. Louis, MO, USA, 23–27 March 2015; pp. 131–139. [Google Scholar]
  197. Sasaki, E.; Kajiwara, A. Multiple Respiration Monitoring by Stepped-FM UWB Sensor. In Proceedings of the 2015 IEEE International Conference on Computational Intelligence Communication Technology, Ghaziabad, India, 13–14 February 2015; pp. 406–409. [Google Scholar]
  198. Zakrzewski, M.; Vehkaoja, A.; Joutsen, A.S.; Palovuori, K.T.; Vanhala, J.J. Noncontact Respiration Monitoring During Sleep with Microwave Doppler Radar. IEEE Sens. J. 2015, 15, 5683–5693. [Google Scholar] [CrossRef]
  199. Arlotto, P.; Grimaldi, M.; Naeck, R.; Ginoux, J.-M. An ultrasonic contactless sensor for breathing monitoring. Sensors 2014, 14, 15371–15386. [Google Scholar] [CrossRef] [PubMed]
  200. Bernacchia, N.; Scalise, L.; Casacanditella, L.; Ercoli, I.; Marchionni, P.; Tomasini, E.P. Non contact measurement of heart and respiration rates based on KinectTM. In Proceedings of the 2014 IEEE International Symposium on Medical Measurements and Applications (MeMeA), Lisboa, Portugal, 11–12 June 2014; pp. 1–5. [Google Scholar]
  201. Chen, Z.; Lau, D.; Teo, J.T.; Ng, S.H.; Yang, X.; Kei, P.L. Simultaneous measurement of breathing rate and heart rate using a microbend multimode fiber optic sensor. J. Biomed. Opt. 2014, 19, 1–11. [Google Scholar] [CrossRef] [PubMed]
  202. Luis, J.; Roa Romero, L.M.; Galan, J.A.; Naranjo, D.; Estudillo-Valderrama, M.; Barbarov-Rostán, G.; Rubia-Marcos, C. Design and Implementation of a Smart Sensor for Respiratory Rate Monitoring. Sensors 2014, 14, 3019–3032. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  203. Mukai, T.; Matsuo, K.; Kato, Y.; Shimizu, A.; Guo, S. Determination of locations on a tactile sensor suitable for respiration and heartbeat measurement of a person on a bed. In Proceedings of the 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Chicago, IL, USA, 26–30 August 2014; pp. 66–69. [Google Scholar]
  204. Nukaya, S.; Sugie, M.; Kurihara, Y.; Hiroyasu, T.; Watanabe, K.; Tanaka, H. A noninvasive heartbeat, respiration, and body movement monitoring system for neonates. Artif. Life Robot. 2014, 19, 414–419. [Google Scholar] [CrossRef]
  205. Patwari, N.; Brewer, L.; Tate, Q.; Kaltiokallio, O.; Bocca, M. Breathfinding: A Wireless Network That Monitors and Locates Breathing in a Home. IEEE J. Sel. Top. Signal Process. 2014, 8, 30–42. [Google Scholar] [CrossRef] [Green Version]
  206. Patwari, N.; Wilson, J.; Ananthanarayanan, S.; Kasera, S.K.; Westenskow, D.R. Monitoring Breathing via Signal Strength in Wireless Networks. IEEE Trans. Mob. Comput. 2014, 13, 1774–1786. [Google Scholar] [CrossRef] [Green Version]
  207. Taheri, T.; Sant’Anna, A. Non-invasive breathing rate detection using a very low power ultra-wide-band radar. In Proceedings of the IEEE International Conference on Bioinformatics and Biomedicine (BIBM), Belfast, UK, 2–5 November 2014; pp. 78–83. [Google Scholar]
  208. Bartula, M.; Tigges, T.; Muehlsteff, J. Camera-based system for contactless monitoring of respiration. In Proceedings of the 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Osaka, Japan, 3–7 July 2013; pp. 2672–2675. [Google Scholar]
  209. Chen, R.; Formenti, F.; Obeid, A.; Hahn, C.E.W.; Farmery, A. A fibre-optic oxygen sensor for monitoring human breathing. Physiol. Meas. 2013, 34, N71–N81. [Google Scholar] [CrossRef]
  210. Dziuda, Ł.; Krej, M.; Skibniewski, F.W. Fiber Bragg Grating Strain Sensor Incorporated to Monitor Patient Vital Signs during MRI. IEEE Sens. J. 2013, 13, 4986–4991. [Google Scholar] [CrossRef]
  211. Klap, T.; Shinar, Z. Using piezoelectric sensor for continuous-contact-free monitoring of heart and respiration rates in real-life hospital settings. In Proceedings of the Computing in Cardiology, Zaragoza, Spain, 22–25 September 2013; pp. 671–674. [Google Scholar]
  212. Šprager, S.; Zazula, D. Detection of heartbeat and respiration from optical interferometric signal by using wavelet transform. Comput. Methods Programs Biomed. 2013, 111, 41–51. [Google Scholar] [CrossRef]
  213. Vinci, G.; Lindner, S.; Barbon, F.; Mann, S.; Hofmann, M.; Duda, A.; Weigel, R.; Koelpin, A. Six-port radar sensor for remote respiration rate and heartbeat vital-sign monitoring. IEEE Trans. Microw. Theory Tech. 2013, 61, 2093–2100. [Google Scholar] [CrossRef]
  214. Yavari, E.; Jou, H.; Lubecke, V.; Boric-Lubecke, O. Doppler radar sensor for occupancy monitoring. In Proceedings of the IEEE Topical Conference on Power Amplifiers for Wireless and Radio Applications, Santa Clara, CA, USA, 20 January 2013; pp. 145–147. [Google Scholar]
  215. Aoki, H.; Miyazaki, M.; Nakamura, H.; Furukawa, R.; Sagawa, R.; Kawasaki, H. Non-contact respiration measurement using structured light 3-D sensor. In Proceedings of the SICE Annual Conference (SICE), Akita, Japan, 20–23 August 2012; pp. 614–618. [Google Scholar]
  216. Boccanfuso, L.; O’Kane, J.M. Remote measurement of breathing rate in real time using a high precision, single-point infrared temperature sensor. In Proceedings of the 4th IEEE RAS EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob), Rome, Italy, 24–27 June 2012; pp. 1704–1709. [Google Scholar]
  217. Brüser, C.; Kerekes, A.; Winter, S.; Leonhardt, S. Multi-channel optical sensor-array for measuring ballistocardiograms and respiratory activity in bed. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, San Diego, CA, USA, 28 August–1 September 2012; pp. 5042–5045. [Google Scholar]
  218. Chen, Z.; Teo, J.T.; Ng, S.H.; Yang, X. Plastic optical fiber microbend sensor used as breathing sensor. In Proceedings of the 2012 IEEE Sensors, Taipei, Taiwan, 28–31 October 2012; pp. 1–4. [Google Scholar]
  219. Gu, C.; Li, R.; Zhang, H.; Fung, A.Y.C.; Torres, C.; Jiang, S.B.; Li, C. Accurate respiration measurement using DC-coupled continuous-wave radar sensor for motion-adaptive cancer radiotherapy. IEEE Trans. Biomed. Eng. 2012, 59, 3117–3123. [Google Scholar] [PubMed]
  220. Lokavee, S.; Puntheeranurak, T.; Kerdcharoen, T.; Watthanwisuth, N.; Tuantranont, A. Sensor pillow and bed sheet system: Unconstrained monitoring of respiration rate and posture movements during sleep. In Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics (SMC), Seoul, Korea, 14–17 October 2012; pp. 1564–1568. [Google Scholar]
  221. Shimomura, N.; Otsu, M.; Kajiwara, A. Empirical study of remote respiration monitoring sensor using wideband system. In Proceedings of the 6th International Conference on Signal Processing and Communication Systems, Gold Coast, Australia, 12–14 December 2012; pp. 1–5. [Google Scholar]
  222. Xia, J.; Siochi, R.A. A real-time respiratory motion monitoring system using KINECT: Proof of concept. Med. Phys. 2013, 39, 2682–2685. [Google Scholar] [CrossRef] [PubMed]
  223. Lai, J.C.Y.; Xu, Y.; Gunawan, E.; Chua, E.C.; Maskooki, A.; Guan, Y.L.; Low, K.; Soh, C.B.; Poh, C. Wireless Sensing of Human Respiratory Parameters by Low-Power Ultrawideband Impulse Radio Radar. IEEE Trans. Instrum. Meas. 2011, 60, 928–938. [Google Scholar] [CrossRef]
  224. Otsu, M.; Nakamura, R.; Kajiwara, A. Remote respiration monitoring sensor using stepped-FM. In Proceedings of the IEEE Sensors Applications Symposium, San Antonio, TX, USA, 22–24 February 2011; pp. 155–158. [Google Scholar]
  225. Postolache, O.; Girão, P.S.; Postolache, G.; Gabriel, J. Cardio-respiratory and daily activity monitor based on FMCW Doppler radar embedded in a wheelchair. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Boston, MA, USA, 30 August–3 September 2011; pp. 1917–1920. [Google Scholar]
  226. Zito, D.; Pepe, D.; Mincica, M.; Zito, F.; Tognetti, A.; Lanata, A.; De Rossi, D. SoC CMOS UWB pulse radar sensor for contactless respiratory rate monitoring. IEEE Trans. Biomed. Circuits Syst. 2011, 5, 503–510. [Google Scholar] [CrossRef]
  227. Heise, D.; Skubic, M. Monitoring pulse and respiration with a non-invasive hydraulic bed sensor. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology, Buenos Aires, Argentina, 31 August–4 September 2010; pp. 2119–2123. [Google Scholar]
  228. Min, S.D.; Kim, J.K.; Shin, H.S.; Yun, Y.H.; Lee, C.K.; Lee, M. Noncontact Respiration Rate Measurement System Using an Ultrasonic Proximity Sensor. IEEE Sens. J. 2010, 10, 1732–1739. [Google Scholar] [CrossRef]
  229. Mostov, K.; Liptsen, E.; Boutchko, R. Medical applications of shortwave FM radar: Remote monitoring of cardiac and respiratory motion. Med. Phys. 2010, 37, 1332–1338. [Google Scholar] [CrossRef] [Green Version]
  230. Nishiyama, M.; Miyamoto, M.; Watanabe, K. Respiration rhythm monitoring in sleep based on weight movement using hetero-core fiber optic sensors. In Proceedings of the ICCAS 2010, Gyeonggi-do, Korea, 27–30 October 2010; pp. 205–208. [Google Scholar]
  231. Nishyama, M.; Miyamoto, M.; Watanabe, K. Respiration and body movement analysis during sleep in bed using hetero-core fiber optic pressure sensors without constraint to human activity. J. Biomed. Opt. 2011, 16, 17002. [Google Scholar] [CrossRef] [Green Version]
  232. Scalise, L.; Marchionni, P.; Ercoli, I. Optical method for measurement of respiration rate. In Proceedings of the IEEE International Workshop on Medical Measurements and Applications, Ottawa, ON, Canada, 30 April–1 May 2010; pp. 19–22. [Google Scholar]
  233. Silvious, J.; Tahmoush, D. UHF measurement of breathing and heartbeat at a distance. In Proceedings of the IEEE Radio and Wireless Symposium (RWS), New Orleans, LA, USA, 10–14 January 2010; pp. 567–570. [Google Scholar]
  234. Tan, K.S.; Saatchi, R.; Elphick, H.; Burke, D. Real-time vision based respiration monitoring system. In Proceedings of the 7th International Symposium on Communication Systems, Networks Digital Signal Processing (CSNDSP 2010), Newcastle upon Tyne, UK, 21–23 July 2010; pp. 770–774. [Google Scholar]
  235. Brady, S.; Dunne, L.E.; Tynan, R.; Diamond, D.; Smyth, B.; O’Hare, G.M.P. Garment-based monitoring of respiration rate using a foam pressure sensor. In Proceedings of the Ninth IEEE International Symposium on Wearable Computers (ISWC’05), Osaka, Japan, 18–21 October 2005; pp. 214–215. [Google Scholar]
  236. Dziuda, L.; Skibniewski, F.; Rozanowski, K.; Krej, M.; Lewandowski, J. Fiber-optic sensor for monitoring respiration and cardiac activity. In Proceedings of the 2011 IEEE Sensors, Limerick, UK, 28–31 October 2011; pp. 413–416. [Google Scholar]
  237. Gutierrez Pascual, M.D. Indoor Location Systems Based on Zigbee Networks. Bachelor’s Thesis, Faculty of Computing, Karlskrona, Sweden, 2012. [Google Scholar]
  238. Wei, Y.-H.; Leng, Q.; Han, S.; Mok, A.K.; Zhang, W.; Tomizuka, M. RT-WiFi: Real-time high-speed communication protocol for wireless cyber-physical control applications. In Proceedings of the IEEE 34th Real-Time Systems Symposium, Vancouver, BC, Canada, 3–6 December 2013; pp. 140–149. [Google Scholar]
  239. Bettstetter, C.; Vogel, H.-J.; Eberspacher, J. GSM phase 2+ general packet radio service GPRS: Architecture, protocols, and air interface. IEEE Commun. Surv. 1999, 2, 2–14. [Google Scholar] [CrossRef]
  240. Delnavaz, A.; Voix, J. Electromagnetic micro-power generator for energy harvesting from breathing. In Proceedings of the IECON 38th Annual Conference on IEEE Industrial Electronics Society, Montreal, QC, Canada, 25–28 October 2012; pp. 984–988. [Google Scholar]
  241. Goreke, U.; Habibiabad, S.; Azgin, K.; Beyaz, M.I. A MEMS turbine prototype for respiration harvesting. Proc. J. Phys. Conf. Ser. 2015, 660, 12059. [Google Scholar] [CrossRef] [Green Version]
  242. Shahhaidar, E.; Padasdao, B.; Romine, R.; Stickley, C.; Boric-Lubecke, O. Piezoelectric and electromagnetic respiratory effort energy harvesters. In Proceedings of the 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Osaka, Japan, 3–7 July 2013; pp. 3439–3442. [Google Scholar]
  243. Li, K.; He, Q.; Wang, J.; Zhou, Z.; Li, X. Wearable energy harvesters generating electricity from low-frequency human limb movement. Microsyst. Nanoeng. 2018, 4, 1–13. [Google Scholar] [CrossRef]
  244. Wang, J.-J.; Su, H.-J.; Hsu, C.-I.; Su, Y.-C. Composite piezoelectric rubber band for energy harvesting from breathing and limb motion. Proc. J. Phys. Conf. Ser. 2014, 557, 12022. [Google Scholar] [CrossRef] [Green Version]
  245. Sun, C.; Shi, J.; Bayerl, D.J.; Wang, X. PVDF microbelts for harvesting energy from respiration. Energy Environ. Sci. 2011, 4, 4508–4512. [Google Scholar] [CrossRef]
  246. Zhang, Z.; Zhang, J.; Zhang, H.; Wang, H.; Hu, Z.; Xuan, W.; Dong, S.; Luo, J. A Portable Triboelectric Nanogenerator for Real-Time Respiration Monitoring. Nanoscale Res. Lett. 2019, 14, 354. [Google Scholar] [CrossRef] [PubMed]
  247. Vasandani, P.; Gattu, B.; Wu, J.; Mao, Z.-H.; Jia, W.; Sun, M. Triboelectric nanogenerator using microdome-patterned PDMS as a wearable respiratory energy harvester. Adv. Mater. Technol. 2017, 2, 1700014. [Google Scholar] [CrossRef]
  248. Seo, M.-H.; Choi, D.-H.; Han, C.-H.; Yoo, J.-Y.; Yoon, J.-B. An electrostatic energy harvester exploiting variable-area water electrode by respiration. In Proceedings of the 28th IEEE International Conference on Micro Electro Mechanical Systems (MEMS), Estoril, Portugal, 18–22 January 2015; pp. 126–129. [Google Scholar]
  249. Xue, H.; Yang, Q.; Wang, D.; Luo, W.; Wang, W.; Lin, M.; Liang, D.; Luo, Q. A wearable pyroelectric nanogenerator and self-powered breathing sensor. Nano Energy 2017, 38, 147–154. [Google Scholar] [CrossRef]
  250. Fan, F.-R.; Tian, Z.-Q.; Wang, Z.L. Flexible triboelectric generator. Nano Energy 2012, 1, 328–334. [Google Scholar] [CrossRef]
  251. Aljadiri, R.T.; Taha, L.Y.; Ivey, P. Electrostatic energy harvesting systems: A better understanding of their sustainability. J. Clean Energy Technol. 2017, 5, 409–416. [Google Scholar] [CrossRef] [Green Version]
  252. Nozariasbmarz, A.; Collins, H.; Dsouza, K.; Polash, M.H.; Hosseini, M.; Hyland, M.; Liu, J.; Malhotra, A.; Ortiz, F.M.; Mohaddes, F.; et al. Review of wearable thermoelectric energy harvesting: From body temperature to electronic systems. Appl. Energy 2020, 258, 114069. [Google Scholar] [CrossRef]
  253. Enescu, D. Thermoelectric Energy Harvesting: Basic Principles and Applications. In Green Energy Advances; IntechOpen: London, UK, 2019. [Google Scholar]
  254. Vanegas, E.; Igual, R.; Plaza, I. Piezoresistive Breathing Sensing System with 3D Printed Wearable Casing. J. Sens. 2019, 2019. [Google Scholar] [CrossRef] [Green Version]
  255. Doerffel, D.; Sharkh, S.A. A critical review of using the Peukert equation for determining the remaining capacity of lead-acid and lithium-ion batteries. J. Power Sources 2006, 155, 395–400. [Google Scholar] [CrossRef]
  256. Kirchev, A. Battery management and battery diagnostics. In Electrochemical Energy Storage for Renewable Sources and Grid Balancing; Elsevier: Amsterdam, The Netherlands, 2015; pp. 411–435. [Google Scholar]
  257. Lee, S.; Kim, J.; Lee, J.; Cho, B.H. State-of-charge and capacity estimation of lithium-ion battery using a new open-circuit voltage versus state-of-charge. J. Power Sources 2008, 185, 1367–1373. [Google Scholar] [CrossRef]
  258. Association, W.M. World Medical Association Declaration of Helsinki. Ethical principles for medical research involving human subjects. Bull. World Health Organ. 2001, 79, 373–374. [Google Scholar]
  259. Zhu, W.; Zeng, N.; Wang, N. Sensitivity, Specificity, Accuracy, Associated Confidence Interval and ROC Analysis with Practical SAS ® Implementations. Health Care Life Sci. 2010, 19, 67. [Google Scholar]
  260. The R Foundation. The R Project for Statistical Computing. Available online: https://www.r-project.org/ (accessed on 20 March 2020).
  261. Microsoft C#. Available online: https://docs.microsoft.com/es-es/dotnet/csharp/ (accessed on 3 August 2020).
  262. Team, O. OpenCV. Available online: https://opencv.org/about/ (accessed on 20 March 2020).
  263. Microsoft Kinect for Windows. Available online: https://www.microsoft.com/en-us/download/details.aspx?id=40278 (accessed on 20 March 2020).
  264. ADINSTRUMENTS. LabChart Lightning. Available online: https://www.adinstruments.com/products/labchart (accessed on 20 March 2020).
  265. Biopac Systems Inc. Acqknowledge Software. Available online: https://www.biopac.com/product/acqknowledge-software/ (accessed on 20 March 2020).
  266. National Instruments LabWindows/CVI. 2020. Available online: http://sine.ni.com/nips/cds/view/p/lang/es/nid/11104 (accessed on 18 September 2020).
  267. MathWorks Peak Analysis. Available online: https://es.mathworks.com/help/signal/examples/peak-analysis.html (accessed on 1 July 2020).
  268. Mallat, S. A Wavelet Tour of Signal Processing; Academic Press: Boston, MA, USA, 2009; ISBN 978-0-12-374370-1. [Google Scholar]
  269. Jawerth, B.; Sweldens, W. An overview of wavelet based multiresolution analyses. SIAM Rev. 1994, 36, 377–412. [Google Scholar] [CrossRef] [Green Version]
  270. Welch, G.; Bishop, G. An Introduction to the Kalman Filter; Citeseer: University Park, PA, USA, 1995. [Google Scholar]
  271. Chen, A.T.-Y.; Biglari-Abhari, M.; Wang, K.I.-K. Trusting the Computer in Computer Vision: A Privacy-Affirming Framework. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, Honolulu, HI, USA, 21–26 July 2017. [Google Scholar]
  272. Venkatesh, V.; Morris, M.G.; Davis, G.B.; Davis, F.D. User Acceptance of Information Technology: Toward a Unified View. MIS Q. 2003, 27, 425–478. [Google Scholar] [CrossRef] [Green Version]
  273. Sundaravej, T. Empirical Validation of Unified Theory of Acceptance and Use of Technology Model. J. Glob. Inf. Technol. Manag. 2010, 13, 5–27. [Google Scholar]
  274. Moon, Y.-J.; Hwang, Y.-H.; Cho, S. An Empirical Study of Impacts of User Intention for Smart Wearable Devices and Use Behavior. Adv. Multimed. Ubiquitous Eng. 2016, 354, 357–365. [Google Scholar] [CrossRef]
  275. Wolbring, G.; Leopatra, V. Sensors: Views of Staff of a Disability Service Organization. J. Pers. Med. 2013, 3, 23–39. [Google Scholar] [CrossRef] [Green Version]
  276. Gao, Y.; Li, H.; Luo, Y. An empirical study of wearable technology acceptance in healthcare. Ind. Manag. Data Syst. 2015, 115, 1704–1723. [Google Scholar] [CrossRef]
  277. Alam, M.Z.; Hu, W.; Barua, Z. Using the UTAT model to determine factors affecting acceptance and use of mobile health (mHealth) services in Bangladesh. J. Stud. Soc. Sci. 2018, 3, 137–172. [Google Scholar]
  278. Lo Presti, D.; Romano, C.; Massaroni, C.; Abbraccio, J.D.; Massari, L.; Caponero, M.A.; Oddo, C.M.; Formica, D.; Schena, E. Cardio-Respiratory Monitoring in Archery Using a Smart Textile Based on Flexible Fiber Bragg Grating Sensors. Sensors 2019, 19, 3581. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  279. Igual, R.; Medrano, C.; Plaza, I.; Orrite, C. Adaptive tracking algorithms to improve the use of computing resources. IET Comput. Vis. 2013, 7, 415–424. [Google Scholar] [CrossRef]
  280. Kim, J.D.; Lee, W.H.; Lee, Y.; Lee, H.J.; Cha, T.; Kim, S.H.; Song, K.-M.; Lim, Y.-H.; Cho, S.H.; Cho, S.H.; et al. Non-contact respiration monitoring using impulse radio ultrawideband radar in neonates. R. Soc. Open Sci. 2019, 6, 190149. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  281. Giavarina, D. Understanding Bland Altman analysis. Biochem. Med. 2015, 141–151. [Google Scholar] [CrossRef] [Green Version]
  282. Barré, A.; Deguilhem, B.; Grolleau, S.; Gérard, M.; Suard, F.; Riu, D. A review on lithium-ion battery ageing mechanisms and estimations for automotive applications. J. Power Sources 2013, 241, 680–689. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Most common application fields of sensing systems to monitor breathing.
Figure 1. Most common application fields of sensing systems to monitor breathing.
Sensors 20 05446 g001
Figure 2. Literature search results and selection procedure (top). PRISMA diagram (bottom).
Figure 2. Literature search results and selection procedure (top). PRISMA diagram (bottom).
Sensors 20 05446 g002
Figure 3. Analysis structure.
Figure 3. Analysis structure.
Sensors 20 05446 g003
Figure 4. Graphical explanation of the different breathing parameters. Signal (A) could come directly from the ADC (analog-to-digital converter) of the sensing system, although it is also possible that it represents physical respiration magnitudes. This figure shows a general representation that is not contextualized to a specific sensing system. The same goes for signal (B).
Figure 4. Graphical explanation of the different breathing parameters. Signal (A) could come directly from the ADC (analog-to-digital converter) of the sensing system, although it is also possible that it represents physical respiration magnitudes. This figure shows a general representation that is not contextualized to a specific sensing system. The same goes for signal (B).
Sensors 20 05446 g004
Figure 5. Most common sensor locations for respiration monitoring. The sensors shown are for contextualization purposes.
Figure 5. Most common sensor locations for respiration monitoring. The sensors shown are for contextualization purposes.
Sensors 20 05446 g005
Figure 6. Distribution of sensing techniques (left) and sensors (right) used in the studies of the wearable category.
Figure 6. Distribution of sensing techniques (left) and sensors (right) used in the studies of the wearable category.
Sensors 20 05446 g006
Figure 7. Distribution of sensing techniques (left) and sensors (right) used in the studies of the environmental category.
Figure 7. Distribution of sensing techniques (left) and sensors (right) used in the studies of the environmental category.
Sensors 20 05446 g007
Figure 8. Number of studies obtaining the different respiratory parameters for the wearable (top) and environmental (bottom) categories.
Figure 8. Number of studies obtaining the different respiratory parameters for the wearable (top) and environmental (bottom) categories.
Sensors 20 05446 g008aSensors 20 05446 g008b
Figure 9. Distribution of sensor location for the wearable studies.
Figure 9. Distribution of sensor location for the wearable studies.
Sensors 20 05446 g009
Figure 10. Distribution of sensor location for the environmental studies.
Figure 10. Distribution of sensor location for the environmental studies.
Sensors 20 05446 g010
Figure 11. Representation of possible setups of respiratory sensing systems. (A) perform data processing on a centralized processing platform and (B) perform data processing near the remote sensing unit.
Figure 11. Representation of possible setups of respiratory sensing systems. (A) perform data processing on a centralized processing platform and (B) perform data processing near the remote sensing unit.
Sensors 20 05446 g011
Figure 12. Representation of possible setups of respiratory sensing systems.
Figure 12. Representation of possible setups of respiratory sensing systems.
Sensors 20 05446 g012
Figure 13. Schemes of energy harvesting using magnetic induction generation: (A) DC generator activated by chest movements (figure inspired by Reference [135]), (B) tube with fixed and free magnets moved by airflow (figure inspired by Reference [240]), and (C) turbine moved by airflow (figure inspired by Reference [241]).
Figure 13. Schemes of energy harvesting using magnetic induction generation: (A) DC generator activated by chest movements (figure inspired by Reference [135]), (B) tube with fixed and free magnets moved by airflow (figure inspired by Reference [240]), and (C) turbine moved by airflow (figure inspired by Reference [241]).
Sensors 20 05446 g013
Figure 14. Piezoelectric energy harvesters. Three possible configurations are shown: (A) power generation based on compression or stretching movements associated with breathing (figure inspired by Reference [244]), (B) energy harvesting based on vibration amplified by a magnet (figure inspired by Reference [243]), and (C) technique using low speed airflow (figure inspired by Reference [245]).
Figure 14. Piezoelectric energy harvesters. Three possible configurations are shown: (A) power generation based on compression or stretching movements associated with breathing (figure inspired by Reference [244]), (B) energy harvesting based on vibration amplified by a magnet (figure inspired by Reference [243]), and (C) technique using low speed airflow (figure inspired by Reference [245]).
Sensors 20 05446 g014
Figure 15. Setups for triboelectric energy harvesting. Three possible configurations are shown: (A) flat belt-attached setup (figure inspired by Reference [246]), (B) Z-shaped connector (figure inspired by Reference [77]), and (C) movable and fixed supports (figure inspired by Reference [247]).
Figure 15. Setups for triboelectric energy harvesting. Three possible configurations are shown: (A) flat belt-attached setup (figure inspired by Reference [246]), (B) Z-shaped connector (figure inspired by Reference [77]), and (C) movable and fixed supports (figure inspired by Reference [247]).
Sensors 20 05446 g015
Figure 16. Electrostatic energy harvesting based on the variation of the area of the upper electrode owing to humidity of the exhaled air (figure inspired by Reference [248]).
Figure 16. Electrostatic energy harvesting based on the variation of the area of the upper electrode owing to humidity of the exhaled air (figure inspired by Reference [248]).
Sensors 20 05446 g016
Figure 17. Schematic of a pyroelectric energy harvester using a mask-mounted breathing prototype (figure inspired by Reference [253]).
Figure 17. Schematic of a pyroelectric energy harvester using a mask-mounted breathing prototype (figure inspired by Reference [253]).
Sensors 20 05446 g017
Figure 18. Example of a solar-powered system composed of a solar module, a charge regulator and a microcontroller. The voltage regulator receives an input voltage from the solar cell in the range of 0.3 V to 6 V. The charge regulator manages the charge of the battery (at constant voltage and current). The battery is connected in parallel to the internal voltage regulator of the microcontroller of the system.
Figure 18. Example of a solar-powered system composed of a solar module, a charge regulator and a microcontroller. The voltage regulator receives an input voltage from the solar cell in the range of 0.3 V to 6 V. The charge regulator manages the charge of the battery (at constant voltage and current). The battery is connected in parallel to the internal voltage regulator of the microcontroller of the system.
Sensors 20 05446 g018
Figure 19. Charge regulator and battery (low capacity, 150 mAh) integrated into the sensing prototype developed by Vanegas et al. [254], slightly modified. The sensor used in that prototype (a force-sensitive resistor) is included separately for size comparison. Units: cm.
Figure 19. Charge regulator and battery (low capacity, 150 mAh) integrated into the sensing prototype developed by Vanegas et al. [254], slightly modified. The sensor used in that prototype (a force-sensitive resistor) is included separately for size comparison. Units: cm.
Sensors 20 05446 g019
Figure 20. Number of studies adopting wired or wireless data transmission in respiration sensing systems.
Figure 20. Number of studies adopting wired or wireless data transmission in respiration sensing systems.
Sensors 20 05446 g020
Figure 21. Number of respiratory monitoring studies that considered different types of communication technologies.
Figure 21. Number of respiratory monitoring studies that considered different types of communication technologies.
Sensors 20 05446 g021
Figure 22. Number of studies adopting the different processing units.
Figure 22. Number of studies adopting the different processing units.
Sensors 20 05446 g022
Figure 23. Distribution of battery lives reported in the respiratory monitoring studies.
Figure 23. Distribution of battery lives reported in the respiratory monitoring studies.
Sensors 20 05446 g023
Figure 24. Common positions/activities to validate the breathing sensors (sitting, standing, lying down, walking, running, and sleeping). Chest sensor used as an example.
Figure 24. Common positions/activities to validate the breathing sensors (sitting, standing, lying down, walking, running, and sleeping). Chest sensor used as an example.
Sensors 20 05446 g024
Figure 25. Representation of different validation approaches: (A) use of artificial validation prototypes, (B) validation using a metronome, and (C) validation using a reference device.
Figure 25. Representation of different validation approaches: (A) use of artificial validation prototypes, (B) validation using a metronome, and (C) validation using a reference device.
Sensors 20 05446 g025
Figure 26. Flow diagram of a typical validation procedure using artificial prototypes.
Figure 26. Flow diagram of a typical validation procedure using artificial prototypes.
Sensors 20 05446 g026
Figure 27. Flow chart for the validation of a respiration sensor using the methods “metronome as reference” and “validation against a reference device”.
Figure 27. Flow chart for the validation of a respiration sensor using the methods “metronome as reference” and “validation against a reference device”.
Sensors 20 05446 g027
Figure 28. Number of studies that adopted the different validation approaches.
Figure 28. Number of studies that adopted the different validation approaches.
Sensors 20 05446 g028
Figure 29. Peak detection of a sample respiration signal obtained from the public breathing dataset published in Reference [254]. (A) Peak detection of a noisy signal without filtering. (B) Peak detection imposing a restriction of p surrounding number of samples (in green the peak accepted). (C) Example of a peak accepted (left, green peak) and a peak discarded (right, red peak) when applying the slope restriction. (D) Example of a peak reaching (green) and not reaching (red) the minimum prominence level PP to be considered a valid peak. (E) Example of two peaks (red) not fulfilling the minimum horizontal distance restriction TD. (F) Example of a peak (red) not fulfilling the vertical minimum level restriction and two peaks that surpass level TL (green peaks). (G) Example of two peaks discarded (red) for not differing the imposed tidal volume (TV) level from a detected peak (green).
Figure 29. Peak detection of a sample respiration signal obtained from the public breathing dataset published in Reference [254]. (A) Peak detection of a noisy signal without filtering. (B) Peak detection imposing a restriction of p surrounding number of samples (in green the peak accepted). (C) Example of a peak accepted (left, green peak) and a peak discarded (right, red peak) when applying the slope restriction. (D) Example of a peak reaching (green) and not reaching (red) the minimum prominence level PP to be considered a valid peak. (E) Example of two peaks (red) not fulfilling the minimum horizontal distance restriction TD. (F) Example of a peak (red) not fulfilling the vertical minimum level restriction and two peaks that surpass level TL (green peaks). (G) Example of two peaks discarded (red) for not differing the imposed tidal volume (TV) level from a detected peak (green).
Sensors 20 05446 g029
Figure 30. Zero-crossings method exemplified in a real signal obtain from the public breathing dataset of Vanegas et al. [254]. (A) Effect of the presence of outliers in the signals in the calculation of the “zero level”. (B) Example of a signal with trends and results of applying a de-trend processing. (C) Example of using different “zero levels” in a signal with trends. (D) Example of a noisy signal with several zero-crossings detected when only one of them (green) should have been considered.
Figure 30. Zero-crossings method exemplified in a real signal obtain from the public breathing dataset of Vanegas et al. [254]. (A) Effect of the presence of outliers in the signals in the calculation of the “zero level”. (B) Example of a signal with trends and results of applying a de-trend processing. (C) Example of using different “zero levels” in a signal with trends. (D) Example of a noisy signal with several zero-crossings detected when only one of them (green) should have been considered.
Sensors 20 05446 g030
Figure 31. Frequency analysis of sample real respiratory signals obtained from a public dataset [254]. (A) Effect of the time window (4 s, 8 s, and 16 s) on the frequency calculation. The true frequency is 0.33 Hz (3 s period) and the sampling frequency is 50 Hz. Results for the 16-s time window (Table A3, 0.3125–0.344 Hz) are closer to the true value. (B) Effect of noise on frequency detection (noisy signal and its spectrum -B.1-, filtered signal and its spectrum -B.2-). (C) Example of a breathing signal with low frequency fluctuations. (D) Example of a breathing signal with fluctuations due to movements of the subject and its spectrum.
Figure 31. Frequency analysis of sample real respiratory signals obtained from a public dataset [254]. (A) Effect of the time window (4 s, 8 s, and 16 s) on the frequency calculation. The true frequency is 0.33 Hz (3 s period) and the sampling frequency is 50 Hz. Results for the 16-s time window (Table A3, 0.3125–0.344 Hz) are closer to the true value. (B) Effect of noise on frequency detection (noisy signal and its spectrum -B.1-, filtered signal and its spectrum -B.2-). (C) Example of a breathing signal with low frequency fluctuations. (D) Example of a breathing signal with fluctuations due to movements of the subject and its spectrum.
Sensors 20 05446 g031
Figure 32. Wavelet transform. (A) 2D representation of the continuous wavelet transform (CWT) (right) of an example signal (left) taken from a dataset of real respiration signals [254] (RR of 20 bpm −0.33 Hz-, and sampling frequency of 50 Hz). (B) Multiresolution analysis (MRA) decomposition process (top). The lower part shows an example of the MRA analysis applied to the signal above ((A), left). Six-level decomposition was applied using the ‘Haar’ wavelet. Two detail levels and the sixth approximation level are represented. The spectrum of the approximation coefficients (level 6) was obtained.
Figure 32. Wavelet transform. (A) 2D representation of the continuous wavelet transform (CWT) (right) of an example signal (left) taken from a dataset of real respiration signals [254] (RR of 20 bpm −0.33 Hz-, and sampling frequency of 50 Hz). (B) Multiresolution analysis (MRA) decomposition process (top). The lower part shows an example of the MRA analysis applied to the signal above ((A), left). Six-level decomposition was applied using the ‘Haar’ wavelet. Two detail levels and the sixth approximation level are represented. The spectrum of the approximation coefficients (level 6) was obtained.
Sensors 20 05446 g032
Figure 33. Kalman filter algorithm for the fusion of different respiration sensors.
Figure 33. Kalman filter algorithm for the fusion of different respiration sensors.
Sensors 20 05446 g033
Figure 34. Number of studies using different processing algorithms for the wearable (left) and environmental (right) categories.
Figure 34. Number of studies using different processing algorithms for the wearable (left) and environmental (right) categories.
Sensors 20 05446 g034
Figure 35. Number of studies using the different figures of merit to determine sensor performance for the wearable (top) and environmental (bottom) categories.
Figure 35. Number of studies using the different figures of merit to determine sensor performance for the wearable (top) and environmental (bottom) categories.
Sensors 20 05446 g035
Figure 36. Number of studies using the different processing tools for the wearable (top) and environmental (bottom) categories.
Figure 36. Number of studies using the different processing tools for the wearable (top) and environmental (bottom) categories.
Sensors 20 05446 g036aSensors 20 05446 g036b
Table 1. Analysis of techniques, sensors, breathing parameters, and sensor locations and sizes for studies of the wearable category.
Table 1. Analysis of techniques, sensors, breathing parameters, and sensor locations and sizes for studies of the wearable category.
Study 1TechniqueSensorMeasured ParameterLocationSize
Aitkulov 2019 [57,58]Chest wall movementsFiber opticRRChest-
Balasubramaniyam 2019 [59]Chest wall movementsResistiveRRAbdomen (shirt)-
Bricout 2019 [60]Chest wall movementsAccelerometerRRChest
Abdomen
-
Chu 2019 [61]Chest wall movementsResistiveRR
TV
Chest
Elfaramawy 2019 [62]Chest wall/abdomen movementsAccelerometer Gyroscope MagnetometerRRChest
Abdomen
26.67 × 65.53 mm
Fajkus 2019 [63]Respiratory air flowFiber opticRRNose (nasal oxygen cannula)
Hurtado 2019 [64]Air temperaturePyroelectricRRNose (below)30 × 16 × 20 mm
Jayarathna 2019 [65]Chest wall movementsResistiveRRChest (shirt)-
Kano 2019 [66]Air humidityNanocrystal and nanoparticlesRRMouth mask
Karacocuk 2019 [67]Chest wall movementsAccelerometer GyroscopeMVChest (front and back)
Massaroni 2019 [68]Respiratory air flow (pressure)Differential pressureRRNose (nostril)36 mm diameter (PCB)
Massaroni 2019 [69]Chest wall movementsResistiveRRChest and
abdomen (shirt, front and back)
-
Nguyen 2019 [70]Respiratory air flow (vibration)Differential pressure sensorRRNose (nasal bridge)
Presti 2019 [71]Respiratory air flowFiber opticRRCervical spine90 × 24 × 1 mm
Presti 2019 [72]Chest/abdomen movementsFiber opticRRChest-
Puranik 2019 [73]Chest wall movementsGyroscopeMonitoring of breathingChest
Abdomen
-
Soomro 2019 [74]Air humidityImpedanceMonitoring of breathingNose (below)
Xiao 2019 [75]Air humidityResistiveMonitoring of breathingMouth mask (2–3 cm from nose)
Yuasa 2019 [76]Respiratory sounds
Chest wall movements
MicrophoneOpticalRRChest (adhesive gel)
Zhang 2019 [77]Chest wall movementsTriboelectric nanogeneratorRRAbdomen-
Dan 2018 [78]Chest wall movementsAccelerometerRR
Respiratory phase
Neck (Suprasternal notch area)-
Koyama 2018 [79]Chest wall movementsFiber Optic sensorRRAbdomen (Cardigan, garment)-
Malik 2018 [80]Air humidityCapacitive sensorMonitoring of breathingMouth mask-
Martin 2018 [81]Respiratory soundsMicrophoneRRHead (inside ear)-
Pang 2018 [82]Air humidityNanocrystal and Nanoparticles sensorMonitoring of breathingMouth mask-
Table 2. Analysis of sensing techniques, sensors, breathing parameters, and sensor location and size for studies of the environmental category.
Table 2. Analysis of sensing techniques, sensors, breathing parameters, and sensor location and size for studies of the environmental category.
Study 1TechniqueSensorMeasured ParameterLocationSize
Al-Wahedi 2019 [163]Modulation cardiac activityRadarRRDistance from subject (20–75 cm away)
Chen 2019 [164]Modulation cardiac activityRadarRRMat (below bed)
Gunaratne 2019 [165]Chest wall movementsPiezoelectricRRMat7 × 7 cm (each sensor)
Guo 2019 [166]Chest wall movementsCapacitive RRMat
Isono 2019 [167]Chest wall movementsPiezoelectric RROthers (under bed legs)
Ivanovs 2019 [168]Chest wall movements
Modulation cardiac activity
Camera
Radar
Respiration detection-
Joshi 2019 [169]Chest wall movementsCapacitiveRRMat (below baby mattress)580 × 300 × 0.4 mm
Krej 2019 [170]Chest wall movementsFiber opticRRMat
Lorato 2019 [171]Air temperatureCameraRRDistance from subject (side and front, 10–50 cm away)
Massaroni 2019 [172]Chest wall movementsCameraRRDistance from subject (1.2 m away)
Park 2019 [173]Chest wall movementsPiezoelectricRRUser’s mat (Chest region)40 × 750 × 0.25 mm
Walterscheid 2019 [174]Modulation cardiac activityRadarRRDistance from subject (3.3–4.2 m away)
Wang 2019 [175]Modulation cardiac activityRadarRRDistance from subject (50 cm away)
Xu 2019 [176]Respiratory soundsMicrophoneRROthers (instrument panel of vehicle)
Yang 2019 [177] Modulation cardiac activityRadarRRDistance from subject (1.5 m height, 0–3 m away)
Chen 2018 [178]Modulation cardiac activityWi-Fi transmitter and receiverRR
Respiration detection
Nodes-
Chen 2018 [179]Chest wall movementsPiezoelectric RRMat2 × 35 cm
Massaroni 2018 [180]Chest wall movementsCameraRR
Respiratory pattern
Distance from subject (1.2 m away)-
Massaroni 2018 [181]Chest wall movementsFiber opticRROthers (inside ventilator duct)3 cm
Sadek 2018 [182]Chest wall movementsFiber opticRR
Respiratory pattern
Mat20 × 50 cm
1 Note: The analysis for studies published before 2018 [5,6,7,9,10,19,48,50,51,52,53,54,183,184,185,186,187,188,189,190,191,192,193,194,195,196,197,198,199,200,201,202,203,204,205,206,207,208,209,210,211,212,213,214,215,216,217,218,219,220,221,222,223,224,225,226,227,228,229,230,231,232,233,234] is included in Appendix A (Table A2).
Table 3. Comparison of the main transmission technologies used in respiratory monitoring systems [237].
Table 3. Comparison of the main transmission technologies used in respiratory monitoring systems [237].
Operating BandwidthTransmission SpeedPower ConsumptionRange (m)Hardware Complexity
Bluetooth2.4 GHz1 Mbps Sensors 20 05446 i0011–100 Sensors 20 05446 i002
Zigbee2.4 GHz (valid worldwide)250 kbps at 2.4 GHz band Sensors 20 05446 i00310–100 Sensors 20 05446 i004
Wi-Fi2.4–5 GHz generallyUp to 1 Gbps Sensors 20 05446 i00550–100 Sensors 20 05446 i006
GSM/GPRS850–1900 MHz120 kbps Sensors 20 05446 i007100 m–several kilometers Sensors 20 05446 i008
Radio frequency433 MHz4 kbps Sensors 20 05446 i00920–200 Sensors 20 05446 i010
Table 4. Analysis of transmission technology, processing station, and energy autonomy for studies in the wearable category.
Table 4. Analysis of transmission technology, processing station, and energy autonomy for studies in the wearable category.
Study 1Wireless TransmissionWired TransmissionProcessing StationBattery CapacityBattery Life (Type Battery)
Aitkulov 2019 [57,58]-Data storage---
Balasubramaniyam 2019 [59]Internet connection-Cloud storage, PC, Smartphone--
Bricout 2019 [60]-----
Chu 2019 [61]Bluetooth-PC--
Elfaramawy 2019 [62]Radio-frequency-PC3.7 V, 100 mAh6 h (Li-ion battery)
Fajkus 2019 [63]-Interrogator
DAQ (data acquisition)
PC--
Hurtado 2019 [64]-----
Jayarathna 2019 [65]Bluetooth (low energy), SD card-PC, smartphone, cloud Storage600 mAh5 days (Li-ion battery)
Kano 2019 [66]Bluetooth-Smartphone3 V(Cell battery)
Karacocuk 2019 [67]Bluetooth-PC, smartphone--
Massaroni 2019 [68]Bluetooth-PC3.6 V, 650 mAh8 h (Li-polymer battery)
Massaroni 2019 [69]Bluetooth----
Nguyen 2019 [70]-----
Presti 2019 [71]-InterrogatorPC--
Presti 2019 [72]-InterrogatorPC--
Puranik 2019 [73]Wi-Fi--3.7 V, 1020 mAh(Li-ion battery)
Soomro 2019 [74]-USBPC, smartphone--
Xiao 2019 [75]--PC--
Yuasa 2019 [76]-USBSmartphone--
Zhang 2019 [77]--Smartphone, PC--
Dan 2018 [78]-----
Koyama 2018 [79]-Interrogator
DAQ
PC--
Malik 2018 [80]-DAQ---
Martin 2018 [81]--PC--
Pang 2018 [82]-----
Table 5. Analysis of validation experiments for the studies in the wearable and environmental categories.
Table 5. Analysis of validation experiments for the studies in the wearable and environmental categories.
Validation ParametersNumber of SubjectsDurationActivities Considered
12 to 56 to 1011 to 20>20Not specified<5 min>5 minSittingStandingLying downSleepingWalking, running, movingMotion artifacts
Number of studies3440302219634734592566162827
Table 6. Analysis of the processing algorithm, performance evaluation, and software for the studies of the wearable category.
Table 6. Analysis of the processing algorithm, performance evaluation, and software for the studies of the wearable category.
Study 1AlgorithmPerformance EvaluationPerformance ValueAnalysis Software
Aitkulov 2019 [57,58]Frequency analysisGraphical comparison--
Balasubramaniyam 2019 [59]---MATLAB
Bricout 2019 [60]Adaptive reconstructionCorrelation factor0.64–0.74-
Chu 2019 [61]Peak detectionBland-Altman analysis
Correlation factor
0.99 (correlation)MATLAB
Elfaramawy 2019 [62]Peak detection--MATLAB
Fajkus 2019 [63]Peak detectionRelative error
Bland-Altman analysis
3.9% (RE)Labview
Hurtado 2019 [64]Zero-crossing detectionRelative error
Bland-Altman analysis
0.4 bpm (BA, mean of difference –MOD–)-
Jayarathna 2019 [65]Peak detection---
Kano 2019 [66]Peak detectionCorrelation coefficient
Bland-Altman analysis
0.88 (correlation)
0.026 bpm (BA, MOD)
-
Karacocuk 2019 [67]Frequency analysisCorrelation-MATLAB
Microprocessor
Massaroni 2019 [68]Custom algorithmRelative error
Linear regression
Bland-Altman analysis
4.03% (RE)
0.91–0.97 (r2)
−0.06 (BA, MOD)
MATLAB
Massaroni 2019 [69]Peak detectionBland-Altman analysis0.05 bpm (BA, MOD)-
Nguyen 2019 [70]Frequency analysis---
Presti 2019 [71]Peak detectionPercentage error<4.71% (PE)MATLAB
Labview
Presti 2019 [72]Peak detection---
Puranik 2019 [73]----
Soomro 2019 [74]----
Xiao 2019 [75]-Graphical comparison--
Yuasa 2019 [76]Peak detectionAccuracy61.3–65.6%MATLAB
Zhang 2019 [77]Frequency analysis---
Dan 2018 [78]Zero-crossing detectionBland-Altman analysis0.01–0.02 bpm (BA, MOD)-
Koyama 2018 [79]Frequency analysisAbsolute error4 bpmPython
Malik 2018 [80]-Graphical monitoring-Python
Martin 2018 [81]Custom algorithmMean absolute error
Mean relative error
Bland-Altman analysis
2.7 bpm (MAE)
30.9% (MRE)
2.4 bpm (BA, MOD)
MATLAB
Pang 2018 [82]-Graphical monitoring--
Table 7. Analysis of the processing algorithm, performance evaluation, and software for the studies of the environmental category.
Table 7. Analysis of the processing algorithm, performance evaluation, and software for the studies of the environmental category.
Study 1AlgorithmPerformance EvaluationPerformance ValueAnalysis Software
Al-Wahedi 2019 [163]Frequency analysisManual verification
Relative error
4–14% (RE)Labview
Chen 2019 [164]Zero-crossing detectionMean squared error1.23 bpm-
Gunaratne 2019 [165]Wavelet transform
Fuzzy logic
Relative error6.2%-
Guo 2019 [166]Wavelet transformCross-correlation0.76–0.85-
Isono 2019 [167]Custom algorithmLinear regression
Bland-Altman analysis
0.969 (r2)
0.07–0.17 bpm (BA, MOD)
LabChart
MATLAB
Ivanovs 2019 [168]Neural networksOthers--
Joshi 2019 [169]-Correlation factor
Root mean square error
Bland-Altman analysis
0.74 (correlation)
4.7 bpm (RMSE)
−0.36 (BA, MOD)
-
Krej 2019 [170]Machine learning methodsRoot mean square error
Bland-Altman analysis
1.48 bpm (RMSE)
0.16 bpm (BA, MOD)
C#
R
Lorato 2019 [171]Frequency analysis Root mean square error
Bland-Altman analysis
1.59 bpm (RMSE)MATLAB
Massaroni 2019 [172]Custom algorithmAbsolute error
Standard error
Percentage error
Bland-Altman analysis
0.39 bpm (AE)
0.02 bpm (SE)
0.07% (PE)
−0.01 bpm (BA, MOD)
MATLAB
Park 2019 [173]Frequency analysisAccuracy
Bland-Altman analysis
99.4% (Acc)MATLAB
Walterscheid 2019 [174]Peak detectionGraphical comparison--
Wang 2019 [175]Custom algorithmAbsolute error
Relative error
0.3 bpm (AE)
2% (RE)
MATLAB
Xu 2019 [176]Custom algorithmAbsolute error
Correlation factor
0.11 bpm (AE)
0.95 (correlation)
-
Yang 2019 [177] Custom algorithmAbsolute error0.3–0.6 bpm-
Chen 2018 [178]Custom algorithmAccuracy98.65%-
Chen 2018 [179]Frequency analysis Graphical comparison-Mobile app
Massaroni 2018 [180]Threshold detection
Zero-crossing detection
Custom algorithm
Correlation factor
Bland-Altman analysis
Percentage error
Others
0.97 (correlation)
0.01 bpm (BA, MOD)
5.5% (PE)
MATLAB
Massaroni 2018 [181]Peak detectionRelative error2%MATLAB
Sadek 2018 [182]Peak detection
Custom algorithm
Correlation factor
Bland-Altman analysis
Mean absolute error
0.78 (correlation)
0.38 bpm (MAE)
-
1 Note: The analysis for studies published before 2018 [5,6,7,9,10,19,48,50,51,52,53,54,183,184,185,186,187,188,189,190,191,192,193,194,195,196,197,198,199,200,201,202,203,204,205,206,207,208,209,210,211,212,213,214,215,216,217,218,219,220,221,222,223,224,225,226,227,228,229,230,231,232,233,234] is included in Appendix A (Table A5).

Share and Cite

MDPI and ACS Style

Vanegas, E.; Igual, R.; Plaza, I. Sensing Systems for Respiration Monitoring: A Technical Systematic Review. Sensors 2020, 20, 5446. https://doi.org/10.3390/s20185446

AMA Style

Vanegas E, Igual R, Plaza I. Sensing Systems for Respiration Monitoring: A Technical Systematic Review. Sensors. 2020; 20(18):5446. https://doi.org/10.3390/s20185446

Chicago/Turabian Style

Vanegas, Erik, Raul Igual, and Inmaculada Plaza. 2020. "Sensing Systems for Respiration Monitoring: A Technical Systematic Review" Sensors 20, no. 18: 5446. https://doi.org/10.3390/s20185446

APA Style

Vanegas, E., Igual, R., & Plaza, I. (2020). Sensing Systems for Respiration Monitoring: A Technical Systematic Review. Sensors, 20(18), 5446. https://doi.org/10.3390/s20185446

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop