Next Article in Journal
Memory Characteristics of Thin Film Transistor with Catalytic Metal Layer Induced Crystallized Indium-Gallium-Zinc-Oxide (IGZO) Channel
Next Article in Special Issue
Coal Thickness Prediction Method Based on VMD and LSTM
Previous Article in Journal
Detailed Placement and Global Routing Co-Optimization with Complex Constraints
Previous Article in Special Issue
Beacon-Based Hybrid Routing Protocol for Large-Scale Unmanned Vehicle Ad Hoc Network
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Sensor-Based Drone for Pollutants Detection in Eco-Friendly Cities: Hardware Design and Data Analysis Application

by
Roberto De Fazio
1,
Leonardo Matteo Dinoi
1,
Massimo De Vittorio
1,2 and
Paolo Visconti
1,*
1
Department of Innovation Engineering, University of Salento, 73100 Lecce, Italy
2
Center for Biomolecular Nanotechnologies, Italian Technology Institute IIT, 73010 Arnesano, Italy
*
Author to whom correspondence should be addressed.
Electronics 2022, 11(1), 52; https://doi.org/10.3390/electronics11010052
Submission received: 17 November 2021 / Revised: 17 December 2021 / Accepted: 22 December 2021 / Published: 24 December 2021

Abstract

:
The increase in produced waste is a symptom of inefficient resources usage, which should be better exploited as a resource for energy and materials. The air pollution generated by waste causes impacts felt by a large part of the population living in and around the main urban areas. This paper presents a mobile sensor node for monitoring air and noise pollution; indeed, the developed system is installed on an RC drone, quickly monitoring large areas. It relies on a Raspberry Pi Zero W board and a wide set of sensors (i.e., NO2, CO, NH3, CO2, VOCs, PM2.5, and PM10) to sample the environmental parameter at regular time intervals. A proper classification algorithm was developed to quantify the traffic level from the noise level (NL) acquired by the onboard microphone. Additionally, the drone is equipped with a camera and implements a visual recognition algorithm (Fast R-CNN) to detect waste fires and mark them by a GPS receiver. Furthermore, the firmware for managing the sensing unit operation was developed, as well as the power supply section. In particular, the node’s consumption was analysed in two use cases, and the battery capacity needed to power the designed device was sized. The onfield tests demonstrated the proper operation of the developed monitoring system. Finally, a cloud application was developed to remotely monitor the information acquired by the sensor-based drone and upload them on a remote database.

1. Introduction

Human civilization and globalization are the primary culprits of the constant change in the global environment in the current scenario, mainly due to air and water pollution, global warming, ozone depletion, acid rain, natural resource depletion, overpopulation, waste disposal, deforestation, and biodiversity loss. Almost all of these processes result from the unsustainable use of natural resources; waste is a growing environmental, social, and economic problem for all modern economies. The air pollution generated by waste causes impacts felt by a large part of the population living in and around the main urban areas. Landfill management is a deeply felt problem by government authorities, given the enormous environmental impact of the pollutants due to naturally evaporated gases or substances generated by self-combustion or man-made fires, potentially dangerous for human health being diffused into the atmosphere.
According to the World Health Organization (WHO), air pollution is the most considerable environmental risk to health in the European Union (EU) [1]. Each year in the EU, it causes about 400,000 premature deaths and hundreds of billions of euros in health-related external costs. People in urban areas are particularly exposed; particulate matter, nitrogen dioxide, and ground-level ozone are the air pollutants responsible for most early deaths. These concepts are summarized in the initial section of the Special Report 23/2018, called “Air pollution: Our health still insufficiently protected” and published by the European Court of Auditors, which stresses the importance of the pollution problem that can no longer be ignored. To solve this problem, the European Parliament and European Union Council have adopted the directive 2016/22884 on reducing national emissions of certain atmospheric pollutants, amending directive 2003/35/E.C. and repealing directive 2001/81/EC [2]. This last act establishes the emission reduction commitments for the member states’ anthropogenic atmospheric emissions of sulfur dioxide (SO2), nitrogen oxides (NOx), non-methane volatile organic compounds (NMVOC), ammonia (NH3), and fine particulate matter (PM2.5). Additionally, the directive imposes that national air pollution control programs are drawn up, adopted, and implemented and that pollutant emissions and their impacts are monitored and reported. For air pollution measurements, a sampling height between 3–10 m must be considered; at this altitude, the vertical mixing is homogeneous, and representative of pollutants transported from neighbouring sources. Several monitoring methods and approaches are available for air pollution, including diffusion tubes [3,4], bubbler sampler [5], gas chromatography (GC) analyzers [6,7], remote optical/long path analyzers [8,9], photochemical and optical sensor systems [10,11], etc. However, the sampling method must be selected according to several parameters and requirements, including analyte typology, sampling duration and frequency, portability, maintenance need, costs, etc.
The sensors networks, including multiple sensor nodes distributed in strategic points, represent a valid approach for monitoring air quality with high precision in a relatively short time interval [12]. Dam et al. developed a wearable air quality sensor that analyzes the personal exposition to pollutions [13]. The designed device, called EnviroSensor, was a low-cost, open-source, mobile air-quality monitor that gathered real-time air quality data (ozone—O3, PM and CO concentrations). A dashboard was designed to display and analyse the air quality data and a mobile application for connecting the sensors and enabling real-time data sharing. Additionally, Dhingra et al. presented a three-phase air pollution monitoring system featured by high sensitivity and precision [14]. The proposed IoT device comprises multiple gas sensors, an Arduino board, and a Wi-Fi module to collect environmental parameters and send them to a cloud server. This last one stores the incoming data, accessible using a custom Android application called IoT-Mobair. Wearable and portable devices are lastly founding applications in the environmental monitoring field. Specifically, Teriús-Padrón et al. worked on a wearable device to acquire PM concentration and send collected data to other devices using Wi-Fi and Bluetooth Low Energy (BLE) connection [15]. The device is placed into a case with three caps to fix it to a belt, pants, bags, or an armband. Moreover, a custom user interface shows the real-time Air Quality Index (AQI) level. This last is calculated from the received data according to the PM level, using the EPA (Environmental Protection Agency) formula reported in [16]:
I p = I H i I L o B P H i B P L o C p B P L o   +   I L o ,
where I p is the index for the pollutant p, C p its truncated average concentration over 24 h based on 1 h measurements, B P H i and B P L o the concentration breakpoints greater or equal and lower or equal than C p , and I H i and I L o the corresponding AQI levels.
Similarly, P. Arroyo et al. developed a portable system for air quality measurements in outdoor environments, detecting the NO2, NO, CO, O3, PM2.5 and PM10 concentrations, along with the temperature, humidity and location [17]. The device comprises electrochemical gas sensors and optical particulate sensors, as well as a GSM module to transmit the acquired data, MQTT-Message Queue Telemetry Transmission communication protocol, to a cloud platform for storing and processing them. In [18], a novel Wireless Sensor Network (WSN) was introduced to detect pollutants, such as CO, NO2, PM10, PM1, PM2.5, and O3, produced by urban transport and domestic heating systems. The proposed WSN, based on sensor nodes called AIRBOX, has been installed in several hotspots and on public buses [18]. In [19], the authors described an air quality monitoring system for urban scenarios applied to the citizens’ garments and their bikes. In particular, the sensor node, equipped with CO, SO2, and NO2 electrochemical gas sensors, was based on a 16-bit MSP430BT5190 microcontroller and a CC256x Bluetooth chip. The tests demonstrated good accuracy in temperature, humidity, and pressure measurements with accuracy within 1 °C, 2% RH, and 2 hPa; for the electrochemical sensors, 0.6 ppm accuracy was obtained [19].
F. Tsow et al. developed a portable/wearable volatile organic toxicants sensor equipped with Bluetooth connectivity [20]. It is based on an innovative tuning fork sensor for detecting VOC concentration. A dedicated oscillator drives each fork sensor; the resulting oscillations are integrated for a set number of cycles and integrated with a high-frequency clock generated by a high-frequency crystal oscillator. The sensor outputs are digitalized and transmitted to a host device such as smartphones and/or laptops.
Several scientific works are proposed in the literature that have as main elements a drone for acquiring air quality data. For instance, in [21], the authors introduced a novel drone for detecting air quality parameters in a given location, constructing a 3D map of the air quality measurements. Furthermore, in [22], the authors developed an Environmental Drone (E-drone) to gather information about concentrations of air pollutants (i.e., CO, CO2, SO2, NH3, PM, O3, and NO2) in a specific place. The device implements onboard pollution abatement solutions. The drone comprises a 500 mL tank containing a solution for decreasing the NO2 level, dispersed when an excessive NO2 concentration is detected. A custom software gathered the data from multiple E-drones, drawing an Air Quality Health Index (AQHI) map for environmental analysis purposes. Similarly, Q. Gu et al. described an unmanned aerial vehicle (UAV) for air monitoring applications to obtain high-resolution and punctual profiling of air pollution [23]. The drone was equipped with low-cost microsensors, measuring the air concentrations of particulate matter and NO2. A fusion servlet software, implemented by the NanoPI Neo Air board, fuses the data from PM and NO2 sensors and flight controller, providing an aggregated data output.
Traffic is a critical issue in urban areas which are densely populated, requiring careful supervision to avoid areas with high congestion zones, resulting in high levels of pollutants with consequent risks to human health [24]. The scientific community has addressed this problem, finding new solutions for monitoring traffic based on environmental parameters. Notably, the noise level (NL) is a good indicator for forecasting the traffic level since it depends on traffic volume and speed, vehicle content, road surface and structure of the surrounding area. By gathering data related to the noise level and correlating them with other environmental parameters, a good estimation of the traffic level can be inferred.
In this paper, a monitoring system based on a low-cost drone is presented, equipped with a series of photochemical and optical sensors for monitoring the primary sources of pollution present in an urban scenario. The monitoring system is based on a Raspberry Pi Zero W board, which acquires and processes the data from sensors and stores them into the internal memory (SD card) along with alarm flags related to overcoming specific threshold values. Furthermore, the sensing unit is equipped with a wide set of sensors to detect concentrations of dangerous gaseous species and particulates (i.e., NO2, CO, NH3, CO2, VOCs, PM2.5, and PM10) as well as the noise levels [25]. Additionally, the drone comprises an IR camera supported by a visual recognition algorithm to detect fires of hazardous materials and simultaneously capture their location by an onboard GNSS (Global Navigation Satellite System) receiver [26]. The presented device is thought to monitor the pollutant species in urban or suburban environments, along with the traffic level correlating it with the noise level detected by an onboard sound sensing module. Thus, the smart drone can easily monitor the traffic load in restricted city areas, allowing real-time management of moving vehicles on different city streets. Specifically, we employed a simple data fusion algorithm to combine the noise level acquired by the onboard microphone module and the air concentrations of gaseous species strictly correlated to vehicular traffic, such as NO2 and PM2.5. The developed solution offers numerous advantages over fixed monitoring systems available on the market, including portability, low cost, customization, and a wide operating range.
The proposed paper aims to develop a low-cost sensor-based drone for pollutant detection and visual recognition of waste fires, enabling quick supervising large areas and identifying pollution sources. Additionally, the consumption analysis is presented to design the sensing unit’s supply section. Furthermore, characterization and testing of the proposed sensor-based drone in different operating conditions were carried out, demonstrating the correct operation of the developed system. Finally, a cloud-based application for remote monitoring of the historical data acquired by the sensor-based drone is introduced; it relies on a local application, enabling the operator to upload the data on the remote database and a mobile application to monitor the acquired measurements.

2. Materials and Methods

In this section, we first describe the architecture of the proposed sensing unit is introduced, along with its integration inside the Phantom 3 drone. Afterwards, the specifications of each component constituting the sensing unit onboard the drone are discussed, along with the threshold values set by the regulations for each pollutant. Lastly, a simple data fusion technique is presented to determine the traffic load in an urban scenario combining the noise level and pollutants measurements.

2.1. Architecture of the Developed Pollution Monitoring Drone

Figure 1 represents the 3D picture of the proposed mobile monitoring system constituted by the Phantom 3 drone (manufactured by DJI, Shenzen, China) hooked the sensing section to acquire the environmental parameters (1 in Figure 1). This last one is placed into a plastic box with air intakes (5 mm diameter), enabling the gas sensors to be exposed to the incoming air forced inside the case by the drone movement (2 in Figure 1). Moreover, an IR camera is placed on the box front section supported by a recognition algorithm to detect fires of hazardous materials and simultaneously capture their location by an onboard GNSS receiver (3 in Figure 1) [27]. Moreover, the sensing section is equipped with an electret microphone for detecting the noise pollution level in the overflown area. The monitoring system core is the mobile sensor node mounted on the previously chosen drone (4 in Figure 1). A Raspberry Pi Zero board acquires and processes the sensors’ signals and implements a visual recognition algorithm to recognise waste fires.
The performances of every gas sensor typology are affected by both analyte flow rate and direction, causing variation of their properties, and thus inducing measurement errors if not compensated [28,29]. Relatively to the developed sensor-based drone, the airflow changes due to variations of drone speed and the turbulence created by the rotation of the drone blades. Two precautions were used to mitigate the possible negative effects on the parameters acquisition. The first one is the application of a 40 µm stainless steel mesh layer behind the air inlets placed in front of the plastic case. This last one acts as a diffusion barrier, reducing the air velocity into the case, thus protecting the stability of the sensing part; also, it avoids the entry of foreign bodies into the plastic box. The latter is to keep relatively low and stable drone velocity (<10 km/h) during measurements.
Specifically, the system is equipped with two particulate detection sensors (ZH03A/ZH03B, manufactured by Winsen Electronics Technology, Zhengzhou, China) based on optical technology able to detect the concentrations of fine dust PM2.5 and PM10. Since these sensors provide analog outputs, the motherboard must be equipped with two external ADC (Analog-to-Digital Converter) (model ADS1115, manufactured by Texas Instruments, Dallas, TX, USA) interfaced by an I2C bus. To make the system polyvalent, it was decided to equip it with sensors capable of detecting a wide range of gaseous species. Notably, the system includes a MOx technology sensor (CCS811, manufactured by AMS Technologies, Premstätten, Austria) to detect TVOC (Total Volatile Organic Compounds) and the CO2 emitted in all applications that require combustion of fossil fuels.
Moreover, the system is equipped with an additional sensor based on MOx technology (MiCS6814, manufactured by SGX Sensortech, Neuchâtel, Switzerland) for detecting nitrogen hydroxide, ammonia, and carbon monoxide, gaseous species attributable to activities in an urban or industrial environment. The mobile sensor nodes are also provided with an audio detection system (model MAX4466, manufactured by Maxim Integrated, San Jose, CA, USA) based on an electric microphone and an adjustable low power gain stage to detect the level of noise pollution present in the overflown area. Once an area with pollution levels beyond the thresholds defined by current regulations or the presence of fires have been identified, the system stores its position using a low consumption GNSS receiver (model NEO M8N, manufactured by U-Blox, Thalwil, Switzerland). Additionally, the drone is equipped with an IR camera (model OV5647, manufactured by OmniVision Technologies, Santa Clara, CA, USA) to detect waste fires and take photos of areas where an abnormal level of pollutants has been detected to improve detection reliability (Figure S1).
The connections between the Raspberry motherboard and the various used components are depicted in Figure 2. The two ADS1115 ADC are interfaced with the microcontroller board using the I2C bus, configuring them with different addresses. The analog signals supplied by the microphone sensor (MAX4466) and the PM10 laser dust sensor (ZH03A) is connected to the analog inputs of the first ADC (ADC1). In contrast, the second ADC (ADC2) converts the analog signals supplied by the MOx sensor (MiCS 6814) and the PM2.5 sensor (ZH03B). Furthermore, the CCS811 MOX sensor is interfaced with the Raspberry board through the I2C bus, whereas the GNSS receiver (NEOM8) uses the UART (Universal Asynchronous Receiver Transmitter) interface, sending NMEA packets containing the drone position. The GNSS receiver is used to acquire the coordinates of the locations where the pollutant concentrations are greater than WHO limits, as detailed in Section 3.1. Finally, the IR Camera (OV5647) is connected to the Raspberry board with the Camera Serial Interface Type 2 (CSI-2).
The developed sensor-based drone can operate in a wide range of environmental conditions, excluding those with strong wind (>35 km/h) and heavy rain, for avoiding drone instability. Additionally, water vapour represents the main disturbing factor in the performances of gas sensors, affecting the sensitivity and calibration of employed gas sensors, thus inducing measurement errors if not compensated [30]. At present, no compensation has been applied to the measurements acquired from gas sensors; as future development, a firmware compensation of the acquired gas concentrations based on air humidity measurements will be implemented.
Furthermore, the Phantom 3 drone is featured by a transmission distance up to 0.5 miles (1 km), depending on the environmental conditions. This distance allows simply reaching inaccessible and dangerous places (i.e., the central area of a landfill) while keeping the operator at a safe distance. As aforementioned, a relatively low sampling height (between 3 to 10 m) has been considered since a vertical homogeneity of gas species is obtained. Moreover, the presented device is extremely resistant to environmental agents (humidity, temperature and chemical species, etc.) and mechanical stress, both from the point of view of the RC drone and sensing section, suitably protected by a proper plastic cover.
Table S1 summarizes the threshold values, provided by the WHO (World Health Organization), of each pollutant component referred to a specific exposition time. The threshold values represent the maximum concentrations above which pollutants are highly harmful to humans and the environment [31].

2.2. Description of Used Devices and Sensors: Technical Features and Functionalities

The core of the sensing unit equipping the drone is the Raspberry Pi Zero W board; it is the smaller board of the Raspberry line, featured by Wi-Fi and Bluetooth capabilities. Specifically, the board includes Broadcom BCM2835 SoC, complying with the ARM11 architecture running at 1 GHz clock, and featured by 512 KB RAM. Moreover, the board is equipped with 40 GPIO (General Purpose Input-Output) and a wide range of interfaces (UART, I2C, USB, CSI), enabling a high use versatility.
Two external ADCs are employed to acquire the analog signals provided by sensors and modules; the ADS1115 is a higher precision 16-bit ADC with four multiplexed channels [32]. It has a four multiplexed input, usable as four single-ended or two fully differential inputs, and programmable gain from 2/3× to 16× to amplify small signals and acquire them with higher resolution (Figure 3). Additionally, the sensing unit is equipped with two laser dust sensors for monitoring the environmental particulate (PM2.5 and PM10) concentrations. The ZH03A/ZH03B laser dust sensors can measure the number of a particular matter in a unit volume of air. In particular, the ZH03B sensor can detect coarse particles with a diameter lower or equal to 10 μm (PM10), whereas the ZH03A module can detect fine particles with a diameter of 2.5 μm or less (PM2.5) [33]. They are featured by 5V working voltage, current absorption lower than 120 mA, and response time (T90) lower than 90 s, making the concentration data available using the UART interface.
Additionally, the system comprises a CCS811 gas sensor to monitor the air concentration of TVOC and CO2; it is an ultra-low-power digital sensor that integrates a metal oxide (MOx) gas sensor to detect CO2 in the 400–8192 ppm and Volatile Organic Compounds (VOC) in the range 0–1187 ppb [34]. The sensor has a small microcontroller to manage heater power, acquire the sensor voltage, and provide them through an I2C interface; it is featured by 1.8–3.6 V supply voltage, 30 mA maximum supply current, and 80 mW maximum power consumption. Similarly, a MiCS-6814 sensor is comprised inside the developed drone; it is a compact MOS sensor with three metal oxide semiconductor sensors (Red sensor, Ox sensor, NH3 sensor) able to detect several gas species [35]. The detectable gases are Nitrogen dioxide (NO2), in the range of 0.05–10 ppm, Carbon monoxide (CO), in the range of 1–1000 ppm, Ammonia (NH3) in the range of 1–500 ppm. The device has a 5 V supply voltage, 88 mW maximum heater power dissipation, and 8 mW maximum sensitive layer power dissipation (Figure 3).
The drone is equipped with a microphone sensor module based on a capsule microphone and a MAX4466-based amplifier, featured by low operating voltage and an excellent power supply noise rejection. The module has a 2.5–5.5 supply voltage, 24 µA supply current, 600 kHz operating bandwidth, and adjustable gain. Furthermore, an IR camera module is mounted in front of the plastic case containing the sensing unit; it is based on OV5647 CMOS (Complementary Metal Oxide Semiconductor) image sensor, providing 2592 × 1944 video output using Omni BSI (back-illuminated sensor) technology [36]. In addition, the camera is integrated with an IR filter, sensitive to Infrared radiation, making the camera operational similar to a night vision camera with external light sources such as IR LED illuminators. The device is interfaced with the Raspberry Pi board using a short ribbon cable to the board processor via the CSI bus, a higher bandwidth link that carries pixel data from the camera to the processor.
Lastly, a NEO M8 GNSS receiver is integrated inside the drone for acquiring the coordinates of zones where anomalous values of detected parameters or a waste fire are detected. The module has 2.5 m horizontal position accuracy, 10 Hz maximum update rate, −166 dBm sensitivity, and 29 s (cold start) time to first fix.
As shown in Figure 1, the sensing section is installed inside a plastic case realized in ABS (Acrylonitrile Butadiene Styrene) by 3D printing, having dimensions 7.8 cm × 6 cm × 6 cm and weight 15 g. The case cover includes a flange, enabling its connection to the joint used for mounting the onboard camera (Figure 1). The entire sensing unit has 294 g weight, including the power supply section and the battery. The drone with the sensing section installed is characterized by a total weight of 1350 g, allowing for easy transport.
As previously discussed, the proposed work aims to develop a low-cost tool for monitoring environmental concentrations of pollutants; considering this goal, the total cost of the developed sensor-based drone is about EUR 780, where most of the cost is attributable to the drone, whereas the cost of sensing unit is around EUR 80. Table 1 shows the breakdown of costs on the various components that make up the developed device. Compared with commercial monitoring systems with similar capabilities [37], the designed device is surely cheaper and more functional, but ensuring easier portability.

2.3. Data Fusion Technique for Monitoring the Traffic Level from Noise and Pollutants Data

The presented air monitoring system represents a valid solution for the rapid and punctual monitoring of the traffic level in smart city scenarios through a suitable data fusion algorithm. In particular, the acquisition section measures the voltage signal provided by the microphone module over a one-minute interval; the microphone module is set to provide a +15 gain on the signal generated by the microphone. The voltage level is converted in a corresponding dB-level, using an empirical function deduced by a characterization of the used module.
NL dB = V ¯ + 0.2674 0.03534 V dB ,
Because the noise measurements are performed to an altitude different than zero, the noise level measurement is compensated using a corrective factor derived by Stokes’ law, supposing a reference attenuation coefficient α = 0.005 dB m (@ 70% relative humidity, 1000 Hz frequency, 18 °C temperature) and considering the altitude measurement provided by the onboard GNSS module. Thus, the NL at zero quotas measured by the drone is given by:
NL dB 0 = NL dB d   +   α   ·   d ,
Furthermore, the collected noise level ( NL ) is combined with the NO2 and PM2.5 concentration acquired by the onboard sensors, defining a Traffic Level Index (TLI) defined as:
TLI = k   ·   [ a   ·   NL dB   +   b   ·   PM 2.5   +   c   ·   NO 2 ] ,
where a [ dB 1 ] , b [ μ g m 3 1 ] , and c [ ppm 1 ] are combination coefficients, as well as k a normalization parameter. These parameters are set to 0.25 dB 1 (a), 24 μ g m 3 1 (b), 825 [ ppm 1 ] (c), and 0.5 ( k ), respectively.
The following classification rule is employed to discern the traffic condition according to TLI value:
TLI 60 LOW   TRAFFIC 60 < TLI 105 MODERATE   TRAFFIC 105 < TLI 200 INTENSE   TRAFFIC TLI > 200 EXCESSIVE   TRAFFIC ,
where the threshold values are determined according to the limitations imposed by the European or international regulations on the parameters above (Table S1).
According to the acquired TLI level, the drone can discern in real-time the traffic load in four different conditions (LOW, MODERATE, INTENSE, and EXCESSIVE), combining the noise level and the pollutant measurements and providing a quick indicator to traffic managers to balance the traffic load on a wider area.

3. Results

In this section, at first, the sensing unit’s firmware for coordinating the acquisition of environmental and traffic conditions is introduced; then, the results of onfield tests on the presented sensor-based drone are reported to verify the correct operation of all the systems functionalities.

3.1. Description of the Sensing Unit’s Firmware for Managing the Data’s Acquisition and Processing

This section describes the flowchart of the firmware implemented by the Raspberry Pi Zero board integrated into the proposed mobile sensor node to manage the acquisition of environmental parameters and detect possible waste fire. The sensing unit’s firmware was developed with Python (third version) programming language, fully compatible with the used computational platform.
The first step is the declaration and initialization of all the useful variables, among which three arrays are fundamental:
  • THRESHOLD array, containing the threshold values of the monitored pollutants, defined by the European and national regulations, as described in Section 2.1. Additionally, a noise threshold value equal to 85 dB is set, corresponding to the maximum noise value tolerated by the human ear [38].
  • DATA array, containing the values measured by all the sensors present in the drone, compared later with the threshold values.
  • FLAGS array, which, unlike the two previous arrays, is a boolean array, i.e., it can contain only the values 0 and 1. It contains the comparison results between the measured value and threshold; a “1” in the i-th position indicates that the value of the i-th pollutant component is above the threshold, whereas “0” indicates a value below the threshold.
Afterwards, two nested cycles are started; the first one lasts 60 s used to evaluate the traffic level and the latter with 5 s period for synchronizing the acquisition and storing of the environmental parameters (Figure 4). Particularly, every 5 s, an acquisition cycle is started; the first step is resetting the DATA and FLAGS arrays since they store the previous measurements if not the first acquisition. Afterwards, the acquisition step begins by measuring gas concentrations and noise level from the eight sensors (ZH03A, ZH03B, CCS811, MiCS 6814, and MAX 4466 sensors). Specifically, each iteration provides for selecting the i-th channel and acquiring the corresponding signal, followed by storing this information in the i-th element of the DATA array. Then, the measurements (DATA[i]) are individually compared with the corresponding thresholds (THRESHOLD[i]) implemented by a for loop constituted by eight iterations and moving on DATA[] and THRESHOLD[] array elements; if the i-th value is greater than the limits, it is annotated with a “1” in the boolean FLAGS array.
Later, the sum of all flags is calculated, and if a single species is over-threshold, the drone acquires the GPS coordinates. Then, the system looks for waste fires; in this case, only the data from two sensors are considered to detect and localize possible waste fires by monitoring the combustion gases produced. In general, the so-called micro-pollutants are produced from waste combustion, such as acid gases, nitrogen oxides, and unburnt gases, and the so-called micro-pollutants, such as heavy metals and organochlorine compounds. Carbon monoxide (CO) is formed in combustion reactions in the absence of sufficient oxygen and represents the primary combustion indicator. Other compounds generated during the fire event are nitrogen oxides, particularly the NO2, toxic gas with a reddish color, irritating, with a pungent odour, and produced following the combustion of nitrogen materials, such as nitrocellulose and organic nitrates.
Furthermore, in the presence of materials containing nitrogen (wool, silk, acrylic materials, etc.), the formation of ammonia can be detected from combustion. Finally, the so-called “greenhouse gases” cannot fail to be mentioned, in particular carbon dioxide (CO2), methane (CH4) and nitrous oxide (N2O). Therefore, the two considered sensors are the CCS811 for monitoring CO2 and TVOC and the MiCS6814 for monitoring CO, NO2, and NH3 [39]. If at least one of the five values above mentioned are greater than the corresponding threshold values, or in other words, if the sum of the FLAGS array values is greater or equal than one, the IR camera is activated, followed by the launching of a recognition algorithm (Figure 4).
Specifically, the Fast Region-based Convolutional Network (Fast R-CNN) method is employed for object detection [40]; the method accepts the input picture and a set of proposal objects. At first, the method calculates multiple convolutions of the input picture and maximum polling layers generating a feature map. Later, a fixed-length vector is derived for each proposal object from the feature map. A sequence of fully connected layers processes each vector, deriving two sibling layers; the first generates a softmax probability estimation relative to the K object classes and the overall background class. The latter produces four real numbers for each K object classes, defining a bounding box over each of them. In particular, a Python implementation of the Fast R-CNN was implemented inside the realized firmware, trained using a set of annotated images containing fires as well as corresponding negatives. Finally, the iteration is terminated by updating the start_time variable with the current instant (start_time = current_time) to restart the described process and repeat the polluting quantities’ sampling every 5 s (Figure 4).
Additionally, every 60 s, the algorithm calculates the mean noise level over the considered observation interval and, thus, the TLI value according to Equation (3). According to the obtained TLI, the traffic load is classified into four categories following the classification rule (6), storing the corresponding GPS coordinates and a timestamp. Furthermore, in the conditions of intense ( 105 < TLI 200 ) and excessive ( TLI > 200 ) traffic, the drone acquires two pictures with different levels of zoom (1× and 20×), allowing direct observation of the traffic state.
Once the drone has finished patrolling, the user can extract the SD card mounted into the Raspberry Pi Zero board, download the acquired data in csv files for numerical values and JPG for the caught picture, process them to detect any anomalies, followed by subsequent field checks. Additionally, the acquired data was uploaded on a cloud application realized on the IBM Cloud platform, allowing users remote monitoring of acquired environmental parameters and traffic conditions, as detailed in Section 3.2.
Before each measurement, the drone’s gas sensors were tested by comparing the environmental concentration values of the gases and particulates acquired on the ground with those from calibrated portable instruments. Specifically, the multi-parametric detector RS-9680 (manufactured by RS Pro, Corby, UK) was used to verify the PM2.5, PM10, and TVOC concentrations, whereas the detector KANE101 (manufactured by Kane International Limited Inc., Welwyn Garden City, UK) was used to verify CO and CO2 levels. Furthermore, the handheld detectors model GAXT-D-DL (manufactured by Frontline Safety Inc., Glasgow, UK) and NH3 Responder (manufactured by CTI Inc., Columbia, MO, USA) are used for detecting the NO2 and NH3 air concentrations, respectively. A good agreement between the measurements obtained with the portable detectors and those obtained from the sensors onboard the drone is obtained, with a maximum deviation of ±5%, demonstrating the proper operation of all integrated sensors.

3.2. Power Consumption Measurements and Power Supply Section Design

In the following section, the consumption characterisation of the sensing section is presented to size the battery capacity, given a fixed energy autonomy; moreover, the power supply section of the proposed mobile sensor node is developed.
Table 2 summarizes the current values absorbed by each component obtained during the laboratory tests, depending on the operative modality (i.e., active mode or sleep/dormancy mode). The obtained values agree with the data reported in the datasheets of the respective components.
After identifying the current consumption of each component, to perform adequate battery sizing, it is necessary to calculate the total current consumed during the drone operation. In particular, two specific cases must be considered; the first one represents the best case, whereas the second is a more realistic use case.
The first case refers to an optimal situation; particularly, after an entire session of monitoring the gases in flight, the acquired values has never been detected above the threshold values, defined in Section 2.1. The overall power consumptions of the sensing section during active and sleep modes are 487.08 mA ( I ACTIVE ) and 241.04 mA ( I SLEEP ), respectively. Therefore, the weighted average ( Ī Patrol   flight ) is calculated according to the time duration of the two phases to obtain an average current value during the drone operation; specifically, the acquisition period lasts about 20 ms, and the sleep period 4.98 s, as demonstrated by the experimental tests.
Ī Patrol   flight = a 1 × I ACTIVE + a 2 × I SLEEP = 20   ms 10   s × 487.08   mA + 9.98   s 10   s × 241.04   mA = 242.02   mA ,
The second case refers to a more realistic use case. In particular, during a patrol flight lasting 30 min, equivalent to the autonomy of the designated drone, the various sensors detected ten times the value of at least one gas concentration above the set threshold value. In response, the system promptly activated the GNSS module to acquire the GPS coordinates and took a photo by the onboard camera module. In detail, the current absorbed by the entire system activates the GNSS module during the tracking phase is 260.56 mA ( I GPS ), as well as during the picture acquisition 370.56 mA ( I Photo ) (Figure 5).
In this second case, the mean current consumption ( Ī Patrol   flight   camera   &   GPS ) is calculated, knowing the mean current during the drone patrolling ( Ī Patrol   flight ) and the current values during the GPS tracking ( I GPS ) and picture acquisition ( I Photo ) and the corresponding time duration of each phase. Particularly, the acquisition of the GPS coordinates lasts 20 s, and the capturing and storing of a picture by the onboard camera requires 1.9 s. The three weights, multiplied by current values, called b1, b2, and b3, are obtained by dividing the time in which the consumption refers by the autonomy time declared by the drone manufacturer, i.e., 30 min.
Ī Patrol   flight   camera   &   GPS = b 1 × Ī Patrol   flight + b 2 × I Photo + b 3 × I GPS = 1581   s 1800   s × 242.02   mA + 19   s 1800   s × 370.56   mA + 200   s 1800   s × 260.56   mA = 245.43   mA ,
In conclusion, the last step is the sizing of the battery to guarantee the autonomy of the mobile sensor node equal to that of the drone (i.e., 30 min). The two quantities required to select and size the battery used to power the mobile sensor nodes are nominal and capacity. The nominal voltage must be at least 5 V, as all components used are powered either at 5 V or 3.3 V; therefore, a two-cell Li-Po battery was chosen, featured by 7.4 V nominal voltage. The overall charge required in the worst-case scenario previously discussed is given:
Q Patrol   flight   camera   &   GPS = Ī Patrolflight   camera   &   GPS × Δ t 1 M = 245.43   mA   × 0.5   h 0.7 = 175.30   mAh ,
where a 30% discharge margin ( M ) on the overall battery capacity has been considered. We can conclude that a two-cell Li-Po battery with a minimum capacity of 175.30 mAh is required to ensure 0.5 h autonomy to the developed sensing unit for monitoring environmental pollutants. For this purpose, a two-cell Li-Po battery was selected, featured by a 380 mAh capacity and 7.4 V nominal voltage (model VZKT5088, manufactured by YuanHan Co., Jiangsu, China), as below described (Figure 6).
It is connected to two adjustable buck DC-DC converters (model MP2307, manufactured by Monolithic Power Systems Co., Kirkland, WA, USA); the first one is used to scale down the battery voltage from 7.4 V to 5 V to power supply the MiCS6814 sensor, the NEOM8N GPS receiver, the ZH03A and ZH03B laser dust sensors, and the Raspberry Pi Zero W board, to which the OV5647 camera is connected via the CSI interface. On the other hand, the second buck converter reduces the 5 V provided by the first converter to 3.3 V used to feed the CCS811 sensor, the MAX4466 microphone sensor, and the two ADS1115 16-bit ADC (Figure 6).

3.3. Onfield Tests of Developed Sensor-Based Drone for Monitoring Environmental Parameters and Traffic Conditions

Figure 7 depicts two application scenarios describing how and where the proposed mobile monitoring system can be applied. In particular, the first figure shows the drone that identifies a fire in a suburban area (Figure 7a), whereas the second one shows a fire in a landfill (Figure 7b).
To test the presented sensor-based drone, we carried out different measurement campaigns to evaluate the correct operation of the entire detection system. The measurement campaigns were carried out considering a height from the ground of the drone between 8 m and 10 m and maintaining a cruising speed below 10 km/h, for the reasons detailed in Section 2.1. To carry out the tests, the monitored area was divided according to a grid with a 20 m pitch. The drone lingered within the quadrant for about 1 min, making it follow a circular trajectory; at the end of the rest period, the drone was brought to the adjacent quadrant till covering the entire monitored area. Relatively to the height range from 3 m to 10 m, significant variation of the parameters acquired with the acquisition altitude was not observed, confirming the hypothesis of homogeneous distribution of the pollutants within the aforementioned height range. Nevertheless, a gradual reduction in the environmental concentration of CO2, NO2 and particulate was observed for acquisitions carried out beyond the indicated range (>10 m), probably due to their specific weight greater than 1 with respect to the air, resulting in the accumulation of these species at low altitude. Furthermore, no significant variations were observed in the sensor measurements with the trajectory described by the drone.
Specifically, the first measurement campaign was carried in a suburban area (Ecotekne Campus, Lecce, Italy), away from possible pollutants sources. Figure 8a depicts the data acquired during the measuring campaign for the seven considered gaseous and particulate quantities. The drone flew over the area for 20 min, using a 5 s sampling period, as described in Section 3.1; later, the data acquired by the drone were downloaded and analyzed. As evident, the detected values are all below the set thresholds (Table S1); thus, the drone does not acquire any picture and GPS coordinates.
The second test campaign was carried near a particularly busy state road (SS694, Lecce, Italy), using the previously described modalities. As evident, a rapid increase in CO2, PM2.5, and PM10 concentrations occurred when the drone flew near the road. Specifically, the CO2 and PM2.5 concentrations overcame the threshold set according to the WHO directives, reaching the peak values of 882 ppm and 10.23 µg/m3, respectively (Figure 8b). In these conditions, as previously described, the drone acquires the specific position of the place where the thresholds were exceeded (40.335011N, 18.130470E). Since the sum of CO2, TVOC, NH3, NO2, and CO flags equals 1, the drone acquires a picture and launches the visual recognition algorithm, not detecting any fire.
Furthermore, another test campaign was carried out to prove the capability of the developed sensor-based drone to detect waste fire. We prepared a setup constituted by a camping stove placed in an outdoor environment and modified the detection firmware for triggering the position and visual acquisition when only the CO2 concentration exceeds a reduced threshold value of 500 ppm. The tests were performed in the evening to test the detection effectiveness of the IR camera and recognition algorithm. The drone was flown over the stove at a low altitude and verified the correct detection of the stove position and the acquisition IR picture of the area. Figure 9 depicts the trend of the CO2 concentration acquired by the drone during the third test campaign. As evident, the CO2 concentration overcame the set threshold (500 ppm) when the drone passed over the camping stove, acquiring the IR picture of the observed area.

4. Discussion

This section analyses the experimental results previously reported, highlighting novelties and potentialities of developed air and land monitoring systems. Later, a discussion on the power consumption of the developed system is presented, designing the corresponding power supply section.
The previously presented results show the correct operation of the different modules and sensors included inside the developed air monitoring system. Specifically, the three test campaigns demonstrated the effectiveness of implemented firmware that carries out a crosscheck on the acquired gas species, triggering an alarm routine of only more than two parameters overcomes the corresponding thresholds. In Figure 8b, it is clear the fast response of sensors when the drone flew over the busy road, allowing identifying the areas with high pollutant levels. The sensing unit has stored the GPS coordinates when CO2 and PM2.5 concentrations overcome the set threshold values (880 ppm and 10 µg/m3). In this condition, the drone has acquired a picture and launched a visual recognition algorithm, not detecting any fire.
The Fast R-CNN algorithm has been successfully tested in conjunction with the IR module for detecting waste-fire when anomalous environmental parameters are detected. As evident from Figure 9, once the detected CO2 overcome the set threshold (i.e., 500 ppm), the visual recognition algorithm is launched for detecting the presence of waste-fire. When a fire is detected, the system marks the presence of fire by setting a flag and storing the acquired picture inside the SD card.
Additionally, a cloud application was developed for uploading the data provided by the drone patrolling. Specifically, it is based on a local application realized by Microsoft Visual Studio® used by the operator to upload the data on the IBM Cloud platform, exploiting the MQTT client package. The GPS coordinates, timestamps, and environmental data of places with anomalous environmental parameters are uploaded in the form of.csv file and stored into a remote database (Cloudant NoSQL Database). Figure 10a reports the loading of sample data inserted inside a. csv file on the remote database. The places with anomalous values are marked on a map, allowing the operator to check the correct loading of the data.
Furthermore, by clicking on the marks, a list view is opened showing the acquired parameters, along with the corresponding timestamp. A mobile application allows users to access the data stored in the remote database and display them on an interactive map (Figure 10b). After login, the user can monitor the places with anomalous parameters, consulting the corresponding environmental parameters and timestamp, allowing a deep understanding of the polluting phenomena in the considered places.

5. Conclusions

This manuscript presents the development of a sensor-based drone for monitoring air and noise pollution. In particular, a Phantom 3 drone is equipped with a Raspberry Pi Zero W board and a wide set of sensors (i.e., NO2, CO, NH3, CO2, VOCs, PM2.5, and PM10) to acquire environmental data. When a detected value exceeds the fixed threshold, the drone stores the GPS coordinates and a timestamp, enabling tracing historical pollution data in the considered area. Additionally, a proper classification algorithm was developed to estimate the traffic level according to the noise level acquired by the integrated microphone. Furthermore, an IR camera supported by a Fast R-CNN algorithm allows the recognition of waste fire even in dark conditions. Additionally, power consumption measurements were carried out in two different use cases to suitably design the sensing unit’s supply section. Onfield tests were performed to verify all functionalities of developed monitoring systems, proving it in different operative scenarios. The tests demonstrated the proper operation of the developed sensor-based drone in all considered scenarios, thus representing a useful tool for performing environmental monitoring in a non-invasive and economical way. Finally, a cloud application was developed for uploading and monitoring the data regarding places and times where excessive levels of pollutants have occurred. In this way, a deeper understanding of pollutant trends can be obtained, enabling the implementation of countermeasures to reduce pollution levels.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/electronics11010052/s1, Figure S1: Block diagram of the electronic section for the environmental pollutants monitoring; Table S1: Summarizing table with threshold values of the main polluting species, issued by provided by the WHO (World Health Organization) [28].

Author Contributions

Conceptualization, R.D.F. and P.V.; methodology, R.D.F. and L.M.D.; software, L.M.D. and R.D.F.; validation, M.D.V.; data curation, R.D.F. and P.V.; writing—original draft preparation, R.D.F. and L.M.D.; writing—review and editing, P.V. and R.D.F.; supervision, M.D.V. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data of our study are available upon request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Air Pollution: Our Health Still Insufficiently Protected. Available online: https://www.eca.europa.eu/Lists/ECADocuments/SR18_23/SR_AIR_QUALITY_EN.pdf (accessed on 9 April 2021).
  2. Ddirective (EU) 2016/2284 of the European Parliament and of the Council. Available online: https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32016L2284&from=ITA (accessed on 9 April 2021).
  3. Cape, J.N. The Use of Passive Diffusion Tubes for Measuring Concentrations of Nitrogen Dioxide in Air. Crit. Rev. Anal. Chem. 2009, 39, 289–310. [Google Scholar] [CrossRef] [Green Version]
  4. Rowell, A.; Terry, M.E.; Deary, M.E. Comparison of diffusion tube–measured nitrogen dioxide concentrations at child and adult breathing heights: Who are we monitoring for? Air Qual. Atmos. Health 2021, 14, 27–36. [Google Scholar] [CrossRef]
  5. Filho, J.P.; Costa, M.A.M.; Cardoso, A.A. A Micro-impinger Sampling Device for Determination of Atmospheric Nitrogen Dioxide. Aerosol Air Qual. Res. 2019, 19, 2597–2603. [Google Scholar] [CrossRef]
  6. Soo, J.-C.; Lee, E.G.; LeBouf, R.F.; Kashon, M.L.; Chisholm, W.; Harper, M. Evaluation of a portable gas chromatograph with photoionization detector under variations of VOC concentration, temperature, and relative humidity. J. Occup. Environ. Hyg. 2018, 15, 351–360. [Google Scholar] [CrossRef]
  7. Watson, N.; Davies, S.; Wevill, D. Air Monitoring: New Advances in Sampling and Detection. Sci. World J. 2012, 11, 2582–2598. [Google Scholar] [CrossRef]
  8. Geiko, P.P.; Smirnov, S.S.; Samokhvalov, I.V. Long Path Detection of Atmospheric Pollutants by UV DOAS Gas-Analyzer. In Proceedings of the 25th International Symposium on Atmospheric and Ocean Optics: Atmospheric Physics, Novosibirsk, Russia, 1–5 July 2019; SPIE: Novosibirsk, Russia, 2019; Volume 11208, pp. 596–600. [Google Scholar]
  9. Yoshii, Y.; Kuze, H.; Takeuchi, N. Long-path measurement of atmospheric NO2 with an obstruction flashlight and a charge-coupled-device spectrometer. Appl. Opt. 2003, 42, 4362–4368. [Google Scholar] [CrossRef]
  10. Perevoshchikova, M.; Sandoval-Romero, G.E.; Argueta-Diaz, V. Developing an optical sensor for local monitoring of air pollution in México. J. Opt. Technol. 2009, 76, 274–278. [Google Scholar] [CrossRef]
  11. Kuula, J.; Kuuluvainen, H.; Rönkkö, T.; Niemi, J.V.; Saukko, E.; Portin, H.; Aurela, M.; Saarikoski, S.; Rostedt, A.; Hillamo, R.; et al. Applicability of Optical and Diffusion Charging-Based Particulate Matter Sensors to Urban Air Quality Measurements. Aerosol Air Qual. Res. 2019, 19, 1024–1039. [Google Scholar] [CrossRef]
  12. De Fazio, R.; Cafagna, D.; Marcuccio, G.; Minerba, A.; Visconti, P. A Multi-Source Harvesting System Applied to Sensor-Based Smart Garments for Monitoring Workers’ Bio-Physical Parameters in Harsh Environments. Energies 2020, 13, 2161. [Google Scholar] [CrossRef]
  13. Dam, N.; Ricketts, A.; Catlett, B.; Henriques, J. Wearable sensors for analyzing personal exposure to air pollution. In Proceedings of the 2017 Systems and Information Engineering Design Symposium (SIEDS), Charlottesville, VA, USA, 28 April 2017; pp. 1–4. [Google Scholar]
  14. Dhingra, S.; Madda, R.B.; Gandomi, A.H.; Patan, R.; Daneshmand, M. Internet of Things Mobile–Air Pollution Monitoring System (IoT-Mobair). IEEE Internet Things J. 2019, 6, 5577–5584. [Google Scholar] [CrossRef]
  15. Teriús-Padrón, J.G.; García-Betances, R.I.; Liappas, N.; Cabrera-Umpiérrez, M.F.; Waldmeyer, M.T.A. Design, Development and Initial Validation of a Wearable Particulate Matter Monitoring Solution. In How AI Impacts Urban Living and Public Health, Proceedings of the 17th International Conference on Smart Homes and Health Telematics, ICOST 2019, New York, NY, USA, 14–16 October 2019; Pagán, J., Mokhtari, M., Aloulou, H., Abdulrazak, B., Cabrera, M.F., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 190–196. [Google Scholar]
  16. U.S. Environmental Protection Agency. Office of Air Quality Planning and Standards a Guide to Air Quality and Your Health. Available online: https://www.airnow.gov/sites/default/files/2018-04/aqi_brochure_02_14_0.pdf (accessed on 9 December 2021).
  17. Arroyo, P.; Gómez-Suárez, J.; Suárez, J.I.; Lozano, J. Low-Cost Air Quality Measurement System Based on Electrochemical and PM Sensors with Cloud Connection. Sensors 2021, 21, 6228. [Google Scholar] [CrossRef]
  18. Penza, M.; Suriano, D.; Pfister, V.; Prato, M.; Cassano, G. Urban Air Quality Monitoring with Networked Low-Cost Sensor-Systems. Proceedings 2017, 1, 573. [Google Scholar] [CrossRef] [Green Version]
  19. Oletic, D.; Bilas, V. Design of sensor node for air quality crowdsensing. In Proceedings of the 2015 IEEE Sensors Applications Symposium (SAS), Zadar, Croatia, 13–15 April 2015; Volume 2012, pp. 1–5. [Google Scholar]
  20. Tsow, F.; Forzani, E.; Rai, A.; Wang, R.; Tsui, R.; Mastroianni, S.; Knobbe, C.; Gandolfi, A.J.; Tao, N.J. A Wearable and Wireless Sensor System for Real-Time Monitoring of Toxic Environmental Volatile Organic Compounds. IEEE Sens. J. 2009, 9, 1734–1740. [Google Scholar] [CrossRef]
  21. Wivou, J.; Udawatta, L.; Alshehhi, A.; Alzaabi, E.; Albeloshi, A.; Alfalasi, S. Air quality monitoring for sustainable systems via drone based technology. In Proceedings of the 2016 IEEE International Conference on Information and Automation for Sustainability (ICIAfS), Galle, Sri Lanka, 16–19 December 2016; pp. 1–5. [Google Scholar]
  22. Rohi, G.; Ejofodomi, O.; Ofualagba, G. Autonomous Monitoring, Analysis, and Countering of Air Pollution Using Environmental Drones. Heliyon 2020, 6, e03252. [Google Scholar] [CrossRef] [Green Version]
  23. Gu, Q.; Jia, C. A Consumer UAV-based Air Quality Monitoring System for Smart Cities. In Proceedings of the 2019 IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, USA, 11–13 January 2019; pp. 1–6. [Google Scholar]
  24. Matz, C.J.; Egyed, M.; Hocking, R.; Seenundun, S.; Charman, N.; Edmonds, N. Human Health Effects of Traffic-Related Air Pollution (TRAP): A Scoping Review Protocol. Syst. Rev. 2019, 8, 1–5. [Google Scholar] [CrossRef] [Green Version]
  25. Visconti, P.; de Fazio, R.; Velázquez, R.; Del-Valle-Soto, C.; Giannoccaro, N.I. Development of Sensors-Based Agri-Food Traceability System Remotely Managed by a Software Platform for Optimized Farm Management. Sensors 2020, 20, 3632. [Google Scholar] [CrossRef]
  26. Visconti, P.; Iaia, F.; De Fazio, R.; Giannoccaro, N.I. A Stake-Out Prototype System Based on GNSS-RTK Technology for Implementing Accurate Vehicle Reliability and Performance Tests. Energies 2021, 14, 4885. [Google Scholar] [CrossRef]
  27. Visconti, P.; de Fazio, R.; Costantini, P.; Miccoli, S.; Cafagna, D. Innovative complete solution for health safety of children unintentionally forgotten in a car: A smart Arduino-based system with user app for remote control. IET Sci. Meas. Technol. 2020, 14, 665–675. [Google Scholar] [CrossRef]
  28. Sedlák, P.; Kuberský, P. The Effect of the Orientation Towards Analyte Flow on Electrochemical Sensor Performance and Current Fluctuations. Sensors 2020, 20, 1038. [Google Scholar] [CrossRef] [Green Version]
  29. Dong, L.; Xu, Z.; Xuan, W.; Yan, H.; Liu, C.; Zhao, W.-S.; Wang, G.; Teh, K.S. A Characterization of the Performance of Gas Sensor Based on Heater in Different Gas Flow Rate Environments. IEEE Trans. Ind. Inform. 2020, 16, 6281–6290. [Google Scholar] [CrossRef]
  30. Krivec, M.; Mc Gunnigle, G.; Abram, A.; Maier, D.; Waldner, R.; Gostner, J.M.; Überall, F.; Leitner, R. Quantitative Ethylene Measurements with MOx Chemiresistive Sensors at Different Relative Air Humidities. Sensors 2015, 15, 28088–28098. [Google Scholar] [CrossRef] [Green Version]
  31. Ambient (Outdoor) Air Pollution. Available online: https://www.who.int/news-room/fact-sheets/detail/ambient-(outdoor)-air-quality-and-health (accessed on 16 June 2021).
  32. ADS1113 Datasheet. Available online: https://www.ti.com/lit/ds/symlink/ads1113.pdf?ts=1616751793744&ref_url=https%253A%252F%252Fwww.google.com%252F (accessed on 28 March 2021).
  33. ZH03-Series-Laser-Dust-Module Datasheet. Available online: https://www.winsen-sensor.com/d/files/air-quality/zh03-series-laser-dust-module-v2_0.pdf (accessed on 30 March 2021).
  34. CCS811_Datasheet. Available online: https://cdn.sparkfun.com/assets/learn_tutorials/1/4/3/CCS811_Datasheet-DS000459.pdf (accessed on 31 March 2021).
  35. MICS-6814 Datasheet. Available online: https://sgx.cdistore.com/datasheets/sgx/1143_datasheet%20mics-6814%20rev%208.pdf (accessed on 30 March 2021).
  36. OV5647 Datasheet. Available online: https://cdn.sparkfun.com/datasheets/Dev/RaspberryPi/ov5647_full.pdf (accessed on 29 March 2021).
  37. Synetica Enlink Air Wireless Air Quality Monitoring Datasheet. Available online: https://synetica.net/wp-content/uploads/2019/03/enLink-Air-V1_3.pdf (accessed on 14 December 2021).
  38. Guidance & Regulations on Reducing Noise Exposure|NIOSH|CDCace Safety and Health Topic. Available online: https://www.cdc.gov/niosh/topics/noise/reducenoiseexposure/regsguidance.html (accessed on 16 June 2021).
  39. Waste Fires: Combustion Gases. Available online: https://www.ingenio-web.it/28737-i-gas-di-combustione-generati-dagli-incendi-di-rifiuti-e-la-loro-tossicita (accessed on 17 June 2021).
  40. Girshick, R. Fast R-CNN. In Proceedings of the IEEE International Conference on Computer Vision, Santiago, Chile, 7–13 December 2015; pp. 1440–1448. [Google Scholar]
Figure 1. 3D image of the proposed mobile monitoring system.
Figure 1. 3D image of the proposed mobile monitoring system.
Electronics 11 00052 g001
Figure 2. Connections of devices equipping the sensing section with the Raspberry Pi Zero board.
Figure 2. Connections of devices equipping the sensing section with the Raspberry Pi Zero board.
Electronics 11 00052 g002
Figure 3. Modules constituting the sensing unit integrated inside the drone: AD1115 ADC module (a), ZH03A/B laser dust sensors (b), CCS811 gas sensor (c), MiCS-6814 sensor (d), Microphone sensor module (e), IR camera module (f), and NEO M8 GNSS receiver module (g).
Figure 3. Modules constituting the sensing unit integrated inside the drone: AD1115 ADC module (a), ZH03A/B laser dust sensors (b), CCS811 gas sensor (c), MiCS-6814 sensor (d), Microphone sensor module (e), IR camera module (f), and NEO M8 GNSS receiver module (g).
Electronics 11 00052 g003
Figure 4. Flowchart of the firmware executed the sensor-based drone for pollutants detection and waste fire localization.
Figure 4. Flowchart of the firmware executed the sensor-based drone for pollutants detection and waste fire localization.
Electronics 11 00052 g004
Figure 5. Time trend of the absorbed current during a patrol flight using camera and GPS.
Figure 5. Time trend of the absorbed current during a patrol flight using camera and GPS.
Electronics 11 00052 g005
Figure 6. Physical connection scheme for supplying the various peripheral devices included in the developed sensing section.
Figure 6. Physical connection scheme for supplying the various peripheral devices included in the developed sensing section.
Electronics 11 00052 g006
Figure 7. Pictures describing two application scenarios: in a suburban area (a); in a landfill (b).
Figure 7. Pictures describing two application scenarios: in a suburban area (a); in a landfill (b).
Electronics 11 00052 g007
Figure 8. Time trend of the air concentrations of gaseous and particles species acquired by developed drone, using a sampling period of 5 s: first test campaign in a suburban area (a) and close a very busy state road (b).
Figure 8. Time trend of the air concentrations of gaseous and particles species acquired by developed drone, using a sampling period of 5 s: first test campaign in a suburban area (a) and close a very busy state road (b).
Electronics 11 00052 g008aElectronics 11 00052 g008b
Figure 9. CO2 trend acquired during the third test campaign used to test the functionality of the IR camera and visual recognition algorithm.
Figure 9. CO2 trend acquired during the third test campaign used to test the functionality of the IR camera and visual recognition algorithm.
Electronics 11 00052 g009
Figure 10. Screenshots of the local application used to upload the drone’s data on cloud platform (a) and mobile application for monitoring the measurements acquired by drone (b).
Figure 10. Screenshots of the local application used to upload the drone’s data on cloud platform (a) and mobile application for monitoring the measurements acquired by drone (b).
Electronics 11 00052 g010
Table 1. Table reporting the breakdown of costs on the various components constituting the sensor-based drone.
Table 1. Table reporting the breakdown of costs on the various components constituting the sensor-based drone.
ComponentIndicative Cost (EUR)
Phantom 3 Drone699
Raspberry Pi Zero W Board12
ADS1115 Breakout board3
ZH03A/B Sensor module8
CCS811 Breakout board10
MiCS6814 Breakout board16
MAX4466 Microphone module3
OV5647 IR Camera module12
NEO 8M GNSS receiver5
Battery5
MP 207 DC/DC Converter0.5
Battery protection board0.3
Table 2. Summarizing table with supply current values of the different components included in the developed mobile sensor node.
Table 2. Summarizing table with supply current values of the different components included in the developed mobile sensor node.
ComponentOperative ModalitySupply Current
Raspberry Pi Zero WIdling120 mA
Loading LXDE160 mA
Shoot 1080p Video (Rapberry + Pi Camera)230 mA
ADS1115Active mode (TA = 25 °C)150 μA (TYP)/200 μA (MAX)
Power-down (TA = 25 °C)0.5 μA (TYP)/2 μA (MAX)
MiCS6814Heating current32 mA (RED sensor)
26 mA (OX sensor)
30 mA (NH3 sensor)
CCS811During measuring26 mA
Sleep mode19 µA
ZH03A/ZH03BWorking current<120 mA
Dormancy current<10 mA
MAX4466Active mode (TA = 25 °C)24 μA (TYP)/48 μA (MAX)
Shutdown (TA = 25 °C)5 nA (TYP)/50 nA (MAX)
NEO M8NAcquisition32 mA
Tracking mode30 mA
Power save mode13 mA
OV5647Dormancy current20 µA
Working current110 mA
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

De Fazio, R.; Dinoi, L.M.; De Vittorio, M.; Visconti, P. A Sensor-Based Drone for Pollutants Detection in Eco-Friendly Cities: Hardware Design and Data Analysis Application. Electronics 2022, 11, 52. https://doi.org/10.3390/electronics11010052

AMA Style

De Fazio R, Dinoi LM, De Vittorio M, Visconti P. A Sensor-Based Drone for Pollutants Detection in Eco-Friendly Cities: Hardware Design and Data Analysis Application. Electronics. 2022; 11(1):52. https://doi.org/10.3390/electronics11010052

Chicago/Turabian Style

De Fazio, Roberto, Leonardo Matteo Dinoi, Massimo De Vittorio, and Paolo Visconti. 2022. "A Sensor-Based Drone for Pollutants Detection in Eco-Friendly Cities: Hardware Design and Data Analysis Application" Electronics 11, no. 1: 52. https://doi.org/10.3390/electronics11010052

APA Style

De Fazio, R., Dinoi, L. M., De Vittorio, M., & Visconti, P. (2022). A Sensor-Based Drone for Pollutants Detection in Eco-Friendly Cities: Hardware Design and Data Analysis Application. Electronics, 11(1), 52. https://doi.org/10.3390/electronics11010052

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop