Next Article in Journal
Design, Fabrication and Analysis of Magnetorheological Soft Gripper
Previous Article in Journal
An Aptasensor Based on a Flexible Screen-Printed Silver Electrode for the Rapid Detection of Chlorpyrifos
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Intercomparison of PurpleAir Sensor Performance over Three Years Indoors and Outdoors at a Home: Bias, Precision, and Limit of Detection Using an Improved Algorithm for Calculating PM2.5

Independent Researcher, Santa Rosa, CA 95409, USA
Sensors 2022, 22(7), 2755; https://doi.org/10.3390/s22072755
Submission received: 5 February 2022 / Revised: 29 March 2022 / Accepted: 31 March 2022 / Published: 2 April 2022
(This article belongs to the Section Environmental Sensing)

Abstract

:
Low-cost particle sensors are now used worldwide to monitor outdoor air quality. However, they have only been in wide use for a few years. Are they reliable? Does their performance deteriorate over time? Are the algorithms for calculating PM2.5 concentrations provided by the sensor manufacturers accurate? We investigate these questions using continuous measurements of four PurpleAir monitors (8 sensors) under normal conditions inside and outside a home for 1.5–3 years. A recently developed algorithm (called ALT-CF3) is compared to the two existing algorithms (CF1 and CF_ATM) provided by the Plantower manufacturer of the PMS 5003 sensors used in PurpleAir PA-II monitors. Results. The Plantower CF1 algorithm lost 25–50% of all indoor data due in part to the practice of assigning zero to all concentrations below a threshold. None of these data were lost using the ALT-CF3 algorithm. Approximately 92% of all data showed precision better than 20% using the ALT-CF3 algorithm, but only approximately 45–75% of data achieved that level using the Plantower CF1 algorithm. The limits of detection (LODs) using the ALT-CF3 algorithm were mostly under 1 µg/m3, compared to approximately 3–10 µg/m3 using the Plantower CF1 algorithm. The percentage of observations exceeding the LOD was 53–92% for the ALT-CF3 algorithm, but only 16–44% for the Plantower CF1 algorithm. At the low indoor PM2.5 concentrations found in many homes, the Plantower algorithms appear poorly suited.

1. Introduction

In recent years, a revolution in developing small low-cost particle sensors has occurred. A variety of sensors have been evaluated in multiple laboratory studies [1,2,3,4,5,6]. Their behavior in outdoor sites has also been studied extensively—a selection of these studies is provided [7,8,9,10,11,12]). The US Environmental Protection Agency has provided guidance on their use in outdoor settings [13,14]. However, fewer studies have focused on their use indoors [15,16,17,18,19,20,21].
Indoor studies are particularly relevant, since most people spend most of their time indoors [22,23]. To date, most epidemiology studies have been able to use only outdoor measurements to estimate human exposure; indoor exposures may or may not be estimated, but there are seldom any measured data [24,25,26]. Some studies such as the Harvard 6-City Study or the EPA’s Particle TEAM study have been able to use personal and indoor monitors to provide an estimate of the total personal or indoor exposure to particles [27,28,29,30,31]. However, these studies have been limited to relatively short-term exposures because of the expense of acquiring, setting up, and taking down expensive research-grade instruments in homes.
Now, it has finally become possible to determine long-term exposure to particles, since the new sensors are normally quiet and inconspicuous and can operate continuously without the need to maintain, clean, or manually download the data they collect. Downloading is made particularly easy by the PurpleAir company, Draper, Utah, USA (https://www2.purpleair.com/, accessed on 30 March 2022), which maintains a web-accessible database of all data except that for users who have requested privacy.
The overriding concern in using low-cost particle sensors is their reliability and accuracy. Accuracy is a function of bias and precision. However, bias can be corrected; precision cannot.
The manufacturer of the PMS 5003 sensors used in PurpleAir PA-II monitors is the Plantower company (http://www.plantower.com/en/, accessed on 30 March 2022). Plantower employs two proprietary algorithms (CF1 and CF_ATM) to estimate PM2.5. For PM2.5 concentrations less than approximately 28 µg/m3, which constitute the vast majority of measured concentrations in many nations, the two algorithms give identical results [19]. Plantower provides no information about the composition or indeed the very existence of a calibration aerosol. Nor does the company provide any other data about the calculations involved in translating observed numbers of particles in six size categories into a mass concentration estimate. We have shown how both these algorithms are seriously flawed. We developed a transparent and reproducible algorithm (ALT-CF3) that was shown to outperform the Plantower algorithms with respect to bias, precision, limit of detection, and distribution fitting [19,32]. Our results suggest that the Plantower algorithms overestimate PM2.5 concentrations by approximately 40–50%, as has been found by other investigations [1,3,8,11,32]. However, this bias can be corrected. What is much more serious is the Plantower problem with precision.
We show in this paper that estimates of precision may be severely impacted by the flawed Plantower practice of assigning a value of zero to concentrations not reaching a certain predefined threshold. In particular, we show that indoor concentrations in a home with reasonably typical indoor activities such as cooking, cleaning, and other particle-producing activities fall below this threshold for such a large fraction of time that estimates of both precision and PM2.5 concentration are limited to a greatly reduced fraction of all observations. We also use our 18 month or 3 year data collection periods to investigate the question of how the performance may change over extended periods. We also compare the LODs of the Plantower and ALT-CF3 algorithms, and show the fraction of outdoor and indoor measurements falling below the LODs for each algorithm.

2. Materials and Methods

All data were collected inside or outside a home with two residents. The indoor PurpleAir monitors were placed on a desk, dresser, or cart approximately 1.0 m high (Figure S1). The outdoor monitor was hung approximately 2 m high from a bracket approximately 2 m high and 15 cm from the home (Figure S2). The home is a detached 1-story building of approximately 400 m3 volume located in the city of Santa Rosa, CA, USA. For the 18-month period from 10 January 2019 to 18 June 2020, two indoor PurpleAir monitors collected data every 80 s (later, every 2 min). For the next 18-month period (18 June 2020–14 January 2022), two additional PurpleAir monitors collected both indoor and outdoor data.
Heating is supplied by a natural gas furnace, and the temperature is normally set to 72 °F (22 °C). A gas stove and electric oven are used for cooking on approximately 5 or 6 days a week. A central air conditioner is used at times in the summer. The central fan is normally on and the main filter is equipped with electrically charged wires to attract particles. The air exchange rate of the home was measured on multiple occasions and was generally in the range from 0.2 to 0.3 air changes per hour (80–120 m3/h). Two blower door tests confirmed that the home was well constructed with a low baseline air change rate. Windows are normally closed except in summer, when the air conditioner is not being used. Some experiments were performed in a closed room involving single puffs from a vaping pen containing marijuana liquid. These experiments resulted in increased PM2.5 levels for less than 4% of the time. Additional experiments were performed using three co-located research-grade SidePak AM510 optical particle monitors (TSI Inc., Shoreview, MN, USA (https://tsi.com/home/, accessed on 30 March 2022)). The SidePak monitors were equipped with a cleaned and greased PM2.5 impactor. Flow rates were set at 1.7 Lpm using a TSI 4140 inline flow meter. Approximately 18 typical indoor sources were examined. Cooking sources such as broccoli, chicken, and hamburger were heated using miniature pans (cast iron, stainless steel and coated pans) on a laboratory hotplate (Cimarec Model SP-131015, ThermoFisherScientific, Waltham, MA, USA (https://www.thermofisher.com/us/en/home.html, accessed on 30 March 2022)) to temperatures sufficient to create particle numbers above background. In some cases, extended time of heating was used to provide blackened shrimp or burnt toast (Figure S3).

The ALT-CF3 Algorithm

The central method of estimating PM2.5 concentrations from PurpleAir monitors was explored in two papers [19,32]. The method is well known and has been used for many years by persons working with optical particle monitors or companies manufacturing optical monitors such as Climet (http://www.climet.com/, accessed on 30 March 2022), MetOne (https://metone.com/, accessed on 30 March 2022), and TSI (https://tsi.com/, accessed on 30 March 2022). Optical particle monitors often provide particle counts for several size categories, 0.3–0.5 µm, 0.5–1 µm, and 1–2.5 µm. There are 3 additional larger categories but they do not contribute to PM2.5. Given the number N of particles in a size category, one can estimate the total volume of the particles by choosing a diameter d within the range of the size category, and calculating the total volume V associated with that diameter d using the equation V = Nπ d3/6. The diameter d is typically either the arithmetic or geometric mean of the boundaries of the size category. For example, the diameter of the 0.3–0.5 µm category is either the arithmetic mean (0.4 µm) or the geometric mean (0.37 µm). Once the total volume of all of the desired size categories is determined, multiplying by the density of the aerosol mixture provides the mass (e.g., PM1 if only the two smallest categories are employed, or PM2.5 if the three smallest size categories are used). In the case of the PurpleAir monitors, the chosen diameter was the geometric mean and the density chosen for the particles was that of water (1 g/cm3) [19,32]. The density of typical PM2.5 particles depends on local conditions, but is often taken to be in the range of 1–2 g/cm3. The actual choice of the density is unimportant, because the ultimate calibration of the monitors will depend on comparisons with reference monitors, and that comparison will give the calibration factor (CF) required to bring the PurpleAir estimate into agreement with the reference monitor.
The first application of this method to PurpleAir monitors was provided in 2020 [19]. In that paper, PM2.5 aerosols produced by vaping marijuana liquid were measured using two PurpleAir PA-II monitors (four Plantower sensors) co-located with three SidePak (TSI) and two Piezobalance (Kanomax, Osaka, Japan, https://kanomax.biz/asia/, accessed on 30 March 2022) research-grade monitors. The SidePaks and Piezobalances were calibrated by comparison with a gravimetric system employing a pump, filter, and microbalance at Stanford University [19]. The ultimate calibration factor for the PurpleAir monitors was 3.0 (SE = 0.015), based on 47 experiments carried out over one year (Figure S4 in the Supporting Information). This CF applies to the indoor aerosol mixture encountered during the experiments, which took place in a room of a detached house and included PM2.5 from typical household activities together with the PM2.5 produced by vaping marijuana liquid.
The next application of the calibration method took place in 2021 [32]. In this study, 33 Purple Air sites within 500 m of 27 EPA regulatory (Federal Reference Method (FRM) or Federal Equivalent Method (FEM)) monitors were compared over extended periods of time (177,000 hourly average measurements). The same method of calculating PM2.5 as in [19] was adopted. Since this was an ALTernative to the Plantower algorithms, it was given the name ALT-CF3. Four different analytical approaches resulted in an estimated CF of 3.05 (SE 0.05). The ALT-CF3 method can be applied to any and all operating PurpleAir monitors accessible on either the PurpleAir API website (https://api.purpleair.com/, accessed on 30 March 2022) or the main PurpleAir map website (https://www2.purpleair.com/, accessed on 30 March 2022). The ALT-CF3 algorithm is presently the only one of several alternative methods available on those sites that does not depend on the Plantower algorithms.
It is important to note that the performance of the Plantower sensors is not necessarily due to the accuracy of the particle counts. In fact, there is much evidence that the counts are NOT accurate (e.g., [2]). There is even evidence from the ALT-CF3 results that the particle numbers are probably underestimated. That is because a multiplier of 3 must be applied to a calculation using an assumed density of 1 g/cm3, whereas PM2.5 is expected to have a density on the order of 1–1.5 g/cm3. If so, the multiplier of 3 used to match results to the reference monitors must indicate a underestimate of particle numbers by approximately a factor of 2 or 3. This is an example of the way that a bias, if reasonably constant, can be corrected to provide good accuracy—but only if the precision is good, as it is for the ALT-CF3 algorithm.
In the following sections, to save space (particularly in tables), we will often use “CF3” as a synonym for “ALT-CF3”.
PurpleAir PA-II monitors include two independent Plantower PMS 5003 sensors, which we identify as a and b. We define precision for each monitor as the absolute difference between the PM2.5 readings divided by their sum: abs(ab)/(a + b). The monitors employed in this study are identified by the numbers 1–4. For example, the two sensors in monitor 1 are identified as 1a and 1b.

3. Results

Monitors 1 and 2 collected data for 3 years. Monitor 1 was indoors the entire time; monitor 2 was indoors for 2 years and outdoors for 1 year. A total of 829,907 observations were made; some were 80 s averages, most were 2 min averages. Only the ALT-CF3 values with a precision better than 20% (under 0.2) were accepted. This left 92% of the data (763,102 observations).
Monitors 3 and 4 collected data over the last 18 months of this study (18 June 2020 to 14 January 2022). Monitor 3 was outdoors for all but one month; monitor 4 was indoors the entire time. There were 406,310 observations in total, and 370,906 observations remained after limiting the data to those with a precision under 0.2. There were 353,256 matched pairs of indoor–outdoor PM2.5 measurements. They appeared to be close to log-normally distributed, spanning approximately 5 orders of magnitude from 0.01 to 450 μg/m3 (Figure 1).
A log-normal distribution would form a straight line on the graph. The departure from log-normal behavior at the higher concentrations may be due to the wildfires affecting this northern California site in 2021. There appears to be a fairly constant fraction relating indoor to outdoor concentrations.
The same observations were plotted using the Plantower CF1 algorithm (Figure 2). More than 50,000 indoor measurements (17% of the total) and 14,000 outdoor measurements fell below the cutoff value of 0.01 at which the Plantower algorithm assigns a value of zero. This accounts for the missing data between approximately −5 and −2 normal probability standard deviations.
It should be noted that although the Plantower CF1 cutoff is at 0.01 µg/m3, the ALT-CF3 data for these same observations shows 20–40 times the value of 0.01 at a Z-score of −2. The ALT-CF3 algorithm never returns a value of zero, since particles are always present in the smallest size category (0.3–0.5 µm). Because of the lack of information for the Plantower CF1 algorithm, it is not known why so many observations are assigned a value of zero.
It can also be seen that the distribution of concentrations using the Plantower CF1 algorithm is affected by the many zeros and cannot be fitted with a log-normal curve.
The PM2.5 measurements for all monitors and both time periods are supplied in Tables S1 and S2 in the Supporting Information (SI). Mean indoor PM2.5 values ranged from 3.6 to 5.7 µg/m3. These values are quite comparable to those found in a previous study including 91 PurpleAir indoor monitors averaged over times from 796 to 13,564 h. The observed means ranged from a median of 3.4 µg/m3 to a 75th percentile value of 5.5 µg/m3 [15].

3.1. Relative Bias

In this section, we compare the bias relative to the mean of all eight sensors, first for the ALT-CF3 algorithm and then for the Plantower CF1 algorithm. We look at both indoor and outdoor measurements. The bias relative to the mean of ALT-CF3 estimates of PM2.5 (µg/m3) for sensors a and b was calculated for the final 18 month period when all 4 monitors (8 sensors) were operating. The mean (SE) overall bias was 3.0% (0.7%), ranging from 0.5% to 5% (Table S3).
The same calculations of bias for the Plantower CF1 algorithm resulted in the loss of more than 100,000 observations for the indoor monitors 1 and 4 due to the large number of reported zeros for the PM2.5 observations. Both indoor and outdoor readings from monitor 2 lost approximately 50% of all observations. Monitor 3 lost more than 75,000 observations (Table S4).
There was also an effect of increased mean values even beyond the 40% overestimate often noted for the Plantower CF1 algorithm. The mean indoor values were approximately 100% higher than for the ALT-CF3 algorithm, and the mean outdoor values were more than 200% higher. The mean (SE) overall bias was 4.0% (1.5%) ranging from 0.6% to 8.2%. These bias calculations are only slightly worse for the Plantower CF1 algorithm, because they are “helped” by not having to consider the thousands of deleted observations which would otherwise lead to different (probably higher) estimates of bias.
The absolute bias of these four monitors with respect to regulatory FEM or FRM monitors could not be determined over the three years of this study, since no regulatory monitor was nearby. However, a previous study compared 33 PurpleAir outdoor monitors to 27 nearby regulatory monitors [32]. The bias (ratio of PurpleAir monitor PM2.5 using the ALT-CF3 calibration to regulatory monitor PM2.5) had a median value of 0.96 (IQR 0.77 to 1.21).

Comparison with FEM Bias

The US EPA operates a program determining the bias of selected Federal Equivalent Method (FEM) PM2.5 results by setting up a side-by-side gravimetric monitor employing the Federal Reference Method (FRM) (https://www.epa.gov/outdoor-air-quality-data/pm25-continuous-monitor-comparability-assessments, accessed on 30 March 2022). The bias of the continuous FEM monitor with respect to the collocated FRM monitor is calculated over a 3 year period. A sample of 61 reports from the state of CA showed a mean absolute bias of 20.5% (SE 3.2%) for the FEM monitor compared to the FRM monitor. 41 of the calculated biases were positive, with a mean (SE) of +27.3% (4.5%). The median bias was 1.21 (IQR 1.04 to 1.25). These results suggest that the range of the absolute bias of 33 PurpleAir monitors employed in [32] using the ALT-CF3 algorithm appears comparable to the absolute bias of the 61 FEM monitors compared to FRM monitors in the EPA comparability assessments.

3.2. Precision

For the entire 3 year period, median precision for monitors 1 and 2 was between 4.6% and 5.7% using the ALT-CF3 algorithm (Table 1). The Plantower CF1 median precision for the same data ranged between 8.4% and 20.5% with, however, more than 100,000 fewer measurements for monitor 1 and 50,000 fewer for monitor 2 due to the excessive number of zeros reported by the CF1 algorithm.
Applying the precision cutoff of 0.2 for the Plantower CF1 measurements resulted in an additional loss of 25% of the remaining data for monitor 1 and 45–50% of the remaining data for monitor 2. Overall, the loss of data using the Plantower CF1 algorithm amounted to 277,488 (36%) measurements for monitor 1 and 387,991 (52%) of all measurements for monitor 2 (Figure 3).
The loss of observations over a 3 year period for the Plantower CF1 algorithm compared to the ALT-CF3 algorithm was 36% for monitor 1 and 52% for monitor 2.
For the 18 month second period employing monitors 3 and 4, the loss of data due to employing the CF1 algorithm was similar for the indoor data at 33%, but improved for the outdoor data (20%) (Table S5).

3.3. PM2.5 Concentrations of Zero

From the 3 year and 18 month studies, the Plantower CF1 algorithm reported PM2.5 concentrations of zero for approximately 60,000 to 160,000 indoor measurements (12% to 23% of the total) and 10,000 to 35,000 outdoor measurements (4–14% of the total) (Table 2). The ALT-CF3 algorithm, by contrast, reports no concentrations of zero for the same set of observations. This is because there are never occasions when the number of particles in the smallest size category (0.3–0.5 um) falls to zero.
The prevalence of zeros produced by the Plantower CF1 and CF_ATM in indoor data, ranging from 12 to 23% of the observations, is due entirely to the Plantower decision to define all values below a certain cutoff as zero. The cutoff appears to be 0.01 µg/m3 as defined by both the Plantower CF1 and CF_ATM algorithms. However, we found that only a minute percentage (15 of 50,000) of these same observations were below 0.01 µg/m3 as defined by the ALT-CF3 algorithm. Instead, the ALT-CF3 measurements of these same observations ranged up to 30 µg/m3, although most were below 1 µg/m3. We can find no apparent reason for this result, which is certainly not reflected by the particle numbers reported in the three size categories. This loss of data will be impossible to recover for all investigators using either of the Plantower CF1 and CF_ATM algorithms, no matter what further modifications they apply to the Plantower algorithms. Statisticians do not approve of replacing concentrations below the LOD with zero, the LOD itself, or half the LOD [33]. In this case, the cutoff is in fact far below the LOD, as discussed in Section 3.5.

3.4. Variation of Precision over Time

We regressed the measured precision against time of operation (3 years for monitors 1 and 2; 18 months for monitors 3 and 4), treating indoor and outdoor measurements separately. Results were split evenly, showing increased precision in three cases and decreased precision in the remaining three cases (Table 3).

3.5. Limit of Detection (LOD)

A definition for the LOD for the case of analyzing a physicochemical sample can be found in many publications (e.g., [34]). The definition envisions analyzing several samples expected to have concentrations near the LOD. If the results of analyzing the several samples shows that the mean is more than 3 times the standard deviation, then the mean value is considered to be near (somewhat above) the LOD.
For the case of continuous sampling, a different definition is needed. One approach was advanced in [18]. In this definition, the LOD occurs at the lowest mean value µ above which more than 95% of mean values exceed their standard deviations σ by more than a factor of 3 (µ/σ > 3). In practice, this requires identifying all cases with µ/σ < 3, sorting according to µ ascending, and then counting the number of cases with µ/σ < 3 in, say, blocks of 100 in ascending order. When a block of 100 is reached with fewer than 5µ/σ values < 3, a candidate for the LOD has been found within that block. However, further exploration at higher mean concentrations may show a new block of 100 with 5 or more cases of µ/σ < 3, at which point that new block of 100 now contains a new (higher) candidate for the LOD. The search ends when all the data have been explored, but in practice it ends much earlier, when there are increasingly great “distances” between blocks of 100 with 5 or more values of µ/σ < 3.
LODs were determined for all four monitors. For monitor 2, which spent time indoors and outdoors, the LOD was calculated separately for each location. The LODs calculated for the ALT-CF3 algorithm ranged from 0.6 to 1.3 µg/m3, compared to from 2.9 to 9.9 µg/m3 for the Plantower CF1 algorithm (Table 4). Approximately 53–92% of the data exceeded the CF3 LODs, but only 16–44% of the data exceeded the Plantower CF1 LODs (Figure 4).

3.6. Comparison with Co-Located Research-Grade SidePak Monitors

Typical indoor particle sources were examined by the three PA-II PurpleAir monitors co-located with three SidePak AM510 monitors. These sources included cooking oils (butter, olive oil, and coconut oil); vegetables (asparagus, broccoli, and red pepper); meats (blackened shrimp, chicken, and hamburger); wooden kitchen matches; dust or SVOCs collected over more than a year on coated or stainless steel pans or on glass Petri dishes; and burnt toast. Outdoor aerosol was also studied; however, during the time of study, the outdoor air was very clean, so the CFs were not tested over a reasonable range of concentrations. Most of the cooking sources were placed on three cleaned miniature pans: cast iron, coated, and stainless steel. Before cleaning, the pans were heated using the hotplate to temperatures of 400–500 °C to drive off the collected dust. Temperatures were gradually increased for each source until particles began to be observed. The hotplate was then turned off without further increase, except in the cases when blackened foods were desired (e.g., blackened shrimp, broccoli, and burnt toast). The Plantower and ALT-CF3 algorithms were applied to the results, requiring all PurpleAir precision to be better than 20%. This requirement resulted in a loss of approximately 8% of the original 2300 2 min averages to 2158 valid observations for the ALT-CF3 algorithm and a much larger loss of almost 50% to 1209 valid observations for the Plantower CF1 algorithm. A particularly striking loss of data occurred for burning wooden kitchen matches—from 256 measurements for the ALT-CF3 algorithm down to 16 measurements for the Plantower CF1 algorithm, a loss of approximately 95% of all data. This loss was not due to the Plantower CF1 assignment of zero to measurements below the threshold, since there were only 15 such cases, the concentrations in general being quite high. The ratios for each of the two algorithms with the SidePak results for all 17 sources are shown in Figure 5. The CFs for the ALT-CF3 algorithm are typically near 0.4. This would be expected for an aerosol mixture with a density near 1 g/cm3, because the SidePak is calibrated with Arizona Road Dust having a density of 2.6 g/cm3 (1/2.6~0.4). Although the CF depends on other characteristics (refractive index, RH, and particle composition), density may in many instances be the controlling factor. The main exception to the general CF of 0.4 was for the ignition of paraffin wax, which resulted in a short-lived fire and produced an estimated CF of only 0.16. However, this is well explained by the fact that the SidePaks are able to handle occupation-level high concentrations (which went in this case to a maximum of 12 mg/m3) whereas the PurpleAir monitors have an upper limit of approximately 1 mg/m3. In keeping with the general tendency of the CF1 algorithm to overpredict PM2.5 concentrations, the CFs reported by the CF1 algorithm are higher (mean approximately 0.7) and more variable (0.6–0.8). The error bars are also larger for the CF1 algorithm.

3.7. Limitations

Only four monitors and only one location was tested in this study. Other long-term studies in other locations would be desirable.

4. Discussion

All major findings regarding the Plantower CF1 algorithm stand on their own and do not depend on the calibration factor of 3 used in the ALT-CF3 algorithm. These findings for the Plantower CF1 algorithm include (1) the assignment of zero to values smaller than a cutoff concentration; (2) the resulting loss of data, which increases at lower concentrations such as those encountered indoors; (3) a resulting decline in precision; (4) a high LOD in the 3–10 µg/m3 range, leading to major portions of datasets falling below the detection limit.
The generally good accuracy shown by the PurpleAir monitors is not an indication of the accuracy of the particle counts themselves. In fact, there is evidence that the particle counts are underestimated by a factor of approximately 2 [19]. However, because the precision of the sensors is good to excellent, this bias appears to be correctable sufficiently to give good results when compared to reference monitors.
Regarding the question of possible decline over time of sensor response, our 3 year period showed improved precision for three sensor/locations and declining precision for the other three sensor/locations, so presented little evidence for a general decline.
The comparisons with the research-grade SidePak indicated considerable uniformity and stability of the ALT-CF3 algorithm, with the PurpleAir/Sidepak ratio not far from 0.4 for 16 of the 17 sources. The 17th source (the fire due to paraffin wax ignition) resulted in exceeding the upper limit of approximately 1 mg/m3 for the PurpleAir monitor and therefore the observed ratio of 0.16 is not an exception to that observation.
The comparisons with the research-grade SidePak monitors do not in themselves have any bearing on the accuracy of the PurpleAir monitors, since that would require concurrent filter collection followed by gravimetric methods. This cannot reasonably be performed over a period of an hour or so since not enough weight would collect on the filters. However, several studies have measured SidePak CFs for different particle sources. Perhaps the most important source to be studied to determine calibration factors is outdoor air, even when indoor air is also being studied. This is because air of ambient origin is normally a substantial contributor to the total indoor aerosol mixture. Unfortunately, the study most focused on establishing calibration factors for the SidePak for outdoor air found the CFs in 18 cases over several months to vary from 0.31 to 1.05 [35]. The authors concluded that perhaps outdoor air is so changeable in composition, refractive index, density, and size distributions that no single CF will be sufficient to characterize the SidePak response to outdoor aerosol.
A slope of 3.38 was found for the SidePak based on comparisons with reference monitors for ambient air [36]. This would translate to a CF of 0.29 (1/3.38) if the intercept were close to zero, but in fact the intercept was 5.8 µg/m3, so the slope might change if forced through zero.
A study of 1400 buildings taken from the PurpleAir sensor network in two California areas during times of wildfires calculated infiltration ratios for days influenced or not influenced by fires [37]. The ratios were lower during wildfire days indicating that residents took action to protect themselves from the worsened outdoor air. The authors identified and removed periods when indoor sources were evident. They identified 16 regulatory monitoring sites within 5 km of their selected outdoor PurpleAir sites and found a calibration factor of 0.53 for the Plantower CF1 algorithm. This suggests that the CF1 algorithm overestimates PM2.5 concentrations by nearly a factor of 2, somewhat more than other measures of approximately 40–50% overestimates. However, it is in close agreement with the CF of 0.48 found in another study of wildfire smoke [38].
A laboratory study of responses to 24 common indoor sources for 7 selected low-cost monitors including PurpleAir PA-II monitors compared to two research monitors was performed [4]. Although the authors did not attempt to determine calibration factors for the monitors, they did estimate the densities of the aerosol mixtures, which would affect the calibration factors. The densities varied widely, which would indicate that the CFs would also vary widely according to the dominant indoor source at any time. It may be worth quoting their final paragraph:
“The evaluated versions of the AirBeam, AirVisual, Foobot, and Purple Air II monitors were of sufficient accuracy and reliability in detecting large sources that they appear suitable for measurement-based control to reduce exposures to PM2.5 mass in homes. The logical next steps in evaluating these monitors are to study their performance in occupied homes and to quantify their performance after months of deployment.”
This study is a first step toward carrying out their recommendation.

5. Conclusions

At typical levels of both indoor and outdoor PM2.5, the Plantower CF1 and CF_ATM algorithms lose an unacceptably large fraction of observations due to the choice of setting a threshold below which observations are assigned a value of zero. This practice is not supported by statistical theory. It causes the loss of substantial amounts of data (12–23% in our three-year study). It also increases the already high overestimates of PM2.5 due to deleting lower concentrations. No models employing either of the two Plantower algorithms can restore these values. The ALT-CF3 algorithm loses no data, since it depends on particle counts that never go to zero. The ALT-CF3 algorithm also has improved precision and limits of detection.
Mean precision using the CF3 algorithm is 5–6%, compared to 8–20% for the Plantower CF1 algorithm. In addition, there is a very great loss of data due in part to the choice to substitute zero for all values below a certain threshold. If an upper limit is set for the precision, the mean precision for both algorithms is improved, but again the loss of data was in the hundreds of thousands of observations for the Plantower CF1 algorithm.
With respect to the LOD, there was a very large difference between the two algorithms, with the LODs for the Plantower CF1 algorithm so high (3–10 µg/m3) that more than half of all data (56–84%) was below the LOD. By contrast, the ALT-CF3 algorithm produced LODs generally below 1 µg/m3, resulting in more than half of all data (53–92%) above the LOD.
Since the PM2.5 measurements using the ALT-CF3 algorithm are readily available on the PurpleAir map site and the API site, there is no bar to choosing the ALT-CF3 algorithm for future studies employing the PurpleAir data from any time period.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/s22072755/s1, Figure S1. Indoor PurpleAir monitors 1, 3, & 4. These are mounted on a cart and are 1.0 m high. Behind them are three SidePak monitors used as reference instruments during 47 experiments during the 3-year study. Figure S2. Outdoor PurpleAir monitor attached to bracket mounted 2.0 m from the ground. Figure S3. Miniature pans used to heat various foods. Coated pan at top, cast iron at bottom, stainless steel at right. Figure S4. Regression of PurpleAir monitor (CF3) against SidePak monitor (CF0.44). Source: data from Wallace, L., Ott, W., Zhao, T., Cheng, K-C, and Hildemann, L. (2020). Secondhand exposure from vaping marijuana: Concentrations, emissions, and exposures determined using both re-search-grade and low-cost monitors, Atmospheric Environment X, https://doi.org/10.1016/j.aeaoa.2020.100093 (accessed on 2 February 2022). Table S1. Estimates of PM2.5 concentration (µg/m3) comparing the ALT-CF3 algorithm to the Plantower CF1 algorithm. Time period: 10 January 2019 to 14 January 2022. Monitors 1 & 2. Table S2. Estimates of PM2.5 concentration (µg/m3) comparing the ALT-CF3 algorithm to the Plantower CF1 algorithm. Time period: 18 June 2020 to 14 January 2022. Monitors 3 & 4. Table S3. Mean PM2.5 concentrations (µg/m3) and relative bias for the ALT-CF3 algorithm. Table S4. Mean PM2.5 concentrations and relative bias for the Plantower CF1 algorithm. Table S5. Precision comparing the ALT-CF3 algorithm to the Plantower CF1 algorithm. Time period: 18 June 2020 to 14 January 2022. Monitors 3 & 4.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data are available from the author on request via [email protected].

Acknowledgments

The PurpleAir company has made studies such as this possible through their maintenance of a globally accessible free database; an important contribution to science.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. AQ-SPEC. Field Evaluation Purple Air PM Sensor. 2016. Available online: http://www.aqmd.gov/docs/default-source/aq-spec/field-evaluations/purpleair---field-evaluation.pdf (accessed on 30 March 2022).
  2. He, M.; Kuerbanjiang, N.; Dhaniyala, S. Performance characteristics of the low-cost Plantower PMS optical sensor. Aerosol Sci. Technol. 2020, 54, 232–241. [Google Scholar] [CrossRef]
  3. Kelly, K.E.; Whitaker, J.; Petty, A.; Widmer, C.; Dybwad, A.; Sleeth, D.; Martin, R.; Butterfield, A. Ambient and laboratory evaluation of a low-cost particulate matter sensor. Environ. Pollut. 2017, 221, 491–500. [Google Scholar] [CrossRef] [PubMed]
  4. Singer, B.C.; Delp, W.W. Response of consumer and research grade indoor air quality monitors to residential sources of fine particles. Indoor Air 2018, 28, 624–639. [Google Scholar] [CrossRef] [PubMed]
  5. Tryner, J.; Quinn, C.; Windom, B.C.; Volckens, J. Design and evaluation of a portable PM2.5 monitor featuring a low-cost sensor in line with an active filter sampler. Environ. Sci. Process. Impacts 2019, 21, 1403–1415. [Google Scholar] [CrossRef] [PubMed]
  6. Wang, Z.; Delp, W.W.; Singer, B.C. Performance of low-cost indoor air quality monitors for PM2.5 and PM10 from residential sources. Build. Environ. 2020, 171, 106654. [Google Scholar] [CrossRef]
  7. Bi, J.; Wildani, A.; Chang, H.H.; Liu, Y. Incorporating low-cost sensor measurements into high-resolution PM2.5 modeling at a large spatial scale. Environ. Sci. Technol. 2020, 54, 2152–2162. [Google Scholar] [CrossRef]
  8. Gupta, P.; Doraiswamy, P.; Levy, R.; Pikelnaya, O.; Maibach, J.; Feenstra, B.; Polidori, A.; Kiros, F.; Mills, K.C. Impact of California fires on local and regional air quality: The role of a low-cost sensor network and satellite observations. GeoHealth 2018, 2, 172–181. [Google Scholar] [CrossRef]
  9. Levy Zamora, M.; Xiong, F.; Gentner, D.; Kerkez, B.; Kohrman-Glaser, J.; Koehler, K. Field and laboratory evaluations of the low-cost Plantower particulate matter sensor. Environ. Sci. Technol. 2018, 53, 838–849. [Google Scholar] [CrossRef]
  10. Magi, B.I.; Cupini, C.; Francis, J.; Green, M.; Hauser, C. Evaluation of PM2.5 measured in an urban setting using a low-cost optical particle counter and a Federal Equivalent Method Beta Attenuation Monitor. Aerosol Sci. Technol. 2019, 54, 147–159. [Google Scholar] [CrossRef]
  11. Sayahi, T.; Butterfield, A.; Kelly, K.E. Long-term field evaluation of the Plantower PMS low-cost particulate matter sensors. Environ. Pollut. 2019, 245, 932–940. [Google Scholar] [CrossRef]
  12. Zusman, M.; Schumacher, C.S.; Gassett, A.J.; Spalt, E.W.; Austin, E.; Larson, T.V.; Carvlin, G.C.; Seto, E.; Kaufman, J.D.; Sheppard, L. Calibration of low-cost particulate matter sensors: Model development for a multi-city epidemiological study. Environ. Int. 2020, 134, 105329. [Google Scholar] [CrossRef] [PubMed]
  13. US EPA. 2017. Available online: https://www.epa.gov/air-sensor-toolbox/how-use-air-sensors-air-sensor-guidebook (accessed on 30 March 2022).
  14. Jayaratne, R.; Liu, X.; Ahn, K.H.; Asumadu-Sakyi, A.; Fisher, G.; Gao, J.; Mabon, A.; Mazaheri, M.; Mullins, B.; Nyaku, M.; et al. Low-cost PM2.5 sensors: An assessment of their suitability for various applications. Aerosol Air Qual. Res. 2020, 20, 520–532. [Google Scholar] [CrossRef]
  15. Bi, J.; Wallace, L.; Sarnat, J.A.; Liu, Y. Characterizing outdoor infiltration and indoor contribution of PM2.5 with citizen-based low-cost monitoring data. Environ. Pollut. 2021, 276, 116763. [Google Scholar] [CrossRef] [PubMed]
  16. Kaduwela, A.P.; Kaduwela, A.P.; Jrade, E.; Brusseau, M.; Morris, S.; Morris, J.; Risk, V. Development of a low-cost air sensor package and indoor air quality monitoring in a California middle school: Detection of a distant wildfire. J. Air Waste Manag. Assoc. 2019, 69, 1015–1022. [Google Scholar] [CrossRef] [PubMed]
  17. Klepeis, N.E.; Bellettiere, J.; Hughes, S.C.; Nguyen, B.; Berardi, V.; Liles, S.; Obayashi, S.; Hofstetter, C.R.; Blumberg, E.; Hovell, M.F. Fine particles in homes of predominantly low-income families with children and smokers: Key physical and behavioral determinants to inform indoor-air-quality interventions. PLoS ONE 2017, 12, e0177718. [Google Scholar] [CrossRef] [PubMed]
  18. Wallace, L.A.; Wheeler, A.; Kearney, J.; Van Ryswyk, K.; You, H.; Kulka, R.; Rasmussen, P.; Brook, J.; Xu, X. Validation of continuous particle monitors for personal, indoor, and outdoor exposures. J. Expo. Sci. Environ. Epidemiol. 2010, 21, 49–64. [Google Scholar] [CrossRef]
  19. Wallace, L.A.; Ott, W.R.; Zhao, T.; Cheng, K.-C.; Hildemann, L. Secondhand exposure from vaping marijuana: Concentrations, emissions, and exposures determined using both research-grade and low-cost monitors. Atmos. Environ. X 2020, 8, 100093. Available online: https://www.sciencedirect.com/science/article/pii/S2590162120300332?via%3Dihub (accessed on 9 January 2022). [CrossRef]
  20. Wang, K.; Chen, F.E.; Au, W.; Zhao, Z.; Xia, Z.-L. Evaluating the feasibility of a personal particle exposure monitor in outdoor and indoor microenvironments in Shanghai, China. Int. J. Environ. Health Res. 2019, 29, 209–220. [Google Scholar] [CrossRef]
  21. Zheng, T.; Bergin, M.H.; Johnson, K.K.; Tripathi, S.N.; Shirodkar, S.; Landis, M.S.; Sutaria, R.; Carlson, D.E. Field evaluation of low-cost particulate matter sensors in high-and low-concentration environments. Atmos. Meas. Tech. 2018, 11, 4823–4846. [Google Scholar] [CrossRef] [Green Version]
  22. Badura, M.; Batog, P.; Drzeniecka-Osiadacz, A.; Modzel, P. Evaluation of low-cost sensors for ambient PM2.5 monitoring. J. Sens. 2018, 2018, 5096540. [Google Scholar] [CrossRef] [Green Version]
  23. Klepeis, N.E.; Nelson, W.C.; Ott, W.R.; Robinson, J.; Tsang, A.M.; Switzer, P.; Behar, J.V.; Hern, S.; Engelmann, W. The National Human Activity Pattern Survey (NHAPS): A resource for assessing exposure to environmental pollutants. J. Expo. Sci. Environ. Epidemiol. 2001, 11, 231–252. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  24. Apte, J.S.; Brauer, M.; Cohen, A.J.; Ezzati, M.; Pope, C.A., III. Ambient PM2.5 reduces global and regional life expectancy. Environ. Sci. Technol. Lett. 2018, 5, 546–551. [Google Scholar] [CrossRef] [Green Version]
  25. Cohen, A.J.; Brauer, M.; Burnett, R.; Anderson, H.R.; Frostad, J.; Estep, K.; Balakrishnan, K.; Brunekreef, B.; Dandona, L.; Dandona, R.; et al. Estimates and 25-year trends of the global burden of disease attributable to ambient air pollution: An analysis of data from the Global Burden of Diseases Study 2015. Lancet 2017, 389, 1907–1918. [Google Scholar] [CrossRef] [Green Version]
  26. Hystad, P.; Larkin, A.; Rangarajan, S.; Al Habib, K.F.; Avezum, Á.; Calik, K.B.T.; Chifamba, J.; Dans, A.; Diaz, R.; du Plessis, J.L.; et al. Associations of outdoor fine particulate air pollution and cardiovascular disease in 157 436 individuals from 21 high-income, middle-income, and low-income countries (PURE): A prospective cohort study. Lancet Planet. Health 2020, 4, e235–e245. [Google Scholar] [CrossRef]
  27. Dockery, D.W.; Pope, C.A.; Xu, X.; Spengler, J.D.; Ware, J.H.; Fay, M.E.; Ferris, B.G.; Speizer, F.E. An association between air pollution and mortality in six U.S. cities. N. Engl. J. Med. 1993, 329, 1753–1759. [Google Scholar] [CrossRef] [Green Version]
  28. Özkaynak, H.; Xue, J.; Spengler, J.; Wallace, L.; Pellizzari, E.; Jenkins, P. Personal exposure to airborne particles and metals: Results from the Particle TEAM study in Riverside, California. J. Expo. Anal. Environ. Epidemiol. 1996, 6, 57–78. [Google Scholar]
  29. Kearney, J.; Wallace, L.; MacNeill, M.; Xu, X.; Van Ryswyk, K.; You, H.; Kulka, R.; Wheeler, A. Residential indoor and outdoor ultrafine particles in Windsor, Ontario. Atmos. Environ. 2011, 45, 7583–7593. [Google Scholar] [CrossRef]
  30. Wallace, L.; Williams, R.; Rea, A.; Croghan, C. Continuous weeklong measurements of personal exposures and indoor concentrations of fine particles for 37 health-impaired North Carolina residents for up to four seasons. Atmos. Environ. 2006, 40, 399–414. [Google Scholar] [CrossRef]
  31. Wallace, L.; Williams, R.; Suggs, J.; Jones, P. Estimating Contributions of Outdoor Fine Particles to Indoor Concentrations and Personal Exposures: Effects of Household and Personal Activities. APM-214; Office of Research and Development Research Triangle Park: Durham, NC, USA, 2006. [Google Scholar]
  32. Wallace, L.; Bi, J.; Ott, W.R.; Sarnat, J.A.; Liu, Y. Calibration of low-cost PurpleAir outdoor monitors using an improved method of calculating PM2.5. Atmos. Environ. 2021, 256, 118432. Available online: https://www.sciencedirect.com/science/article/abs/pii/S135223102100251X (accessed on 2 February 2022). [CrossRef]
  33. Helsel, D. Much ado about next to nothing: Incorporating nondetects in science. Ann. Occup. Hyg. 2010, 54, 257–262. [Google Scholar]
  34. International Organization for Standardization (ISO). Capability of Detection; Report No. ISO 11843-1; ISO: Geneva, Switzerland, 1997. [Google Scholar]
  35. Jiang, R.; Acevedo-Bolton, V.; Cheng, K.; Klepeis, N.; Ott, W.R.; Hildemann, L. Determination of response of real-time SidePak AM510 monitor to secondhand smoke, other common indoor aerosols, and outdoor aerosol. J. Environ. Monit. 2011, 13, 1695–1702. [Google Scholar] [CrossRef] [PubMed]
  36. Zhu, K.; Zhang, J.; Lioy, P. Evaluation and comparison of continuous fine particulate matter monitors for measurement of ambient aerosols. J. Air Waste Manag. Assoc. 2007, 57, 1499–1506. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  37. Liang, Y.; Sengupta, D.; Campmier, M.J.; Lunderberg, D.M.; Apte, J.S.; Goldstein, A.H. Wildfire smoke impacts on indoor air quality assessed using crowdsourced data in California. Proc. Natl. Acad. Sci. USA 2021, 118, e2106478118. [Google Scholar] [CrossRef] [PubMed]
  38. Delp, W.W.; Singer, B.C. Wildfire smoke adjustment factors for low-cost and professional PM2.5 monitors with optical sensors. Sensors 2020, 20, 3683. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Indoor and outdoor PM2.5 concentrations (N = 353,256) over 18 months using the ALT-CF3 algorithm. The three middle points centered on the median at 0 provide the interquartile range (25th and 75th percentiles).
Figure 1. Indoor and outdoor PM2.5 concentrations (N = 353,256) over 18 months using the ALT-CF3 algorithm. The three middle points centered on the median at 0 provide the interquartile range (25th and 75th percentiles).
Sensors 22 02755 g001
Figure 2. Same observations as in Figure 1 using the Plantower CF1 algorithm. Many measurements have been assigned a value of zero and cannot be shown on the logarithmic graph.
Figure 2. Same observations as in Figure 1 using the Plantower CF1 algorithm. Many measurements have been assigned a value of zero and cannot be shown on the logarithmic graph.
Sensors 22 02755 g002
Figure 3. Total observations remaining after applying an upper precision limit of 0.2 (20%).
Figure 3. Total observations remaining after applying an upper precision limit of 0.2 (20%).
Sensors 22 02755 g003
Figure 4. Percent of observations exceeding the LOD compared for the ALT-CF3 and Plantower CF1 algorithms. Monitor/Location shown on x-axis.
Figure 4. Percent of observations exceeding the LOD compared for the ALT-CF3 and Plantower CF1 algorithms. Monitor/Location shown on x-axis.
Sensors 22 02755 g004
Figure 5. Ratios of the ALT-CF3 and Plantower CF1 PM2.5 estimates with the co-located SidePak estimates for 17 sources. Error bars are propagated standard errors.
Figure 5. Ratios of the ALT-CF3 and Plantower CF1 PM2.5 estimates with the co-located SidePak estimates for 17 sources. Error bars are propagated standard errors.
Sensors 22 02755 g005
Table 1. Precision compared using ALT-CF3 and Plantower CF1 algorithms. Time period: 1 October 2019 to 14 January 2022.
Table 1. Precision compared using ALT-CF3 and Plantower CF1 algorithms. Time period: 1 October 2019 to 14 January 2022.
Valid NMeanStd. Err.Lower QuartileMedianUpper Quartile90th %TileMax
ALT-CF3 algorithm (using precision cutoff of 0.2)
Monitor 1 indoors763,1020.0640.0000550.0250.0530.0940.140.2
Monitor 2 indoors499,2960.0670.0000680.0270.0570.0970.140.2
Monitor 2 outdoors242,6630.0580.0000930.0210.0460.0840.130.2
Plantower CF1 algorithm (using ALT-CF3 cutoff of 0.2)
Monitor 1 indoors647,7570.1920.0003340.0340.0840.200.571
Monitor 2 indoors448,8670.3370.0004950.0720.2050.5111
Monitor 2 outdoors234,8140.2930.0006310.0650.1720.410.921
Plantower CF1 algorithm (using precision cutoff of 0.2)
Monitor 1 indoors486,6140.0670.0000740.0250.0550.100.150.2
Monitor 2 indoors224,8770.0810.0001180.0330.0710.130.170.2
Monitor 2 outdoors129,0810.0820.0001570.0330.0730.130.170.2
Table 2. Number of PM2.5 concentrations reported as zero by the Plantower CF1 algorithm for monitors 1 and 2 over a 3 year period and monitors 3 and 4 over an 18 month period.
Table 2. Number of PM2.5 concentrations reported as zero by the Plantower CF1 algorithm for monitors 1 and 2 over a 3 year period and monitors 3 and 4 over an 18 month period.
SensorLocationN Obs.N ZerosFraction = 0
1aIndoors815,558165,7320.20
1bIndoors817,696164,3990.20
2aIndoors530,78163,8670.12
2bIndoors558,322130,2630.23
4aIndoors406,05961,4440.15
4bIndoors406,06869,4350.17
2aOutdoors252,53210,3240.04
2bOutdoors253,43935,3740.14
3aOutdoors363,78623,5160.06
3bOutdoors363,78318,7570.05
Table 3. Variation of precision over time for monitors 1 and 2 (3 years) and 3 and 4 (18 months).
Table 3. Variation of precision over time for monitors 1 and 2 (3 years) and 3 and 4 (18 months).
3 Year Period (10 January 2019 to 14 January 2022)18 Month Period (18 June 2020 to 14 January 2022)
Monitor1 IN2 IN2 OUT3 OUT3 IN4 IN
LocationIndoorsIndoorsOutdoorsOutdoorsIndoorsIndoors
N763,102499,296242,663356,48442,204370,906
Intercept−0.28−0.330.61−0.271.60.1
SE (Int.)0.0070.0100.0400.0190.0390.022
Slope7.8 × 1069.0 × 106−1.2 × 1057.4 × 106−3.4 × 105−8.7 × 107
SE (slope)1.7 × 1072.3 × 1079.1 × 1074.3 × 1078.8 × 1074.8 × 10−7
R2 (adj.)0.00280.003190.000760.000820.0340.00006
SE of estimate0.0480.0480.0460.0420.0320.050
F-value2181159918629615003.2
z4740−1417−39−2
p-value000000.072
starting precision0.0600.0620.0830.0540.0580.068
ending precision0.0680.0720.0700.0580.0380.067
Relative annual increase (%)4.85.3−5.35.3−22.6−0.49
Table 4. PM2.5 LODs (µg/m3) calculated for the ALT-CF3 and Plantower CF1 algorithms. Number and percent of observations greater than the LOD.
Table 4. PM2.5 LODs (µg/m3) calculated for the ALT-CF3 and Plantower CF1 algorithms. Number and percent of observations greater than the LOD.
SensorLocationValid NCF3 LOD# Obs with CF3 > LOD% Obs with CF3 > LODCF1 LOD# Obs with CF1 > LOD% Obs with CF1 > LOD
1Indoors406,1080.99233,900582.9177,90844
2Outdoors253,4540.92203,384809.939,48716
2Indoors146,2290.72110,674763.244,28930
3Outdoors363,7970.6334,973924.4156,85043
4Indoors406,0921.32215,872535.379,37120
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Wallace, L. Intercomparison of PurpleAir Sensor Performance over Three Years Indoors and Outdoors at a Home: Bias, Precision, and Limit of Detection Using an Improved Algorithm for Calculating PM2.5. Sensors 2022, 22, 2755. https://doi.org/10.3390/s22072755

AMA Style

Wallace L. Intercomparison of PurpleAir Sensor Performance over Three Years Indoors and Outdoors at a Home: Bias, Precision, and Limit of Detection Using an Improved Algorithm for Calculating PM2.5. Sensors. 2022; 22(7):2755. https://doi.org/10.3390/s22072755

Chicago/Turabian Style

Wallace, Lance. 2022. "Intercomparison of PurpleAir Sensor Performance over Three Years Indoors and Outdoors at a Home: Bias, Precision, and Limit of Detection Using an Improved Algorithm for Calculating PM2.5" Sensors 22, no. 7: 2755. https://doi.org/10.3390/s22072755

APA Style

Wallace, L. (2022). Intercomparison of PurpleAir Sensor Performance over Three Years Indoors and Outdoors at a Home: Bias, Precision, and Limit of Detection Using an Improved Algorithm for Calculating PM2.5. Sensors, 22(7), 2755. https://doi.org/10.3390/s22072755

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop