Next Article in Journal
InSAR Reveals Land Deformation at Guangzhou and Foshan, China between 2011 and 2017 with COSMO-SkyMed Data
Previous Article in Journal
Rain Microstructure Parameters Vary with Large-Scale Weather Conditions in Lausanne, Switzerland
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Cloud Classification in Wide-Swath Passive Sensor Images Aided by Narrow-Swath Active Sensor Data

School of Electronics and Information Engineering, Beihang University, No. 37 Xueyuan Road, Beijing 100191, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2018, 10(6), 812; https://doi.org/10.3390/rs10060812
Submission received: 29 April 2018 / Revised: 16 May 2018 / Accepted: 18 May 2018 / Published: 23 May 2018
(This article belongs to the Section Atmospheric Remote Sensing)

Abstract

:
It is a challenge to distinguish between different cloud types because of the complexity and diversity of cloud coverage, which is a significant clutter source that impacts on target detection and identification from the images of space-based infrared sensors. In this paper, a novel strategy for cloud classification in wide-swath passive sensor images is developed, which is aided by narrow-swath active sensor data. The strategy consists of three steps, that is, the orbit registration, most matching donor pixel selection, and cloud type assignment for each recipient pixel. A new criterion for orbit registration is proposed so as to improve the matching accuracy. The most matching donor pixel is selected via the Euclidean distance and the square sum of the radiance relative differences between the recipient and the potential donor pixels. Each recipient pixel is then assigned a cloud type that corresponds to the most matching donor. The cloud classification of the Moderate Resolution Imaging Spectroradiometer (MODIS) images is performed with the aid of the data from Cloud Profiling Radar (CPR). The results are compared with the CloudSat product 2B-CLDCLASS, as well as those that are obtained using the method of the International Satellite Cloud Climatology Project (ISCCP), which demonstrates the superior classification performance of the proposed strategy.

Graphical Abstract

1. Introduction

The cloud represents one of the largest modulators of earth’s radiation and is a significant and complex clutter source that impacts on space-based infrared sensors [1,2,3,4]. Cloud effects are complicated and depend on the coverage fraction, type, phase, and spatial-temporal variations in vertical structure and horizontal distribution [5]. The detection and classification of cloud types are an essential issue in remote sensing applications [4,6,7,8,9]. However, cloud classification is still one of the most difficult issues in spaceborne earth observation models, because cloud performances are not only significantly different for a variety of cloud types, but they are also impacted by multilayer clouds, which occur frequently, leading to significant errors in estimating the effects of clouds [5,6,10,11,12,13,14,15,16,17,18,19,20]. Cloud detection has been paid enough attentions, whereas recognizing different types of clouds has not been really focused on.
Passive observation data sets from satellites and ground-based observers have been widely used to study cloud types and cloud layer overlap [1,2,8,20]. The International Satellite Cloud Climatology Project (ISCCP) has focused its attention on the geographical distributions and long-term variations of cloud types [21,22,23,24,25]. Based on the typical single-layer cloud assumption that there is only a single layer cloud in the field of view (FOV), ISCCP classifies clouds into nine types by setting different thresholds on cloud top pressure (CTP) and cloud optical thickness (COT) [21,22,23,25]. Multilayer cloud detection has been studied by a few researchers, based on surface weather reports and measurements from ground-based cloud radar and passive sensors [12,19,20,26,27]. However, passive sensors are generally not able to penetrate the clouds and classify multilayer clouds effectively, without providing the vertical overlap structure of cloud layers [12,19,20,28]. Thus, clouds are roughly classified as single and multilayer, without more specific types.
Considering the limitation of only using passive sensors such as the Moderate Resolution Imaging Spectroradiometer (MODIS), active sensors like the Cloud Profiling Radar (CPR) on CloudSat [29,30,31] have been regarded as the most powerful tools for characterizing of the cloud coverage and the three-dimensional (3D, across-track, along-track, and height) cloud structures [5,17,27,32,33,34,35,36]. CloudSat product data, such as 2B-GEOPROF, 2B-CLDCLASS, have been widely used to investigate the 3D distribution and structures of clouds [6,7,17,35]. 2B-CLDCLASS provides eight cloud types, including high clouds (High) (cirrus and cirrostratus), altostratus (As), altocumulus (Ac), stratus (St), stratocumulus (Sc), cumulus (Cu), nimbostratus (Ns), and deep convective clouds (Deep). Nevertheless, since these active sensors cannot implement across-track scanning, measurement data are provided along a narrow, nadir or nearly nadir, two-dimensional (2D, along-track, and height) track [12,23,26,27].
To deal with their respective limitations, the combination of passive and active sensors is a promising way to classify the 3D distributions of cloud types [23,31,37,38]. However, the collaboration usage of data from different sensors is a challenge, as different sensors always have different spatial resolutions within an interval between their passing time [39]. In this case, orbit registration is implemented in the combination, so as to seek the corresponding pixels for data from different sensors. Data from MODIS on Aqua and CPR on CloudSat are frequently adopted and are able to integrate, because Aqua leads CloudSat by just about 55 s, providing nearly simultaneous and collocated cloud observations [31,40,41].
In this paper, a novel strategy is proposed to classify clouds, including single-layer and multilayer clouds, in wide-swath passive sensor images, with the aid of narrow-swath active sensor data, following the similar radiance matching (SRM) hypothesis [18,33] that any two cloudy pixels can be assumed to be those with the same type, if they are in proximity to one another and have a sufficiently similar radiance. The pixels in the passive sensor images and columns on the active sensor track are, respectively, called recipient and potential donor pixels. The strategy consists of orbit registration, selection of the most matching donor pixel, and cloud type assignment for each recipient pixel. A new criterion for orbit registration is proposed to improve matching accuracy. The most matching donor pixel is selected via the Euclidean distance and the square sum of the radiance relative differences between the recipient and the potential donor pixels. Then, each recipient pixel is assigned a cloud type corresponding to the most matching donor pixel.
The remainder of this paper is organized as follows. Section 2 makes a brief description of sensors and data used in the context. The SRM hypothesis is demonstrated in Section 3. Section 4 details the process of the cloud classification, including the proposed orbit registration, selection of the best matching donor pixel, and the cloud type assignment for each recipient pixel. The results and analyses are presented in Section 5, including the hypothesis analysis, orbit registration, and cloud classification results. Section 6 discusses the accuracy of the proposed cloud classification strategy. Section 7 concludes the paper. The boundary selection of the orbit registration criterion is presented in Appendix A.

2. Sensors and Data

2.1. Sensors

2.1.1. MODIS

The MODIS [34,41] was a whisk-broom scanning imaging radiometer that has been loaded on the Aqua satellite, with a mean orbit altitude of 705 km and measured the radiances at 36 spectral bands, ranging from 0.4 μm to 14.24 μm, with three spatial resolutions of 250 m, 500 m, and 1 km. It covered 1354 km across track with a scan angle of up to 55° and obtained 2030 km along track at nadir, during five minutes. The MODIS swath during five minutes was called a granule and was stored as one data file. Images and data product, including the geographic product MYD03 and the cloud properties product MYD021KM and MYD06, were able to be browsed and downloaded from MODIS website [13,14,42].

2.1.2. CPR

The CPR was a 94-GHz nadir-looking radar aboard the CloudSat [40,43], which closely trailed the Aqua by an average delay of about 55 s, and led to continuous near-coincident measurements. A single CPR profile had a horizontal footprint of approximately 1.3 km × 1.7 km (across track × along track) and consisted of 120 vertical sampling bins for a 30 km vertical observation window, which resulted in a vertical resolution of 250 m [34,43]. As it movd along the track, the CPR detected clouds and achieved a detailed vertical structural distribution of various cloud types, for any profile. The available CloudSat products included cloud property products such as 2B-CLDCLASS [7,40] and auxiliary product like MODIS-AUX [41,44].

2.2. Data

The data used in this work were MYD06 [13,14], MYD021KM [42], and MYD03 [42] from MODIS, as well as MODIS-AUX [44] and 2B-CLDCLASS [7] from CPR [40]. The data from MODIS and CPR was available at http://modis-atmos.gsfc.nasa.gov/ and http://www.cloudsat.cira.colostate.edu/, respectively. Figure 1 shows the concept diagram of the proposed strategy. The flowchart has been detailed in the following subsections.
In this work, the proposed classification strategy was applied to the MODIS scene, observing from 02:40 to 02:45, on 1 June 2015, as shown in Figure 2 [42].

2.2.1. MODIS Data

● MYD06
The MYD06 [13,14] provided a cloud mask, CTP, COT, brightness temperature (BT), and multilayer flag (MLF) to the method of ISCCP [21,24,25]. ISCCP [21,24,25] used the thresholds of CTP and COT to classify the clouds into cumulus (Cu), stratocumulus (Sc), stratus (St), altocumulus (Ac), altostratus (As), nimbostratus (Ns), cirrus, cirrostratus, or deep convective clouds (Deep). The MODIS cloud mask [14] was stored as decimal integers, which classified pixels into four classes, that is, cloudy, clear, probably clear, and uncertain. The types of pixels were determined by comparing the bit values with each flag bit-by-bit after transferring the decimal integers into binary bit values. In this work, the pixels, excepting the clear pixels, were classified roughly as cloudy pixels, so as to develop further classification.
● MYD021KM
The level 1 calibrated radiances [42] on four bands [33] (band1: 0.62–0.67 μm, band7: 2.105–2.155 μm, band29: 8.4–8.7 μm, band32: 11.77–12.27 μm) were involved in the proposed cloud classification strategy.
● MYD03
MYD03 [42] were used to offer the MODIS latitude and longitude at 1 km resolution for the orbit registration in Section 4.1. The boundary selection of the orbit registration criterion was validated using the MYD03 data of the time span from day 152 to 177, in 2015.

2.2.2. CPR Data

● MODIS-AUX
The latitude and longitude from MODIS-AUX [44] were used to provide the most matching MODIS latitude and longitude for each CPR pixel on the track, in order to validate the effectiveness of the orbit registration.
● 2B-CLDCLASS
2B-CLDCLASS [7,40] was used to offer the CPR latitude and longitude in the orbit registration. 2B-CLDCLASS was also used to provide the vertical bin height and cloud scenario, so as to verify the SRM hypothesis. The vertical bin height was used to calculate the cloud top height (CTH) and cloud base height (CBH). The cloud scenario [7,40] was in the form of 16-bit integer and it was required to be binary. Afterwards, cloud-type was determined through bit-by-bit, by comparing the binary number with the cloud flag. The 2B-CLDCLASS data that was measured in May 2015 for the region around China (0–60° N, 70–140° E) were randomly selected to verify the SRM hypothesis.

3. Similar Radiance Matching Hypothesis

To verify the similar radiance matching (SRM) hypothesis, the homogeneity or similarity of each cloud type [18,45] was analyzed. Four statistical parameters were calculated for both the recipient and the donor pixels with the same type, that is, the standard deviations (SDs) of the cloud base height differences (CBHD) and the cloud top height differences (CTHD), and the means of the CBHD and CTHD. Figure 3 illustrates the flowchart of the statistical analysis process, which consisted of three major steps.
Firstly, the study area and corresponding CloudSat orbits were chosen, while 2B-CLDCLASS [7,40] were downloaded, so as to provide cloud height and cloud scenario. Secondly, the pixels were classified according to the cloud types from the 2B-CLDCLASS cloud scenario. Finally, for an arbitrary cloud type, the distance of the pixels and the SDs of the CBHD and CTHD were calculated. The homogeneity or similarity of each cloud type was determined by following the statement in [18,45], that is, if fitted curves of the SDs of the CBHD and CTHD had smaller variances with distance, it indicated that the clouds shared the same geometric, microphysical, and radiative properties. Similar to [18,45], the binned values were calculated on the SDs of the CBHD and CTHD, and the mean of the CBHD and CTHD at a set of distances between any two pixels with the same cloud type. Their dependences on distance were analyzed by these binned values. Note that the CBH and CTH in this work were the base height of the lowest cloudy bin and the top height of the highest cloudy bin, respectively, in a single CPR profile [7,40].
To further illustrate the statistical analysis process, the pixels of a CPR orbit were assumed to distribute, as shown in Figure 4. Each circle stood for one pixel on the CPR orbit, while the pixels with the same cloud type were shown in the orange circles. The height distribution of the orange pixels was denoted using the rectangles. ΔlCPR was the distance between the two nearest CPR pixels. A pixel pair was composed of two orange pixels. The distance of each pixel pair was calculated. For instance, the distance d of the pixel pair (2,3) was 2 ΔlCPR. The SDs of the CBHD at each distance d were calculated. All of the pixels on the orbit track in the study region were treated for each cloud type, using the aforementioned process. In the same manner, the SDs of the CTH differences, the mean of CBH differences, and the mean of CTH differences were obtained.

4. Cloud Classification Strategy

A novel cloud classification strategy was proposed based on the SRM hypothesis [28,33]. The strategy consisted of three major steps, that is, the orbit registration, selection of the most matching donor pixel, and cloud type assignment for each recipient pixel, as shown in Figure 5.
In Figure 5, the input was the passive sensor images and data of the active sensor profiles, while the output was the cloud-type classification image of the recipient pixels. The cloud types in the passive sensor images were determined by ISCCP and cloud mask of MYD06. The cloud types of the active sensor profile were from 2B-CLDCLASS. Only when the number of types in the images of the active and passive sensors were not equal, the cloud types of the passive sensor were used for the pixels whose types were not included by the active sensor granule. According to the distribution of cloud types [6,11], cloud occurrence frequency [17], and the horizontal dimension of each type cloud [46], there was a higher than 80% probability for the equal number of cloud types in the active and passive sensor images. There was about 20% probability that the types of the recipient pixels were not included in the active sensor granule and the types from the passive sensor were used.
The first two steps have been detailed in the following subsections. The third step was a simple assignment process, that is to say, the cloud type of the donor pixel was used as that of the recipient pixel. Consequently, multilayer cloud classification was also achieved, together with the overlapped type structure.
The relationship of the relative pixels is presented in Figure 6. In Figure 6, the line denoted the track of the active sensor, while the grids were pixels of the passive sensor. Each green grid represented the corresponding most matching pixel ( i , 0 ) , from the passive sensor for a pixel of the active sensor along the track. The grid in green was also the potential donor pixel, which was denoted as pixel ( m , 0 ) , ( m , 0 ) { ( i , 0 ) , m i } . The grid in black stood for an off-track recipient pixel ( i , j ) .

4.1. Orbit Registration Process

The cost function and process that were provided in our previous work [38] were used and detailed in this subsection, where a rough-to-fine process was implemented step-by-step so as to register the pixel-by-pixel orbit between the wide swaths of the passive sensors and the narrow traces of active sensors. A new criterion was proposed to improve the matching accuracy.
Figure 7 plots a MODIS granule and the overlapped CPR track to demonstrate the process, where the sparse sampling pixels of the granule were plotted to reveal the details. The circles represented the original MODIS granule. The overlapped asterisks were the selected MODIS granules. The dotted line denoted the truncated CloudSat track.
The process was composed of three major steps. Firstly, according to the ranges of the latitude and longitude from the MODIS granule, the CPR orbit track was truncated drawn as the dotted line, in Figure 7. Secondly, the ranges of the latitude and longitude from the truncated CPR track were calculated. The constrained MODIS region, shown as the green region in Figure 7, was selected by the range of the overlapped truncated CPR track. The pixels in the selected region were potential MODIS pixels. Thirdly, there was a most matching pixel for each CPR pixel along the truncated track and it was sought out from the potential MODIS pixels in the selected MODIS region. The third step was repeated until the most matching MODIS pixel on the track was found for each CPR pixel.
For each CPR pixel on the truncated track, the most matching MODIS pixels were determined from the pixels in the selected green region, shown in Figure 7, using the square deviation δ d i f given by [47] or by our proposed cost function δ e r r in [38], as follows:
δ d i f = ( φ M l a t φ C S l a t ) 2 + ( λ M l o n g λ C S l o n g ) 2 ,
δ e r r = ( φ M l a t φ C S l a t φ C S l a t ) 2 + ( λ M l o n g λ C S l o n g λ C S l o n g ) 2 ,
where δ d i f denotes the sum of the squared absolute errors, δ e r r denotes the sum of the square relative errors, φ M l a t and λ M l o n g are thegeodetic latitude and longitude of the MODIS pixel in degrees, respectively, φ C S l a t and λ C S l o n g are geodetic latitude and longitude of the CPR pixels in degrees, respectively.
The pixel with the smallest δ d i f was determined as the most matching pixel on the track when Equation (1) was used.
Figure 8 plots the most matching MODIS pixels as well as the truncated CPR track, where the dotted line was the track of CPR pixels, and the solid line denoted the track of the most matching MODIS pixels using Equation (1). It was found that δ d i f was the same between a single MODIS pixel and several CPR pixels along track when Equation (1) was used for higher latitude domain (e.g., latitude larger than 50°). This phenomenon was obvious for the two ends of the truncated CPR track, as shown in Figure 8.
To deal with such a phenomenon, the most matching MODIS pixel on the track was sought out using Equation (2) with a given threshold of 10–8 [38], which depended on the average order of magnitude for the relative differences of latitudes and longitudes between the two nearest MODIS pixels. The pixel whose δ e r r was the smallest and below the specified threshold wasdetermined as the most matching MODIS pixel on the track.
A large number of experiments were carried out on the boundary selection of Equations (1) and (2) in the whole latitude domain. In this work, a new criterion was proposed where Equation (1) was used to seek the most matching MODIS pixel on the track, when the latitude was not larger than 40°, otherwise Equation (2) was used. The details have been presented in Appendix A.

4.2. Most Matching Donor Pixel Selection

For the second step of the classification processing, shown in Figure 5, the most matching donor pixel was selected for each recipient pixel, based on the control function of the radiance and the Euclidian distance. The pixels in the passive sensor images and columns on the active sensor track were called the recipient and potential donor pixels, respectively. The radiances and position information of the passive pixels that collocated with the active sensor track were used as those of potential CPR donor pixels. The cloud type of the most matching CPR donor pixel was finally used as the cloud type of the recipient pixel.
As shown in Figure 6, for each recipient pixel ( i , j ) , { i , j [ J , 1 ] [ 1 , J ] } in the passive sensor granule, the most matching donor pixel was searched along the track of active sensor, by calculating the control function in [28,33,48], as follows:
F ( i , j ; m ) = k = 1 K ( r k ( i , j ) r k ( m , 0 ) r k ( i , j ) ) 2 ; m [ i m 1 , i + m 2 ] ,
where r k ( i , j ) and r k ( m , 0 ) denote the radiances of the recipient pixel ( i , j ) and the potential donor pixel ( m , 0 ) at the kth spectral band, respectively. K is the number of spectral bands. The essence of F is the square sum of the radiance relative errors over the K spectral bands. Four MODIS bands are used with m 1 = m 2 = 200 [33]. m is the number of the potential donor pixel ( m , 0 ) along the CPR track, with the range of m [ i m 1 , i + m 2 ] .
The Euclidian distance between the recipient pixel ( i , j ) and the potential donor pixel ( m , 0 ) is denoted as D ( i , j ; m ) and calculated by the following:
D ( i , j ; m ) = Δ L ( i m ) 2 + j 2 ,
where Δ L is the horizontal resolution of the passive sensor, here Δ L = 1   km for MODIS.
Then F ( i , j ; m ) is ordered as an ascending sequence A i n d e x ,
A i n d e x = { min F ( i , j ; m ) , , max F ( i , j ; m ) } ; m [ 1 , m 1 + m 2 + 1 ] ,
The nearest CPR pixel to the recipient pixel was selected as the most matching donor from the first 3 % of pixels [33] in the ascending sequence A i n d e x .
When clouds were classified on the recipient pixels, which were farther away from the track, more potential donor pixels were searched so as to ensure the validity of the matching. Thus, parameters m 1 and m 2 are given by the following:
m 1 = m 2 = { 200 ; D m 30   km 200 + | D m / Δ L | ; D m > 30   km ,
where D m denotes the distance between the recipient pixel and the track of the active sensor. Since pixel ( i , 0 ) is the matching MODIS pixel for the CPR pixel on the track, D m = j × Δ L is for the recipient pixel ( i , j ) .
Finally, the cloud type of the most matching donor pixel was used. The process was repeated until the cloud types of all of the recipient pixels had been determined.

5. Results

5.1. SRM Hypothesis Analysis Results

Referring to the characteristics of the eight cloud types that were defined in the CloudSat 2B-CLDCLASS algorithm [46], Cu was isolated in about a 1 km horizontal dimension. The Deep and St were in moderate 10 km and 100 km horizontal dimensions, respectively. Other type clouds were in a comparatively large 103 km horizontal dimension. The pixels whose distances were within 500 km were counted in the statistics for overall consideration.
Table 1 lists the means and SDs of CTH and CBH for the eight cloud types that were selected in this study [7,48].
In Table 1, since the CBH was equal to the CTH minus cloud thickness, the means of CBH were the means of CTH, minus the means of the cloud thickness in [48]. The SDs of the CBH in Table 1 were the means of the SDs of the CBH from [7], and the SDs of the CBH were calculated by the SDs of the CTH and the cloud thickness from [48], using variance arithmetic property.
The SDs of the CTHD and CBHD and the means of the CTHD and CBHD in bins of 20 km were calculated. Statistics in the binned values were shown by bars and curves in Figure 9 and Figure 10. Figure 9a,b showed the SDs of CBHD, along with the distances between the recipient and donor pixels in May 2015, in the region 0–60° N and 70–140° E. Figure 9c,d showed the means of the CBHD, where (a) and (c) were for Ci, As, Ac, and St, and (b) and (d) were for Sc, Cu, Ns, and Deep. Figure 10 is the same as Figure 9, except for the CTHD.
In Figure 9, St existed just in a small range of about 20 km, because of its natural properties, that is, St was sparse with a small horizontal dimension. The SDs and means of the CBHD for St were smaller than 0.5 km, less than one standard deviation of St CBH. The SDs of CBHD of any cloud type, except for Deep, were smaller than 1 km. When the distances were less than 400 km, the SDs of CBHD of Cu, Sc, and Ns were less than 0.5 km, simultaneously the mean values of the CBHD for Cu, Sc, and Ns were smaller than 1 km. When the distances were less than 400 km, the SDs and means of the Deep were less than 1.5 km. Similar to the statistic results in [18], their variations were less than one standard deviation for each type in Table 1, which led to a local observation that the CBH was a generally good predictor of the bases of other corresponding clouds in the region.
Similar conclusions were obtained in Figure 10. All of the SDs of the CTHD, except for Deep, were smaller than 1.2 km when the distances were less than 300 km. When distances were less than 300 km, the SDs and means of the CTHD for Ci, As, and Ac were less than their respective standard deviations. Most of the means of the CTHD for Sc and Ns were within the respective one standard deviation. However, the mean values of the CTHD for Cu and Deep were relatively large, which justified the local observation that their CTH was less representative of the surrounding clouds of the same type. When the distances were less than about 300 km, all of the SDs of the CTHD and the means of CTHD, except for the Deep, were consistent with the range of the standard deviation of the CTH listed in Table 1.
Consequently, a local observation of CBH and CTH, in the distance range 300 km, provided a generally good predictor of the CBH and CTH of other clouds with the same type in the region. Similar conclusions could be obtained from the results for the other regions, which were not presented in the manuscript for the conciseness. Thus, it was feasible that any two cloudy pixels could be assumed to be pixels with the same cloud type, if they were close to each other and had sufficiently similar radiances.

5.2. Orbit Registration

The estimated most matching pixels according to Equation (2) have been shown in Figure 11 and compared with pixels of the truncated CPR track. The dotted line denoted the set of CPR track pixels, and solid line was the set of the estimated most matching MODIS pixels using Equation (2). From Figure 11, it can be seen that the orbit registration result using Equation (2) coincided well with the CPR track data.
The estimated latitudes and longitudes were compared with that of the MODIS-AUX data [41,42] from the CloudSat product. Figure 12 compares the estimated latitudes and longitudes with those from MODIS-AUX for the granule, shown in Figure 7. Figure 12a,b are the scattered pixels distribution and relative differences, respectively, between the estimated latitudes and MODIS latitudes from MODIS-AUX. Figure 12c,d are scattered pixels distribution and relative differences, respectively, between the estimated longitudes and MODIS longitudes from MODIS-AUX.
It can be seen in Figure 12a that the estimated latitudes coincided well with the MODIS latitudes from the MODIS-AUX data. Figure 12b shows that the relative differences were below 1.5 × 10–4, which were lower than about 73% of all of the relative errors of the latitudes between the two neighboring CPR pixels. Similar conclusion could be reached in Figure 12c, where the estimated longitudes coincided well with the MODIS longitudes from the MODIS-AUX data. Figure 12d shows that the relative differences were below 2 × 10–4, which were lower than about 78% of all of the relative errors of longitudes between two neighboring CPR pixels. It could be seen that, for the range shown in Figure 12, the estimated results coincided well with that of the MODIS-AUX data, whatever the latitude or longitude was. The similar conclusions could be obtained by analyzing the results of other data, which were not presented in the manuscript for the conciseness.

5.3. Cloud Classification

The proposed strategy was applied to 60 pixels in both cross-track directions on the randomly selected MODIS scene, from 02:40 to 02:45 on 1 June 2015, which resulted in a classified cloud field covering over about 121 km cross track and 2030 km along track. To validate the proposed classification strategy, the classified results were compared with those that were obtained by ISCCP [21,24,25], as well as the CloudSat product 2B-CLDCLASS [7,40,43,48]. The comparisons have been given in Figure 13, where (a) was cloud-type classification result of the proposed strategy, (b) was obtained by the method of the ISCCP [21,24,25], and (c) showed the cloud types on the track from the CloudSat product 2B-CLDCLASS [7,40]. The arrowhead indicated the CPR motion direction in (c). The different cloud types were labelled with different colors. Note that Ci and Cs were not able to be distinguished since the CloudSat just classified them as the high cloud class.
Comparing Figure 13a with 13b, multilayer clouds were identified by the proposed strategy. However, the multilayer clouds were not able be distinguished by the method of ISCCP, based on the single-layer assumption. It can be seen from Figure 13c that the multilayer clouds (Ci/Cs over low level clouds, MLice) appeared in the spatial spans from the number of profile/pixel (Nop) along the ground track 800 to 700 and Nop 100 to 1. The pixels that were located in the same spatial spans in Figure 13a were classified into MLice by the proposed strategy, which demonstrated the correctness for multilayer cloud classification. Furthermore, comparing Figure 13a with 13c, it was seen that the proposed strategy could also classify clouds off the CPR track.
The pixels that were located at the bottom and the upper part of Figure 13a,b were classified by both of the methods into Sc and Ci clouds, respectively. The textures of clouds were similar on the whole. A few discrepancies existed because deviations were probably introduced using the ISCCP method, where the cloud characteristics were retrieved with the plane-parallel atmosphere system. For instance, the pixels in the spatial spans from Nop 1600 to 1400, from Nop 1400 to 1300, and from Nop 1300 to 1200 were correctly classified by the proposed strategy and the 2B-CLDCLASS as Ac, As, and Ac, respectively, whereas inconsistent types were determined by ISCCP.
There were about four major spatial spans in Figure 13a,c with consistent classified cloud types, that is, from Nop 1600 to 1200, from Nop 800 to 700, from Nop 800 to 100, and from Nop 100 to 1. Most of the pixels on the track were classified into the same types by both of the proposed strategy and 2B-CLDCLASS.

6. Discussion

To quantitatively illustrate the classification performance of the proposed strategy, the classified results were analyzed by comparing pixel-by-pixel with cloud types from CloudSat product 2B-CLDCLASS [7,40] for the recipients on the track. Figure 14 compared the results of the proposed strategy with the data from 2B-CLDCLASS, under the condition that the cloud types of the potential CPR pixel with the same position information were probably used as those of the recipient pixels. Figure 14a plots pixel-by-pixel cloud types that were obtained by the proposed strategy and the types that were given in 2B-CLDCLASS, where the horizontal axis was the number of pixels on the CPR track, while the vertical axis presented the cloud types, that is, clear, Ci/Cs, As, Ac, St, Sc, Cu, Ns, Deep, multilayer ice cloud (MLice), multilayer water cloud (MLwater), and filled value (Fill). Figure 14b shows the statistical distribution of the different clouds by the proposed strategy and 2B-CLDCLASS.
From Figure 14a,b, it could be seen that the classification results in this work on the CPR track coincided well with the data that was given in 2B-CLDCLASS. The overall matching degree was about 96.8%. Very few pixels were categorized into different types.
Similar to Figure 14, Figure 15 presents the results under the condition where the cloud types of no potential CPR pixel with the same position information were used as those of the recipient pixels. In this case, the matching degree was about 86%. Note that the condition that was used in Figure 15 was universal, which was applicable to the recipient pixels off the track. The same matching degree was expected when similar condition as Figure 15 were set. The matching degree depended on the number of cloud types along the CPR track. The more abundant the cloud types, the smaller the classification error. The strategy was also applicable to other MODIS scenes. Similar conclusions could be obtained from the results for the other regions, which were not presented in the manuscript for the conciseness. According to the SRM hypothesis analysis in Section 5.1, 300 km were reasonable for being selected as the farthest distance range of the recipient pixels that were away from the CPR track.

7. Conclusions

This paper proposes a strategy to classify both single-layer and multilayer clouds in wide-swath passive sensor images with the aid of narrow-swath active sensor data, following the SRM hypothesis. Different from the classification that is based on the single-layer hypothesis that is used by ISCCP, multilayer cloud detection is also achieved, together with overlapped type structure. Data from MODIS and CPR are used to validate the proposed strategy. A rough-to-fine and pixel-by-pixel orbit registration process is implemented to enable the combination of the passive and active sensors. A new criterion for the orbit registration is proposed so as to improve the matching accuracy. Comparative results show that the estimated latitudes and longitudes coincide well with the latitudes and longitudes from the MODIS-AUX data, with relative differences being below 1.5 × 10–4 and 2 × 10–4, respectively. The proposed classification strategy is applied to the MODIS scene covering 121 km × 2030 km. The classification results of the proposed strategy coincide well with the data in 2B-CLDCLASS, with an overall accuracy of about 86%. It is concluded that the proposed strategy exhibits a superior performance for cloud classification in remote sensing images.

Author Contributions

H.W. performed the studies, designed the algorithms and the experiments, and wrote the manuscript. X.X. proposed the work, revised the manuscript, and gave specific advice during the studies.

Acknowledgments

The authors would like to thank the MODIS and CloudSat science team for providing excellent and accessible data products that made this study possible. The authors also would like to express their sincere gratitude to the anonymous reviewers for their valuable comments and suggestions, which led to great improvement of the authors’ work and the original manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Boundary Selection of the Orbit Registration Criterion

A detailed description of the boundary selection of the orbit registration criterion is presented. The first orbit of CPR on 13 July 2013 has been randomly chosen to present orbital characteristics. Figure A1 gives the relationship curves of the latitude and longitude differences for the two nearest pixels with latitudes. In Figure A1, the latitude differences and longitude differences are plotted in the red circles and blue points, respectively.
As it can be seen when the latitudes are not higher than about 30°, the difference in the curves are nearly parallel to the X-axis. In detail, the CPR ground track is linear, the latitude differences between the two nearest pixels are a constant for latitudes less than about 45°, while the longitude differences between the two nearest pixels are a constant for latitudes less than about 30°. However, it is not the case when the latitudes are higher than 30°. The MODIS ground track has the same property. Consequently, according to the error theory, the pixels are able to be considered as the same measurand and the absolute error is suitable to be used when their latitudes are less than about 30°. The pixels need to be considered as different measurands and the relative error is suitable to be used when their latitudes are higher than about 30°.
To further validate boundary selection of the orbit registration criterion, more analyses are given about the orbital characteristics of CloudSat and Aqua satellites.
Figure A1. Relationship curves of the latitude and longitude differences for the two nearest pixels with latitudes, where circles are for the latitude differences, and points are for the longitude differences.
Figure A1. Relationship curves of the latitude and longitude differences for the two nearest pixels with latitudes, where circles are for the latitude differences, and points are for the longitude differences.
Remotesensing 10 00812 g0a1
Along a meridian, a change d φ in geodetic latitude corresponds to an elliptical arc d L M , while along a parallel, a change d λ in longitude corresponds to a circular arc d L P , as (A1) and (A2) computed [40,42,49],
d L M = ρ d φ ,
d L P = N cos φ d λ ,
with
ρ = N 1 e 2 1 e 2 sin 2 φ ,
N = a 1 e 2 sin 2 φ ,
where ρ is the curvature of meridian, N is the great normal, φ is the geodetic latitude, λ is the geodetic longitude. e is eccentricity, e = 0.000115 for CloudSat, e = 0.000187 for Aqua, a is semi-major axis, a = 7077.718262 km for CloudSat, a = 7077.797363 km for Aqua [40,42,49].
Figure A2 shows the difference curves of the orbital characteristics for Aqua and CloudSat. In Figure A2, both d L M and d L P of Aqua are different from those of CloudSat in an insignificant amount. Thus, the differences of the orbital characteristics can be ignored.
Figure A2. Difference curves about the orbital characteristics of Aqua and CloudSat, as follows: (a) d L M difference of Aqua and CloudSat with latitude, when d φ = 1 ° ; and (b) d L P difference of Aqua and CloudSat with latitude, when d λ = 1 ° .
Figure A2. Difference curves about the orbital characteristics of Aqua and CloudSat, as follows: (a) d L M difference of Aqua and CloudSat with latitude, when d φ = 1 ° ; and (b) d L P difference of Aqua and CloudSat with latitude, when d λ = 1 ° .
Remotesensing 10 00812 g0a2
As (A1) and (A2) have shown, d L P and d L M are in proportion to longitude difference d λ and latitude difference d φ , respectively.
Figure A3 gives statistical charts of the latitude differences and longitudes difference between the two nearest pixels along the CloudSat ground track, the 48,817 granule from 2015 was randomly chosen. As Figure A3 has shown, 95% latitude differences between the two nearest pixels along CloudSat ground track are lower than 0.0096 degree, and 95% longitude differences between the two nearest pixels along CloudSat ground track are lower than 0.0315 degree. Figure A4 plots (a) 0.5 ( d L M + d L P ) , (b) d L M 2 + d L P 2 , and (c) d L M 2 + d L P 2 / φ with 95% latitude and longitude differences of the nearest two pixels along the CloudSat ground track, d φ = 0.0096 ° , d λ = 0.0315 ° .
It can be seen from Figure A4 that as the latitude increases, the d L M + d L P decreases for the same d λ and d φ , and the corresponding d λ and d φ are increasing for the same d L M + d L P . Considering that the influence of d L M + d L P on d λ and d φ varies with the latitudes, the influences of latitude and longitude need to be introduced into the orbit registration. Inflexion point of d L M 2 + d L P 2 versus φ emerges approximately at a latitude of 45°. That is to say, the property of d L M 2 + d L P 2 is different for φ > 45 ° and φ < 45 ° . When φ 40 ° , the variation of d L M 2 + d L P 2 as a function of φ , is less than 1 km, which is less than the resolution a single MODIS pixel. On the other hand, it is found that when φ > 40 ° , the variation curve of d L M 2 + d L P 2 / φ as a function of φ , tends to be parallel. By overall consideration, it is reasonable to use Equation (1) to seek the most matching MODIS pixel when φ 40 ° , whereas Equation (2) is used when φ > 40 ° .
Figure A3. Estimated cumulative density function (CDF) for the latitude differences and longitude differences between the two nearest pixels along CloudSat ground track randomly chosen, the 48,817 granule from 2015, (a) latitude difference and (b) longitude difference.
Figure A3. Estimated cumulative density function (CDF) for the latitude differences and longitude differences between the two nearest pixels along CloudSat ground track randomly chosen, the 48,817 granule from 2015, (a) latitude difference and (b) longitude difference.
Remotesensing 10 00812 g0a3
Figure A4. CloudSat orbital characteristics curves with latitude. (a) 0.5 ( d L M + d L P ) , (b) d L M 2 + d L P 2 , and (c) d L M 2 + d L P 2 / φ with 95% latitude and longitude differences of the nearest two pixels along the CloudSat ground track.
Figure A4. CloudSat orbital characteristics curves with latitude. (a) 0.5 ( d L M + d L P ) , (b) d L M 2 + d L P 2 , and (c) d L M 2 + d L P 2 / φ with 95% latitude and longitude differences of the nearest two pixels along the CloudSat ground track.
Remotesensing 10 00812 g0a4

References

  1. Hughes, M.J.; Hayes, D.J. Automated detection of cloud and cloud shadow in single-date Landsat imagery using neural networks and spatial post-processing. Remote Sens. 2014, 6, 4907–4926. [Google Scholar] [CrossRef]
  2. Musial, J.P.; Hüsler, F.; Sütterlin, M.; Neuhaus, C.; Wunderle, S. Daytime low stratiform cloud detection on AVHRR imagery. Remote Sens. 2014, 6, 5124–5150. [Google Scholar] [CrossRef] [Green Version]
  3. Hollstein, A.; Segl, K.; Guanter, L.; Brell, M.; Enesco, M. Ready-to-use methods for the detection of clouds, cirrus, snow, shadow, water and clear sky pixels in sentinel-2 MSI images. Remote Sens. 2016, 8, 666. [Google Scholar] [CrossRef]
  4. Li, H.; Zheng, H.; Han, C.; Wang, H.; Miao, M. Onboard spectral and spatial cloud detection for hyperspectral remote sensing images. Remote Sens. 2018, 10, 152. [Google Scholar] [CrossRef]
  5. Li, J.; Yi, Y.; Minnis, P.; Huang, J.; Yan, H.; Ma, Y.; Wang, W.; Ayers, J.K. Radiative effect differences between multi-layered and single-layer clouds derived from CERES, CALIPSO and CloudSat data. J. Quant. Spectrosc. Radiat. Transf. 2011, 112, 361–375. [Google Scholar] [CrossRef]
  6. Behrangi, A.; Casey, S.P.F.; Lambrigtsen, B.H. Three-dimensional distribution of cloud types over the USA and surrounding areas observed by CloudSat. Int. J. Remote Sens. 2012, 33, 4856–4870. [Google Scholar] [CrossRef]
  7. Wang, Z.; Sassen, K. Level 2 Cloud Scenario Classification Product Process Description and Interface Control Document. Available online: http://irina.eas.gatech.Edu/EAS_Fall2008/CloudSat_ATBD_L2_cloud_clas.pdf (accessed on 8 March 2018).
  8. Parmes, E.; Rauste, Y.; Molinier, M.; Andersson, K.; Seitsonen, L. Automatic cloud and shadow detection in optical satellite imagery without using thermal bands—Application to Suomi NPP VIIRS images over Fennoscandia. Remote Sens. 2017, 9, 806. [Google Scholar] [CrossRef]
  9. Tan, K.; Zhang, Y.; Tong, X. Cloud extraction from Chinese high resolution satellite imagery by probabilistic latent semantic analysis and object-based machine learning. Remote Sens. 2016, 8, 963. [Google Scholar] [CrossRef]
  10. Frey, R.; Baum, B.; Heidinger, A.; Ackerman, S.; Maddux, B.; Menzel, P. MODIS CTP (MOD06) Webinar #7. Available online: http://modis-atmos.gsfc.nasa.gov/sites/default/files/ModAtmo/MODIS_C6_Cloud_Top_Products_Menzel.pdf (accessed on 8 March 2018).
  11. Heidinger, A.K.; Pavolonis, M.J. Global daytime distribution of overlapping cirrus cloud from NOAA’s Advanced very High Resolution Radiometer. J. Clim. 2005, 18, 4772–4784. [Google Scholar] [CrossRef]
  12. Joiner, J.; Vasilkov, A.P.; Bhartia, P.K.; Wind, G.; Platnick, S.; Menzel, W.P. Detection of multi-layer and vertically-extended clouds using A-train sensors. Atmos. Meas. Tech. 2010, 3, 233–247. [Google Scholar] [CrossRef]
  13. Menzel, W.P.; Frey, R.A.; Baum, B.A. Cloud Top Properties and Cloud Phase Algorithm Theoretical Basis Document. Available online: https://modis-atmos.gsfc.nasa.gov/_docs/MOD06-ATBD_2015_05_01.pdf (accessed on 8 March 2018).
  14. Platnick, S.; King, M.D.; Meyer, K.G.; Wind, G.; Amarasinghe, N.; Marchant, B.; Arnold, G.T.; Zhang, Z.; Hubanks, P.A.; Ridgway, B.; et al. MODIS Cloud Optical Properties: User Guide for the Collection 6 Level-2 MOD06/MYD06 Product and Associated Level-3 Datasets. Available online: https://modis-atmos. gsfc.nasa.gov/_docs/C6MOD06OPUserGuide.pdf (accessed on 8 March 2018).
  15. Spinhirne, J.D.; Palm, S.P.; Hart, W.D.; Hlavka, D.L.; Welton, E.J. Cloud and aerosol measurements from GLAS: Overview and initial results. Geophys. Res. Lett. 2005, 32, L22S03. [Google Scholar] [CrossRef]
  16. Spinhirne, J.D.; Palm, S.P.; Hart, W.D. Antarctica cloud cover for October 2003 from GLAS satellite lidar profiling. Geophys. Res. Lett. 2005, 32, L22S05. [Google Scholar] [CrossRef]
  17. Kato, S.; Sun-Mack, S.; Miller, W.F.; Rose, F.G.; Chen, Y.; Minnis, P.; Wielicki, B.A. Relationships among cloud occurrence frequency, overlap, and effective thickness derived from CALIPSO and CloudSat merged cloud vertical profiles. J. Geophys. Res. Atmos. 2010, 115, D00H28. [Google Scholar] [CrossRef]
  18. Miller, S.D.; Forsythe, J.M.; Partain, P.T.; Haynes, J.M.; Bankert, R.L.; Sengupta, M.; Mitrescu, C.; Hawkins, J.D.; Vonder Haar, T.H. Estimating three-dimensional cloud structure via statistically blended satellite observations. J. Appl. Meteorol. Clim. 2014, 53, 437–455. [Google Scholar] [CrossRef]
  19. Minnis, P.; Sun-Mack, S.; Chen, Y.; Yi, H.; Huang, J.; Nguyen, L.; Khaiyer, M.M. Detection and retrieval of multi-layered cloud properties using satellite data. In Proceedings of the SPIE Europe International Symposium on Remote Sensing, Remote Sensing of Clouds and the Atmosphere X, Bruges, Belgium, 19–22 September 2005. [Google Scholar]
  20. Wind, G.; Platnick, S.; King, M.D.; Hubanks, P.A.; Pavolonis, M.J.; Heidinger, A.K.; Yang, P.; Baum, B.A. Multilayer cloud detection with the MODIS near-infrared water vapor absorption band. J. Appl. Meteorol. Clim. 2010, 49, 2315–2332. [Google Scholar] [CrossRef]
  21. Golea, V. International Satellite Cloud Climatology Project. Available online: http://isccp.giss.nasa.gov/cloudtypes.html#DIAGRAM (accessed on 8 March 2018).
  22. Marchand, R.; Ackerman, T.; Smyth, M.; Rossow, W.B. A review of cloud top height and optical depth histograms from MISR, ISCCP and MODIS. J. Geophys. Res. Atmos. 2010, 115, D16206. [Google Scholar] [CrossRef]
  23. Pincus, R.; Platnick, S.; Ackerman, S.A.; Hemler, R.S.; Hofmann, R.J.P. Reconciling simulated and observed views of clouds: MODIS, ISCCP and the limits of instrument simulators. J. Clim. 2012, 25, 4699–4720. [Google Scholar] [CrossRef]
  24. Rossow, W.B.; Garder, L.C.; Lu, P.J.; Walker, A. International Satellite Cloud Climatology Project (ISCCP) Documentation of Cloud Data; WMO/TD 266 (rev.); World Climate Research Programme: Geneva, Switzerland, 1991; pp. 1–85. [Google Scholar]
  25. Rossow, W.B.; Schiffer, R.A. Advances in understanding clouds from ISCCP. Bull. Am. Meteorol. Soc. 1999, 80, 2261–2287. [Google Scholar] [CrossRef]
  26. Sun-Mack, S.; Minnis, P.; Chen, Y.; Gibson, S.; Yi, Y.; Trepte, Q.; Wielicki, B.; Kato, S.; Winker, D. Integrated cloud-aerosol-radiation product using CERES, MODIS, CALIPSO and CloudSat data. In Proceedings of the SPIE Europe Conference on the Remote Sensing of Clouds and the Atmosphere, Florence, Italy, 25 October 2007; pp. 1–12. [Google Scholar] [CrossRef]
  27. Sun-Mack, S.; Minnis, P.; Kato, S.; Chen, Y.; Yi, Y.; Gibson, S.; Heck, P.; Winker, D.; Ayers, K. Enhanced cloud algorithm from collocated CALIPSO, CloudSat and MODIS global boundary layer lapse rate studies. In Proceedings of the IEEE International Geoscience & Remote Sensing Symposium, Honolulu, HI, USA, 25–30 July 2010; pp. 201–204. [Google Scholar]
  28. Sun, X.J.; Li, H.R.; Barker, H.W.; Zhang, R.W.; Zhou, Y.B.; Liu, L. Satellite-based estimation of cloud-base heights using constrained spectral radiance matching. Q. J. R. Meteorol. Soc. 2016, 142, 224–232. [Google Scholar] [CrossRef]
  29. Austin, R.T.; Heymsfield, A.J.; Stephens, G.L. Retrieval of ice cloud microphysical parameters using the CloudSat millimeter-wave radar and temperature. J. Geophys. Res. 2009, 114, D00A23. [Google Scholar] [CrossRef]
  30. Leptoukh, G.; Kempler, S.; Smith, P.; Savtchenko, A.; Kummerer, R.; Gopalan, A.; Farley, J.; Chen, A. A-train data depot: Integrating and exploring data along the A-train tracks. In Proceedings of the IEEE International Geoscience & Remote Sensing Symposium, Barcelona, Spain, 23–28 July 2007; Volume 100, pp. 1118–1121. [Google Scholar] [CrossRef]
  31. Savtchenko, A.; Kummerer, R.; Smith, P.; Gopalan, A.; Kempler, S.; Leptoukh, G. A-train data depot: Bringing atmospheric measurements together. IEEE Trans. Geosci. Remote Sens. 2008, 46, 2788–2795. [Google Scholar] [CrossRef]
  32. Barker, D.M.; Huang, W.; Guo, Y.R.; Bourgeois, A.J.; Xiao, Q.N. A three-dimensional variational data assimilation system for MM5: Implementation and initial results. Mon. Weather Rev. 2004, 132, 897–914. [Google Scholar] [CrossRef]
  33. Barker, H.W.; Jerg, M.P.; Wehr, T.; Kato, S.; Donovan, D.P.; Hogan, R.J. A 3D cloud-construction algorithm for the EarthCARE satellite mission. Q. J. R. Meteorol. Soc. 2011, 137, 1042–1058. [Google Scholar] [CrossRef]
  34. Chan, M.A.; Comiso, J.C. Arctic cloud characteristics as derived from MODIS, CALIPSO and CloudSat. J. Clim. 2013, 26, 3285–3306. [Google Scholar] [CrossRef]
  35. Luo, Y.; Zhang, R.; Wang, H. Comparing occurrences and vertical Structures of hydrometeors between eastern China and the Indian monsoon region using CloudSat/CALIPSO Data. J. Clim. 2009, 22, 1052–1064. [Google Scholar] [CrossRef]
  36. Zeng, S.; Riedi, J.; Trepte, C.R.; Winker, D.M.; Hu, Y.X. Study of global cloud droplet number concentration with A-train satellites. Atmos. Chem. Phys. 2014, 14, 7125–7134. [Google Scholar] [CrossRef]
  37. Young, A.H.; Bates, J.J.; Curry, J.A. Application of cloud vertical structure from CloudSat to investigate MODIS-derived cloud properties of cirriform, anvil, and deep convective clouds. J. Geophys. Res. Atmos. 2013, 118, 4689–4699. [Google Scholar] [CrossRef]
  38. Wang, H.; Xu, X. Orbit registration between wide swaths of passive sensors and narrow tracks of active sensors. In Proceedings of the 13th IEEE International Conference on Signal Processing, Chengdu, China, 6–10 November 2016; pp. 209–214. [Google Scholar]
  39. Zhang, Z.; Li, D.; Liu, S.; Xiao, B.; Cao, X. Cross-domain ground-based cloud classification based on transfer of local features and discriminative metric learning. Remote Sens. 2018, 10, 8. [Google Scholar] [CrossRef]
  40. Stephens, G.L.; Vane, D.G.; TeBockhorst, D. CloudSat-Instrument: Home. Available online: http://www.cloudsat.cira.colostate.edu/ (accessed on 8 March 2018).
  41. Delanoë, J.; Hogan, R.J. Combined CloudSat-CALIPSO-MODIS retrievals of the properties of ice clouds. J. Geophys. Res. Atmos. 2010, 115, 1–17. [Google Scholar] [CrossRef]
  42. Maccherone, B.; Frazier, S. MODIS Level 1 Data, Geolocation, Cloudmask and Atmosphere Product. Available online: http://modis-atmos.gsfc.nasa.gov/ (accessed on 8 March 2018).
  43. Stephens, G.L.; Vane, D.G.; Tanelli, S.; Im, E.; Durden, S.; Rokey, M.; Reinke, D.; Partain, P.; Mace, G.G.; Austin, R. CloudSat mission: Performance and early science after the first year of operation. J. Geophys. Res. Atmos. 2008, 113, D00A18. [Google Scholar] [CrossRef]
  44. Partain, P. Cloudsat MODIS-AUX Auxiliary Data Process Description and Interface Control Document; Colorado State University: Fort Collins, CO, USA, 2007. [Google Scholar]
  45. Wang, S.; Yao, Z.; Han, Z.; Zhao, Z. Feasibility analysis of extending the spatial coverage of cloud-base height from CloudSat. Meteorol. Mon. 2012, 38, 210–219. [Google Scholar]
  46. Sassen, K.; Wang, Z. Classifying clouds around the globe with the CloudSat radar: 1-year of results. Geophys. Res. Lett. 2008, 35, L04805. [Google Scholar] [CrossRef]
  47. Xiong, X. The Research of Specific Cloud Properties Based on MODIS Remote Sensing Data. Master’s Thesis, University of Electronic Science and Technology of China, Chengdu, China, 2015. [Google Scholar]
  48. Ham, S.H.; Kato, S.; Barker, H.W.; Rose, F.G.; Sun-Mack, S. Effects of 3-D clouds on atmospheric transmission of solar radiation: Cloud type dependencies inferred from A-train satellite data. J. Geophys. Res. Atmos. 2014, 119, 943–963. [Google Scholar] [CrossRef]
  49. Capderou, M. Handbook of Satellite Orbits-from Kepler to GPS; Lyle, S., Translator; Springer: Cham, Switzerland, 2014; pp. 25–52. ISBN 978-3-319-03416-4. [Google Scholar]
Figure 1. Concept diagram of the proposed strategy.
Figure 1. Concept diagram of the proposed strategy.
Remotesensing 10 00812 g001
Figure 2. Moderate Resolution Imaging Spectroradiometer (MODIS) RGB (Red, Green, and Blue) scene from 02:40 to 02:45, on 1 June 2015 [42].
Figure 2. Moderate Resolution Imaging Spectroradiometer (MODIS) RGB (Red, Green, and Blue) scene from 02:40 to 02:45, on 1 June 2015 [42].
Remotesensing 10 00812 g002
Figure 3. Flowchart of the statistical analysis process. CTH—cloud top height; CTHD—CTH differences; CBH—cloud base height; CBHD—CBH differences.
Figure 3. Flowchart of the statistical analysis process. CTH—cloud top height; CTHD—CTH differences; CBH—cloud base height; CBHD—CBH differences.
Remotesensing 10 00812 g003
Figure 4. Hypothetical distribution of CloudSat pixels of an arbitrary cloud type.
Figure 4. Hypothetical distribution of CloudSat pixels of an arbitrary cloud type.
Remotesensing 10 00812 g004
Figure 5. Flow chart of cloud classification based on similar radiance matching (SRM).
Figure 5. Flow chart of cloud classification based on similar radiance matching (SRM).
Remotesensing 10 00812 g005
Figure 6. Relationship of relative pixels.
Figure 6. Relationship of relative pixels.
Remotesensing 10 00812 g006
Figure 7. MODIS swath during five minutes, from 02:40 to 02:45, on 1 June 2015, and the overlapped CloudSat track, where the sparse sampling pixels of the granule have been plotted to reveal clear details. The circles represent the original MODIS granule, the overlapped asterisks are the selected MODIS granule, and the dotted line denotes the truncated CloudSat track.
Figure 7. MODIS swath during five minutes, from 02:40 to 02:45, on 1 June 2015, and the overlapped CloudSat track, where the sparse sampling pixels of the granule have been plotted to reveal clear details. The circles represent the original MODIS granule, the overlapped asterisks are the selected MODIS granule, and the dotted line denotes the truncated CloudSat track.
Remotesensing 10 00812 g007
Figure 8. Most matching MODIS pixels on the track and the truncated Cloud Profiling Radar (CPR) track for the granule shown in Figure 7 using Equation (1).
Figure 8. Most matching MODIS pixels on the track and the truncated Cloud Profiling Radar (CPR) track for the granule shown in Figure 7 using Equation (1).
Remotesensing 10 00812 g008
Figure 9. SDs (a,b) and mean values (c,d) of CBHD, along with the distance between recipient and donor pixels in May 2015, in the region 0~60° N and 70–140° E, where (a,c) are for cirrus (Ci), altostratus (As), altocumulus (Ac), and stratus (St), and (b,d) are for stratocumulus (Sc), cumulus (Cu), nimbostratus (Ns), and deep convective clouds (Deep).
Figure 9. SDs (a,b) and mean values (c,d) of CBHD, along with the distance between recipient and donor pixels in May 2015, in the region 0~60° N and 70–140° E, where (a,c) are for cirrus (Ci), altostratus (As), altocumulus (Ac), and stratus (St), and (b,d) are for stratocumulus (Sc), cumulus (Cu), nimbostratus (Ns), and deep convective clouds (Deep).
Remotesensing 10 00812 g009
Figure 10. SDs (a,b) and mean values (c,d) of CTHD, along with the distance between recipient and donor pixels in May 2015, in the region 0~60° N and 70–140° E, where (a,c) are for Ci, As, Ac, and St, and (b,d) are for Sc, Cu, Ns, and Deep.
Figure 10. SDs (a,b) and mean values (c,d) of CTHD, along with the distance between recipient and donor pixels in May 2015, in the region 0~60° N and 70–140° E, where (a,c) are for Ci, As, Ac, and St, and (b,d) are for Sc, Cu, Ns, and Deep.
Remotesensing 10 00812 g010aRemotesensing 10 00812 g010b
Figure 11. The most matching MODIS pixels and the CPR pixels on truncated track for the granule shown in Figure 7 using Equation (2) [38].
Figure 11. The most matching MODIS pixels and the CPR pixels on truncated track for the granule shown in Figure 7 using Equation (2) [38].
Remotesensing 10 00812 g011
Figure 12. Scattered pixels distribution and relative differences about estimated latitudes, longitudes and those from MODIS-AUX for the granule shown in Figure 7 [38]. (a,b) are for latitude, while (c,d) are for longitude.
Figure 12. Scattered pixels distribution and relative differences about estimated latitudes, longitudes and those from MODIS-AUX for the granule shown in Figure 7 [38]. (a,b) are for latitude, while (c,d) are for longitude.
Remotesensing 10 00812 g012
Figure 13. Classified result comparisons. (a) Is for the proposed strategy, (b) is for the method of the ISCCP [21,24,25], and (c) 2B-CLDCLASS [7,40] on the track. Different cloud types are labeled with different colors of the color bar. MLice means ice clouds overlap low level cloud, whereas MLwater means water clouds lie above. MODIS moves from bottom to top in (a,b). Arrowhead indicates the CPR motion direction in (c).
Figure 13. Classified result comparisons. (a) Is for the proposed strategy, (b) is for the method of the ISCCP [21,24,25], and (c) 2B-CLDCLASS [7,40] on the track. Different cloud types are labeled with different colors of the color bar. MLice means ice clouds overlap low level cloud, whereas MLwater means water clouds lie above. MODIS moves from bottom to top in (a,b). Arrowhead indicates the CPR motion direction in (c).
Remotesensing 10 00812 g013aRemotesensing 10 00812 g013b
Figure 14. Results of the proposed technique and cloud types from 2B-CLDCLASS [7,40] under the condition that cloud types of potential CPR pixel with the same position information are probably used as those of recipient pixels. (a) is scattered distribution, where points are cloud types of the proposed technique while circles stand for types from 2B-CLDCLASS. (b) is statistical distribution, “Matched” means that the same type is determined by both methods whereas “Unmatched” means that different types are classified.
Figure 14. Results of the proposed technique and cloud types from 2B-CLDCLASS [7,40] under the condition that cloud types of potential CPR pixel with the same position information are probably used as those of recipient pixels. (a) is scattered distribution, where points are cloud types of the proposed technique while circles stand for types from 2B-CLDCLASS. (b) is statistical distribution, “Matched” means that the same type is determined by both methods whereas “Unmatched” means that different types are classified.
Remotesensing 10 00812 g014
Figure 15. Results of the proposed technique and cloud types from 2B-CLDCLASS [7,40] under the condition that cloud types of no potential CPR pixel with the same position information were used as those of recipient pixels, (a) is scattered distribution, where points are cloud types of the proposed technique while circles stand for types from 2B-CLDCLASS. (b) Is statistical distribution, ‘matched’ means that the same type is determined by both methods, whereas ‘unmatched’ means that different types are classified.
Figure 15. Results of the proposed technique and cloud types from 2B-CLDCLASS [7,40] under the condition that cloud types of no potential CPR pixel with the same position information were used as those of recipient pixels, (a) is scattered distribution, where points are cloud types of the proposed technique while circles stand for types from 2B-CLDCLASS. (b) Is statistical distribution, ‘matched’ means that the same type is determined by both methods, whereas ‘unmatched’ means that different types are classified.
Remotesensing 10 00812 g015
Table 1. Means and SDs of cloud top height (CTH) and cloud base height (CBH) for the eight cloud types selected in this study *.
Table 1. Means and SDs of cloud top height (CTH) and cloud base height (CBH) for the eight cloud types selected in this study *.
Cloud TypesCTH (km)CBH (km)
Cirrus (Ci)12.77 ± 2.2710.41 ±2.64
Altostratus (As)6.36 ± 2.664.01 ± 3.01
Altocumulus (Ac)4.31 ± 1.633.14 ± 1.41
Stratus (St)1.06 ± 0.650.71 ± 0.54
Stratocumulus (Sc)1.66 ± 0.810.88 ± 0.68
Cumulus (Cu)2.19 ± 1.620.81 ± 1.42
Nimbostratus (Ns)4.43 ± 2.100.47 ± 2.34
Deep convective clouds (Dc)5.42 ± 1.900.56 ± 1.54
* Mean and standard deviation are given as (mean) ± (one standard deviation).

Share and Cite

MDPI and ACS Style

Wang, H.; Xu, X. Cloud Classification in Wide-Swath Passive Sensor Images Aided by Narrow-Swath Active Sensor Data. Remote Sens. 2018, 10, 812. https://doi.org/10.3390/rs10060812

AMA Style

Wang H, Xu X. Cloud Classification in Wide-Swath Passive Sensor Images Aided by Narrow-Swath Active Sensor Data. Remote Sensing. 2018; 10(6):812. https://doi.org/10.3390/rs10060812

Chicago/Turabian Style

Wang, Hongxia, and Xiaojian Xu. 2018. "Cloud Classification in Wide-Swath Passive Sensor Images Aided by Narrow-Swath Active Sensor Data" Remote Sensing 10, no. 6: 812. https://doi.org/10.3390/rs10060812

APA Style

Wang, H., & Xu, X. (2018). Cloud Classification in Wide-Swath Passive Sensor Images Aided by Narrow-Swath Active Sensor Data. Remote Sensing, 10(6), 812. https://doi.org/10.3390/rs10060812

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop