Figure 1.
The study area at Xiaotangshan, Beijing, China (OMIS true color synthesis image, R = 699.2 nm, G = 565.4 nm, B = 465.0 nm).
Figure 1.
The study area at Xiaotangshan, Beijing, China (OMIS true color synthesis image, R = 699.2 nm, G = 565.4 nm, B = 465.0 nm).
Figure 2.
The study area at Doña Ana County, NM, USA.
Figure 2.
The study area at Doña Ana County, NM, USA.
Figure 3.
The study area at Qinghai Lake basin, Qinghai Province, China.
Figure 3.
The study area at Qinghai Lake basin, Qinghai Province, China.
Figure 4.
The evaluation scheme for the HCW-SSC method (Euclidean distance (ED), spectral angle cosine (SAC), spectral correlation coefficient (SCC), and spectral information divergence (SID)).
Figure 4.
The evaluation scheme for the HCW-SSC method (Euclidean distance (ED), spectral angle cosine (SAC), spectral correlation coefficient (SCC), and spectral information divergence (SID)).
Figure 5.
A schematic diagram showing the rule of the similarity measure.
Figure 5.
A schematic diagram showing the rule of the similarity measure.
Figure 6.
Classification maps of four different classification methods and the field measurement (spectral angle cosine (SAC), spectral information divergence (SID), spectral information divergence–spectral angle cosine (SID-SAC), and a new hybrid changing-weight classification method with a filter feature selection (HCW-SSC)).
Figure 6.
Classification maps of four different classification methods and the field measurement (spectral angle cosine (SAC), spectral information divergence (SID), spectral information divergence–spectral angle cosine (SID-SAC), and a new hybrid changing-weight classification method with a filter feature selection (HCW-SSC)).
Figure 7.
Classification maps of five different classification methods and baseline map (Euclidean distance (ED), spectral angle cosine (SAC), Euclidean distance–spectral angle cosine (ED-SAC), random forest (RF), and a new hybrid changing-weight classification method with a filter feature selection (HCW-SSC)).
Figure 7.
Classification maps of five different classification methods and baseline map (Euclidean distance (ED), spectral angle cosine (SAC), Euclidean distance–spectral angle cosine (ED-SAC), random forest (RF), and a new hybrid changing-weight classification method with a filter feature selection (HCW-SSC)).
Figure 8.
Classification maps of five different classification methods and baseline map (spectral information divergence (SID), spectral correlation coefficient (SCC), spectral information divergence–spectral correlation coefficient (SID-SCC), random forest (RF), and a new hybrid changing-weight classification method with a filter feature selection (HCW-SSC)).
Figure 8.
Classification maps of five different classification methods and baseline map (spectral information divergence (SID), spectral correlation coefficient (SCC), spectral information divergence–spectral correlation coefficient (SID-SCC), random forest (RF), and a new hybrid changing-weight classification method with a filter feature selection (HCW-SSC)).
Figure 9.
The time consumption of different kernels is based on different image sizes (Euclidean distance (ED), spectral angle cosine (SAC), Euclidean distance–spectral angle cosine (ED-SAC), a new hybrid changing-weight classification method with a filter feature selection (HCW-SSC), and random forest (RF)).
Figure 9.
The time consumption of different kernels is based on different image sizes (Euclidean distance (ED), spectral angle cosine (SAC), Euclidean distance–spectral angle cosine (ED-SAC), a new hybrid changing-weight classification method with a filter feature selection (HCW-SSC), and random forest (RF)).
Table 1.
The details of data used in this paper.
Table 1.
The details of data used in this paper.
Name | Spatial Resolution | Spectral Resolution | Spectral Domain | Location (Based on WGS 84) |
---|
Standard spectral libraries | NA (fieldwork) | 0.2 nm or 1 nm | 400–2500 nm | Pasadena, CA, USA |
OMIS | 2.8 m | 10 nm | 400–12,500 nm | Beijing, China (40°10′57″ N, 116°26′36″ E) |
AVIRIS | 20 m | 10 nm | 357–2576 nm | NM, USA (32°28′16.6″ N, 106°54′23.7″ W) |
Hyperion | 30 m | 10 nm | 426–2356 nm | Qianhai, China (37°25′54″ N, 100°11′44″ E) |
Table 2.
The details of spectral libraries.
Table 2.
The details of spectral libraries.
Spectral Library | The Extent of Wavelength | Spectral Resolution in the Visible Region | Spectral Resolution in the Infrared Region |
---|
USGS vegetation | 0.4–2.5 μm | 0.2 nm | 0.5 nm |
USGS mineral | 0.4–2.5 μm | 0.2 nm | 0.5 nm |
Chris Elvidge green | 0.4–2.5 μm | 1 nm | 4 nm |
Chris Elvidge dry | 0.4–2.5 μm | 1 nm | 4 nm |
Table 3.
Overall accuracy of each similarity measure based on standard spectral libraries (Euclidean distance (ED), spectral angle cosine (SAC), Euclidean distance–spectral angle cosine (ED-SAC), and a new hybrid changing-weight classification method with a filter feature selection (HCW-SSC)).
Table 3.
Overall accuracy of each similarity measure based on standard spectral libraries (Euclidean distance (ED), spectral angle cosine (SAC), Euclidean distance–spectral angle cosine (ED-SAC), and a new hybrid changing-weight classification method with a filter feature selection (HCW-SSC)).
Similarity Measure | ED | SAC | ED-SAC | HCW-SSC |
---|
Overall accuracy (%) | 87.50 | 91.25 | 93.75 | 97.50 |
F1 score | 0.888 | 0.943 | 0.958 | 0.974 |
Table 4.
Overall accuracy and kappa coefficient of each similarity measure based on OMIS HSI (spectral angle cosine (SAC), spectral information divergence (SID), spectral information divergence–spectral angle cosine (SID-SAC), and a new hybrid changing-weight classification method with a filter feature selection (HCW-SSC)).
Table 4.
Overall accuracy and kappa coefficient of each similarity measure based on OMIS HSI (spectral angle cosine (SAC), spectral information divergence (SID), spectral information divergence–spectral angle cosine (SID-SAC), and a new hybrid changing-weight classification method with a filter feature selection (HCW-SSC)).
Similarity Measure | SAC | SID | SID-SAC | HCW-SSC |
---|
Overall accuracy (%) | 62.69 | 75.69 | 88.27 | 93.21 |
Kappa coefficient | 0.5989 | 0.7563 | 0.8698 | 0.9245 |
Table 5.
Overall accuracy and kappa coefficient of each similarity measure based on AVIRIS HSI (Euclidean distance (ED), spectral angle cosine (SAC), Euclidean distance–spectral angle cosine (ED-SAC), random forest (RF), and a new hybrid changing-weight classification method with a filter feature selection (HCW-SSC)).
Table 5.
Overall accuracy and kappa coefficient of each similarity measure based on AVIRIS HSI (Euclidean distance (ED), spectral angle cosine (SAC), Euclidean distance–spectral angle cosine (ED-SAC), random forest (RF), and a new hybrid changing-weight classification method with a filter feature selection (HCW-SSC)).
Similarity Measure | ED | SAC | ED-SAC | RF | HCW-SSC |
---|
Overall accuracy (%) | 51.23 | 63.76 | 73.12 | 68.27 | 79.24 |
Kappa coefficient | 0.4249 | 0.5323 | 0.7225 | 0.6628 | 0.8044 |
Table 6.
Overall accuracy and kappa coefficient of each similarity measure based on Hyperion HSI (spectral information divergence (SID), spectral correlation coefficient (SCC), spectral information divergence–spectral correlation coefficient (SID-SCC), random forest (RF), and a new hybrid changing-weight classification method with a filter feature selection (HCW-SSC)).
Table 6.
Overall accuracy and kappa coefficient of each similarity measure based on Hyperion HSI (spectral information divergence (SID), spectral correlation coefficient (SCC), spectral information divergence–spectral correlation coefficient (SID-SCC), random forest (RF), and a new hybrid changing-weight classification method with a filter feature selection (HCW-SSC)).
Similarity Measure | SID | SCC | SID-SCC | RF | HCW-SSC |
---|
Overall accuracy (%) | 54.82 | 62.39 | 56.34 | 51.21 | 81.23 |
Kappa coefficient | 0.3934 | 0.4363 | 0.3623 | 0.4255 | 0.7234 |
Table 7.
Overall accuracy and kappa coefficient of each hybrid kernel based on OMIS HSI (the gray area has the results from
Table 4, Euclidean distance–spectral correlation coefficient (ED-SCC), Euclidean distance–spectral angle cosine (ED-SAC), spectral correlation coefficient–spectral information divergence (SCC-SID), spectral angle cosine–spectral correlation coefficient (SAC-SCC), Euclidean distance–spectral information divergence (ED-SID), spectral information divergence–spectral angle cosine (SID-SAC), and a new hybrid changing-weight classification method with a filter feature selection (HCW-SSC)). The numbers in bold mean the most important results.
Table 7.
Overall accuracy and kappa coefficient of each hybrid kernel based on OMIS HSI (the gray area has the results from
Table 4, Euclidean distance–spectral correlation coefficient (ED-SCC), Euclidean distance–spectral angle cosine (ED-SAC), spectral correlation coefficient–spectral information divergence (SCC-SID), spectral angle cosine–spectral correlation coefficient (SAC-SCC), Euclidean distance–spectral information divergence (ED-SID), spectral information divergence–spectral angle cosine (SID-SAC), and a new hybrid changing-weight classification method with a filter feature selection (HCW-SSC)). The numbers in bold mean the most important results.
Similarity Measure | ED-SCC | ED-SAC | SCC-SID | SAC-SCC | ED-SID | SID-SAC | HCW-SSC |
---|
Overall accuracy (%) | 63.22 | 58.82 | 61.39 | 69.24 | 66.17 | 88.27 | 93.21 |
Kappa coefficient | 0.6215 | 0.4932 | 0.4342 | 0.7014 | 0.6138 | 0.8698 | 0.9245 |
Table 8.
Weight of selected similarity measures based on standard spectral libraries (Euclidean distance (ED) and spectral angle cosine (SAC)).
Table 8.
Weight of selected similarity measures based on standard spectral libraries (Euclidean distance (ED) and spectral angle cosine (SAC)).
Weight | Types of the Standard Spectral Library |
---|
USGS Vegetation | Chris Elvidge Dry | Chris Elvidge Green | USGS Mineral |
---|
ED | 0.312 | 0.329 | 0.343 | 0.246 |
SAC | 0.688 | 0.671 | 0.657 | 0.754 |
Table 9.
Overall accuracy of a semisupervised kernel with K-means on three different datasets.
Table 9.
Overall accuracy of a semisupervised kernel with K-means on three different datasets.
| Urban (Hypercube) | Kennedy Space Center (AVIRIS) | DC Mall (Airborne Sensor) |
---|
Number of bands | 210 | 176 | 210 |
Number of classes | 4 | 13 | 7 |
Overall Accuracy |
Semisupervised kernel | 93.48% | 80.37% | 99.17% |
Table 10.
Overall accuracy of support vector machines (SVM), extended morphological profiles (EMPs), joint spare representation (JSR), edge-preserving filtering (EPF), 3D-CNN, CNN with pixel–pair features (CNN-PPF), Gabor-CNN, Siamese CNN (S-CNN), 3D-generative adversarial network (3D-GAN), and the deep feature fusion network (DFFN) on three different datasets.
Table 10.
Overall accuracy of support vector machines (SVM), extended morphological profiles (EMPs), joint spare representation (JSR), edge-preserving filtering (EPF), 3D-CNN, CNN with pixel–pair features (CNN-PPF), Gabor-CNN, Siamese CNN (S-CNN), 3D-generative adversarial network (3D-GAN), and the deep feature fusion network (DFFN) on three different datasets.
| Salinas (AVIRIS) | University of Pavia (ROSIS-03 Sensor) | Houston (Airborne Sensor) |
---|
Number of bands | 204 | 115 | 144 |
Number of classes | 16 | 9 | 15 |
Overall Accuracy |
SVM | 0.7513 | 0.8813 | 0.8985 |
EMP | 0.7815 | 0.9504 | 0.9707 |
JSR | 0.7528 | 0.9349 | 0.9307 |
EPF | 0.7824 | 0.9688 | 0.9700 |
3D-CNN | 0.8013 | 0.9502 | 0.9695 |
CNN-PPE | 0.7991 | 0.969 | 0.9404 |
Gabor-CNN | 0.8114 | 0.9662 | 0.9734 |
S-CNN | 0.8052 | 0.9743 | 0.9510 |
3D-GAN | 0.7616 | 0.9697 | 0.9793 |
DFFN | 0.8328 | 0.9808 | 0.9967 |
Table 11.
Overall accuracy of support vector machines (SVM), k-nearest neighbors (kNN), classification and regression trees (Cart), and naïve Bayes on three different datasets.
Table 11.
Overall accuracy of support vector machines (SVM), k-nearest neighbors (kNN), classification and regression trees (Cart), and naïve Bayes on three different datasets.
| Indian Pines (AVIRIS) | Salinas (AVIRIS) | University of Pavia (ROSIS-03 Sensor) |
---|
Number of bands | 224 | 224 | 103 |
Number of classes | 9 | 16 | 9 |
Overall Accuracy |
SVM | 0.7875 | 0.8994 | 0.8992 |
kNN | 0.6733 | 0.8502 | 0.7942 |
Cart | 0.6301 | 0.8389 | 0.7406 |
Naïve Bayes | 0.5292 | 0.7873 | 0.6776 |