Hyperspectral Image Classification Based on a Least Square Bias Constraint Additional Empirical Risk Minimization Nonparallel Support Vector Machine
Abstract
:1. Introduction
- (1)
- The selection of experimental software tools and the hardware conditions of the experiment.
- (2)
- A brief introduction of the BC-AERM-NSVM algorithm and a detailed description of the LS-BC-AERM-NSVM algorithm model.
- (3)
- An evaluation index of the experimental results.
2. Materials and Methods
2.1. Software Description
2.2. Data
2.2.1. Indian Pines Dataset
2.2.2. Kennedy Space Center Dataset
2.2.3. Pavia University Dataset
2.2.4. Salinas Dataset
2.3. Bias Constraint Additional Empirical Risk Minimization Nonparallel Support Vector Machine
2.4. Least Square Bias Constraint Additional Empirical Risk Minimization Nonparallel Support Vector Machine
2.4.1. Linear Case
2.4.2. Nonlinear Case
2.5. Application of Algorithms in Hyperspectral Image Classification
Algorithm 1: The Classification Process of the LSBAENSVM Algorithm Model for a Hyperspectral Dataset |
Step 1: Each category of the hyperspectral dataset can be combined into pairs to obtain binary classification tasks. Step 2: Hyperparameters c1, c2, c3, c4 of the LSBAENSVM model are set. Step 3: Each binary classification task is trained using LSBAENSVM. 1. First, we used the parameter set in Step 2 to solve parameters according to Formulas (22) and (24). Here, c1 and c3 are the two parameters of Formula (22), and c2 and c4 are the two parameters of Formula (24). 2. The normal vectors and offsets of the two decision hyperplanes are obtained with (21) and (23). Finally, classifier models are obtained. Step 4: For the classifier models trained in Step 3, the category of the new sample is predicted by Formula (25), all predicted categories are recorded, and the sample is classified into the category with the most votes. |
2.6. Accuracy Assessment
3. Results
3.1. Indian Pines Dataset
3.2. Kennedy Space Center Dataset
3.3. Pavia University Dataset
3.4. Salinas Dataset
4. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Sowmya, V.; Soman, K.P.; Hassaballah, M. Hyperspectral image: Fundamentals and advances. In Recent Advances in Computer Vision; Springer: Berlin/Heidelberg, Germany, 2019; pp. 401–424. [Google Scholar]
- Lv, W.; Wang, X. Overview of Hyperspectral Image Classification. J. Sens. 2020, 2020, 4817234. [Google Scholar] [CrossRef]
- Ranjan, S.; Nayak, D.R.; Kumar, K.S.; Dash, R.; Majhi, B. Hyperspectral Image Classification: A k-Means Clustering Based Approach. In Proceedings of the 2017 4th International Conference on Advanced Computing and Communication Systems (ICACCS), Coimbatore, India, 6–7 January 2017. [Google Scholar]
- El Rahman, S.A. Hyperspectral imaging classification using ISODATA algorithm: Big data challenge. In Proceedings of the 2015 Fifth International Conference on e-Learning (econf), Manama, Bahrain, 18–20 October 2015; pp. 247–250. [Google Scholar]
- Song, W.; Li, S.; Kang, X.; Huang, K. Hyperspectral image classification based on KNN sparse representation. In Proceedings of the 2016 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Beijing, China, 10–15 July 2016; pp. 2411–2414. [Google Scholar] [CrossRef]
- Goel, P.K.; Prasher, S.O.; Patel, R.M.; Landry, J.A.; Bonnell, R.B.; Viau, A.A. Classification of hyperspectral data by decision trees and artificial neural networks to identify weed stress and nitrogen status of corn. Comput. Electron. Agric. 2003, 39, 67–93. [Google Scholar]
- Pradhan, B. A comparative study on the predictive ability of the decision tree, support vector machine and neuro-fuzzy models in landslide susceptibility mapping using GIS. Comput. Geosci. 2013, 51, 350–365. [Google Scholar]
- Belgiu, M.; Drăguţ, L. Random forest in remote sensing: A review of applications and future directions. ISPRS J. Photogramm. Remote Sens. 2016, 114, 24–31. [Google Scholar] [CrossRef]
- Tan, K.; Du, P. Hyperspectral Remote Sensing Image Classification Based on Support Vector Machine. J. Infrared Millim. Waves 2008, 27, 123–128. [Google Scholar] [CrossRef]
- Wang, Y. Remote Sensing Image Automatic Classification with Support Vector Machine. Comput. Simul. 2013, 30, 378–385. [Google Scholar]
- Pal, M.; Mather, P.M. Support vector machines for classification in remote sensing. Int. J. Remote Sens. 2005, 26, 1007–1011. [Google Scholar] [CrossRef]
- Cortes, C.; Vapnik, V. Support-vector networks. Mach. Learn. 1995, 20, 273–297. [Google Scholar] [CrossRef]
- Zhang, D.; Zhou, Z.H.; Chen, S. Semi-supervised dimensionality reduction. In Proceedings of the 2007 SIAM International Conference on Data Mining, Society for Industrial and Applied Mathematics, Minneapolis, MN, USA, 26–28 April 2007; pp. 629–634. [Google Scholar]
- Sain, S.R. The Nature of Statistical Learning Theory; Taylor & Francis: Abingdon, UK, 1996; p. 409. [Google Scholar]
- Harikiran, J.J. Hyperspectral image classification using support vector machines. IAES Int. J. Artif. Intell. 2020, 9, 684. [Google Scholar] [CrossRef]
- Melgani, F.; Bruzzone, L. Classification of hyperspectral remote sensing images with support vector machines. IEEE Trans. Geosci. Remote Sens. 2004, 42, 1778–1790. [Google Scholar] [CrossRef]
- Chan, R.H.; Kan, K.K.; Nikolova, M.; Plemmons, R.J. A two-stage method for spectral–spatial classification of hyperspectral images. J. Math. Imaging Vis. 2020, 62, 790–807. [Google Scholar] [CrossRef]
- Jin, S.; Zhang, W.; Yang, P.; Zheng, Y.; An, J.; Zhang, Z.; Qu, P.; Pan, X. Spatial-spectral feature extraction of hyperspectral images for wheat seed identification. Comput. Electr. Eng. 2022, 101, 108077. [Google Scholar] [CrossRef]
- Suykens, J.A.; Vandewalle, J. Least squares support vector machine classifiers. Neural Processing Lett. 1999, 9, 293–300. [Google Scholar] [CrossRef]
- Gao, H.-Z.; Wan, J.-W.; Zhu, Z.-Z.; Wang, L.-B.; Nian, Y.-J. Classification technique for hyperspectral image based on subspace of bands feature extraction and LS-SVM. Spectrosc. Spectr. Anal. 2011, 31, 1314–1317. [Google Scholar]
- Shao, Y.; Gao, C.; Xuan, G.; Gao, X.; Chen, Y.; Hu, Z. Determination of damaged wheat kernels with hyperspectral imaging analysis. Int. J. Agric. Biol. Eng. 2020, 13, 194–198. [Google Scholar] [CrossRef]
- Mangasarian, O.L.; Wild, E.W. Multisurface proximal support vector machine classification via generalized eigenvalues. IEEE Trans. Pattern Anal. Mach. Intell. 2005, 28, 69–74. [Google Scholar] [CrossRef] [PubMed]
- Khemchandani, R.; Chandra, S. Twin support vector machines for pattern classification. IEEE Trans. Pattern Anal. Mach. Intell. 2007, 29, 905–910. [Google Scholar]
- Schölkopf, B.; Smola, A.J.; Bach, F. Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond; MIT Press: Cambridge, MA, USA, 2002. [Google Scholar]
- Kaya, G.T.; Torun, Y.; Küçük, C. Recursive feature selection based on non-parallel SVMs and its application to hyperspectral image classification. In Proceedings of the 2014 IEEE Geoscience and Remote Sensing Symposium, Quebec City, QC, Canada, 13–18 July 2014; pp. 3558–3561. [Google Scholar]
- Liu, Z.; Zhu, L. A novel remote sensing image classification algorithm based on multi-feature optimization and TWSVM. In Proceedings of the Ninth International Conference on Digital Image Processing (ICDIP 2017), International Society for Optics and Photonics, Hong Kong, China, 19–22 May 2017. [Google Scholar]
- Liu, G.; Wang, L.; Liu, D.; Fei, L.; Yang, J. Hyperspectral Image Classification Based on Non-Parallel Support Vector Machine. Remote Sens. 2022, 14, 2447. [Google Scholar] [CrossRef]
Class | Samples | 10% | 20% | 30% | 40% |
---|---|---|---|---|---|
Alfalfa | 54 | 5 | 11 | 16 | 22 |
Corn-notill | 1434 | 143 | 287 | 430 | 574 |
Corn-mintill | 834 | 83 | 167 | 250 | 334 |
Corn | 234 | 23 | 47 | 70 | 94 |
Grass-pasture | 497 | 50 | 99 | 149 | 199 |
Grass-trees | 747 | 75 | 149 | 224 | 299 |
Grass-pasture-mowed | 26 | 3 | 5 | 8 | 11 |
Hay-windrowed | 489 | 49 | 98 | 147 | 196 |
Oats | 20 | 2 | 4 | 6 | 8 |
Soybean-notill | 968 | 97 | 194 | 290 | 388 |
Soybean-mintill | 2468 | 247 | 494 | 740 | 988 |
Soybean-clean | 614 | 61 | 123 | 184 | 246 |
Wheat | 212 | 21 | 42 | 64 | 85 |
Woods | 1294 | 129 | 259 | 388 | 518 |
Buildings-Grass-Trees-Drives | 300 | 30 | 60 | 90 | 120 |
Stone-Steel-Towers | 95 | 10 | 19 | 29 | 38 |
Total | 10,286 | 1029 | 2057 | 3086 | 4120 |
Train Rate | Accuracy | SVM | TWSVM | BAENSVM | LSSVM | LSTWSVM | LSBAENSVM |
---|---|---|---|---|---|---|---|
10% | OA | 82.42 | 82.79 | 82.84 | 84.08 | 83.49 | 84.18 |
Kappa | 81.16 | 81.50 | 81.70 | 82.88 | 82.69 | 82.95 | |
20% | OA | 87.10 | 87.02 | 87.61 | 88.63 | 88.21 | 88.78 |
Kappa | 86.06 | 86.07 | 86.70 | 87.44 | 87.38 | 87.78 | |
30% | OA | 89.41 | 89.43 | 89.79 | 90.24 | 90.09 | 90.41 |
Kappa | 88.48 | 88.68 | 88.96 | 89.43 | 89.41 | 89.50 | |
40% | OA | 90.15 | 90.02 | 90.57 | 91.15 | 90.65 | 91.26 |
Kappa | 89.28 | 89.39 | 89.76 | 90.49 | 89.89 | 90.57 |
Class | Samples | 10% | 20% | 30% | 40% |
---|---|---|---|---|---|
Scrub | 761 | 77 | 153 | 229 | 305 |
Willow swamp | 243 | 25 | 49 | 73 | 98 |
CP hammock | 256 | 26 | 52 | 77 | 103 |
Slash pine | 252 | 26 | 51 | 76 | 102 |
Oak/Broadleaf | 161 | 17 | 33 | 49 | 65 |
Hardwood | 229 | 23 | 46 | 69 | 92 |
Swamp | 105 | 11 | 21 | 32 | 42 |
Graminoid marsh | 431 | 44 | 87 | 130 | 173 |
Spartina marsh | 520 | 52 | 104 | 156 | 208 |
Cattail marsh | 404 | 41 | 81 | 122 | 162 |
Salt marsh | 419 | 42 | 84 | 126 | 168 |
Mud flats | 503 | 51 | 101 | 151 | 202 |
Water | 527 | 53 | 106 | 159 | 211 |
Total | 4811 | 488 | 968 | 1449 | 1931 |
Train Number | Accuracy | SVM | TWSVM | BAENSVM | LSSVM | LSTWSVM | LSBAENSVM |
---|---|---|---|---|---|---|---|
10% | OA | 91.84 | 91.19 | 92.15 | 92.63 | 91.54 | 92.86 |
Kappa | 90.93 | 90.73 | 91.36 | 92.23 | 91.03 | 92.40 | |
20% | OA | 93.28 | 92.59 | 93.45 | 93.82 | 92.64 | 94.03 |
Kappa | 92.75 | 92.23 | 92.95 | 93.31 | 92.29 | 93.46 | |
30% | OA | 94.08 | 93.70 | 94.30 | 94.58 | 93.97 | 94.76 |
Kappa | 93.50 | 93.51 | 93.90 | 94.02 | 93.45 | 94.24 | |
40% | OA | 94.59 | 94.31 | 94.82 | 94.91 | 94.28 | 95.17 |
Kappa | 94.02 | 94.03 | 94.49 | 94.42 | 93.96 | 94.74 |
Class | Samples | Train Dataset 1 | Train Dataset 2 | Train Dataset 3 | Train Dataset 4 |
---|---|---|---|---|---|
Asphalt | 6631 | 200 | 300 | 400 | 500 |
Meadows | 18,649 | 200 | 300 | 400 | 500 |
Gravel | 2099 | 200 | 300 | 400 | 500 |
Trees | 3064 | 200 | 300 | 400 | 500 |
Painted metal sheets | 1345 | 200 | 300 | 400 | 500 |
Bare Soil | 5029 | 200 | 300 | 400 | 500 |
Bitumen | 1330 | 200 | 300 | 400 | 500 |
Self-Blocking Bricks | 3682 | 200 | 300 | 400 | 500 |
Shadows | 947 | 200 | 300 | 400 | 500 |
Total | 42,776 | 1800 | 2700 | 3600 | 4500 |
Train Number | Accuracy | SVM | TWSVM | BAENSVM | LSSVM | LSTWSVM | LSBAENSVM |
---|---|---|---|---|---|---|---|
200 | OA | 91.35 | 91.68 | 91.87 | 91.82 | 91.46 | 92.50 |
Kappa | 88.50 | 89.35 | 89.34 | 89.40 | 88.88 | 90.20 | |
300 | OA | 91.75 | 91.95 | 92.36 | 92.20 | 92.14 | 92.81 |
Kappa | 88.94 | 88.58 | 89.87 | 89.88 | 89.64 | 90.55 | |
400 | OA | 92.57 | 91.89 | 92.73 | 92.60 | 92.71 | 93.12 |
Kappa | 89.93 | 89.29 | 90.27 | 90.20 | 90.41 | 90.78 | |
500 | OA | 92.83 | 92.32 | 93.13 | 92.89 | 93.07 | 93.42 |
Kappa | 90.18 | 89.74 | 90.72 | 90.49 | 90.74 | 91.07 |
Class | Samples | Train Dataset 1 | Train Dataset 2 | Train Dataset 3 | Train Dataset 4 |
---|---|---|---|---|---|
ineyard_green_weeds_1 | 2009 | 200 | 300 | 400 | 500 |
ineyard_green_weeds_2 | 3726 | 200 | 300 | 400 | 500 |
Fallow | 1976 | 200 | 300 | 400 | 500 |
Fallow_rough_plow | 1394 | 200 | 300 | 400 | 500 |
Fallow_smooth | 2678 | 200 | 300 | 400 | 500 |
Stubble | 3959 | 200 | 300 | 400 | 500 |
Celery | 3579 | 200 | 300 | 400 | 500 |
Grapes_untrained | 11,271 | 200 | 300 | 400 | 500 |
Soil_vinyard_develop | 6203 | 200 | 300 | 400 | 500 |
Corn_senesced_green_weeds | 3278 | 200 | 300 | 400 | 500 |
Lettuce_romaine_4wk | 1068 | 200 | 300 | 400 | 500 |
Lettuce_romaine_5wk | 1927 | 200 | 300 | 400 | 500 |
Lettuce_romaine_6wk | 916 | 200 | 300 | 400 | 500 |
Lettuce_romaine_7wk | 1070 | 200 | 300 | 400 | 500 |
Vinyard_untrained | 7268 | 200 | 300 | 400 | 500 |
Vinyard_vertical_trellis | 1807 | 200 | 300 | 400 | 500 |
Total | 54,129 | 1800 | 2700 | 3600 | 4500 |
Train Number | Accuracy | SVM | TWSVM | BAENSVM | LSSVM | LSTWSVM | LSBAENSVM |
---|---|---|---|---|---|---|---|
200 | OA | 91.13 | 91.08 | 91.37 | 91.34 | 91.01 | 91.69 |
Kappa | 90.19 | 90.12 | 90.39 | 90.35 | 89.97 | 90.75 | |
300 | OA | 92.01 | 91.98 | 92.40 | 92.22 | 92.13 | 92.57 |
Kappa | 91.14 | 91.06 | 91.51 | 91.30 | 91.29 | 91.69 | |
400 | OA | 92.43 | 92.30 | 92.55 | 92.58 | 92.25 | 92.87 |
Kappa | 91.52 | 91.37 | 91.63 | 91.71 | 91.32 | 92.01 | |
500 | OA | 92.50 | 92.44 | 92.64 | 92.72 | 92.21 | 92.92 |
Kappa | 91.57 | 91.52 | 91.67 | 91.77 | 91.25 | 92.03 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Liu, G.; Wang, L.; Liu, D. Hyperspectral Image Classification Based on a Least Square Bias Constraint Additional Empirical Risk Minimization Nonparallel Support Vector Machine. Remote Sens. 2022, 14, 4263. https://doi.org/10.3390/rs14174263
Liu G, Wang L, Liu D. Hyperspectral Image Classification Based on a Least Square Bias Constraint Additional Empirical Risk Minimization Nonparallel Support Vector Machine. Remote Sensing. 2022; 14(17):4263. https://doi.org/10.3390/rs14174263
Chicago/Turabian StyleLiu, Guangxin, Liguo Wang, and Danfeng Liu. 2022. "Hyperspectral Image Classification Based on a Least Square Bias Constraint Additional Empirical Risk Minimization Nonparallel Support Vector Machine" Remote Sensing 14, no. 17: 4263. https://doi.org/10.3390/rs14174263
APA StyleLiu, G., Wang, L., & Liu, D. (2022). Hyperspectral Image Classification Based on a Least Square Bias Constraint Additional Empirical Risk Minimization Nonparallel Support Vector Machine. Remote Sensing, 14(17), 4263. https://doi.org/10.3390/rs14174263