SFS-AGGL: Semi-Supervised Feature Selection Integrating Adaptive Graph with Global and Local Information
Abstract
:1. Introduction
2. Related Work
2.1. Notations
2.2. Sparse Representation
2.3. Constructing Graph Methods
2.4. Label Propagation Algorithm
2.5. The Graph-Based Semi-Supervised Sparse Feature Selection
3. The Proposed Method
3.1. Methodology Model
3.1.1. SFS Model
3.1.2. Global and Local Adaptive Graph Learning (AGGL) Model
3.1.3. Objective Function
3.2. Model Optimization
3.3. Algorithm Description
Algorithm 1: SFS-AGGL |
Input: Sample Matrix: Label Matrix: Parameters: Output: Feature Projection Matrix W Predictive Labeling Matrix F Similarity Matrix S |
1: Initialization: the initial non-negative matrix , ; 2: Calculation of the matrices U and E according to Equations (12) and (19), compute D and H according to S0 and W0 ; 3: Repeat 4: According to Equation (27) update as ; 5: According to Equation (33) update as ; 6: According to Equation (39) update as ; 7: According to Siter and Witer update matrices D and H; 8: Update ; 9: Until converges |
3.4. Computational Complexity and Convergence Analysis
3.4.1. Computational Complexity Analysis
3.4.2. Proof of Convergence
4. Experiment and Analysis
4.1. Description of the Comparison Methods
4.2. Classification Experiments
4.2.1. Classification Datasets
4.2.2. Evaluation Metric
4.2.3. Experimental Setup for Classification Task
4.2.4. Analysis of Classification Results
4.3. Clustering Experiments
4.3.1. Clustering Datasets
4.3.2. Evaluation Metrics
4.3.3. Experimental Setup for Clustering
4.3.4. Analysis of Clustering Results
4.4. Convergence and Runtime Analysis
5. Conclusions and Discussion
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Wen, J.; Yang, S.; Wang, C.D.; Jiang, Y.; Li, R. Feature-splitting Algorithms for Ultrahigh Dimensional Quantile Regression. J. Econom. 2023, 2023, 105426. [Google Scholar] [CrossRef]
- Lue, X.; Long, L.; Deng, R.; Meng, R. Image feature extraction based on fuzzy restricted Boltzmann machine. Measurement 2022, 204, 112063. [Google Scholar] [CrossRef]
- Sheikhpour, R.; Sarram, M.A.; Gharaghani, S.; Chahooki, M.A.Z. A survey on semi-supervised feature selection methods. Pattern Recognit. 2017, 64, 141–158. [Google Scholar] [CrossRef]
- Mafarja, M.; Qasem, A.; Heidari, A.A.; Aljarah, I.; Faris, H.; Mirjalili, S. Efficient hybrid nature-inspired binary optimizers for feature selection. Cogn. Comput. 2020, 12, 150–175. [Google Scholar] [CrossRef]
- Huang, G.Y.; Hung, C.Y.; Chen, B.W. Image feature selection based on orthogonal ℓ2,0 norms. Measurement 2022, 199, 111310. [Google Scholar] [CrossRef]
- Cai, J.; Luo, J.; Wang, S.; Yang, S. Feature selection in machine learning: A new perspective. Neurocomputing 2018, 300, 70–79. [Google Scholar] [CrossRef]
- Solorio-Fernández, S.; Carrasco-Ochoa, J.A.; Martínez-Trinidad, J.F. A systematic evaluation of filter Unsupervised Feature Selection methods. Expert Syst. Appl. 2020, 162, 113745. [Google Scholar] [CrossRef]
- Bhadra, T.; Bandyopadhyay, S. Supervised feature selection using integration of densest subgraph finding with floating forward–backward search. Inf. Sci. 2021, 566, 1–18. [Google Scholar] [CrossRef]
- Mann, G.S.; McCallum, A. Generalized Expectation Criteria for Semi-Supervised Learning with Weakly Labeled Data. J. Mach. Learn. Res. 2010, 11, 955–984. [Google Scholar]
- Hou, C.; Nie, F.; Li, X.; Yi, D.; Wu, Y. Joint embedding learning and sparse regression: A framework for unsupervised feature selection. IEEE Trans. Cybern. 2013, 44, 793–804. [Google Scholar]
- Wang, L.; Jiang, S.; Jiang, S. A feature selection method via analysis of relevance, redundancy, and interaction. Expert Syst. Appl. 2021, 183, 115365. [Google Scholar] [CrossRef]
- Dokeroglu, T.; Deniz, A.; Kiziloz, H.E. A comprehensive survey on recent metaheuristics for feature selection. Neurocomputing 2022, 494, 2966. [Google Scholar] [CrossRef]
- Nie, F.; Zhu, W.; Li, X. Structured graph optimization for unsupervised feature selection. IEEE Trans. Knowl. Data Eng. 2019, 33, 1210–1222. [Google Scholar] [CrossRef]
- Zhao, Z.; Liu, H. Semi-supervised feature selection via spectral analysis. In Proceedings of the 2007 SIAM International Conference on Data Mining; Society for Industrial and Applied Mathematics, Minneapolis, MN, USA, 26–28 April 2007; pp. 641–646. [Google Scholar]
- Toğaçar, M.; Ergen, B.; Cömert, Z. Classification of flower species by using features extracted from the intersection of feature selection methods in convolutional neural network models. Measurement 2020, 158, 107703. [Google Scholar] [CrossRef]
- Chen, X.; Song, L.; Hou, Y.; Shao, G. Efficient semi-supervised feature selection for VHR remote sensing images. In Proceedings of the 2016 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Beijing, China, 10–15 July 2016; pp. 1500–1503. [Google Scholar]
- Peng, S.; Lu, J.; Cao, J.; Peng, Q.; Yang, Z. Adaptive graph regularization method based on least square regression for clustering. Signal Process. Image Commun. 2023, 114, 116938. [Google Scholar] [CrossRef]
- Chang, X.; Nie, F.; Yang, Y.; Huang, H. A convex formulation for semi-supervised multi-label feature selection. In Proceedings of the AAAI Conference on Artificial Intelligence, Québec City, QC, Canada, 27–31 July 2014; Volume 28. [Google Scholar]
- Chen, X.; Yuan, G.; Nie, F.; Huang, J.Z. Semi-supervised feature selection via rescaled linear regression. In Proceedings of the Twenty Sixth International Joint Conference on Artificial Intelligence, Melbourne, Australia, 19–25 August 2017; pp. 1525–1531. [Google Scholar]
- Chen, X.; Chen, R.; Wu, Q.; Nie, F.; Yang, M.; Mao, R. Semi supervised feature selection via structured manifold learning. IEEE Trans. Cybern. 2021, 52, 5756–5766. [Google Scholar] [CrossRef]
- Liu, Z.; Lai, Z.; Ou, W.; Zhang, K.; Zheng, R. Structured optimal graph based sparse feature extraction for semi-supervised learning. Signal Process. 2020, 170, 107456. [Google Scholar] [CrossRef]
- Akbar, S.; Hayat, M.; Tahir, M.; Chong, K.T. cACP-2LFS: Classification of anticancer peptides using sequential discriminative model of KSAAP and two-level feature selection approach. IEEE Access 2020, 8, 131939–131948. [Google Scholar] [CrossRef]
- Bakir-Gungor, B.; Hacilar, H.; Jabeer, A.; Nalbantoglu, O.U.; Aran, O.; Yousef, M. Inflammatory bowel disease biomarkers of human gut microbiota selected via ensemble feature selection methods. PeerJ 2022, 10, e13205. [Google Scholar] [CrossRef]
- Ahmed, N.; Rafiq, J.I.; Islam, M.R. Enhanced human activity recognition based on smartphone sensor data using hybrid feature selection model. Sensors 2020, 20, 317. [Google Scholar] [CrossRef]
- López, D.; Ramírez-Gallego, S.; García, S.; Xiong, N.; Herrera, F. BELIEF: A distance-based redundancy-proof feature selection method for Big Data. Inf. Sci. 2021, 558, 124–139. [Google Scholar] [CrossRef]
- Chen, X.; Yuan, G.; Wang, W.; Nie, F.; Chang, X.; Huang, J.Z. Local adaptive projection framework for feature selection of labeled and unlabeled data. IEEE Trans. Neural Netw. Learn. Syst. 2019, 29, 6362–6373. [Google Scholar] [CrossRef] [PubMed]
- Cheng, B.; Yang, J.; Yan, S.; Fu, Y.; Huang, T.S. Learning with l1-graph for image analysis. IEEE Trans. Image Process. 2009, 19, 858–866. [Google Scholar] [CrossRef]
- Liu, G.; Lin, Z.; Yan, S.; Sun, J.; Yu, Y.; Ma, Y. Robust recovery of subspace structures by low-rank representation. IEEE Trans. Pattern Anal. Mach. Intell. 2012, 35, 171–184. [Google Scholar] [CrossRef]
- Singh, R.P.; Ojha, D.; Jadon, K.S. A Survey on Various Representation Learning of Hypergraph for Unsupervised Feature Selection. In Data, Engineering and Applications: Select Proceedings of IDEA 2021; Springer: Berlin/Heidelberg, Germany, 2022; pp. 71–82. [Google Scholar]
- Elhamifar, E.; Vidal, R. Sparse subspace clustering: Algorithm, theory, and applications. IEEE Trans. Pattern Anal. Mach. Intell. 2013, 35, 2765–2781. [Google Scholar] [CrossRef]
- Zhong, G.; Pun, C.M. Subspace clustering by simultaneously feature selection and similarity learning. Knowl. Based Syst. 2020, 193, 105512. [Google Scholar] [CrossRef]
- Wan, Y.; Sun, S.; Zeng, C. Adaptive similarity embedding for unsupervised multi-view feature selection. IEEE Trans. Knowl. Data Eng. 2020, 33, 3338–3350. [Google Scholar] [CrossRef]
- Shang, R.; Song, J.; Jiao, L.; Li, Y. Double feature selection algorithm based on low-rank sparse non-negative matrix factorization. Int. J. Mach. Learn. Cybern. 2020, 11, 1891–1908. [Google Scholar] [CrossRef]
- Zhu, J.; Jang-Jaccard, J.; Liu, T.; Zhou, J. Joint spectral clustering based on optimal graph and feature selection. Neural Process. Lett. 2021, 53, 257–273. [Google Scholar] [CrossRef]
- Sha, Y.; Faber, J.; Gou, S.; Liu, B.; Li, W.; Schramm, S.; Stoecker, H.; Steckenreiter, T.; Vnucec, D.; Wetzstein, N.; et al. An acoustic signal cavitation detection framework based on XGBoost with adaptive selection feature engineering. Measurement 2022, 192, 110897. [Google Scholar] [CrossRef]
- Zhu, P.; Hou, X.; Tang, K.; Liu, Y.; Zhao, Y.P.; Wang, Z. Unsupervised feature selection through combining graph learning and ℓ2, 0-norm constraint. Inf. Sci. 2023, 622, 68–82. [Google Scholar] [CrossRef]
- Mei, S.; Zhao, W.; Gao, Q.; Yang, M.; Gao, X. Joint feature selection and optimal bipartite graph learning for subspace clustering. Neural Netw. 2023, 164, 408–418. [Google Scholar] [CrossRef] [PubMed]
- Zhou, P.; Du, L.; Li, X.; Shen, Y.D.; Qian, Y. Unsupervised feature selection with adaptive multiple graph learning. Pattern Recognit. 2020, 105, 107375. [Google Scholar] [CrossRef]
- Bai, X.; Zhu, L.; Liang, C.; Li, J.; Nie, X.; Chang, X. Multi-view feature selection via nonnegative structured graph learning. Neurocomputing 2020, 387, 110–122. [Google Scholar] [CrossRef]
- Zhou, P.; Chen, J.; Du, L.; Li, X. Balanced spectral feature selection. IEEE Trans. Cybern. 2022, 53, 4232–4244. [Google Scholar] [CrossRef]
- Miao, J.; Yang, T.; Sun, L.; Fei, X.; Niu, L.; Shi, Y. Graph regularized locally linear embedding for unsupervised feature selection. Pattern Recognit. 2022, 122, 108299. [Google Scholar] [CrossRef]
- Xie, G.B.; Chen, R.B.; Lin, Z.Y.; Gu, G.S.; Yu, J.R.; Liu, Z.; Cui, J.; Lin, L.; Chen, L. Predicting lncRNA–disease associations based on combining selective similarity matrix fusion and bidirectional linear neighborhood label propagation. Brief. Bioinform. 2023, 24, bbac595. [Google Scholar] [CrossRef]
- Sheikhpour, R.; Sarram, M.A.; Gharaghani, S.; Chahooki, M.A.Z. A robust graph-based semi-supervised sparse feature selection method. Inf. Sci. 2020, 531, 13–30. [Google Scholar] [CrossRef]
- Li, Z.; Tang, J. Semi-supervised local feature selection for data classification. Sci. China Inf. Sci. 2021, 64, 192108. [Google Scholar] [CrossRef]
- Jiang, B.; Wu, X.; Zhou, X.; Liu, Y.; Cohn, A.G.; Sheng, W.; Chen, H. Semi-supervised multiview feature selection with adaptive graph learning. IEEE Trans. Neural Netw. Learn. Syst. 2022, 1–15. [Google Scholar] [CrossRef]
- Shang, R.; Zhang, X.; Feng, J.; Li, Y.; Jiao, L. Sparse and low-dimensional representation with maximum entropy adaptive graph for feature selection. Neurocomputing 2022, 485, 57–73. [Google Scholar] [CrossRef]
- Lai, J.; Chen, H.; Li, T.; Yang, X. Adaptive graph learning for semi-supervised feature selection with redundancy minimization. Inf. Sci. 2022, 609, 465–488. [Google Scholar] [CrossRef]
- Lai, J.; Chen, H.; Li, W.; Li, T.; Wan, J. Semi-supervised feature selection via adaptive structure learning and constrained graph learning. Knowl.-Based Syst. 2022, 251, 109243. [Google Scholar] [CrossRef]
- Luo, T.; Hou, C.; Nie, F.; Tao, H.; Yi, D. Semi-supervised feature selection via insensitive sparse regression with application to video semantic recognition. IEEE Trans. Knowl. Data Eng. 2018, 30, 1943–1956. [Google Scholar] [CrossRef]
- Zhu, R.; Dornaika, F.; Ruichek, Y. Learning a discriminant graph-based embedding with feature selection for image categorization. Neural Netw. 2019, 111, 35–46. [Google Scholar] [CrossRef]
- Favati, P.; Lotti, G.; Menchi, O.; Romani, F. Construction of the similarity matrix for the spectral clustering method: Numerical experiments. J. Comput. Appl. Math. 2020, 375, 112795. [Google Scholar] [CrossRef]
- Qu, J.; Zhao, X.; Xiao, Y.; Chang, X.; Li, Z.; Wang, X. Adaptive Manifold Graph representation for Two-Dimensional Discriminant Projection. Knowl.-Based Syst. 2023, 266, 110411. [Google Scholar] [CrossRef]
- Ma, Z.; Wang, J.; Li, H.; Huang, Y. Adaptive graph regularized non-negative matrix factorization with self-weighted learning for data clustering. Appl. Intell. 2023, 53, 28054–28073. [Google Scholar] [CrossRef]
- Yang, S.; Wen, J.; Zhan, X.; Kifer, D. ET-lasso: A new efficient tuning of lasso-type regularization for high-dimensional data. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, Anchorage, AK, USA, 4–8 August 2019; pp. 607–616. [Google Scholar]
- Huang, S.; Xu, Z.; Wang, F. Nonnegative matrix factorization with adaptive neighbors. In Proceedings of the 2017 International Joint Conference on Neural Networks (IJCNN), Anchorage, AK, USA, 14–19 May 2017; pp. 486–493. [Google Scholar]
- Zhou, W.; Wu, C.; Yi, Y.; Luo, G. Structure preserving non-negative feature self-representation for unsupervised feature selection. IEEE Access 2017, 5, 8792–8803. [Google Scholar] [CrossRef]
- Shang, R.; Zhang, W.; Lu, M.; Jiao, L.; Li, Y. Feature selection based on non-negative spectral feature learning and adaptive rank constraint. Knowl.-Based Syst. 2022, 236, 107749. [Google Scholar] [CrossRef]
- Martinez, A.; Benavente, R. The AR Face Database: CVC Technical Report; Computer Vision Center: Barcelona, Spain, 1998; Volume 24. [Google Scholar]
- Sim, T.; Baker, S.; Bsat, M. The CMU pose, illumination, and expression (PIE) database. In Proceedings of the Fifth IEEE International Conference on Automatic Face Gesture Recognition, Washington, DC, USA, 20–21 May 2002; pp. 53–58. [Google Scholar]
- Zhang, L.; Zhang, L.; Zhang, D.; Zhu, H. Online finger-knuckle-print verification for personal authentication. Pattern Recognit. 2010, 43, 2560–2571. [Google Scholar] [CrossRef]
- Samaria, F.S.; Harter, A.C. Parameterisation of a stochastic model for human face identification. In Proceedings of the 1994 IEEE Workshop on Applications of Computer Vision, Seattle, WA, USA, 21–23 June 1994; pp. 138–142. [Google Scholar]
- Nene, S.A.; Nayar, S.K.; Murase, H. Columbia Object Image Library (COIL-20); Columbia University: New York, NY, USA, 1996. [Google Scholar]
- Yi, Y.; Lai, S.; Li, S.; Dai, J.; Wang, W.; Wang, J. RRNMF-MAGL: Robust regularization non-negative matrix factorization with multi-constraint adaptive graph learning for dimensionality reduction. Inf. Sci. 2023, 640, 119029. [Google Scholar] [CrossRef]
- Blake, C.L.; Merz, C.J. UCI Repository of Machine Learning Databases; Department of Information and Computer Science, University of California: Irvine, CA, USA, 1998; p. 55. [Google Scholar]
- Li, Z.; Tang, C.; Zheng, X.; Liu, X.; Zhang, W.; Zhu, E. High-order correlation preserved incomplete multi-view subspace clustering. IEEE Trans. Image Process. 2022, 31, 2067–2080. [Google Scholar] [CrossRef] [PubMed]
Notation | Description | Notation | Description |
---|---|---|---|
Sample matrix | Zero matrix | ||
Labeled sample matrix | Sample dimension | ||
Unlabeled sample matrix | Sample size | ||
Label matrix | Number of selected features | ||
Predictive labeling matrix | Number of categories | ||
Weighting matrix | Number of label samples | ||
Local adaptation matrix | Infinitely large numbers | ||
Weighting matrix | Matrix dot product | ||
Unit matrix | Traces of matrix |
Matrix | Formula | Time Complexity |
---|---|---|
Method | Number of Variables | Algorithm Complexity |
---|---|---|
RLSR [19] | 2 | |
FDEFS [50] | 3 | |
GS3FS [43] | 4 | |
S2LFS [44] | 3 | |
AGLRM [47] | 4 | |
ASLCGLFS [48] | 4 | |
SFS-AGGL | 3 |
W | ||
F | ||
S | ||
Dataset | Size of Image | Size of Classes | Size per Class | P1 | P2 | Type |
---|---|---|---|---|---|---|
AR | 32 × 32 | 100 | 14 | 7 | 7 | Face |
CMU PIE | 32 × 32 | 68 | 24 | 12 | 12 | Face |
Extended YaleB | 32 × 32 | 38 | 64 | 20 | 44 | Face |
ORL | 32 × 32 | 40 | 10 | 7 | 3 | Face |
COIL20 | 32 × 32 | 20 | 72 | 20 | 52 | Object |
Dataset | {d, t, α, β, θ, λ} |
---|---|
AR | {400, 200, 1, 0.01, 0.01, 0.001} |
CMU PIE | {200, 200, 10, 0.1, 0.001, 0.001} |
Extended YaleB | {300, 100, 10, 0.1, 10, 0.001} |
ORL | {500, 100, 1000, 0.1, 0.001, 0.001} |
COIL20 | {150, 100, 0.1, 0.1, 10, 1} |
Method | AR | CMU PIE | Extended YaleB | ORL | COIL20 |
---|---|---|---|---|---|
NNSAFS | 63.90 ± 2.12 (400) | 85.29 ± 0.64 (500) | 62.58 ± 1.39 (300) | 92.17 ± 1.81 (500) | 93.56 ± 1.32 (100) |
SPNFSR | 64.50 ± 0.91 (200) | 86.22 ± 1.02 (300) | 64.02 ± 1.95 (300) | 92.83 ± 1.81 (500) | 94.21 ± 1.42 (400) |
RLSR | 64.37 ± 1.58 (500) | 84.66 ± 1.25 (500) | 64.57 ± 0.87 (300) | 95.67 ± 1.75 (500) | 93.73 ± 1.25 (500) |
FDEFS | 63.51 ± 1.29 (500) | 85.85 ± 1.06 (500) | 65.01 ± 1.00 (500) | 96.25 ± 1.37 (500) | 94.35 ± 1.42 (450) |
GS3FS | 63.90 ± 1.37 (450) | 85.83 ± 0.68 (500) | 61.85 ± 1.18 (500) | 96.25 ± 1.48 (450) | 93.38 ± 1.35 (500) |
S2LFS | 64.20 ± 1.48 (500) | 87.50 ± 0.85 (500) | 64.67 ± 0.82 (500) | 96.42 ± 1.62 (400) | 94.95 ± 1.18 (500) |
AGLRM | 64.39 ± 1.58 (450) | 86.90 ± 0.72 (450) | 61.89 ± 1.13 (500) | 96.08 ± 1.42 (500) | 95.09 ± 1.48 (200) |
ASLCGLFS | 67.07 ± 1.62 (250) | 87.71 ± 1.12 (150) | 64.36 ± 1.19 (400) | 96.25 ± 1.37 (500) | 95.31 ± 1.13 (100) |
SFS-AGGL | 68.03 ± 1.58(400) | 88.97 ± 1.11(200) | 66.35 ± 1.22 (300) | 96.42 ± 1.31(500) | 95.80 ± 1.16(500) |
Method | AR | CMU PIE | Extended YaleB | ORL | COIL20 |
---|---|---|---|---|---|
RLSR vs. SFS-AGGL | 3.14 × 10−5 | 4.40 × 10−8 | 6.97 × 10−4 | 7.03 × 10−1 | 6.24 × 10−4 |
FDEFS vs. SFS-AGGL | 7.82 × 10−7 | 1.03 × 10−6 | 0.74 × 10−2 | 9.44 × 10−1 | 1.12 × 10−2 |
GS3FS vs. SFS-AGGL | 3.36 × 10−6 | 6.90 × 10−8 | 5.95 × 10−8 | 9.36 × 10−1 | 2.14 × 10−4 |
S2LFS vs. SFS-AGGL | 1.29 × 10−5 | 1.10 × 10−3 | 9.62 × 10−4 | 9.53 × 10−1 | 6.23 × 10−2 |
AGLRM vs. SFS-AGGL | 1.55 × 10−5 | 1.96 × 10−5 | 5.10 × 10−8 | 8.84 × 10−1 | 1.23 × 10−1 |
ASLCGLFS vs. SFS-AGGL | 9.87 × 10−2 | 7.50 × 10−3 | 8.17 × 10−4 | 9.44 × 10−1 | 1.76 × 10−1 |
Dataset | Number of Samples | Dimension | Category |
---|---|---|---|
ORL | 400 | 1024 | 40 |
COIL20 | 1440 | 1024 | 20 |
Libras Movement | 360 | 89 | 15 |
Landsat | 296 | 36 | 6 |
Method | ACC | NMI | Purity | ARI | F-Score | Precision | Recall |
---|---|---|---|---|---|---|---|
RLSR | 62.79 ± 2.89 (500) | 81.04 ± 1.83 (500) | 66.93 ± 2.19 (500) | 49.88 ± 3.78 (100) | 51.13 ± 3.64 (100) | 44.28 ± 4.33 (100) | 60.75 ± 2.65 (500) |
FDEFS | 62.82 ± 3.69 (200) | 81.27 ± 1.59 (100) | 67.25 ± 3.08 (100) | 50.13 ± 3.71 (100) | 51.37 ± 3.60 (100) | 44.55 ± 3.85 (100) | 60.88 ± 3.64 (50) |
GS3FS | 62.21 ± 1.55 (50) | 80.99 ± 0.74 (50) | 66.18 ± 1.33 (50) | 49.86 ± 1.58 (50) | 51.11 ± 1.53 (50) | 44.17 ± 1.79 (50) | 60.79 ± 2.32 (150) |
S2LFS | 61.93 ± 3.35 (350) | 80.62 ± 1.45 (350) | 66.82 ± 2.37 (350) | 48.55 ± 3.61 (350) | 49.82 ± 3.51 (350) | 43.55 ± 3.60 (350) | 58.74 ± 4.44 (400) |
AGLRM | 64.21 ± 3.70 (50) | 81.84 ± 1.89 (50) | 68.00 ± 3.14 (50) | 51.16 ± 4.40 (50) | 52.36 ± 4.26 (50) | 45.80 ± 4.72 (50) | 61.29 ± 4.00 (50) |
ASLCGLFS | 58.32 ± 3.68 (250) | 78.56 ± 2.33 (250) | 63.32 ± 3.10 (250) | 44.22 ± 5.03 (250) | 45.62 ± 4.85 (250) | 39.37 ± 5.58 (300) | 54.62 ± 3.95 (300) |
SFS-AGGL | 67.96 ± 2.30 (250) | 84.17 ± 1.50 (400) | 71.89 ± 1.94 (500) | 56.89 ± 3.34 (400) | 57.95 ± 3.25 (400) | 50.69 ± 3.47 (400) | 67.71 ± 3.11 (400) |
Method | ACC | NMI | Purity | ARI | F-Score | Precision | Recall |
---|---|---|---|---|---|---|---|
RLSR | 60.45 ± 3.98 (150) | 72.19 ± 2.00 (250) | 63.27 ± 3.19 (50) | 50.84 ± 3.42 (300) | 53.42 ± 3.19 (300) | 48.67 ± 3.97 (300) | 59.38 ± 2.56 (50) |
FDEFS | 58.52 ± 3.44 (50) | 70.67 ± 2.62 (400) | 61.55 ± 3.23 (50) | 48.14 ± 4.12 (50) | 50.93 ± 3.85 (50) | 45.32 ± 4.36 (50) | 58.42 ± 3.27 (150) |
GS3FS | 59.98 ± 2.85 (150) | 72.31 ± 1.41 (250) | 63.38 ± 2.52 (150) | 50.23 ± 1.84 (250) | 52.86 ± 1.70 (250) | 47.65 ± 2.60 (250) | 59.47 ± 1.61 (250) |
S2LFS | 58.42 ± 3.90 (250) | 70.19 ± 3.15 (250) | 61.58 ± 3.61 (250) | 46.40 ± 5.23 (450) | 49.41 ± 4.74 (450) | 42.64 ± 6.35 (250) | 59.72 ± 2.47 (450) |
AGLRM | 59.85 ± 4.34 (150) | 72.17 ± 2.59 (150) | 63.17 ± 4.12 (150) | 50.03 ± 4.46 (150) | 52.71 ± 4.15 (150) | 47.16 ± 5.17 (150) | 60.05 ± 3.18 (300) |
ASLCGLFS | 60.02 ± 3.59 (50) | 71.52 ± 1.67 (50) | 62.95 ± 3.35 (50) | 50.05 ± 2.56 (50) | 52.67 ± 2.34 (50) | 48.11 ± 3.81 (100) | 58.55 ± 1.90 (50) |
SFS-AGGL | 61.88 ± 3.70 (350) | 73.30 ± 1.78 (500) | 64.67 ± 3.62 (350) | 52.37 ± 1.80 (500) | 54.85 ± 1.69 (500) | 50.36 ± 3.16 (200) | 62.16 ± 2.28 (500) |
Method | ACC | NMI | Purity | ARI | F-Score | Precision | Recall |
---|---|---|---|---|---|---|---|
RLSR | 47.50 ± 2.21 (40) | 60.07 ± 2.13 (56) | 50.00 ± 1.85 (56) | 30.04 ± 2.88 (56) | 34.82 ± 2.69 (56) | 31.38 ± 2.56 (56) | 39.22 ± 3.70 (56) |
FDEFS | 46.33 ± 3.22 (32) | 60.36 ± 2.88 (32) | 50.22 ± 2.60 (24) | 30.73 ± 3.77 (72) | 35.58 ± 3.41 (72) | 31.60 ± 3.65 (32) | 41.28 ± 3.95 (72) |
GS3FS | 46.72 ± 3.24 (80) | 60.57 ± 1.97 (56) | 50.94 ± 2.24 (80) | 31.20 ± 2.68 (56) | 36.01 ± 2.53 (56) | 31.79 ± 2.32 (80) | 41.93 ± 3.79 (56) |
S2LFS | 46.56 ± 1.92 (80) | 59.95 ± 1.12 (64) | 50.72 ± 1.34 (64) | 30.28 ± 1.80 (80) | 35.13 ± 1.82 (80) | 30.97 ± 1.17 (80) | 40.82 ± 4.28 (80) |
AGLRM | 46.00 ± 2.89 (56) | 60.35 ± 1.32 (56) | 50.72 ± 1.87 (72) | 30.80 ± 1.92 (56) | 35.62 ± 1.88 (56) | 31.42 ± 1.45 (56) | 41.33 ± 4.05 (56) |
ASLCGLFS | 46.28 ± 3.41 (40) | 59.72 ± 2.30 (40) | 50.17 ± 2.33 (40) | 29.93 ± 2.97 (40) | 34.84 ± 2.77 (40) | 30.60 ± 2.68 (40) | 41.04 ± 4.67 (80) |
SFS-AGGL | 49.22 ± 2.88 (72) | 62.33 ± 2.34 (72) | 53.11 ± 2.70 (72) | 33.04 ± 2.65 (56) | 37.75 ± 2.46 (56) | 33.57 ± 3.13 (72) | 44.42 ± 4.83 (80) |
Method | ACC | NMI | Purity | ARI | F-Score | Precision | Recall |
---|---|---|---|---|---|---|---|
RLSR | 48.30 ± 2.10 (6) | 45.97 ± 1.02 (18) | 50.59 ± 1.88 (18) | 33.94 ± 1.46 (27) | 47.53 ± 1.91 (27) | 38.53 ± 1.49 (3) | 63.72 ± 8.11 (27) |
FDEFS | 47.89 ± 2.65 (30) | 45.68 ± 1.59 (30) | 50.60 ± 2.30 (30) | 33.49 ± 1.83 (30) | 47.10 ± 1.99 (24) | 38.12 ± 1.25 (30) | 63.03 ± 6.57 (18) |
GS3FS | 49.10 ± 1.88 (15) | 46.34 ± 1.05 (30) | 51.37 ± 1.82 (21) | 34.02 ± 1.33 (21) | 47.69 ± 1.56 (21) | 38.23 ± 1.33 (30) | 64.44 ± 6.57 (21) |
S2LFS | 47.81 ± 2.63 (15) | 45.99 ± 1.27 (30) | 49.86 ± 2.53 (15) | 34.03 ± 0.82 (15) | 47.46 ± 1.24 (15) | 38.83 ± 1.60 (30) | 62.37 ± 7.02 (15) |
AGLRM | 49.06 ± 3.14 (9) | 45.83 ± 1.40 (15) | 50.97 ± 3.01 (9) | 34.03 ± 1.62 (27) | 47.23 ± 2.13 (27) | 39.39 ± 1.48 (15) | 60.78 ± 8.67 (27) |
ASLCGLFS | 48.79 ± 1.78 (30) | 46.51 ± 1.41 (18) | 50.59 ± 2.03 (24) | 34.77 ± 0.83 (21) | 48.33 ± 0.99 (21) | 38.67 ± 1.17 (30) | 65.35 ± 4.89 (21) |
SFS-AGGL | 51.02 ± 1.99 (12) | 47.21 ± 1.18 (18) | 52.81 ± 1.76 (12) | 35.49 ± 1.23 (27) | 49.04 ± 1.23 (18) | 40.17 ± 1.32 (30) | 69.26 ± 4.51 (15) |
Method | AR | Extended YaleB | CMU PIE | ORL | COIL20 |
---|---|---|---|---|---|
RLSR | 28.3706 | 28.8286 | 27.0407 | 27.0545 | 23.2237 |
FDEFS | 154.3282 | 190.8570 | 210.9357 | 55.6699 | 71.4754 |
GS3FS | 41.5550 | 44.3966 | 49.7174 | 28.2427 | 27.0868 |
S2LFS | 52.7492 | 52.1703 | 54.4472 | 50.3727 | 47.9846 |
AGLRM | 11.7974 | 13.1744 | 15.9342 | 3.4292 | 4.5080 |
ASLCGLFS | 2690.2222 | 2477.4907 | 3735.1528 | 126.7054 | 340.7762 |
SFS-AGGL | 15.5277 | 13.7843 | 17.4384 | 5.5204 | 6.7449 |
SFS-AGGL(GPU) | 10.2069 | 9.3128 | 11.0843 | 3.3713 | 3.9999 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yi, Y.; Zhang, H.; Zhang, N.; Zhou, W.; Huang, X.; Xie, G.; Zheng, C. SFS-AGGL: Semi-Supervised Feature Selection Integrating Adaptive Graph with Global and Local Information. Information 2024, 15, 57. https://doi.org/10.3390/info15010057
Yi Y, Zhang H, Zhang N, Zhou W, Huang X, Xie G, Zheng C. SFS-AGGL: Semi-Supervised Feature Selection Integrating Adaptive Graph with Global and Local Information. Information. 2024; 15(1):57. https://doi.org/10.3390/info15010057
Chicago/Turabian StyleYi, Yugen, Haoming Zhang, Ningyi Zhang, Wei Zhou, Xiaomei Huang, Gengsheng Xie, and Caixia Zheng. 2024. "SFS-AGGL: Semi-Supervised Feature Selection Integrating Adaptive Graph with Global and Local Information" Information 15, no. 1: 57. https://doi.org/10.3390/info15010057
APA StyleYi, Y., Zhang, H., Zhang, N., Zhou, W., Huang, X., Xie, G., & Zheng, C. (2024). SFS-AGGL: Semi-Supervised Feature Selection Integrating Adaptive Graph with Global and Local Information. Information, 15(1), 57. https://doi.org/10.3390/info15010057