Beam-Influenced Attribute Selector for Producing Stable Reduct
Abstract
:1. Introduction
2. Preliminaries
2.1. Attribute Reduction
- (1)
- A holds the constraint ;
- (2)
- , B does not hold the constraint .
- (1)
- Forward greedy searching. For each iteration, one or more appropriate attributes can be selected based on the evaluations of the candidate attributes. Thereby, through using sufficient iterations, a satisfactory reduct can be generated.
- (2)
- Backward greedy searching. For each iteration, one or more inferior attributes will be removed from the set of the raw attributes based on the evaluations. Then, through using sufficient iterations, a justifiable reduct can also be obtained.
- (3)
- Forward-Backward greedy searching. In such a strategy, both the forward and backward strategies are employed for seeking a reasonable reduct. For example, firstly, a potential reduct can be generated based on the strategy of forward searching; secondly, the redundant attributes in such a potential reduct can be further removed by the backward searching.
Algorithm 1: Forward-backward greedy searching for attribute reduction (FBGSAR). |
2.2. Measurement of Stability
3. Beam-Influenced Selector for Attribute Reduction
3.1. Ensemble Selector for Attribute Reduction
- (1)
- Single fitness function may lead to poorer adaptability. For example, a reduct generated based on a single granularity may fail to qualify as the reduct over other granularity, such as a slightly finer or coarser granularity [41] generated by a slight data perturbation.
- (2)
- Single fitness function may result in a poorer learning performance. For instance, in different classes, samples possess some distinct characteristics [31] that tend to optimize class-specific measurements. Nevertheless, revealing the differences among these differentiated characteristics using merely one fitness function is quite challenging.
- (1)
- Homogenous fitness function set: the set of fitness function {} is constructed using same evaluation criterion. For instance, the fitness function set can be defined based on approximation qualities of different rough set models, and then the generated can better adapt to different models.
- (2)
- Heterogeneous fitness function set: the set of fitness function {} is built using different evaluation criteria. For example, the fitness function set can be defined based on many different measures of a rough set model, such as approximation quality and entropy, etc., and then, the derived reduct will better adapt to different constraints.
Algorithm 2: Ensemble selector-based attribute reduction (ESAR). |
- (1)
- Rely heavily on the distribution of samples. Take the classical ensemble selector proposed by Yang and Yao [21] as an example; each fitness function is constructed based on the samples with the same label. Therefore, the performance of the used fitness functions will be degraded if sample distribution is seriously unbalanced or the categories of samples are fewer.
- (2)
- Rely heavily on the selection of appropriate features. Take the selector proposed by Jiang et al. [42] as an example, only the optimal attribute will be selected based on each fitness function. Therefore, some candidate attributes with potential importance will be ignored, which indicates that some attributes with strong adaptability will be difficult to determine.
3.2. Beam-Influenced Selector-Based Attribute Reduction
- (1)
- Divide the set of raw data into groups in terms of the samples;
- (2)
- Different fitness functions are constructed based on different n groups of samples;
- (3)
- Each candidate attribute can be evaluated by different fitness functions, and then the top of the attributes with respect to the evaluation results over each fitness function will be added into the multiset T;
- (4)
- Select an attribute b with the maximal frequency of occurrences in the multiset T.
Algorithm 3: Beam-influenced selector-based attribute reduction (BISAR). |
4. Experimental Analysis
4.1. Data Sets and Configuration
4.2. Experimental Setup
- (1)
- Feature noise. Given raw data, if the noise ratio is , then the injection is realized by randomly selecting features and replacing the values over these features with random numbers (value range is [0, 1]).
- (2)
- Label noise. Given raw data, if the noise ratio is , then the injection is realized by randomly selecting samples and replacing the labels of these samples randomly.
- (1)
- Attribute group based attribute reduction (AGAR) [15];
- (2)
- Dissimilarity based attribute reduction (DAR) [45];
- (3)
- Data-guidance based attribute reduction (DGAR) [42];
- (4)
- Ensemble selector-based attribute reduction (ESAR) [21];
- (5)
- Forward-backward greedy searching based attribute reduction (FBGSAR) [15].
4.3. Comparisons of Stability-Based Reducts
- Compared with reducts generated by AGAR, DAR, DGAR, ESAR and FBGSAR, the reduct obtained by our proposed strategy can possess higher stability in most cases over raw data. Take the “LSVT Voice Rehabilitation (ID: 6)” data set as an example, the values with respect to stabilities of reducts obtained by AGAR, DAR, DGAR, ESAR, FBGSAR and our proposed strategy are 0.1003, 0.2194, 0.1133, 0.1380, 0.1133 and 0.6188, respectively. It is obvious that the reduct with higher stability can be effectively generated by our BISAR.
- Whether the label noise data or feature noise data are considered, the reduct obtained by our proposed strategy can always possess high stability. Take “LSVT Voice Rehabilitation (ID: 6)” data set as an example, over the label noise data, the values with respect to stabilities of reducts obtained by AGAR, DAR, DGAR, ESAR, FBGSAR and our proposed strategy are 0.0600, 0.1926, 0.0650, 0.0838, 0.0792 and 0.5738; over the feature noise data, the values are 0.0787, 0.1938, 0.0851, 0.1070, 0.1127 and 0.6256. It is not difficult to draw a conclusion that our proposed strategy can better adapt to the data with label noise or feature noise.
4.4. Comparisons of Classification Performances
5. Conclusions, Limitations, and Future Research
- The hyperparameters selection will be further optimized based on some parameter optimization approaches;
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
AGAR | attribute group based attribute reduction |
BIS | beam-influenced selector |
BISAR | beam-influenced selector-based attribute reduction |
DAR | dissimilarity based attribute reduction |
DGAR | data-guidance based attribute reductio |
ESAR | ensemble selector-based attribute reduction |
FBGSAR | forward-backward greedy searching based attribute reduction |
KNN | k-nearest neighbor |
SVM | support vector machine |
References
- Xu, W.H.; Yu, J.H. A novel approach to information fusion in multi-source datasets: A granular computing viewpoint. Inf. Sci. 2017, 378, 410–423. [Google Scholar] [CrossRef]
- Emani, C.K.; Cullot, N.; Nicolle, C. Understandable big data: A survey. Comput. Sci. Rev. 2015, 17, 70–81. [Google Scholar] [CrossRef]
- Xu, W.H.; Li, W.T. Granular computing approach to two-way learning based on formal concept analysis in fuzzy datasets. IEEE Trans. Cyber. 2016, 46, 366–379. [Google Scholar] [CrossRef] [PubMed]
- Yuan, K.H.; Xu, W.H.; Li, W.T.; Ding, W.Q. An incremental learning mechanism for object classificationbased on progressive fuzzy three-way concept. Inf. Sci. 2022, 584, 127–147. [Google Scholar] [CrossRef]
- Elaziz, M.A.; Abualigah, L.; Yousri, D.; Oliva, D.; Al-Qaness, M.A.A.; Nadimi-Shahraki, M.H.; Ewees, A.A.; Lu, S.; Ibrahim, R.A. Boosting atomic orbit search using dynamic-based learning for feature selection. Mathematics 2021, 9, 2786. [Google Scholar] [CrossRef]
- Khurma, R.A.; Aljarah, I.; Sharieh, A.; Elaziz, M.A.; Damaševičius, R.; Krilavičius, T. A review of the modification strategies of the nature inspired algorithms for feature selection problem. Mathematics 2022, 10, 464. [Google Scholar] [CrossRef]
- Li, J.D.; Liu, H. Challenges of feature selection for big data analytics. IEEE Intell. Syst. 2017, 32, 9–15. [Google Scholar] [CrossRef] [Green Version]
- Pérez-Martín, A.; Pérez-Torregrosa, A.; Rabasa, A.; Vaca, M. Feature selection to optimize credit banking risk evaluation decisions for the example of home equity loans. Mathematics 2020, 8, 1971. [Google Scholar] [CrossRef]
- Cai, J.; Luo, J.W.; Wang, S.L.; Yang, S. Feature selection in machine learning: A new perspective. Neurocomputing 2018, 300, 70–79. [Google Scholar] [CrossRef]
- Li, Y.; Li, T.; Liu, H. Recent advances in feature selection and its applications. Knowl. Inf. Syst. 2017, 53, 551–577. [Google Scholar] [CrossRef]
- Pawlak, Z. Rough Sets: Theoretical Aspects of Reasoning about Data; Kluwer Academic Publishers: Dordrecht, Netherlands, 1992. [Google Scholar]
- Ju, H.R.; Yang, X.B.; Yu, H.L.; Li, T.J.; Yu, D.J.; Yang, J.Y. Cost-sensitive rough set approach. Inf. Sci. 2016, 355–356, 282–298. [Google Scholar] [CrossRef]
- Liu, D.; Yang, X.; Li, T.R. Three-way decisions: Beyond rough sets and granular computing. Int. J. Mach. Learn. Cybern. 2020, 11, 989–1002. [Google Scholar] [CrossRef]
- Wang, C.Z.; Huang, Y.; Shao, M.W.; Fan, X.D. Fuzzy rough set-based attribute reduction using distance measures. Knowl. Based Syst. 2019, 164, 205–212. [Google Scholar] [CrossRef]
- Chen, Y.; Liu, K.Y.; Song, J.J.; Fujita, H.; Yang, X.B.; Qian, Y.H. Attribute group for attribute reduction. Inf. Sci. 2020, 535, 64–80. [Google Scholar] [CrossRef]
- Liu, K.Y.; Yang, X.B.; Yu, H.L.; Fujita, H.; Chen, X.J.; Liu, D. Supervised information granulation strategy for attribute reduction. Int. J. Mach. Learn. Cybern. 2020, 11, 2149–2163. [Google Scholar] [CrossRef]
- Liu, K.Y.; Yang, X.B.; Yu, H.L.; Mi, J.S.; Wang, P.X.; Chen, X.J. Rough set based semi-supervised feature selection via ensemble selector. Knowl. Based Syst. 2019, 165, 282–296. [Google Scholar] [CrossRef]
- Qian, Y.H.; Liang, J.Y.; Pedrycz, W.; Dang, C.Y. Positive approximation: An accelerator for attribute reduction in rough set theory. Artif. Intell. 2010, 174, 597–618. [Google Scholar] [CrossRef] [Green Version]
- Du, W.; Cao, Z.B.; Song, T.C.; Li, Y.; Liang, Y.C. A feature selection method based on multiple kernel learning with expression profiles of different types. BioData Min. 2017, 10, 4. [Google Scholar] [CrossRef] [Green Version]
- Goh, W.W.B.; Wong, L. Evaluating feature-selection stability in next generation proteomics. J. Bioinform. Comput. Biol. 2016, 14, 1650029. [Google Scholar] [CrossRef] [Green Version]
- Yang, X.B.; Yao, Y.Y. Ensemble selector for attribute reduction. Appl. Soft Comput. 2018, 70, 1–11. [Google Scholar] [CrossRef]
- Wu, W.Z.; Leung, Y. A comparison study of optimal scale combination selection in generalized multi-scale decision tables. Int. J. Mach. Learn. Cybern. 2020, 11, 961–972. [Google Scholar] [CrossRef]
- Wu, W.Z.; Qian, Y.H.; Li, T.J.; Gu, S.M. On rule acquisition in incomplete multi-scale decision tables. Inf. Sci. 2017, 378, 282–302. [Google Scholar] [CrossRef]
- Chen, Z.; Liu, K.Y.; Yang, X.B.; Fujitae, H. Random sampling accelerator for attribute reduction. Int. J. Approx. Reason. 2022, 140, 75–91. [Google Scholar] [CrossRef]
- Freitag, M.; Al-Onaizan, Y. Beam search strategies for neural machine translation. In Proceedings of the First Workshop on Neural Machine Translation, Vancouver, BC, Canada, 4 August 2017; pp. 56–60. [Google Scholar]
- Hu, Q.H.; Yu, D.R.; Xie, Z.X. Neighborhood classifiers. Expert Syst. Appl. 2008, 34, 866–876. [Google Scholar] [CrossRef]
- Wang, C.Z.; Hu, Q.H.; Wang, X.Z.; Chen, D.G.; Qian, Y.H.; Dong, Z. Feature selection based on neighborhood discrimination index. IEEE Trans. Neural Netw. Learn. Syst. 2018, 29, 2986–2999. [Google Scholar] [CrossRef] [PubMed]
- Zhang, X.; Mei, C.L.; Chen, D.G.; Li, J.H. Feature selection in mixed data: A method using a novel fuzzy rough set-based information entropy. Pattern Recognit. 2016, 56, 1–15. [Google Scholar] [CrossRef]
- Chen, Y.; Song, J.J.; Liu, K.Y.; Lin, Y.J.; Yang, X.B. Combined accelerator for attribute reduction: A sample perspective. Math. Probl. Eng. 2020, 2020, 2350627. [Google Scholar] [CrossRef]
- Jiang, Z.H.; Liu, K.Y.; Yang, X.B.; Yu, H.L.; Fujita, H.; Qian, Y.H. Accelerator for supervised neighborhood based attribute reduction. Int. J. Approx. Reason. 2020, 119, 122–150. [Google Scholar] [CrossRef]
- Xu, S.P.; Yang, X.B.; Yu, H.L.; Yu, D.J.; Yang, J.Y.; Tsang, E.C.C. Multi-label learning with label-specific feature reduction. Knowl. Based Syst. 2016, 104, 52–61. [Google Scholar] [CrossRef]
- Wu, W.Z. Attribute reduction based on evidence theory in incomplete decision systems. Inf. Sci. 2008, 178, 1355–1371. [Google Scholar] [CrossRef]
- Quafafou, M. α-RST: A generalization of rough set theory. Inf. Sci. 2000, 124, 301–316. [Google Scholar] [CrossRef]
- Skowron, A.; Rauszer, C. The discernibility matrices and functions in information systems. In Intelligent Decision Support: Handbook of Applications and Advances of the Rough Sets Theory; Springer: Dordrecht, The Netherlands, 1992; Volume 11, pp. 331–362. [Google Scholar]
- Zhang, W.X.; Wei, L.; Qi, J.J. Attribute reduction theory and approach to concept lattice. Sci. China F Inf. Sci. 2005, 48, 713–726. [Google Scholar] [CrossRef]
- Yan, W.W.; Chen, Y.; Shi, J.L.; Yu, H.L.; Yang, X.B. Ensemble and quick strategy for searching Reduct: A hybrid mechanism. Information 2021, 12, 25. [Google Scholar] [CrossRef]
- Xia, S.Y.; Zhang, Z.; Li, W.H.; Wang, G.Y.; Giem, E.; Chen, Z.Z. GBNRS: A Novel rough set algorithm for fast adaptive attribute reduction in classification. IEEE Trans. Knowl. Data Eng. 2020, 34, 1231–1242. [Google Scholar] [CrossRef]
- Yao, Y.Y.; Zhao, Y.; Wang, J. On reduct construction algorithms. Trans. Comput. Sci. II 2008, 5150, 100–117. [Google Scholar]
- Qian, Y.H.; Wang, Q.; Cheng, H.H.; Liang, J.Y.; Dang, C.Y. Fuzzy-rough feature selection accelerator. Fuzzy Sets Syst. 2015, 258, 61–78. [Google Scholar] [CrossRef]
- Yang, X.B.; Qi, Y.; Yu, H.L.; Song, X.N.; Yang, J.Y. Updating multigranulation rough approximations with increasing of granular structures. Knowl. Based Syst. 2014, 64, 59–69. [Google Scholar] [CrossRef]
- Liu, K.Y.; Yang, X.B.; Fujita, H.; Liu, D.; Yang, X.; Qian, Y.H. An efficient selector for multi-granularity attribute reduction. Inf. Sci. 2019, 505, 457–472. [Google Scholar] [CrossRef]
- Jiang, Z.H.; Dou, H.L.; Song, J.J.; Wang, P.X.; Yang, X.B.; Qian, Y.H. Data-guided multi-granularity selector for attribute reduction. Appl. Intell. 2021, 51, 876–888. [Google Scholar] [CrossRef]
- Xu, W.H.; Yuan, K.H.; Li, W.T. Dynamic updating approximations of local generalized multigranulation neighborhood rough set. Appl. Intell. 2022. [Google Scholar] [CrossRef]
- Ba, J.; Liu, K.Y.; Ju, H.R.; Xu, S.P.; Xu, T.H.; Yang, X.B. Triple-G: A new MGRS and attribute reduction. Int. J. Mach. Learn. Cybern. 2022, 13, 337–356. [Google Scholar] [CrossRef]
- Rao, X.S.; Yang, X.B.; Yang, X.; Chen, X.J.; Liu, D.; Qian, Y.H. Quickly calculating reduct: An attribute relationship based approach. Knowl. Based Syst. 2020, 200, 106041. [Google Scholar] [CrossRef]
- Borah, P.; Gupta, D. Functional iterative approaches for solving support vector classification problems based on generalized Huber loss. Neural Comput. Appl. 2020, 32, 9245–9265. [Google Scholar] [CrossRef]
- Borah, P.; Gupta, D. Unconstrained convex minimization based implicit Lagrangian twin extreme learning machine for classification (ULTELMC). Appl. Intell. 2020, 50, 1327–1344. [Google Scholar] [CrossRef]
- Adhikary, D.D.; Gupta, D. Applying over 100 classifiers for churn prediction in telecom companies. Multimed. Tools Appl. 2021, 80, 35123–35144. [Google Scholar] [CrossRef]
- Zhou, H.F.; Wang, X.Q.; Zhu, R.R. Feature selection based on mutual information with correlation coefficient. Appl. Intell. 2021. [Google Scholar] [CrossRef]
- Karakatič, S. EvoPreprocess-Data preprocessing gramework with nature-Inspired optimization algorithms. Mathematics 2020, 8, 900. [Google Scholar] [CrossRef]
- Karakatič, S.; Fister, I.; Fister, D. Dynamic genotype reduction for narrowing the feature selection search Space. In Proceedings of the 2020 IEEE 20th International Symposium on Computational Intelligence and Informatics (CINTI), Budapest, Hungary, 5–7 November 2020; pp. 35–38. [Google Scholar]
- Yan, D.W.; Chi, G.T.; Lai, K.K. Financial distress prediction and feature selection in multiple periods by lassoing unconstrained distributed lag non-linear models. Mathematics 2020, 8, 1275. [Google Scholar] [CrossRef]
ID | Data Sets | Samples | Attributes | Decision Classes |
---|---|---|---|---|
1 | Breast Cancer Wisconsin (Diagnostic) | 569 | 30 | 2 |
2 | Breast Tissue | 106 | 9 | 6 |
3 | Congressional Voting Records | 435 | 16 | 2 |
4 | Forest Type Mapping | 523 | 27 | 4 |
5 | Ionosphere | 315 | 34 | 2 |
6 | LSVT Voice Rehabilitation | 126 | 256 | 2 |
7 | Lymphography | 98 | 18 | 3 |
8 | Madelon | 2600 | 500 | 2 |
9 | Musk (Version 1) | 476 | 166 | 2 |
10 | Parkinsons | 195 | 23 | 7 |
11 | Parkinson Speech Dataset with Multiple Types of Sound Recordings | 1208 | 26 | 2 |
12 | QSAR biodegradation | 1055 | 41 | 2 |
13 | Sonar | 208 | 60 | 2 |
14 | Statlog (Heart) | 270 | 13 | 2 |
15 | Synthetic Control Chart Time Series | 600 | 60 | 6 |
16 | Wine | 178 | 13 | 3 |
ID | AGAR | DAR | DGAR | ESAR | FBGSAR | BISAR |
---|---|---|---|---|---|---|
1 | 0.5116 | 0.6466 | 0.6446 | 0.5295 | 0.5830 | 0.7501 |
2 | 0.7708 | 0.8025 | 0.8336 | 0.9329 | 0.7882 | 0.9522 |
3 | 0.4259 | 0.6850 | 0.4091 | 0.6378 | 0.4116 | 0.6158 |
4 | 0.6150 | 0.7408 | 0.6306 | 0.7412 | 0.6316 | 0.8098 |
5 | 0.2807 | 0.3978 | 0.3669 | 0.4608 | 0.3340 | 0.6511 |
6 | 0.1103 | 0.2194 | 0.1133 | 0.1380 | 0.1133 | 0.6188 |
7 | 0.3868 | 0.6086 | 0.4620 | 0.5804 | 0.3790 | 0.7739 |
8 | 0.1938 | 0.5297 | 0.2558 | 0.1617 | 0.2863 | 0.2903 |
9 | 0.1539 | 0.3790 | 0.1835 | 0.1644 | 0.1720 | 0.4745 |
10 | 0.6410 | 0.7520 | 0.6732 | 0.7808 | 0.7015 | 0.9089 |
11 | 0.7343 | 0.8704 | 0.7937 | 0.7795 | 0.7897 | 0.8321 |
12 | 0.6175 | 0.8238 | 0.7584 | 0.7340 | 0.7504 | 0.8561 |
13 | 0.1176 | 0.2981 | 0.1360 | 0.1773 | 0.1360 | 0.4064 |
14 | 0.6271 | 0.7977 | 0.7885 | 0.7137 | 0.7279 | 0.8033 |
15 | 0.2442 | 0.4187 | 0.3343 | 0.5237 | 0.3503 | 0.3575 |
16 | 0.4448 | 0.6289 | 0.4940 | 0.5229 | 0.5821 | 0.6984 |
avg | 0.4297 | 0.5999 | 0.4924 | 0.5362 | 0.4835 | 0.6750 |
ID | AGAR | DAR | DGAR | ESAR | FBGSAR | BISAR |
---|---|---|---|---|---|---|
1 | 0.5848 | 0.7555 | 0.6281 | 0.6193 | 0.6262 | 0.8081 |
2 | 0.7636 | 0.8302 | 0.7845 | 0.9234 | 0.7816 | 0.9458 |
3 | 0.7114 | 0.7385 | 0.7082 | 0.7228 | 0.7190 | 0.8988 |
4 | 0.6567 | 0.7937 | 0.6703 | 0.7292 | 0.6828 | 0.8493 |
5 | 0.4669 | 0.5529 | 0.4509 | 0.5955 | 0.4665 | 0.8205 |
6 | 0.0600 | 0.1926 | 0.0650 | 0.0838 | 0.0792 | 0.5738 |
7 | 0.3840 | 0.5854 | 0.4474 | 0.4809 | 0.4648 | 0.8300 |
8 | 0.1577 | 0.4950 | 0.2119 | 0.1310 | 0.2023 | 0.2668 |
9 | 0.1066 | 0.3240 | 0.1345 | 0.1494 | 0.1372 | 0.5461 |
10 | 0.6341 | 0.7588 | 0.7009 | 0.8062 | 0.6779 | 0.9239 |
11 | 0.7193 | 0.8477 | 0.7675 | 0.7773 | 0.7747 | 0.8538 |
12 | 0.6279 | 0.7981 | 0.6976 | 0.7188 | 0.6961 | 0.8670 |
13 | 0.1342 | 0.3135 | 0.1434 | 0.1814 | 0.1585 | 0.4770 |
14 | 0.6935 | 0.8111 | 0.7675 | 0.7788 | 0.7881 | 0.8927 |
15 | 0.1679 | 0.3861 | 0.2522 | 0.3840 | 0.2597 | 0.3887 |
16 | 0.5738 | 0.6975 | 0.5854 | 0.6395 | 0.5768 | 0.8134 |
avg | 0.4652 | 0.6175 | 0.5010 | 0.5451 | 0.5057 | 0.7347 |
ID | AGAR | DAR | DGAR | ESAR | FBGSAR | BISAR |
---|---|---|---|---|---|---|
1 | 0.4297 | 0.5841 | 0.5316 | 0.4826 | 0.4832 | 0.7231 |
2 | 0.6788 | 0.7523 | 0.7252 | 0.8747 | 0.6940 | 0.9172 |
3 | 0.3373 | 0.5795 | 0.3602 | 0.5081 | 0.3810 | 0.5808 |
4 | 0.5410 | 0.7313 | 0.5629 | 0.6856 | 0.5901 | 0.7971 |
5 | 0.2081 | 0.2960 | 0.2529 | 0.3771 | 0.2445 | 0.5329 |
6 | 0.0787 | 0.1938 | 0.0851 | 0.1070 | 0.1127 | 0.6256 |
7 | 0.3359 | 0.5344 | 0.3956 | 0.5449 | 0.3918 | 0.7752 |
8 | 0.0988 | 0.4505 | 0.1116 | 0.0572 | 0.1012 | 0.1652 |
9 | 0.0906 | 0.3071 | 0.1104 | 0.1255 | 0.1104 | 0.4379 |
10 | 0.5575 | 0.6977 | 0.6098 | 0.7375 | 0.5988 | 0.8958 |
11 | 0.6398 | 0.8258 | 0.7298 | 0.7222 | 0.7209 | 0.7797 |
12 | 0.5395 | 0.7564 | 0.6436 | 0.6642 | 0.6498 | 0.8194 |
13 | 0.1001 | 0.2304 | 0.1114 | 0.1165 | 0.1148 | 0.4163 |
14 | 0.5740 | 0.7204 | 0.6593 | 0.6568 | 0.6500 | 0.7690 |
15 | 0.1659 | 0.3550 | 0.2281 | 0.4777 | 0.2168 | 0.3288 |
16 | 0.3641 | 0.5585 | 0.3889 | 0.4736 | 0.4165 | 0.6743 |
avg | 0.3588 | 0.5358 | 0.4067 | 0.4757 | 0.4048 | 0.6399 |
ID | AGAR and BISAR | DAR and BISAR | DGAR and BISAR | ESAR and BISAR | FBGSAR and BISAR |
---|---|---|---|---|---|
1 | 5.3081 | 7.6417 | 1.5479 | 4.9864 | |
2 | 3.9832 | 6.7574 | 4.6827 | 8.0126 | 8.1705 |
3 | 6.6815 | 7.8760 | 7.8321 | 5.4804 | 7.9626 |
4 | 9.7384 | 3.6480 | 1.0530 | 9.3183 | 6.0143 |
5 | 2.2178 | 6.7765 | 7.5774 | 3.6372 | 3.2931 |
6 | 6.0148 | 6.7765 | 9.1266 | 7.5774 | 2.6898 |
7 | 6.7478 | 3.4070 | 2.0047 | 5.2376 | 6.1004 |
8 | 2.3413 | 6.7765 | 3.2348 | 5.0907 | 5.2499 |
9 | 2.0616 | 6.7765 | 6.6737 | 2.0616 | 4.5401 |
10 | 7.8552 | 8.1003 | 7.0110 | 2.9521 | 2.1571 |
11 | 1.3303 | 4.3165 | 2.7454 | 1.6648 | 2.0711 |
12 | 1.9943 | 3.1036 | 4.6986 | 2.2273 | 7.1081 |
13 | 9.1266 | 6.7765 | 2.3557 | 3.7051 | 9.1266 |
14 | 1.1590 | 1.1982 | 6.4561 | 2.9223 | 6.2163 |
15 | 2.7986 | 6.7765 | 4.2488 | 5.5605 | 2.1841 |
16 | 1.1582 | 1.6236 | 1.2265 | 1.0373 | 1.7824 |
ID | AGAR and BISAR | DAR and BISAR | DGAR and BISAR | ESAR and BISAR | FBGSAR and BISAR |
---|---|---|---|---|---|
1 | 5.9907 | 1.7500 | 1.5377 | 9.7693 | 8.3396 |
2 | 2.9400 | 1.0400 | 3.6085 | 4.9537 | 1.2440 |
3 | 6.7669 | 6.7600 | 6.7669 | 1.3731 | 3.3273 |
4 | 1.7612 | 4.6500 | 2.3147 | 6.2200 | 2.0407 |
5 | 5.2507 | 1.4400 | 3.7020 | 1.6571 | 1.9177 |
6 | 4.5390 | 3.7500 | 3.4156 | 1.0646 | 9.1728 |
7 | 6.7956 | 2.9400 | 1.2346 | 2.9249 | 1.6669 |
8 | 1.2941 | 9.1300 | 4.6792 | 6.9166 | 6.9166 |
9 | 1.2346 | 5.9000 | 5.2269 | 1.7933 | 1.4359 |
10 | 1.3304 | 9.8700 | 2.3045 | 1.0113 | 4.1420 |
11 | 7.7061 | 2.2900 | 1.6659 | 3.7020 | 7.5699 |
12 | 1.2922 | 1.1400 | 6.2126 | 9.2428 | 2.7360 |
13 | 3.4958 | 3.1500 | 1.1045 | 7.9720 | 2.1393 |
14 | 6.4965 | 8.6200 | 4.5414 | 1.1829 | 5.0924 |
15 | 9.7480 | 7.5600 | 3.9662 | 6.7193 | 6.7193 |
16 | 3.9425 | 4.1000 | 3.7796 | 3.1556 | 5.5013 |
ID | AGAR and BISAR | DAR and BISAR | DGAR and BISAR | ESAR and BISAR | FBGSAR and BISAR |
---|---|---|---|---|---|
1 | 5.0907 | 6.7900 | 2.2270 | 2.2270 | 3.1517 |
2 | 2.5837 | 3.3700 | 1.4726 | 8.7024 | 7.2803 |
3 | 3.4156 | 9.0300 | 1.2009 | 2.3883 | 4.3147 |
4 | 3.6309 | 1.0200 | 5.1049 | 4.3202 | 1.1045 |
5 | 5.2269 | 7.4100 | 8.5974 | 2.9598 | 3.6636 |
6 | 6.7860 | 5.8700 | 6.7860 | 1.2009 | 6.7956 |
7 | 6.7956 | 1.6600 | 6.7956 | 1.8074 | 1.2345 |
8 | 3.6048 | 6.8000 | 8.1032 | 1.3761 | 9.1266 |
9 | 6.7956 | 3.3700 | 6.7956 | 1.3312 | 1.5469 |
10 | 1.2203 | 1.5800 | 4.5815 | 7.7089 | 3.0553 |
11 | 7.7118 | 2.8500 | 3.1517 | 1.1045 | 9.7480 |
12 | 3.3798 | 6.0100 | 4.3184 | 3.0480 | 1.6098 |
13 | 1.2009 | 2.5600 | 3.0691 | 3.6388 | 1.7824 |
14 | 2.9249 | 1.4000 | 1.9533 | 1.4149 | 9.1266 |
15 | 7.5788 | 5.7900 | 1.7939 | 1.0141 | 2.9598 |
16 | 1.8074 | 3.6000 | 2.9249 | 6.7470 | 1.2365 |
ID | AGAR | DAR | DGAR | ESAR | FBGSAR | BISAR |
---|---|---|---|---|---|---|
1 | 0.9747 | 0.9401 | 0.9765 | 0.9782 | 0.9746 | 0.9770 |
2 | 0.6291 | 0.6152 | 0.6273 | 0.6441 | 0.6600 | 0.6364 |
3 | 0.9045 | 0.9563 | 0.9172 | 0.8644 | 0.8782 | 0.9084 |
4 | 0.8577 | 0.8494 | 0.8571 | 0.8780 | 0.8744 | 0.8807 |
5 | 0.9097 | 0.8741 | 0.9141 | 0.8883 | 0.8827 | 0.9070 |
6 | 0.8212 | 0.8356 | 0.8132 | 0.8492 | 0.8132 | 0.8144 |
7 | 0.7580 | 0.7540 | 0.7700 | 0.6811 | 0.7047 | 0.7330 |
8 | 0.6015 | 0.6344 | 0.5994 | 0.5776 | 0.6078 | 0.5842 |
9 | 0.7702 | 0.7287 | 0.7651 | 0.7182 | 0.7333 | 0.8013 |
10 | 0.0867 | 0.0703 | 0.0892 | 0.0631 | 0.0751 | 0.0846 |
11 | 0.6548 | 0.6895 | 0.6557 | 0.6659 | 0.6683 | 0.6621 |
12 | 0.8439 | 0.8204 | 0.8366 | 0.8350 | 0.8370 | 0.8498 |
13 | 0.7400 | 0.6963 | 0.7351 | 0.7534 | 0.7351 | 0.8176 |
14 | 0.7972 | 0.7413 | 0.8070 | 0.7809 | 0.7917 | 0.7846 |
15 | 0.8443 | 0.7909 | 0.8079 | 0.6478 | 0.7794 | 0.7515 |
16 | 0.9194 | 0.9506 | 0.9306 | 0.9243 | 0.9426 | 0.8954 |
avg | 0.7571 | 0.7467 | 0.7564 | 0.7343 | 0.7474 | 0.7555 |
ID | AGAR | DAR | DGAR | ESAR | FBGSAR | BISAR |
---|---|---|---|---|---|---|
1 | 0.9082 | 0.8768 | 0.9088 | 0.8987 | 0.9036 | 0.9069 |
2 | 0.4914 | 0.5786 | 0.4941 | 0.5805 | 0.5832 | 0.4823 |
3 | 0.8441 | 0.8613 | 0.8414 | 0.7934 | 0.7860 | 0.8536 |
4 | 0.8169 | 0.8043 | 0.8098 | 0.8259 | 0.8200 | 0.8249 |
5 | 0.8393 | 0.8076 | 0.8440 | 0.8307 | 0.8179 | 0.8400 |
6 | 0.6844 | 0.7168 | 0.6920 | 0.7228 | 0.6984 | 0.6848 |
7 | 0.6925 | 0.6965 | 0.7155 | 0.6863 | 0.6784 | 0.7120 |
8 | 0.5285 | 0.5684 | 0.5239 | 0.5211 | 0.5245 | 0.5212 |
9 | 0.6886 | 0.6751 | 0.6833 | 0.6696 | 0.6733 | 0.7192 |
10 | 0.0941 | 0.0764 | 0.0969 | 0.0903 | 0.0990 | 0.0887 |
11 | 0.6193 | 0.6383 | 0.6211 | 0.6266 | 0.6248 | 0.6243 |
12 | 0.7736 | 0.7525 | 0.7722 | 0.7610 | 0.7641 | 0.7719 |
13 | 0.6646 | 0.6251 | 0.6510 | 0.6663 | 0.6610 | 0.7090 |
14 | 0.7263 | 0.7020 | 0.7272 | 0.7172 | 0.7261 | 0.7215 |
15 | 0.7548 | 0.7147 | 0.7163 | 0.6141 | 0.6742 | 0.6539 |
16 | 0.8689 | 0.9161 | 0.8674 | 0.8560 | 0.8763 | 0.8360 |
avg | 0.6872 | 0.6882 | 0.6853 | 0.6788 | 0.6819 | 0.6844 |
ID | AGAR | DAR | DGAR | ESAR | FBGSAR | BISAR |
---|---|---|---|---|---|---|
1 | 0.9734 | 0.9360 | 0.9732 | 0.9742 | 0.9741 | 0.9733 |
2 | 0.5750 | 0.5943 | 0.5682 | 0.5732 | 0.5782 | 0.5677 |
3 | 0.9061 | 0.9586 | 0.9053 | 0.8811 | 0.8910 | 0.8991 |
4 | 0.8531 | 0.8441 | 0.8533 | 0.8740 | 0.8681 | 0.8667 |
5 | 0.9049 | 0.8601 | 0.9074 | 0.8810 | 0.8690 | 0.8996 |
6 | 0.8108 | 0.8236 | 0.8192 | 0.8292 | 0.8316 | 0.8040 |
7 | 0.7505 | 0.7365 | 0.7580 | 0.7026 | 0.7121 | 0.7250 |
8 | 0.6079 | 0.6246 | 0.6160 | 0.5663 | 0.6069 | 0.5781 |
9 | 0.7398 | 0.7117 | 0.7331 | 0.7018 | 0.7078 | 0.7682 |
10 | 0.0879 | 0.0759 | 0.0915 | 0.0787 | 0.0897 | 0.0856 |
11 | 0.6465 | 0.6786 | 0.6503 | 0.6638 | 0.6649 | 0.6514 |
12 | 0.8337 | 0.8084 | 0.8294 | 0.8264 | 0.8261 | 0.8334 |
13 | 0.7215 | 0.6805 | 0.7146 | 0.7217 | 0.7212 | 0.7873 |
14 | 0.7889 | 0.7450 | 0.7948 | 0.7802 | 0.7870 | 0.7774 |
15 | 0.8234 | 0.7723 | 0.7988 | 0.6032 | 0.7584 | 0.7286 |
16 | 0.9174 | 0.9464 | 0.9220 | 0.9163 | 0.9351 | 0.8886 |
avg | 0.7463 | 0.7373 | 0.7459 | 0.7234 | 0.7388 | 0.7396 |
ID | AGAR | DAR | DGAR | ESAR | FBGSAR | BISAR |
---|---|---|---|---|---|---|
1 | 0.9744 | 0.9419 | 0.9752 | 0.9854 | 0.9848 | 0.9779 |
2 | 0.5895 | 0.4952 | 0.5909 | 0.4495 | 0.4468 | 0.5909 |
3 | 0.9090 | 0.9540 | 0.9103 | 0.9057 | 0.9126 | 0.9080 |
4 | 0.8404 | 0.8491 | 0.8417 | 0.8805 | 0.8758 | 0.8555 |
5 | 0.8853 | 0.8441 | 0.8907 | 0.8720 | 0.8607 | 0.8886 |
6 | 0.8504 | 0.8720 | 0.8556 | 0.8696 | 0.8556 | 0.8148 |
7 | 0.8000 | 0.7410 | 0.8125 | 0.7189 | 0.7195 | 0.8595 |
8 | 0.5799 | 0.5606 | 0.5768 | 0.5724 | 0.5824 | 0.5775 |
9 | 0.7020 | 0.6687 | 0.7000 | 0.6044 | 0.6237 | 0.6635 |
10 | 0.0469 | 0.0718 | 0.0487 | 0.0941 | 0.0918 | 0.0269 |
11 | 0.6107 | 0.6527 | 0.6112 | 0.6689 | 0.6692 | 0.6124 |
12 | 0.8270 | 0.8092 | 0.8190 | 0.8176 | 0.8124 | 0.8265 |
13 | 0.6888 | 0.6680 | 0.7034 | 0.7112 | 0.7034 | 0.7456 |
14 | 0.8593 | 0.7691 | 0.8587 | 0.8085 | 0.8139 | 0.8407 |
15 | 0.8470 | 0.8138 | 0.8368 | 0.7283 | 0.8313 | 0.8183 |
16 | 0.9386 | 0.9489 | 0.9429 | 0.9203 | 0.9497 | 0.9203 |
avg | 0.7468 | 0.7288 | 0.7484 | 0.7255 | 0.7334 | 0.7454 |
ID | AGAR | DAR | DGAR | ESAR | FBGSAR | BISAR |
---|---|---|---|---|---|---|
1 | 0.9602 | 0.9197 | 0.9607 | 0.9707 | 0.9706 | 0.9557 |
2 | 0.4091 | 0.4657 | 0.4077 | 0.4041 | 0.4009 | 0.4059 |
3 | 0.8910 | 0.9102 | 0.8876 | 0.8885 | 0.8660 | 0.9049 |
4 | 0.8278 | 0.8459 | 0.8242 | 0.8648 | 0.8582 | 0.8354 |
5 | 0.8736 | 0.8240 | 0.8676 | 0.8434 | 0.8261 | 0.8834 |
6 | 0.7476 | 0.8116 | 0.7412 | 0.7780 | 0.7396 | 0.7200 |
7 | 0.7550 | 0.7210 | 0.7645 | 0.6974 | 0.7142 | 0.7860 |
8 | 0.5557 | 0.5519 | 0.5512 | 0.5449 | 0.5539 | 0.5461 |
9 | 0.6598 | 0.6595 | 0.6619 | 0.5837 | 0.5819 | 0.6415 |
10 | 0.0815 | 0.1154 | 0.0913 | 0.1179 | 0.1174 | 0.0756 |
11 | 0.6042 | 0.6492 | 0.6034 | 0.6623 | 0.6628 | 0.6045 |
12 | 0.7889 | 0.7873 | 0.7863 | 0.7870 | 0.7929 | 0.7866 |
13 | 0.6300 | 0.6073 | 0.6293 | 0.6310 | 0.6432 | 0.6488 |
14 | 0.8070 | 0.7570 | 0.8069 | 0.7670 | 0.7707 | 0.7967 |
15 | 0.8031 | 0.7631 | 0.7888 | 0.6982 | 0.7627 | 0.7446 |
16 | 0.9280 | 0.9481 | 0.9309 | 0.9034 | 0.9211 | 0.9014 |
avg | 0.7077 | 0.7086 | 0.7065 | 0.6964 | 0.6989 | 0.7023 |
ID | AGAR | DAR | DGAR | ESAR | FBGSAR | BISAR |
---|---|---|---|---|---|---|
1 | 0.7390 | 0.6640 | 0.7188 | 0.7956 | 0.7902 | 0.7738 |
2 | 0.3377 | 0.4005 | 0.3441 | 0.3741 | 0.3427 | 0.3882 |
3 | 0.6971 | 0.6214 | 0.6953 | 0.5797 | 0.5637 | 0.7218 |
4 | 0.5337 | 0.5800 | 0.5284 | 0.6423 | 0.6084 | 0.6119 |
5 | 0.7654 | 0.6521 | 0.7650 | 0.6723 | 0.6517 | 0.7861 |
6 | 0.6944 | 0.8016 | 0.6960 | 0.7080 | 0.6976 | 0.7720 |
7 | 0.6310 | 0.5945 | 0.6045 | 0.6368 | 0.6163 | 0.7425 |
8 | 0.5004 | 0.4803 | 0.5024 | 0.4882 | 0.4895 | 0.4994 |
9 | 0.5895 | 0.5894 | 0.5895 | 0.4842 | 0.4842 | 0.5894 |
10 | 0.0710 | 0.1023 | 0.0754 | 0.1179 | 0.1208 | 0.0600 |
11 | 0.5909 | 0.5975 | 0.5909 | 0.5892 | 0.5892 | 0.5909 |
12 | 0.6777 | 0.6682 | 0.6775 | 0.6303 | 0.6303 | 0.6776 |
13 | 0.6020 | 0.5910 | 0.5890 | 0.5795 | 0.5766 | 0.5971 |
14 | 0.6411 | 0.6663 | 0.6424 | 0.5365 | 0.5511 | 0.6296 |
15 | 0.4345 | 0.4601 | 0.4027 | 0.3283 | 0.3836 | 0.3737 |
16 | 0.5657 | 0.5703 | 0.5483 | 0.5429 | 0.5180 | 0.5697 |
avg | 0.5670 | 0.5650 | 0.5606 | 0.5441 | 0.5384 | 0.5865 |
ID | AGAR and BISAR | DAR and BISAR | DGAR and BISAR | ESAR and BISAR | FBGSAR and BISAR | |
---|---|---|---|---|---|---|
1 | KNN | 2.0644 | 5.2573 | 2.1889 | 9.8911 | 1.6004 |
SVM | 2.5723 | 6.0814 | 3.9437 | 1.2039 | 1.0070 | |
2 | KNN | 3.7135 | 4.8863 | 4.6827 | 4.6921 | 3.1837 |
SVM | 1.5427 | 5.2465 | 4.6827 | 6.6519 | 7.2475 | |
3 | KNN | 3.6985 | 6.8796 | 6.8796 | 6.8796 | 6.8796 |
SVM | 3.3772 | 4.6827 | 4.6827 | 4.6827 | 4.6827 | |
4 | KNN | 2.8056 | 1.1806 | 1.0481 | 3.9018 | 3.2251 |
SVM | 2.0648 | 3.5596 | 8.5592 | 3.9834 | 9.0919 | |
5 | KNN | 5.2376 | 1.3910 | 6.3108 | 6.6904 | 1.9399 |
SVM | 6.1545 | 2.8044 | 6.9196 | 1.8908 | 4.9346 | |
6 | KNN | 6.1605 | 3.4279 | 8.3896 | 9.5712 | 8.3896 |
SVM | 1.7159 | 3.1024 | 1.4745 | 1.0865 | 1.4745 | |
7 | KNN | 1.9103 | 1.0344 | 4.7130 | 6.2748 | 3.1920 |
SVM | 2.0188 | 6.9204 | 6.6483 | 4.4311 | 1.2975 | |
8 | KNN | 2.7326 | 1.7193 | 2.1841 | 5.4271 | 9.0892 |
SVM | 2.5021 | 1.3308 | 3.2344 | 7.2508 | 1.2307 | |
9 | KNN | 1.4860 | 1.0259 | 5.0986 | 1.1885 | 1.0278 |
SVM | 8.0246 | 7.7617 | 2.2257 | 1.2221 | 3.6325 | |
10 | KNN | 9.8914 | 7.1531 | 4.8874 | 2.8783 | 1.2076 |
SVM | 1.5778 | 1.8814 | 1.2125 | 1.8659 | 1.8443 | |
11 | KNN | 8.6173 | 1.3389 | 7.9774 | 9.5978 | 4.6637 |
SVM | 4.5248 | 7.6986 | 6.4350 | 5.6611 | 5.5477 | |
12 | KNN | 2.9258 | 3.8833 | 8.9614 | 5.2758 | 8.6188 |
SVM | 8.6024 | 7.4837 | 9.8615 | 9.1377 | 6.8006 | |
13 | KNN | 1.7761 | 7.8634 | 1.0860 | 1.7656 | 1.0860 |
SVM | 8.6270 | 1.3444 | 5.9977 | 3.5839 | 5.9977 | |
14 | KNN | 2.7905 | 2.8985 | 7.6348 | 6.9469 | 2.4437 |
SVM | 4.8118 | 2.7764 | 1.3714 | 1.4023 | 2.5560 | |
15 | KNN | 6.2426 | 8.1017 | 3.2047 | 2.6753 | 5.6459 |
SVM | 7.6417 | 6.0715 | 2.7909 | 1.7788 | 3.3682 | |
16 | KNN | 1.3842 | 2.5667 | 1.5250 | 2.6838 | 3.2508 |
SVM | 2.3925 | 9.1071 | 5.0916 | 9.3508 | 1.7683 |
ID | AGAR and BISAR | DAR and BISAR | DGAR and BISAR | ESAR and BISAR | FBGSAR and BISAR | |
---|---|---|---|---|---|---|
1 | KNN | 9.6758 | 2.5577 | 7.6583 | 3.8397 | 2.8503 |
SVM | 3.4925 | 1.0400 | 3.2929 | 1.5927 | 1.0273 | |
2 | KNN | 6.8455 | 4.3762 | 7.0463 | 4.0877 | 2.2126 |
SVM | 8.9230 | 2.2852 | 9.5680 | 8.8155 | 7.0451 | |
3 | KNN | 3.9625 | 2.9725 | 2.9281 | 2.0463 | 6.7193 |
SVM | 3.8144 | 1.5830 | 8.4363 | 1.4276 | 8.8909 | |
4 | KNN | 2.5016 | 1.0159 | 3.7157 | 4.7338 | 6.1668 |
SVM | 2.7268 | 2.2176 | 8.0775 | 1.2155 | 4.6850 | |
5 | KNN | 9.7838 | 8.1514 | 8.2844 | 2.1257 | 9.3528 |
SVM | 2.7204 | 1.6557 | 1.5516 | 8.1531 | 8.3383 | |
6 | KNN | 9.8919 | 2.7375 | 6.4524 | 1.9909 | 3.9377 |
SVM | 9.0453 | 1.0315 | 1.2586 | 3.1860 | 1.2260 | |
7 | KNN | 9.2983 | 3.5002 | 8.4958 | 4.6513 | 5.6343 |
SVM | 7.3803 | 1.8509 | 2.3371 | 1.3708 | 4.0766 | |
8 | KNN | 3.5059 | 5.4754 | 1.0000E+00 | 9.0308 | 8.3920 |
SVM | 9.0892 | 2.7923 | 3.9408 | 8.3923 | 1.8054 | |
9 | KNN | 3.0484 | 2.5975 | 9.1734 | 7.3834 | 1.4397 |
SVM | 8.8203 | 7.6362 | 9.0786 | 5.2252 | 1.0909 | |
10 | KNN | 4.6370 | 5.2357 | 2.7107 | 9.1346 | 1.2847 |
SVM | 3.8501 | 4.6004 | 5.1025 | 8.0404 | 4.3220 | |
11 | KNN | 1.2970 | 6.2163 | 4.6488 | 5.4273 | 9.8921 |
SVM | 8.4962 | 3.3218 | 8.2833 | 6.6344 | 6.6344 | |
12 | KNN | 5.9769 | 5.3213 | 8.6035 | 1.5152 | 1.0743 |
SVM | 4.2473 | 4.8179 | 7.6597 | 4.9023 | 3.5743 | |
13 | KNN | 1.0142 | 1.3953 | 1.7673 | 6.2598 | 4.6850 |
SVM | 2.9740 | 3.7131 | 1.7173 | 3.0999 | 6.3558 | |
14 | KNN | 5.7874 | 2.3835 | 2.3886 | 8.6031 | 1.9816 |
SVM | 1.5925 | 1.1591 | 2.6119 | 4.6492 | 5.2485 | |
15 | KNN | 2.0334 | 1.3479 | 1.2893 | 1.6770 | 2.5585 |
SVM | 1.1865 | 3.7202 | 5.1451 | 5.3378 | 3.1685 | |
16 | KNN | 1.0943 | 5.1395 | 1.2186 | 2.2854 | 6.2524 |
SVM | 9.5507 | 4.5217 | 1.5728 | 7.1467 | 8.2662 |
ID | AGAR and BISAR | DAR and BISAR | DGAR and BISAR | ESAR and BISAR | FBGSAR and BISAR | |
---|---|---|---|---|---|---|
1 | KNN | 7.0398 | 6.5411 | 9.1365 | 5.0717 | 4.8974 |
SVM | 1.5925 | 4.6673 | 1.2498 | 3.0531 | 4.6977 | |
2 | KNN | 4.5642 | 1.5516 | 8.9223 | 8.4943 | 5.7795 |
SVM | 1.7171 | 7.7621 | 2.1819 | 8.4971 | 4.1681 | |
3 | KNN | 3.5397 | 6.3219 | 1.1612 | 5.6754 | 6.3220 |
SVM | 6.8154 | 8.2261 | 4.9579 | 1.0394 | 2.3960 | |
4 | KNN | 1.3036 | 1.7477 | 1.0307 | 1.0418 | 3.6405 |
SVM | 9.0967 | 4.4830 | 9.0967 | 3.9382 | 9.4603 | |
5 | KNN | 1.6233 | 7.0537 | 3.3053 | 2.0454 | 7.6509 |
SVM | 7.3434 | 3.5158 | 3.7827 | 9.5953 | 3.0538 | |
6 | KNN | 7.0443 | 3.0340 | 3.4998 | 1.9289 | 1.9333 |
SVM | 3.4680 | 5.4426 | 6.1335 | 2.3396 | 5.4144 | |
7 | KNN | 2.4799 | 3.0220 | 5.5464 | 6.7339 | 2.2780 |
SVM | 2.8125 | 2.4634 | 8.9355 | 2.0081 | 5.8136 | |
8 | KNN | 7.6403 | 3.7933 | 1.4364 | 7.6390 | 2.6533 |
SVM | 4.6566 | 4.7933 | 1.0859 | 1.1641 | 1.0113 | |
9 | KNN | 1.3807 | 4.8136 | 4.8708 | 2.3429 | 3.9666 |
SVM | 1.0000 | 6.1469 | 1.0000 | 1.1052 | 1.1052 | |
10 | KNN | 6.7298 | 1.1530 | 3.7819 | 3.2856 | 7.1431 |
SVM | 7.0818 | 2.3496 | 2.9515 | 2.2516 | 1.8580 | |
11 | KNN | 1.5929 | 5.1822 | 5.9774 | 6.5521 | 1.9495 |
SVM | 1.0344 | 1.4745 | 2.1841 | 2.8500 | 2.1841 | |
12 | KNN | 7.8647 | 3.3646 | 4.2449 | 2.9079 | 1.6734 |
SVM | 7.7617 | 5.2499 | 3.2348 | 6.7298 | 4.2488 | |
13 | KNN | 8.6853 | 8.8203 | 2.4109 | 6.8995 | 1.8640 |
SVM | 5.5521 | 8.8090 | 8.5793 | 2.6073 | 2.1336 | |
14 | KNN | 1.3292 | 2.6390 | 4.5064 | 4.4819 | 1.9365 |
SVM | 7.4115 | 1.6209 | 6.2268 | 6.5337 | 9.1050 | |
15 | KNN | 7.5328 | 3.2544 | 5.5354 | 8.0248 | 2.9395 |
SVM | 4.2488 | 1.0751 | 5.4275 | 2.5027 | 8.0763 | |
16 | KNN | 9.9096 | 7.4222 | 2.6588 | 4.5521 | 1.1188 |
SVM | 9.6750 | 7.5551 | 9.6755 | 8.8125 | 5.5859 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yan, W.; Ba, J.; Xu, T.; Yu, H.; Shi, J.; Han, B. Beam-Influenced Attribute Selector for Producing Stable Reduct. Mathematics 2022, 10, 553. https://doi.org/10.3390/math10040553
Yan W, Ba J, Xu T, Yu H, Shi J, Han B. Beam-Influenced Attribute Selector for Producing Stable Reduct. Mathematics. 2022; 10(4):553. https://doi.org/10.3390/math10040553
Chicago/Turabian StyleYan, Wangwang, Jing Ba, Taihua Xu, Hualong Yu, Jinlong Shi, and Bin Han. 2022. "Beam-Influenced Attribute Selector for Producing Stable Reduct" Mathematics 10, no. 4: 553. https://doi.org/10.3390/math10040553
APA StyleYan, W., Ba, J., Xu, T., Yu, H., Shi, J., & Han, B. (2022). Beam-Influenced Attribute Selector for Producing Stable Reduct. Mathematics, 10(4), 553. https://doi.org/10.3390/math10040553