Literature Review on Hybrid Evolutionary Approaches for Feature Selection
Abstract
:1. Introduction
- The initial work [18] only focused and reviewed a limited number of papers (only 10 in number) published from 2020–2021. In order to provide a more comprehensive overview of the field, the additional relevant research on hybrid FS from 2009–2022 is extremely important to include in the review study.
- The current review paper deepens the scope of our research on multiple domains covering a wide range of metaheuristic approaches and wrapper classifiers.
- The literature review presented in the current paper aims to fulfill the highly evolving nature of research in the field of FS, and it is very important to stay up to date with the latest developments in order to provide the most accurate and relevant information to the readers.
- Therefore, we believed it was important to design the current updated and extended review paper, which will be of interest to researchers in the FS domain.
2. Related Work
- reducing overfitting and eliminating redundant data,
- improving accuracy and reducing misleading results, and
- reducing the ML algorithm training time, dropping the algorithm complexity, and speeding up the training process.
- 1.
- Searching Techniques: To obtain the best features with the highest accuracy, searching approaches are required to be applied in an FS process. Exhaustive search, heuristic search, and evolutionary computation are a few popular searching methods. An exhaustive search is explored in a few works [19,20]. Numerous heuristic strategies and greedy techniques, such as sequential forward selection (SFS) [22], and sequential backward selection (SBS), have therefore been used for FS [23]. However, in later parts of the FS process, it could be impossible to select or delete eliminated or selected features because both SFS and SBS suffer from the “nesting effect” problem. After being selected, features in the SFS method cannot be discarded later, while the features discarded in the SBS cannot be selected again. These two approaches can be compromised by using SFS l times and then applying SBS r times [24]. The nesting effect can be reduced by such a method, but the correct values of l and r must be determined carefully. Sequential backward and forward floating methods were presented to avoid this problem [22]. A two-layer cutting plane approach was recently suggested in [23] to evaluate the best subsets of characteristics. In [24], an exhaustive FS search with backtracking and a heuristic search was proposed.Various EC approaches have been proposed in recent years to tackle the challenges of the FS problems successfully. Some of them are differential evolution (DE) [25], genetic algorithms (GAs) [26], grey wolf optimization (GWO) [27,28], ant colony optimization (ACO) [29,30,31], binary Harris hawks optimization (BHHO) [32,33] and improved BHHO (IBHHO) [34], binary ant lion optimization (BALO) [35,36], salp swarm algorithm (SSA) [37], dragon algorithm (DA) [38], multiverse algorithm (MVA) [39], Jaya optimization algorithms such as the FS based on the Jaya optimization algorithm (FSJaya) [40] and the FS based on the adaptive Jaya algorithm (AJA) [41], grasshopper swarm intelligence optimization algorithm (GOA) and its binary versions [42], binary teaching learning-based optimization (BTLBO) [43], harmony search (HS) [44], and the vortex search algorithm (VSA) [45], etc. All these techniques have been applied for performing FS on various types of datasets, and they have been demonstrated to achieve high optimization rates and to increase the CA. EC techniques require no domain knowledge and do not presume whether the training dataset is linearly separable or not. Another valuable aspect of EC methods is that their population-based process can deliver several solutions in one cycle. However, EC approaches often entail considerable computational costs because they typically include a wide range of assessments. The stability of an EC approach is also a critical concern, as the respective algorithms often pick different features from various rounds. Further research study is required as the growing number of characteristics in large-scale datasets also raises computational costs and decreases the consistency of EC algorithm application [13] in certain real-world FS activities. A high-level description of the most used EC algorithms is given below.
- Genetic Algorithm (GA): A GA [46] is a metaheuristic influenced by natural selection that belongs to the larger class of evolutionary algorithms in computer science and operations research. GA relies on biologically inspired operators, such as mutation, crossover, and selection to develop high-quality solutions to optimization and search challenges. The GA is a mechanism that governs biological evolution and for tackling both constrained and unconstrained optimization issues. The GA adjusts a population of candidate solutions on a regular basis.
- Particle Swarm Optimization (PSO): PSO is a bioinspired algorithm that is straightforward to use while looking for the best alternative in the solution space. It differs from other optimization techniques in that it simply requires the objective function and is unaffected by the gradient or any differential form of the objective. It also has a small number of hyperparameters. Kennedy and Eberhart proposed PSO in 1995 [47]. Sociobiologists think that a school of fish or a flock of birds moving in a group “may profit from the experience of all other members”, as stated in the original publication. In other words, while a bird is flying around looking for food at random, all of the birds in the flock can share what they find and assist the entire flock to get the best hunt possible. While we may imitate the movement of a flock of birds, we can also assume that each bird is assisting us in locating the best solution in a high-dimensional solution space, with the flock’s best solution being the best solution in the space. This is a heuristic approach because we can never be certain that the true global optimal solution exists, and it rarely does. However, we frequently discover that the PSO solution is very close to the global optimum.
- Grey Wolf Optimizer (GWO): Mirjalili et al. [48] presented GWO as a new metaheuristic in 2014. The grey wolf’s social order and hunting mechanisms inspired the algorithm. First, there are four wolves, or degrees of the social hierarchy, to consider when creating GWO.
- –
- The wolf: the solution having best fitness value;
- –
- the wolf: the solution having second-best fitness value;
- –
- the wolf: the solution having third-best fitness value; and
- –
- the wolf: all other solutions.
As a result, the algorithm’s hunting mechanism is guided by the first three appropriate wolves, , , and . The remaining wolves are regarded as and follow them. Grey wolves follow a set of well-defined steps during hunting: encircling, hunting, and attacking. - Harris Hawk Optimization (HHO): Heidari and his team introduced HHO as a new metaheuristic algorithm in 2019 [49]. HHO uses Harris hawk principles to investigate the prey, surprise pounce, and diverse assault techniques used by Harris hawks in the environment. Hawks reflect alternatives in HHO, whereas prey represents the best solution. The Harris hawks use their keen vision to follow the target and then conduct a surprise pounce to seize the prey they have spotted. In general, HHO is divided into two phases: exploitation and exploration. The HHO algorithm can be switched from exploration to exploitation, and the exploration behaviour can then be adjusted depending on the fleeing prey’s energy.
- 2.
- Criteria for Evaluation: The common evaluation criteria for wrapper FS techniques are the classification efficiency and effectiveness by using the selected attributes. Decision trees (DTs), support vector machines (SVMs), naive Bayes (NB), k-nearest neighbor (KNN), artificial neural networks (ANNs), and linear discriminant analysis (LDA) are just a few examples of common classifiers that have been used as wrappers in FS applications [50,51,52]. In the domain of filter approaches, measurements from a variety of disciplines have been incorporated, particularly information theory, correlation estimates, distance metrics, and consistency criteria [53]. Individual feature evaluation, relying on a particular aspect, is a basic filter approach in which only the best tier features are selected [50]. Relief [54] is a distinctive case in which a distance metric is applied to assess the significance of features. Filter methods are often computationally inexpensive, but they do not consider attribute relationships, which often leads to complicated problems in case of repetitive feature sets, such as in the case of microarray gene data, where the genes are intrinsically correlated [21,53]. To overcome these issues, it is necessary to perform proper filter measurements to choose a suitable subset of relevant features in order to evaluate the whole feature set. Wang et al. [55] recently published a distance measure to assess the difference between the chosen feature space and the space spanned by all features in order to locate a subset of features that approximates all features. Peng et al. [56] introduced the minimum redundancy maximum relevance (MRMR) approach based on shared information, and recommended measures were added to the EC because of their powerful exploration capability [57,58]. A unified selection approach was proposed by Mao and Tsang [23], which optimizes multivariate performance measures but also results in an enormous search area for high-dimensional data, a problem that requires strong heuristic search methods for finding the best output. There are several relatively straightforward statistical methods, such as t-testing, logistic regression (LR), hierarchical clustering, and classification and regression tree (CART), which can be applied jointly to produce better classification results [59]. Recently, authors of [60] have applied sparse LR for FS problems including millions of features. Min et al. [24] developed a rough principle procedure to solve FS tasks under budgetary and schedule constraints. Many experiments show that most filter mechanisms are inefficient for cases with vast numbers of features [61].
- 3.
- Number of Objectives: Single-objective (SO) optimization frameworks are techniques which combine the classifier’s accuracy and the features quantity into a single optimization function. On the contrary, multiobjective (MO) optimization approaches entail techniques designed to find and balance the tradeoffs among alternatives. In an SO situation, a solution’s superiority over other solutions is determined by comparing the resulting fitness values, while in an MO optimization, the dominance notion is employed to get the best results [62]. In particular, to determine the significance of the derived feature sets, in an MO situation, multiple criteria need to be optimized by considering different parameters. MO strategies thus may be used to solve some challenging problems involving multiple conflicting goals [63], and MO optimization comprises fitness functions that minimize or maximize multiple conflicting goals. For example, a typical MO problem with minimization functions can be expressed mathematically as follows,Finding the balance among the competing objectives is the process that identifies the dominance of an MO optimization approach. For example, a solution dominates another solution in a minimization problem if and only if
3. A Brief Survey
Search Procedure
- 1.
- What are the search approaches that were utilised to find the best features?
- 2.
- What are the search algorithms utilised to choose the best features for classification?
- 3.
- What hybrid search approaches have been utilised to choose the best characteristics for classification?
- Inclusion Criteria:
- –
- Research articles on hybrid evolutionary FS must have been published between 2009 and 2022.
- –
- Only research that has been published in peer-reviewed publications is included.
- –
- If the study had been published in more than one journal, we select the most complete version for inclusion.
- –
- Only related works utilised for classification are included.
- Exclusion Criteria:
- –
- Research articles prior to 2009 are not included.
- –
- Papers that are unrelated to the search topic are rejected.
- –
- Only items written in English are considered. Other languages are removed.
4. Analysis and Discussion
- The fitness value focused only on the error rate and not on the number of features (P5 [68]).
- They take longer to execute (P35 [17]).
- The capability of newly developed methods has not been thoroughly explored, particularly in terms of their scalability, and therefore additional research is suggested for FS in high-dimensional real-world applications.
- Since computation complexity is one of the key issues in most hybrid approaches for FS, it is recommended that more appropriate measures to reduce computational complexity should be proposed. Two key considerations must be weighed in order to do so: (1) more efficient ways to perform searching in the large solution spaces and (2) faster evaluation tools.
- The FS priorities, such as computational burden and space complexity, can indeed be viewed in combination with the two main objectives of the hybrid FS problem (i.e., exploration and exploitation).
- Proposing new methodologies that soften the fitness landscape will significantly reduce the problem’s complexities and motivate the development of more effective search strategies.
- Most of the existing studies in the literature used only one fitness function. However, FS can be viewed as an MO problem and thus, the application of hybridization in multiobjective FS tasks is an open research domain for researchers.
- As hybrid FS techniques are time-consuming as compared to the others, employing parallel processing during the FS phase is also an area of research to be explored.
- Most of the abovementioned articles are wrapper-based; however, the optimal solutions generated by wrapper approaches are less generic. Therefore, a hybrid–hybrid approach (i.e., hybridising filter and wrapper criteria while mixing evolutionary techniques) for FS is a challenging research domain.
- Feature selection plays a vital role in the biomedical area due to the high dimensionality of the data. However, very few works (22%) explored their techniques in this field. Therefore, the application of hybrid FS techniques to biomedical data is a very good research area for the future.
5. Conclusions and Future Work
Author Contributions
Funding
Conflicts of Interest
References
- Piri, J.; Mohapatra, P.; Dey, R. Fetal Health Status Classification Using MOGA—CD Based Feature Selection Approach. In Proceedings of the IEEE International Conference on Electronics, Computing and Communication Technologies (CONECCT), Bangalore, India, 2–4 July 2020; pp. 1–6. [Google Scholar]
- Bhattacharyya, T.; Chatterjee, B.; Singh, P.K.; Yoon, J.H.; Geem, Z.W.; Sarkar, R. Mayfly in Harmony: A New Hybrid Meta-Heuristic Feature Selection Algorithm. IEEE Access 2020, 8, 195929–195945. [Google Scholar] [CrossRef]
- Piri, J.; Mohapatra, P. Exploring Fetal Health Status Using an Association Based Classification Approach. In Proceedings of the IEEE International Conference on Information Technology (ICIT), Bhubaneswar, India, 19–21 December 2019; pp. 166–171. [Google Scholar]
- Piri, J.; Mohapatra, P.; Acharya, B.; Gharehchopogh, F.S.; Gerogiannis, V.C.; Kanavos, A.; Manika, S. Feature Selection Using Artificial Gorilla Troop Optimization for Biomedical Data: A Case Analysis with COVID-19 Data. Mathematics 2022, 10, 2742. [Google Scholar] [CrossRef]
- Jain, D.; Singh, V. Diagnosis of Breast Cancer and Diabetes using Hybrid Feature Selection Method. In Proceedings of the 5th International Conference on Parallel, Distributed and Grid Computing (PDGC), Solan, India, 20–22 December 2018; pp. 64–69. [Google Scholar]
- Mendiratta, S.; Turk, N.; Bansal, D. Automatic Speech Recognition using Optimal Selection of Features based on Hybrid ABC-PSO. In Proceedings of the IEEE International Conference on Inventive Computation Technologies (ICICT), Coimbatore, India, 26–27 August 2016; Volume 2, pp. 1–7. [Google Scholar]
- Naik, A.; Kuppili, V.; Edla, D.R. Binary Dragonfly Algorithm and Fisher Score Based Hybrid Feature Selection Adopting a Novel Fitness Function Applied to Microarray Data. In Proceedings of the International IEEE Conference on Applied Machine Learning (ICAML), Bhubaneswar, India, 27–28 September 2019; pp. 40–43. [Google Scholar]
- Monica, K.M.; Parvathi, R. Hybrid FOW—A Novel Whale Optimized Firefly Feature Selector for Gait Analysis. Pers. Ubiquitous Comput. 2021, 1–13. [Google Scholar] [CrossRef]
- Azmi, R.; Pishgoo, B.; Norozi, N.; Koohzadi, M.; Baesi, F. A Hybrid GA and SA Algorithms for Feature Selection in Recognition of Hand-printed Farsi Characters. In Proceedings of the IEEE International Conference on Intelligent Computing and Intelligent Systems, Xiamen, China, 29–31 October 2010; Volume 3, pp. 384–387. [Google Scholar]
- Al-Tashi, Q.; Abdulkadir, S.J.; Rais, H.M.; Mirjalili, S.; Alhussian, H. Approaches to Multi-Objective Feature Selection: A Systematic Literature Review. IEEE Access 2020, 8, 125076–125096. [Google Scholar] [CrossRef]
- Brezočnik, L.; Fister, I.; Podgorelec, V. Swarm Intelligence Algorithms for Feature Selection: A Review. Appl. Sci. 2018, 8, 1521. [Google Scholar] [CrossRef] [Green Version]
- Venkatesh, B.; Anuradha, J. A Review of Feature Selection and Its Methods. Cybern. Inf. Technol. 2019, 19, 3–26. [Google Scholar] [CrossRef] [Green Version]
- Abd-Alsabour, N. A Review on Evolutionary Feature Selection. In Proceedings of the IEEE European Modelling Symposium, Pisa, Italy, 21–23 October 2014; pp. 20–26. [Google Scholar]
- Wolpert, D.H.; Macready, W.G. No Free Lunch Theorems for Optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef] [Green Version]
- Cheng, M.Y.; Prayogo, D. Symbiotic Organisms Search: A new Metaheuristic Optimization Algorithm. Comput. Struct. 2014, 139, 98–112. [Google Scholar] [CrossRef]
- Singh, N.; Son, L.H.; Chiclana, F.; Magnot, J.P. A new Fusion of Salp Swarm with Sine Cosine for Optimization of Non-Linear Functions. Eng. Comput. 2020, 36, 185–212. [Google Scholar] [CrossRef] [Green Version]
- Piri, J.; Mohapatra, P.; Singh, H.K.R.; Acharya, B.; Patra, T.K. An Enhanced Binary Multiobjective Hybrid Filter-Wrapper Chimp Optimization Based Feature Selection Method for COVID-19 Patient Health Prediction. IEEE Access 2022, 10, 100376–100396. [Google Scholar] [CrossRef]
- Piri, J.; Mohapatra, P.; Dey, R.; Panda, N. Role of Hybrid Evolutionary Approaches for Feature Selection in Classification: A Review. In Proceedings of the International Conference on Metaheuristics in Software Engineering and its Application, Marrakech, Morocco, 27–30 October 2022; pp. 92–103. [Google Scholar]
- Blum, A.; Langley, P. Selection of Relevant Features and Examples in Machine Learning. Artif. Intell. 1997, 97, 245–271. [Google Scholar] [CrossRef] [Green Version]
- Liu, H.; Motoda, H. Feature Selection for Knowledge Discovery and Data Mining; The Springer International Series in Engineering and Computer Science; Springer: Berlin/Heidelberg, Germany, 1998; Volume 454. [Google Scholar]
- Guyon, I.; Elisseeff, A. An Introduction to Variable and Feature Selection. J. Mach. Learn. Res. 2003, 3, 1157–1182. [Google Scholar]
- Pudil, P.; Novovicová, J.; Kittler, J. Floating Search Methods in Feature Selection. Pattern Recognit. Lett. 1994, 15, 1119–1125. [Google Scholar] [CrossRef]
- Mao, Q.; Tsang, I.W. A Feature Selection Method for Multivariate Performance Measures. IEEE Trans. Pattern Anal. Mach. Intell. 2013, 35, 2051–2063. [Google Scholar] [CrossRef] [Green Version]
- Min, F.; Hu, Q.; Zhu, W. Feature Selection with Test Cost Constraint. Int. J. Approx. Reason. 2014, 55, 167–179. [Google Scholar] [CrossRef] [Green Version]
- Vivekanandan, T.; Iyengar, N.C.S.N. Optimal Feature Selection using a Modified Differential Evolution Algorithm and its Effectiveness for Prediction of Heart Disease. Comput. Biol. Med. 2017, 90, 125–136. [Google Scholar] [CrossRef]
- Sahebi, G.; Movahedi, P.; Ebrahimi, M.; Pahikkala, T.; Plosila, J.; Tenhunen, H. GeFeS: A Generalized Wrapper Feature Selection Approach for Optimizing Classification Performance. Comput. Biol. Med. 2020, 125, 103974. [Google Scholar] [CrossRef]
- Al-Tashi, Q.; Rais, H.; Jadid, S. Feature Selection Method Based on Grey Wolf Optimization for Coronary Artery Disease Classification. In Proceedings of the International Conference of Reliable Information and Communication Technology, Kuala Lumpur, Malaysia, 23–24 July 2018; pp. 257–266. [Google Scholar]
- Too, J.; Abdullah, A.R. Opposition based Competitive Grey Wolf Optimizer for EMG Feature Selection. Evol. Intell. 2021, 14, 1691–1705. [Google Scholar] [CrossRef]
- Aghdam, M.H.; Ghasem-Aghaee, N.; Basiri, M.E. Text Feature Selection using Ant Colony Optimization. Expert Syst. Appl. 2009, 36, 6843–6853. [Google Scholar] [CrossRef]
- Erguzel, T.T.; Tas, C.; Cebi, M. A Wrapper-based Approach for Feature Selection and Classification of Major Depressive Disorder-Bipolar Disorders. Comput. Biol. Med. 2015, 64, 127–137. [Google Scholar] [CrossRef]
- Huang, H.; Xie, H.; Guo, J.; Chen, H. Ant Colony Optimization-based Feature Selection Method for Surface Electromyography Signals Classification. Comput. Biol. Med. 2012, 42, 30–38. [Google Scholar] [CrossRef]
- Piri, J.; Mohapatra, P. An Analytical Study of Modified Multi-objective Harris Hawk Optimizer towards Medical Data Feature Selection. Comput. Biol. Med. 2021, 135, 104558. [Google Scholar] [CrossRef] [PubMed]
- Too, J.; Abdullah, A.R.; Saad, N.M. A New Quadratic Binary Harris Hawk Optimization for Feature Selection. Electronics 2019, 8, 1130. [Google Scholar] [CrossRef] [Green Version]
- Zhang, Y.; Liu, R.; Wang, X.; Chen, H.; Li, C. Boosted Binary Harris Hawks Optimizer and Feature Selection. Eng. Comput. 2021, 37, 3741–3770. [Google Scholar] [CrossRef]
- Emary, E.; Zawbaa, H.M.; Hassanien, A.E. Binary Ant Lion Approaches for Feature Selection. Neurocomputing 2016, 213, 54–65. [Google Scholar] [CrossRef]
- Piri, J.; Mohapatra, P.; Dey, R. Multi-objective Ant Lion Optimization Based Feature Retrieval Methodology for Investigation of Fetal Wellbeing. In Proceedings of the 3rd IEEE International Conference on Inventive Research in Computing Applications (ICIRCA), Coimbatore, India, 21–23 September 2021; pp. 1732–1737. [Google Scholar]
- Hegazy, A.E.; Makhlouf, M.A.A.; El-Tawel, G.S. Improved Salp Swarm Algorithm for Feature Selection. J. King Saud Univ.-Comput. Inf. Sci. 2020, 32, 335–344. [Google Scholar] [CrossRef]
- Mafarja, M.M.; Aljarah, I.; Heidari, A.A.; Faris, H.; Fournier-Viger, P.; Li, X.; Mirjalili, S. Binary Dragonfly Optimization for Feature Selection using Time-varying Transfer Functions. Knowl. Based Syst. 2018, 161, 185–204. [Google Scholar] [CrossRef]
- Sreejith, S.; Nehemiah, H.K.; Kannan, A. Clinical Data Classification using an Enhanced SMOTE and Chaotic Evolutionary Feature Selection. Comput. Biol. Med. 2020, 126, 103991. [Google Scholar] [CrossRef]
- Das, H.; Naik, B.; Behera, H.S. A Jaya Algorithm based Wrapper Method for Optimal Feature Selection in Supervised Classification. J. King Saud Univ.-Comput. Inf. Sci. 2020, 34, 3851–3863. [Google Scholar] [CrossRef]
- Tiwari, V.; Jain, S.C. An Optimal Feature Selection Method for Histopathology Tissue Image Classification using Adaptive Jaya Algorithm. Evol. Intell. 2021, 14, 1279–1292. [Google Scholar] [CrossRef]
- Haouassi, H.; Merah, E.; Rafik, M.; Messaoud, M.T.; Chouhal, O. A new Binary Grasshopper Optimization Algorithm for Feature Selection Problem. J. King Saud Univ.-Comput. Inf. Sci. 2022, 34, 316–328. [Google Scholar]
- Mohan, A.; Nandhini, M. Optimal Feature Selection using Binary Teaching Learning based Optimization Algorithm. J. King Saud Univ.-Comput. Inf. Sci. 2022, 34, 329–341. [Google Scholar]
- Dash, R. An Adaptive Harmony Search Approach for Gene Selection and Classification of High Dimensional Medical Data. J. King Saud Univ.-Comput. Inf. Sci. 2021, 33, 195–207. [Google Scholar] [CrossRef]
- Gharehchopogh, F.S.; Maleki, I.; Dizaji, Z.A. Chaotic Vortex Search Algorithm: Metaheuristic Algorithm for Feature Selection. Evol. Intell. 2022, 15, 1777–1808. [Google Scholar] [CrossRef]
- Mitchell, M. An Introduction to Genetic Algorithms; MIT Press: Cambridge, MA, USA, 1998. [Google Scholar]
- Kennedy, J.; Eberhart, R. Particle Swarm Optimization. In Proceedings of the IEEE International Conference on Neural Networks (ICNN), Perth, WA, Australia, 27 November–1 December 1995; pp. 1942–1948. [Google Scholar]
- Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
- Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.M.; Chen, H. Harris Hawks Optimization: Algorithm and Applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
- Liu, H.; Zhao, Z. Manipulating Data and Dimension Reduction Methods: Feature Selection. In Encyclopedia of Complexity and Systems Science; Springer: Berlin/Heidelberg, Germany, 2009; pp. 5348–5359. [Google Scholar]
- Liu, H.; Motoda, H.; Setiono, R.; Zhao, Z. Feature Selection: An Ever Evolving Frontier in Data Mining. In Proceedings of the 4th International Workshop on Feature Selection in Data Mining (FSDM), Hyderabad, India, 21 June 2010; Volume 10, pp. 4–13. [Google Scholar]
- Xue, B.; Zhang, M.; Browne, W.N. Particle Swarm Optimization for Feature Selection in Classification: A Multi-Objective Approach. IEEE Trans. Cybern. 2013, 43, 1656–1671. [Google Scholar] [CrossRef]
- Dash, M.; Liu, H. Feature Selection for Classification. Intell. Data Anal. 1997, 1, 131–156. [Google Scholar] [CrossRef]
- Kira, K.; Rendell, L.A. A Practical Approach to Feature Selection. In Proceedings of the 9th International Workshop on Machine Learning (ML), San Francisco, CA, USA, 1–3 July 1992; Morgan Kaufmann: Burlington, MA, USA, 1992; pp. 249–256. [Google Scholar]
- Wang, S.; Pedrycz, W.; Zhu, Q.; Zhu, W. Subspace learning for unsupervised feature selection via matrix factorization. Pattern Recognit. 2015, 48, 10–19. [Google Scholar] [CrossRef]
- Peng, H.; Long, F.; Ding, C.H.Q. Feature Selection Based on Mutual Information: Criteria of Max-Dependency, Max-Relevance, and Min-Redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 2005, 27, 1226–1238. [Google Scholar] [CrossRef]
- Cervante, L.; Xue, B.; Zhang, M.; Shang, L. Binary Particle Swarm Optimisation for Feature Selection: A Filter based Approach. In Proceedings of the IEEE Congress on Evolutionary Computation (CEC), Brisbane, Australia, 10–15 June 2012; pp. 1–8. [Google Scholar]
- Ünler, A.; Murat, A.E.; Chinnam, R.B. mr2PSO: A Maximum Relevance Minimum Redundancy Feature Selection Method based on Swarm Intelligence for Support Vector Machine Classification. Inf. Sci. 2011, 181, 4625–4641. [Google Scholar] [CrossRef]
- Tan, N.C.; Fisher, W.G.; Rosenblatt, K.P.; Garner, H.R. Application of Multiple Statistical Tests to Enhance Mass Spectrometry-based Biomarker Discovery. BMC Bioinform. 2009, 10, 144. [Google Scholar] [CrossRef] [Green Version]
- Tan, M.; Tsang, I.W.; Wang, L. Minimax Sparse Logistic Regression for Very High-Dimensional Feature Selection. IEEE Trans. Neural Netw. Learn. Syst. 2013, 24, 1609–1622. [Google Scholar] [CrossRef] [PubMed]
- Zhai, Y.; Ong, Y.; Tsang, I.W. The Emerging “Big Dimensionality”. IEEE Comput. Intell. Mag. 2014, 9, 14–26. [Google Scholar] [CrossRef]
- Thiele, L.; Miettinen, K.; Korhonen, P.J.; Luque, J.M. A Preference-Based Evolutionary Algorithm for Multi-Objective Optimization. Evol. Comput. 2009, 17, 411–436. [Google Scholar] [CrossRef] [PubMed]
- Bui, L.T.; Alam, S. Multi-Objective Optimization in Computational Intelligence: Theory and Practice; IGI Global: Hershey, PA, USA, 2008. [Google Scholar]
- Al-Wajih, R.; Abdulkadir, S.J.; Alhussian, H.; Aziz, N.; Al-Tashi, Q.; Mirjalili, S.; Alqushaibi, A. Hybrid Binary Whale with Harris Hawks for Feature Selection. Neural Comput. Appl. 2022, 34, 19377–19395. [Google Scholar] [CrossRef]
- Ajibade, S.S.M.; Ahmad, N.B.B.; Zainal, A. A Hybrid Chaotic Particle Swarm Optimization with Differential Evolution for Feature Selection. In Proceedings of the IEEE Symposium on Industrial Electronics & Applications (ISIEA), Kristiansand, Norway, 9–13 November 2020; pp. 1–6. [Google Scholar]
- Ahmed, S.; Ghosh, K.K.; Singh, P.K.; Geem, Z.W.; Sarkar, R. Hybrid of Harmony Search Algorithm and Ring Theory-Based Evolutionary Algorithm for Feature Selection. IEEE Access 2020, 8, 102629–102645. [Google Scholar] [CrossRef]
- Bezdan, T.; Zivkovic, M.; Bacanin, N.; Chhabra, A.; Suresh, M. Feature Selection by Hybrid Brain Storm Optimization Algorithm for COVID-19 Classification. J. Comput. Biol. 2022, 29, 515–529. [Google Scholar] [CrossRef]
- Lee, C.; Le, T.; Lin, Y. A Feature Selection Approach Hybrid Grey Wolf and Heap-Based Optimizer Applied in Bearing Fault Diagnosis. IEEE Access 2022, 10, 56691–56705. [Google Scholar] [CrossRef]
- Thawkar, S. Feature Selection and Classification in Mammography using Hybrid Crow Search Algorithm with Harris Hawks Optimization. Biocybern. Biomed. Eng. 2022, 42, 1094–1111. [Google Scholar] [CrossRef]
- El-Kenawy, E.S.; Eid, M. Hybrid Gray Wolf and Particle Swarm Optimization for Feature Selection. Int. J. Innov. Comput. Inf. Control 2020, 16, 831–844. [Google Scholar]
- Al-Tashi, Q.; Abdulkadir, S.J.; Rais, H.M.; Mirjalili, S.; Alhussian, H. Binary Optimization Using Hybrid Grey Wolf Optimization for Feature Selection. IEEE Access 2019, 7, 39496–39508. [Google Scholar] [CrossRef]
- Jia, H.; Xing, Z.; Song, W. A New Hybrid Seagull Optimization Algorithm for Feature Selection. IEEE Access 2019, 7, 49614–49631. [Google Scholar] [CrossRef]
- Jia, H.; Li, J.; Song, W.; Peng, X.; Lang, C.; Li, Y. Spotted Hyena Optimization Algorithm with Simulated Annealing for Feature Selection. IEEE Access 2019, 7, 71943–71962. [Google Scholar] [CrossRef]
- Aziz, M.A.E.; Ewees, A.A.; Ibrahim, R.A.; Lu, S. Opposition-based Moth-flame Optimization Improved by Differential Evolution for Feature Selection. Math. Comput. Simul. 2020, 168, 48–75. [Google Scholar]
- Arora, S.; Singh, H.; Sharma, M.; Sharma, S.; Anand, P. A New Hybrid Algorithm Based on Grey Wolf Optimization and Crow Search Algorithm for Unconstrained Function Optimization and Feature Selection. IEEE Access 2019, 7, 26343–26361. [Google Scholar] [CrossRef]
- Tawhid, M.A.; Dsouza, K.B. Hybrid Binary Bat Enhanced Particle Swarm Optimization Algorithm for Solving Feature Selection Problems. Appl. Comput. Inform. 2018, 16, 117–136. [Google Scholar] [CrossRef]
- Rajamohana, S.P.; Umamaheswari, K. Hybrid Approach of Improved Binary Particle Swarm Optimization and Shuffled Frog Leaping for Feature Selection. Comput. Electr. Eng. 2018, 67, 497–508. [Google Scholar] [CrossRef]
- Elaziz, M.E.A.; Ewees, A.A.; Oliva, D.; Duan, P.; Xiong, S. A Hybrid Method of Sine Cosine Algorithm and Differential Evolution for Feature Selection. In Proceedings of the 24th International Conference on Neural Information Processing (ICONIP), Guangzhou, China, 14–18 November 2017; Volume 10638, pp. 145–155. [Google Scholar]
- Mafarja, M.M.; Mirjalili, S. Hybrid Whale Optimization Algorithm with Simulated Annealing for Feature Selection. Neurocomputing 2017, 260, 302–312. [Google Scholar] [CrossRef]
- Menghour, K.; Souici-Meslati, L. Hybrid ACO-PSO Based Approaches for Feature Selection. Int. J. Intell. Eng. Syst. 2016, 9, 65–79. [Google Scholar] [CrossRef]
- Hafez, A.I.; Hassanien, A.E.; Zawbaa, H.M.; Emary, E. Hybrid Monkey Algorithm with Krill Herd Algorithm optimization for Feature Selection. In Proceedings of the 11th IEEE International Computer Engineering Conference (ICENCO), Cairo, Egypt, 29–30 December 2015; pp. 273–277. [Google Scholar]
- Nemati, S.; Basiri, M.E.; Ghasem-Aghaee, N.; Aghdam, M.H. A Novel ACO-GA Hybrid Algorithm for Feature Selection in Protein Function Prediction. Expert Syst. Appl. 2009, 36, 12086–12094. [Google Scholar] [CrossRef]
- Chuang, L.; Yang, C.; Yang, C. Tabu Search and Binary Particle Swarm Optimization for Feature Selection Using Microarray Data. J. Comput. Biol. 2009, 16, 1689–1703. [Google Scholar] [CrossRef] [PubMed]
- Kumar, L.; Bharti, K.K. A Novel Hybrid BPSO-SCA Approach for Feature Selection. Nat. Comput. 2021, 20, 39–61. [Google Scholar] [CrossRef] [Green Version]
- Moslehi, F.; Haeri, A. A Novel Hybrid Wrapper-filter Approach based on Genetic Algorithm, Particle Swarm Optimization for Feature Subset Selection. J. Ambient Intell. Humaniz. Comput. 2020, 11, 1105–1127. [Google Scholar] [CrossRef]
- Zawbaa, H.M.; Emary, E.; Grosan, C.; Snásel, V. Large-dimensionality Small-instance Set Feature Selection: A Hybrid Bio-inspired Heuristic Approach. Swarm Evol. Comput. 2018, 42, 29–42. [Google Scholar] [CrossRef]
- Abualigah, L.M.; Diabat, A. A Novel Hybrid Antlion Optimization Algorithm for Multi-objective Task Scheduling Problems in Cloud Computing Environments. Clust. Comput. 2021, 24, 205–223. [Google Scholar] [CrossRef]
- Adamu, A.; Abdullahi, M.; Junaidu, S.B.; Hassan, I.H. An Hybrid Particle Swarm Optimization with Crow Search Algorithm for Feature Selection. Mach. Learn. Appl. 2021, 6, 100108. [Google Scholar] [CrossRef]
- Thawkar, S. A Hybrid Model using Teaching-learning-based Optimization and Salp Swarm Algorithm for Feature Selection and Classification in Digital Mammography. J. Ambient Intell. Humaniz. Comput. 2021, 12, 8793–8808. [Google Scholar] [CrossRef]
- Houssein, E.H.; Hosney, M.E.; Elhoseny, M.; Oliva, D.; Mohamed, W.M.; Hassaballah, M. Hybrid Harris Hawks Optimization with Cuckoo Search for Drug Design and Discovery in Chemoinformatics. Sci. Rep. 2020, 10, 1–22. [Google Scholar] [CrossRef]
- Hussain, K.; Neggaz, N.; Zhu, W.; Houssein, E.H. An Efficient Hybrid Sine-cosine Harris Hawks Optimization for Low and High-dimensional Feature Selection. Expert Syst. Appl. 2021, 176, 114778. [Google Scholar] [CrossRef]
- Al-Wajih, R.; Abdulkadir, S.J.; Aziz, N.; Al-Tashi, Q.; Talpur, N. Hybrid Binary Grey Wolf With Harris Hawks Optimizer for Feature Selection. IEEE Access 2021, 9, 31662–31677. [Google Scholar] [CrossRef]
- Shunmugapriya, P.; Kanmani, S. A Hybrid Algorithm using Ant and Bee Colony Optimization for Feature Selection and Classification (AC-ABC Hybrid). Swarm Evol. Comput. 2017, 36, 27–36. [Google Scholar] [CrossRef]
- Zorarpaci, E.; Özel, S.A. A Hybrid Approach of Differential Evolution and Artificial Bee Colony for Feature Selection. Expert Syst. Appl. 2016, 62, 91–103. [Google Scholar] [CrossRef]
- Jona, J.B.; Nagaveni, N. Ant-cuckoo Colony Optimization for Feature Selection in Digital Mammogram. Pak. J. Biol. Sci. PJBS 2014, 17, 266–271. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Abdmouleh, Z.; Gastli, A.; Ben-Brahim, L.; Haouari, M.; Al-Emadi, N.A. Review of Optimization Techniques applied for the Integration of Distributed Generation from Renewable Energy Sources. Renew. Energy 2017, 113, 266–280. [Google Scholar] [CrossRef]
Searching Techniques | |
---|---|
ABC | Artificial Bee Colony Algorithm |
ACO | Ant Colony Optimization |
AFSA | Artificial Fish-Swarm Algorithm |
AJA | Adaptive Jaya Algorithm |
ALO | Ant Lion Optimization |
Ant–Cuckoo | Ant Colony Optimization-Cuckoo Search |
ASO | Atom Search Optimization |
BALO | Binary Ant Lion Optimization |
BBBC | Big Bang Big Crunch |
BGWO | Binary Grey Wolf Optimization |
BGWOPSO | Binary Grey Wolf Optimization-Particle Swarm Optimization |
BHHO | Binary Harris Hawks Optimization |
BPSO | Binary Particle Swarm Optimization |
BSA | Backtracking Optimization Search Algorithm |
BSO | Brain Storm Optimization |
BTLBO | Binary Teaching Learning-Based Optimization |
CFO | Central Force Optimization |
ChOA | Chimp Optimization Algorithm |
CRO | Chemical Reaction Optimization |
CS | Cuckoo Search |
CSA | Crow Search Algorithm |
CSO | Curved Space Optimization |
CSS | Charged System Search |
DA | Dragon Algorithm |
DE | Differential Evolution |
DPO | Dolphin Partner Optimization |
DSA | Differential Search Algorithm |
FA | Firefly Algorithm |
FLA | Frog Leaping Algorithm |
FSJaya | FS-Based on Jaya optimization |
GA | Genetic Algorithm |
GOA | Grasshopper Optimization Algorithm |
GSA | Gravitational Search Algorithm |
GSO | Group Search Optimizer |
GWO | Grey Wolf Optimization |
HBBEPSO | Hybrid Binary Bat Enhanced Particle Swarm Optimization Algorithm |
IBHHO | Improved Binary Harris Hawks Optimization |
HBO | Heap-Based Optimizer |
HBPSOSCA | Hybrid Binary Particle Swarm Optimization and Sine Cosine Algorithm |
HHO | Harris Hawk Optimization |
HS | Harmony Search |
ISA | Interior Search Algorithm |
JA | Jaya Algorithm |
KHA | Krill Herd Algorithm |
LCA | League Championship Algorithm |
MA | Monkey Algorithm |
MAKHA | Monkey–Krill Herd Algorithm |
MBA | Mine Blast Algorithm |
MFO | Moth–Flame Optimization |
MOChOA | Multiobjective Chimp Optimization |
MPA | Marine Predators Algorithm |
MVA | Multiverse Algorithm |
PSO | Particle Swarm Optimization |
QE | Queen Bee Evolution |
RTEA | Ring Theory-Based Evolutionary Algorithm |
RTHS | Ring Theory-Based Harmony Search |
SA | Simulated Annealing |
SBS | Sequential Backward Selection |
SCA | Sine Cosine Algorithm |
SDO | Supply–Demand-Based Optimization |
SFLA | Shuffled Frog Leaping Algorithm |
SFS | Sequential Forward Selection |
SHO | Spotted Hyena Optimization |
SHO-SA | Spotted Hyena Optimization-Simulated Annealing |
SOA | Seagull Optimization Algorithm |
SPO | Stochastic Paint Optimizer |
SSA | Salp Swarm Algorithm |
TEO | Thermal Exchange Optimizer |
TLBO | Teaching Learning-Based Optimization |
TS | Tabu Search |
VSA | Vortex Search Algorithm |
WOA | Whale Optimization Algorithm |
Machine Learning Algorithms | |
ANN | Artificial Neural Network |
CART | Classification And Regression Tree |
DT | Decision Tree |
KNN | k-Nearest Neighbor |
LDA | Linear Discriminant Analysis |
LR | Logistic Regression |
NB | Naive Bayes |
RF | Random Forest |
SVM | Support Vector Machine |
Performance Metrics | |
ACC | Accuracy |
AUROC | Area Under the Receiver Operating Characteristic |
BF | Best Fitness |
CA | Classification Accuracy |
CE | Classification Error |
FDR | False Discovery Rate |
FNR | False Negative Rate |
FPR | False Positive Rate |
FSc | F-Score |
IGD | Inverted Generational Distance |
MCC | Matthews Correlation Coefficient |
MF | Mean Fitness |
MSE | Mean Square Error |
NFE | Number of Function Evaluations |
NPV | Negative Predictive Value |
NSF | Number of Selected Features |
PA | Predictive Accuracy |
PPV | Positive Predictive Value |
PR | Precision |
RE | Recall |
RT | Running Time |
SN | Sensitivity |
Parameters | |
DBI | Davies–Bouldin Index |
DI | Dunn Index |
SI | Silhouette Index |
Others | |
EC | Evolutionary Computation |
FS | Feature Selection |
ML | Machine Learning |
MO | Multiobjective |
MRMR | Minimum Redundancy Maximum Relevance |
OBL | Opposition-Based Learning |
SO | Single Objective |
WRS | Wilcoxon’s Rank Sum |
Paper | Year | Aim | Experimental Evaluation | Assessment Metrics |
---|---|---|---|---|
P1 [64] | 2022 | The intention of this article is to design a simplified and functional hybrid algorithm for FS by considering the simplicity of the WOA and the stochastic nature of HHO. | Experimental findings on 18 benchmark datasets reveal that the proposed hybrid method is capable of enhancing the achievement of the conventional WOA concerning ACC, selected feature count, and execution time. | CA, MF, BF, WF, NSF, and RT. |
P2 [65] | 2020 | This study attempts to combine DE and Chaotic Dynamic Weight Particle Swarm Optimization (CHPSO) in an effort to enhance CHPSO. | According to the simulation outputs, CHPSO-DE performs better than other solutions at solving practically the FS challenge. | Average NFE. |
P3 [66] | 2020 | This study presented a novel hybrid FS model called RTHS, which is based on the HS metaheuristic and RTEA. | The RTHS approach was applied on 18 standard datasets from UCI Machine Learning Repository and it was contrasted to 10 popular evolutionary FS methods. The findings indicated that the RTHS approach appears to be more effective than the considered approaches. | CA, PR, RE, FSc, and AUROC. |
P4 [67] | 2022 | The aim of this research is to use a unique hybridised wrapper-based brain storm optimization-firefly algorithm (BSO-FA) strategy in order to enhance the FS technique and produce improved classification outcomes on standard UCI datasets, including a publicly available dataset with COVID-19 patient health data. | On 21 UCI datasets, the suggested approach is assessed and contrasted with 11 optimization techniques. The proposed approach is also used for a dataset related to coronavirus disease. The observed experimental findings support the robustness of the suggested hybrid model. In comparison to other methods in the literature, it effectively decreases and chooses the features and also produces better CA. | CA and MF. |
P5 [68] | 2022 | The objective of this study is to apply a hybridised grey wolf optimization-heap-based optimizer (GWO-HBO) methodology as a wrapper for the FS phase of a fault-diagnosis system. | The suggested approach is validated on four separate datasets to ensure its efficiency. The proposed approach is compared to three methods, namely BGWO, BPSO, and GA, and the test results may attest to its predictability. | CA. |
P6 [69] | 2022 | This work aims at designing a hybrid optimization technique based on the CSA and the HHO for selecting features and mass categorization in digital mammograms. | This strategy was tested by using 651 mammograms. When compared with respect to conventional CSA and HHO methods employing experimental data, the new CSAHHO method was found to perform better. | CA, SN, SP, FPR, FNR, FSc, and Kappa coefficient. |
P7 [70] | 2020 | The purpose of this article is to acquire the balance between exploitation and exploration by mixing the advantages of GWO and PSO. | Seventeen UCI datasets are used to measure the suggested optimizer’s consistency, dependability, and stability. | Average error, average NSF, MF, BF, WF, STD of fitness, and RT. |
P8 [71] | 2019 | In order to determine the appropriate trait subgroup and resolve FS issues, this article suggests a hybrid PSO and GWO. | The study’s results highlight that the BGWOPSO framework is superior in computation time, PR, and FS. The findings of the BGWOPSO procedure have shown that it is easier than other approaches to monitor the compromise between exploratory and exploitative behaviours. | Average CA, NSF, MF, BF, WF, and RT. |
P9 [72] | 2019 | This study provides three hybrid structures for the FS task based on TEO and SOA. | The simulation outcomes have demonstrated that the suggested hybrid model enhances classification efficiency, guarantees the choice of hybrid SOA-algorithms, decreases CPU time, and selects the salient factor. | RT, average NSF, and CA. |
P10 [73] | 2019 | This study provides two separate hybrid versions of the spotted hyena optimization (SHO) for FS problems. In the first version (SHOSA-1), the SA is embedded in the SHO algorithm. In the second version (SHOSA-2), the SA is used to enhance the ultimate solution obtained by the SHO algorithm. | The findings of the tests revealed that the SHOSA-1 strategy improves recognition rate and reduces the number of chosen features in relation to other wrapper methods. Experiments also demonstrated that SHOSA-1 has excellent success (compared to SHOSA-2) in the spatial search and choice of feature characteristics. | Average CA, NSF, MF, STD, RT, SN, and SP. |
P11 [74] | 2019 | In this paper, the OBL concept is integrated with a DE technique and an MFO approach in order to boost the capacity of the MFO algorithm for generating an optimal attribute array. | The findings clearly reveal that the presented algorithm is better in terms of efficiency and the methodology suggested with a limited range of selected features and a minimal CPU time in comparison to other benchmark evolutionary approaches. | MF, STD, average RT, selection ratio (SR), CA, FSc, PR, RE, and WRS. |
P12 [75] | 2019 | This article presents a hybrid GWOCSA that efficiently blends the strengths of both GWO and the crow search optimizer (CSO) to provide optimal solutions for the most efficient global operation. | The experimental findings suggest that the GWOCSA has improved fitness optimization and performed at a higher convergence speed compared to the other FS methodologies to solve the FS problem and achieved more satisfactory optimization results in fewer iterations. This demonstrates the potential of the model to solve difficult issues in real-world large datasets. | CA, average NSF, MF, STD, and WRS. |
P13 [76] | 2018 | This paper proposes a unique hybrid method for the FS problems known as the HBBEPSO. | The outcomes from testing the HBBEPSO demonstrate the possibility of using the recommended hybrid strategy to determine the ideal variable combination. | MF, BF, WF, STD, average SR, and average FSc. |
P14 [77] | 2018 | This article utilizes a hybrid method of discrete PSO and the SFLA to reduce the feature dimension and choose optimal parameter subsets. | The simulation outputs indicate that the suggested hybrid approach is good enough to provide an optimized attribute subset and obtain a high CA values. | CA, PR, and RE. |
P15 [78] | 2017 | The DE operators are used as local search techniques in this work to address the difficulties in SCA. | The outcomes of the execution conclude that the new technique will function better than the alternatives on the basis of success metrics and predictive analysis. | CE, FSc, MF, BF, WF, STD, SR, and RT. |
P16 [79] | 2017 | In this paper, hybridized frameworks are introduced to construct FS models based on the WOA. | The proposed hybrid models combine SA with WOA. The derived experimentation results have shown that the performance and the capacity of the hybrid WOA approach for choosing the most informational features and for searching the feature space are improved compared with individual wrapper approaches. | CA, NSF, MF, BF, and WF. |
P17 [6] | 2016 | This article presents a hybrid evolutionary optimization technique called the artificial bee colony-particle swarm optimization (ABC-PSO) for optimum selection and retrieval of features. | The findings show that the overall efficiency of the method is very good and the suggested hybrid method is better suited for voice recognition upon implementing it in the MATLAB working platform. | CA, SN, SP, PPV, NPV, FPR, FDR, and MCC. |
P18 [80] | 2016 | This article suggests nature-based hybrid techniques for FS. The techniques are based on two strategies for swarm intelligence: ACO and PSO. | The experimental findings conclude that the proposed approaches have better efficiency for reducing the NSF and also in terms of CA. | CA and NSF. |
P19 [81] | 2015 | This paper proposes the use of a hybrid MA for FS combined with the KHA. | The test results reveal that the proposed MAKHA technique can easily find an optimal or almost an optimal set of combination of attributes by minimizing the objective function and achieving sufficient efficiency to increase the accuracy of feature classification. | BF, MF, WF, and CE. |
P20 [9] | 2010 | This research suggests a hybrid approach for FS based on GA and SA. | The FS results were improved by correcting the SA in the creation of the next generation (by considering two maximum and minimum thresholds). | CA and NSF. |
P21 [82] | 2009 | This article suggests a new FS framework which combines GA with ACO to increase and improve search capabilities in protein structure forecasting. | The testing results show the superiority of the proposed hybrid method (compared to ACO ans GA) and also present the low computational complexity of the suggested hybrid approach. | Predictive accuracy (PA). |
P22 [83] | 2009 | In this paper, TS is combined with binary PSO to select an optimal feature vector in FS. | Testing results from applying the method on 11 classification problems taken from the literature show that this approach simplifies features effectively. This method has the ability to obtain higher CA and to use fewer features compared to other FS methods. | CA. |
P23 [84] | 2019 | This study suggests the combination of BPSO and SCA. The aim of the approach is to perform FS and cluster analysis by employing a cross-breed approach of SCA to BPSO. | The experimental findings (on 10 benchmark test functions and seven datasets taken from the UCI repository) show that the suggested HBPSOSCA approach generally performs better than other FS approaches. | MF, BF, WF, Average NSF, SI, DI, and DBI. |
P24 [85] | 2019 | This research presents a hybrid filter-wrapper approach for the collection of attribute subsets, focused on the hybridization of GA and PSO. The method utilizes an ANN in the fitness/objective function. | The experimental findings on five datasets showed that the suggested hybrid approach achieves a higher PR of classification in comparison to other competitor techniques. | Average NSF, average CA, best ACC, and average RT. |
P25 [86] | 2018 | This paper presents a hybrid of two methods, ALO and GWO, that provides the strength of having a good understanding from fewer instances and the decent collection of characteristics from a very wide range, thus maintaining a high PR in the classification results. | Datasets with around 50,000 characteristics and fewer than 200 examples were utilized to measure the accuracy of the system. The test findings are positive with respect to GA and PSO. | MF, BF, WF, STD, CMSE, average NSF, average Fisher score, and WRS. |
P26 [87] | 2019 | This paper proposes a new hybrid ALO with elitism-based DE to tackle task-scheduling problems in cloud environments. | The experimental results showed that for larger search spaces, the modified-ALO (MALO) approach converged faster, proving it ideal for massive task scheduling jobs. The statistical t-tests were used to analyse the data, indicating that MALO significantly improved the results. | Degree of imbalance, size of tasks, makespan, and RT. |
P27 [88] | 2021 | The purpose of this research is to perform FS by fusing an improved CSA method with PSO. | With the use of 15 datasets from the UCI, the presented technique is compared to four well-known optimization techniques, namely PSO, binary PSO, CSA, and chaotic CSA. Distinct performance indicators were applied in the tests by using KNN as classifier. This hybrid approach was found to perform better than cutting-edge techniques. | MF, BF, WF, and STD of fitness. |
P28 [89] | 2021 | The main goal of this approach is to shorten the size of the selected feature vector by combining TLBO and SSA techniques, which can also increase the classifier’s predictability. | A total of 651 breast cancer screenings were produced by the hybrid approach, and the outputs demonstrate that TLBO-SSA performs better than the TLBO. Once more, the strength of this metaheuristic approach was evaluated by taking a UCI dataset. The TLBO-SSA result demonstrated its superiority when compared to GA. | SE, SP, CA, FSc, Kappa coeff, FPR, and FNR. |
P29 [90] | 2020 | In order to increase the initial HHO’s effectiveness for collecting chemical descriptors and chemical composites, this work combined HHO, CS, and chaotic maps. | Some UCI datasets and two chemical datasets are considered to validate the presented solution. Comprehensive experimental and computational analysis showed that the proposed approach has achieved many desired solutions over other competing solutions. | CA, SE, SP, RE, PR, and FSc. |
P30 [91] | 2021 | This article proposes a hybrid optimal strategy that includes SCA in HHO. By adjusting the candidate solutions in a complex manner, SCA attempts to tackle ineffective HHO identification and to prevent stagnation situations in HHO. | With 16 datasets including more than 15,000 in attributes and the CEC’17 computational optimising trials, this recommended approach was evaluated and contrasted with SCA, HHO, and other existing methods. The detailed evaluations of experiments and statistics showed that the suggested HHO hybrid variant produced effective results without extra computational cost. | Average CA, MF, average NSF, SR, average RT, and STD. |
P31 [92] | 2021 | This article suggests a hybrid GWO-HHO-based FS technique. | In comparison to GWO, PSO, HHO, and GA, the accuracy of the suggested hybrid approach was tested and evaluated on 18 UCI datasets. The approach performed better than the GWO. | Average CA, MF, BF, WF, average NSF, and average RT. |
P32 [93] | 2017 | This research suggests a new AC-ABC hybrid technique that incorporates the FS characteristics of ACO and ABC. By employing hybridization, the stagnation behavior of the ants is removed, and lengthy global searches for the original solutions by the employed bees. | The suggested method was evaluated by 13 UCI datasets. Experimental findings revealed the positive characteristics of the proposed technique which was found to achieve a high accuracy rate and optimum selection of features. | NSF, CA, and RT. |
P33 [94] | 2016 | In this paper, a hybrid approach that merges the ABC optimizer with DE is recommended for FS in classification problems. | The approach was tested by using 15 UCI datasets and was compared with ABC and DE based FS, and also with gain, chi-square and correlation based FS. The empirical outputs of this study indicate that the new technique selects informative features for classification that increase the classifier’s efficiency and accuracy. | FSc, NSF, and RT. |
P34 [95] | 2014 | In this paper, a novel hybrid evolutionary technique called ant–cuckoo-produced by the fusion of ACO and CS methods-is introduced for performing FS in digital mammogram. | The tests are carried out on the miniMIAS database of mammograms. Compared with ACO and PSO algorithms, the efficiency of the ant–cuckoo method was analyzed. The findings indicated that FS optimization for the hybrid ant–cuckoo method was more accurate than the one achieved by the individual FS approaches. | SN, SP, CA, and AUROC. |
P35 [17] | 2022 | By using a MOChOA-based FS technique, this method seeks to identify pertinent parameters for forecasting the health status of COVID-19 patients. | By contrasting this strategy with five other existing FS procedures on nine distinct datasets, its efficacy is demonstrated. | Average NSF, average CA, average RT, and IGD. |
Paper | Search Method | Fitness Function | Means of Hybridization |
---|---|---|---|
P1 [64] | WOA, HHO | , where , ER: error, : #picked factors and : #actual factors. | The exploration technique of HHO is immersed in the WOA to rise the randomness of the optimum solution search, based on the humpback whale’s exploitative manner. |
P2 [65] | Chaotic PSO (CPSO), DE. | NFE. | In order to prevent the decay that is normally discovered by the CPSO, the DE approach is combined with CPSO. As the swarm begins to deteriorate, the DE is used to provide the required momentum for the particles to travel through the search area and thereby flee from the local optima. |
P3 [66] | HS, RTEA | , where : array of chosen attributes, : error rate with reduced feature string, F: original feature array and | HS and RTEA have been hybridised by following the pipeline model. |
P4 [67] | BSO and FA | , where : classifier’s error rate, : size of the chosen attribute string, : count of total features and lies in between 0 and 1 and . | To reduce the drawbacks of the conventional BSO, this new architecture combines the best elements of BSO’s great exploration and FA’s exceptional exploitation where if the cycle counter is odd, the FA search mechanism is used for a location change; otherwise, the original BSO is used for solution improvement. |
P5 [68] | GWO and HBO | , where NT: #truly predicted instances, NF: #instances that are falsely predicted. | The best solution obtained from GWO is stored as a record. If the new solution generated by the HBO is more than similar to the above record, then crossover is used. After crossover, if the new solution is the same as the record then mutation is performed. |
P6 [69] | CSA and HHO | , where : fitness value of , : performance of classifier. FS/N: #features selected/#total features. | The probability opts for either solutions to be updated by CSA or HHO. |
P7 [70] | GWO and PSO | , where E(D): CE, s: #selected features, f: #features and , are constants. | Starting with an arbitrary selection of solutions, the optimization process begins. After determining the fitness function for each individual for each iteration, the first three leaders are given the names alpha, beta, and delta. After that, the population is equally split into two classes, with the first class following GWO operations and the second class following PSO processes. In this manner, the search space is thoroughly examined for potential points, and these points are then utilised by the potent PSO and GWO. |
P8 [71] | GWO-PSO | , where and , : KNN’s error rate, : length of chosen feature vector and : length of actual feature vector. | The basic principle of PSOGWO is to enhance the potential of the system to exploit PSO to explore GWO in order to accomplish both optimizer powers, where the location of the first three agents is modified, rather than with the normal calculation, exploitation, and discovery of the grey wolf. |
P9 [72] | SOA, TEO | , where : CE, : length of the chosen substring, : length of whole feature set, and . | Three hybrid ways to manage FS tasks based on SOA and TEO are proposed in this paper. Either of the two algorithms is selected for updating the position in the first method based on the roulette wheel. The second approach is followed by SOA optimization by TEO. The final technique uses the TEO formulation for heat exchange to boost the SOA style of attack. |
P10 [73] | SA, SHO algorithm | , where : CE, : length of the chosen substring, : length of whole feature set, and . | Two hybrid systems to enhance the use of the SHO model are presented in this article. SA is used as part of SHO in the first hybrid model and SHO and SA techniques are executed once for every iteration. In the case of second architecture, the first SHO model is applied to seek out the optimum solution, followed by the SA to find the new best solution. |
P11 [74] | MFO, DE | , where Err : error of the classifier, : count of chosen attributes, Dim: whole features count and : any random between 0 and 1. | The suggested model utilizes the OBL principle to generate initial solutions, and the DE operators to boost the operational capabilities of MFO. |
P12 [75] | CSA, GWO | Fitness , where : CE, : length of the chosen substring, : length of whole feature set, and . | In particular, in its location change equation, the CSA integrates a control parameter. This parameter plays a key part in achieving the global optimum as the big value of this factor results in global discovery and a small figure results in a local search. In the suggested GWOCSA, a greater value of the parameter is used to make use of the CSA’s outstanding discovery quality. |
P13 [76] | PSO, bat algorithm | , where : classifier’s error rate, : size of the chosen attribute string, : count of total features and lies in between 0 and 1 and . | The separation of the speed vectors of the bats and the particles calls for a new design of the suggested method. This is because the personal and global solutions are not modified after the BBA, but only after the full round of the PSO. |
P14 [77] | Binary PSO (BPSO), frog leaping algorithm (FLA) | , where : NB’s accuracy and x: feature subset. | The population, which includes an optimized BPSO extracted feature, is given as an input for the FLA under the presented hybrid system. |
P15 [78] | SCA-DE | , where : LR’s error rate, : count of picked features, D: total feature count and . | The DE operators are applied as a local search strategy to help the SCA to avoid the local spot. |
P16 [79] | SA, WOA | , where : CE, : length of the chosen substring, : length of whole feature set, and . | Two methods to address the FS problem are employed in this article. The SA algorithm in WOASA-1 operates as a WOA algorithm operator. WOASA-2 enhances the optimal solution discovered by WOA. |
P17 [6] | ABC, PSO | , where : classifier performance with the subset S, : total attribute count, : size of the attribute subset and : standard of the classification. | Each employed bee produces a new food source and exploits a good source. Each onlooker bee selects a supply based on the amount of its solution, creates a new food source and exploits the better one. PSO is used instead of scouting bees for the hunt for new sources after deciding the source to be left and assigning the employed bee as scout. |
P18 [80] | ACO, PSO | , where : CE with the attribute vector chosen by particle p at the iteration. | Here, both ACO and PSO are executed simultaneously by each individual. Estimate the value of the chosen subset of each ant and particles by the classifier and pick the best one for the next generation. |
P19 [81] | MA with KHA | , where : fitness function with number of features, : total feature count, : CE, : constant. | The suggested MAKHA hybrid technique employs foraging operation and physical random diffusion with crossover and mutation and uses somersault process and watch–jump process from the MA. |
P20 [9] | GA, SA | Percentage of recognition of the Bayesian classifier. | SA is used to select the chromosomes for the next generation. |
P21 [82] | GA-ACO | CA | In the proposed hybrid method, ACO utilizes the GA’s crossover and mutation strategy. This results in the exploration of ants near the optimum solution. The mechanism is again iterated after pheromone upgrading. |
P22 [83] | TS-BPSO | The KNN with LOOCV and SVM with OVR serves as estimators of the TS and BPSO objective functions. | In the suggested hybrid architecture, BPSO acts as a local optimization technique for the Tabu search method. |
P23 [84] | SCA, Binary PSO (BPSO) | SI: . | The methodology is used in this article to increase the search capabilities and find a close-to-optimum global solution by combining the BPSO with the SCA. In this context, SCA improves the movement of a particle in the BPSO. |
P24 [85] | GA, PSO | ANN | The procedure is applied three times at a time and continued until the given number of generations is reached after producing the initial population and determining the cost of each solution. The GA and PSO follow these three moves. Two steps are taken concurrently in the GA, while the PSO takes only a single move. |
P25 [86] | ALO, GWO | , where : fitness function considering number of features, : total feature count, : CE, : constant. | The suggested hybrid approach updates the ants applying the essence of ALO, and the ant lions, with the help of the GWO concept, that deserve to be converged more quickly. |
P26 [87] | ALO, DE | mapped to , where : required RT, : #virtual machines, : #tasks. | In each iteration, the ant lions are updated by using DE operators. |
P27 [88] | PSO and CSA | , where : classifier’s error rate, : size of the subset, and : #total features. | This strategy merely targets a few chosen crows with the greatest feeds during the hybridization process to improve the effectiveness of randomly following every crow in the original CSA. The next step is to apply the OBL approach to create the crows’ opposite location and update their locations in the PSO. This is done so that the result generated by each method can explore the search space, in turn, without interfering with one another. |
P28 [89] | SSA, TLBO | , where : fitness value of . : performance of the classifier, S/N: #features selected/#total features. | During the teaching and learning phase, the population change is accomplished by using the TLBO methodology or the SSA. |
P29 [90] | HHO-CS | , where R: CE, C: total attribute count, : size of subset, : performance of the classification. | The merits of the CS approach for controlling HHO vectors in place is taken in the CHHO–CS algorithm. CS attempts to determine the optimal solution after each iteration T. As a result, if the fitness of the current solution is greater than that of the new solution derived from HHO, the new solutions will be determined; otherwise, the older one will stay intact. |
P30 [91] | SCA-HHO | , where : error by KNN, : count of attributes picked, D: actual #feature. | SCA and HHO are paired to execute their discovery task by SCA and exploitation by HHO. |
P31 [92] | GWO-HHO | , where lies between 0 and 1, ER: error, : #picked factors and : # actual factors. | Exploration is carried out by HHO, while exploitation is done by GWO. |
P32 [93] | ACO-ABC | , where : value of the objective for the corresponding attribute set. | The ants use the bees’ exploitation to decide the best ant and optimal attribute substring; bees incorporate the attribute substring that the ants create as a supply of food. |
P33 [94] | DE-ABC | Weighted average F-measure from J48. | If the fitness probability is > , DE mutation is performed; otherwise, ABC neighbourhood solution creation procedure is followed. |
P34 [95] | ACO-CS | Mean square error (MSE) of SVM. | ACO is an excellent evolutionary strategy. The disadvantage of this strategy is that the ant moves in the direction where the pheromone density is high, slowing down the operation. CS is therefore used to perform the local ACO scan. |
P35 [17] | ChOA and HHO | and , MI: mutual information and PCC: Pearson correlation coefficient. | Hybrid solutions are created based on ChOA and HHO solutions. Then, the best solutions among the ChOA, HHO, and hybrid solutions are treated as the current solution. Then ChOA is used to update the position. |
Paper | Classifier | Dataset | Application Domain |
---|---|---|---|
P1 [64] | KNN | Breast Cancer, Breast EW, Congress EW, Exactly, Exactly 2, Heart EW, Ionosphere EW, Krvskp EW, Lymphography, M of N, Penglung EW, Sonar EW, Spect EW, Tic-tac-toe, Vote, Waveform EW, Wine EW, Zoo. | For FS (Miscellaneous). |
P2 [65] | Eight benchmark functions. | For FS (Miscellaneous). | |
P3 [66] | KNN, NB, RF | UCI (Zoo, Breast Cancer, Breast EW, Congress EW, Exactly, Exactly 2, Heart EW, Ionosphere, KrvskpEW, Lymphography EW, M of N, Penglung EW, Sonar EW, Spect EW, Tic-tac-toe, Vote, Waveform EW, Wine EW). | It focuses on FS (Biology, Politics, Game, Physics, Chemistry, Electromagnetic). |
P4 [67] | KNN | Breast Cancer, Tic-tac-toe, Zoo, Wine EW, Spect EW, Sonar EW, Ionosphere EW, Heart EW, Congress EW, Krvskp EW, Waveform EW, Exactly, Exactly 2, M of N, Vote, Breast EW, Semeion, Clean 1, Clean 2, Lymphography, Penghung EW. | FS for COVID-19 classification (Medical). |
P5 [68] | KNN | BreastEW, Ionosphere, PenglungEW, Segmentation, Sonar, Vehicle, Bearing dataset, CWRU, and MFPT benchmark dataset. | FS for Fault Diagnosis (Engineering). |
P6 [69] | ANN | 651 mammograms obtained from the Digital Database for Screening Mammography (DDSM). | FS and classification in mammography (Medical). |
P7 [70] | KNN | Hepatitis, Ionosphere, Vertebral, Seeds, Parkinson, Australian, Blood, Breast Cancer, Diabetes, Lymphography, Parkinson, Ring, Titanic, Towonorm, WaveformEW, Tic-Tac-Toe, M of N. | For enhancing FS (Miscellaneous). |
P8 [71] | KNN | UCI (Zoo, Breast Cancer, Breast EW, Congress EW, Exactly, Exactly 2, Heart EW, Ionosphere, KrvskpEW, Lymphography EW, M of N, Penglung EW, Sonar EW, Spect EW, Tic-tac-toe, Vote, Waveform EW, Wine EW). | For handling FS tasks (Miscellaneous). |
P9 [72] | KNN | UCI (Iris, Wine, Glass, Diabetes, Heartstatlog, Ionosphere, Sonar, Vehicle, Balance Scale, CMC, Cancer, Seed, Blood, Aggregation, Vowel, WBC, Bupa, Jain, Thyroid, WDBC). | For FS (miscellaneous). |
P10 [73] | KNN | UCI (BreastCW, Congressional, ConnectBench, Dermatology, Drug_consumption, Glass, Heart, Hepatitis, Horse-colic, ILPD, Ionosphere, Primary-tumor, Seeds, Soybean, Spambase, SPECT Heart, SteelPF, Thoracic Surgery, Tic-tac-toe, Zoo). | To solve FS issues (Miscellaneous). |
P11 [74] | KNN | UCI (WBDC, Hepatitis, Heart, Sonar, Lymphography, Clean 1, Breastcancer, Clean 2, Waveform, Ionosphere). | To enhance the FS (Galaxies Classification). |
P12 [75] | KNN | UCI (Zoo, Breast Cancer, Congress EW, Exactly, Ionosphere, M of N, Penglung EW, Sonar EW, Vote, Wine EW, Exactly 2, Heart EW, Tic-tac-toe, Waveform EW, Krvskp EW, Lymphography EW, Spect EW, Clean 1, Clean 2, Semeion) | For producing optimistic nominee solutions to obtain global optima efficiently which can be used in solving real-world complex problems and FS (miscellaneous). |
P13 [76] | KNN | UCI (Zoo, Breast Cancer, Breast EW, Congress, Exactly, Ionosphere, M of N, Sonar EW, Wine EW, Exactly 2, Heart EW, Tic-tac-toe, Waveform EW, Lymphography EW, Spect EW, Dermatology, Krvskp EW, Echocardiogram, Hepatitis, Lung Cancer). | To solve FS problems (Miscellaneous). |
P14 [77] | NB | The dataset consists of 1600 reviews of the 20 well known Chicago hotels that are organized as: 800 positive reviews (400-truthful, 400-deceptive), and 800 negative reviews (400-truthful, 400-deceptive). | It helps with the identification of fake reviews and also to discard irrelevant reviews. It is able to classify efficiently the reviews into spam and ham reviews. |
P15 [78] | LR | UCI (Breast, SPECT, Ionosphere, Wine, Congress, Sensor, Clean 1, Clean 2) | To solve FS problems (Miscellaneous). |
P16 [79] | KNN | UCI (Zoo, Breast Cancer, Breast EW, Congress EW, Exactly, Exactly 2, Heart EW, Ionosphere, KrvskpEW, Lymphography EW, M of N, Penglung EW, Sonar EW, Spect EW, Tic-tac-toe, Vote, Waveform EW, Wine EW). | To design different FS techniques (Miscellaneous). |
P17 [6] | SVM | Three types of datasets are used: 1. 100 recorded speech signals of fruits type, 2. 80 recorded speech signals of animals, and 3. 120 recorded combined speech signals. | Feature selection for automatic speech recognition. |
P18 [80] | NB | UCI Machine Learning Repository (SpamBase, BreastCancer, German, Hepatitis, Liver, Musk). | To improve the CA and enhance the FS (Miscellaneous). |
P19 [81] | KNN | UCI (Zoo, Breast Cancer, Breast EW, Congress EW, Exactly, Exactly 2, Heart EW, Ionosphere, KrvskpEW, Lymphography EW, M of N, Penglung EW, Sonar EW, Spect EW, Tic-tac-toe, Vote, Waveform EW, Wine EW). | To enhance the FS and to increase the classification performance (Miscellaneous). |
P20 [9] | Bayesian | Handwritten Farsi characters having 100 samples for each 33 characters. | For the identification of handprinted Farsi characters. |
P21 [82] | KNN | GPCR-PROSITE dataset, ENZYME-PROSITE dataset. | For FS in protein function prediction. |
P22 [83] | KNN, SVM | Tumors 9, Tumors 11, Brain Tumor 1, Tumors 14, Brain Tumor 2, Leukemia 1, Leukemia 2, Lung Cancer, SRBCT, Prostate Tumor and diffuse large B-cell lymphoma datasets. | To improve gene selection in medical diagnosis. |
P23 [84] | Ionosphere, Breast Cancer Wisconsin, Connectionist Bench, Statlog, Parkinson, 9_Tumors, Leukemia2. | To solve FS problems (Miscellaneous). | |
P24 [85] | Five nearest neighbors (5-NN) | Hill-Valley, Gas 6, Musk 1, Madelon, Isolet 5, Lung. | To solve the FS problem in high-dimensional datasets (Miscellaneous). |
P25 [86] | KNN | UCI (Zoo, Breast Cancer, Breast EW, Congress EW, Exactly, Exactly 2, Heart EW, Ionosphere, KrvskpEW, Lymphography EW, M of N, Penglung EW, Sonar EW, Spect EW, Tic-tac-toe, Vote, Waveform EW, Wine EW). | To select significant features from datasets (Bioinformatics). |
P26 [87] | Synthetic and real trace datasets. | To solve task scheduling problems in cloud computing environments. | |
P27 [88] | KNN | Wine, Dermatology, Heart, Ionosphere, Lung cancer, Thoracic surgery, Hepatitis, Parkinson, Phishing website, Qsar biodegradation, Absenteeism at work, Divorce, Wpdc, Risk factor cervical cancer, Wdpc. | For the FS task (Miscellaneous). |
P28 [89] | ANN | Digital Database for Screening Mammography (DDSM), Breast Cancer Wisconsin (WBC) dataset. | To solve the FS (Medical). |
P29 [90] | SVM | BreastCancer, KCL, WineEW, WDBC, LungCancer, Diabetic, Stock, Scene, Lymphography, and Parkinson. | For chemical descriptor selection and chemical compound activities (Chemical Engineering). |
P30 [91] | KNN | Exactly, Exactly 2, Lymphography, Spect EW, Congress EW, Ionosphere EW, Vote, Wine EW, Breast EW, Brain Tumors 1, Tumors 11, Leukemia 2, SRBCT, DLBCL, Prostate Tumors and Tumors 14. | To boost the FS process (Miscellaneous). |
P31 [92] | KNN | Breast Cancer, Breast EW, Congress EW, Exactly, Exactly 2, Heart EW, Ionosphere EW, Krvskp EW, Lymphography, M of N, Penglung EW, Sonar EW, Spect EW, Tic-tac-toe, Vote, Waveform EW, Wine EW and Zoo. | the FS task (Miscellaneous). |
P32 [93] | DT (J48) | Heart-Cleveland, Dermatology, Hepatitis, Lung Cancer, Lymphography, Pima Indian Diabetes, Iris, Breast Cancer W, Diabetes, Heart-Stalog, Thyroid, Sonar, Gene. | For FS in classification (Miscellaneous). |
P33 [94] | J48 | Autos, Breast-w, Car, Glass, Heart-C, Dermatology, Hepatitis, Thoraric-Surgery, Lymph, Credit-g, Sonar, Ionosphere, Liver-Disorders, Vote, Zoo. | For the FS tasks in classification (Miscellaneous). |
P34 [95] | SVM with RBF kernel | miniMIAS data with 100 mammograms (50-normal, 50-abnormal). | For FS in digital mammogram (Medical). |
P35 [17] | KNN | Lymphography, Diabetic, Cardiotocography, Cervical Cancer, Lung Cancer, Arrhythmia, Parkinson, Colon Tumor, Leukemia and three COVID-19 datasets. | To enhance FS for COVID-19 prediction (Medical). |
Paper | Classifier | Description |
---|---|---|
P3 [66] | RF (Random Forest) | RF is a selection algorithm made up of several decision trees. It builds each particular tree by using bagging and features variability and attempts to generate a nonoverlapping forest of trees whose forecast is more reliable than that of any individual. |
P1 [64], P3 [66], P4 [67], P5 [68], P7 [70], P8 [71], P9 [72], P10 [73], P11 [74], P12 [75], P13 [76], P16 [79], P19 [81], P21 [82], P22 [83], P24 [85], P25 [86], P27 [88], P30 [91], P31 [92], P35 [17] | KNN | KNN is a straightforward classifier that records all available samples and categorizes new samples focusing on a similarity metric. It is usually used to categorize a piece of data depending on how its neighbors are graded. |
P3 [66], P14 [77], P18 [80] | NB | An NB model believes that the existence of one attribute in a class has no influence on the existence of any other attribute. |
P17 [6], P22 [83], P29 [90], P34 [95] | SVM | SVM is a supervised ML model for two-group classification tasks. They will identify a new instance after providing an SVM with sets of named training examples for each type. |
P15 [78] | LR | The logistic sigmoid is utilised to convert the outcome of the LR classification method, which assigns samples to a discrete set of classes and then returns a probability value. |
P20 [9] | Bayesian | The Bayesian classification model forecasts membership estimates for every group, such as the likelihood that a certain record contributes to a certain class. The category with the greatest likelihood is thought to be the most probable. |
P32 [93], P33 [94] | DT, J48 classifier | Ross Quinlan’s DT is generated by using C4.5 (J48). C4.5 is an extension of the previous ID3 algorithm of Quinlan. For classification, the DTs generated by C4.5 can be used and, hence, C4.5 is also called a statistical classifier. |
P6 [69], P28 [89] | ANN | To provide computer programs with the ability to process information and make judgments similarly to those of humans, ANN models are used in AI to simulate the networks of neurons that make up the human brain. In order to develop an ANN, an artificial model is designed and programmed to operate analogously to a network of interconnected neurons and brain cells. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Piri, J.; Mohapatra, P.; Dey, R.; Acharya, B.; Gerogiannis, V.C.; Kanavos, A. Literature Review on Hybrid Evolutionary Approaches for Feature Selection. Algorithms 2023, 16, 167. https://doi.org/10.3390/a16030167
Piri J, Mohapatra P, Dey R, Acharya B, Gerogiannis VC, Kanavos A. Literature Review on Hybrid Evolutionary Approaches for Feature Selection. Algorithms. 2023; 16(3):167. https://doi.org/10.3390/a16030167
Chicago/Turabian StylePiri, Jayashree, Puspanjali Mohapatra, Raghunath Dey, Biswaranjan Acharya, Vassilis C. Gerogiannis, and Andreas Kanavos. 2023. "Literature Review on Hybrid Evolutionary Approaches for Feature Selection" Algorithms 16, no. 3: 167. https://doi.org/10.3390/a16030167
APA StylePiri, J., Mohapatra, P., Dey, R., Acharya, B., Gerogiannis, V. C., & Kanavos, A. (2023). Literature Review on Hybrid Evolutionary Approaches for Feature Selection. Algorithms, 16(3), 167. https://doi.org/10.3390/a16030167