A Review of Quantum-Inspired Metaheuristic Algorithms for Automatic Clustering
Abstract
:1. Introduction
- Data mining: Data mining is the practice of analysing data from different perspectives to extract valuable information from a huge amount of data to develop new products, reduce costs and improve decision-making tasks.
- Marketing: The common uses of cluster analysis include market segmentation to identify the groups of several entities, viz., people, products, services and structure of an organisation, understanding customer behaviour, identifying the opportunities of new products and so on.
- Community detection: The application area of community detection includes the analysis of social networks (e.g., LinkedIn®, Facebook®, Twitter®, etc.), politics (e.g., influence of political parties, astroturfing, etc.), public health (e.g., growth of epidemic spread, detection of cancer, tumours, etc.), smart advertising, criminology (e.g., criminal identification, fraud detection, criminal activity detection, etc.) and so on.
- Insurance: Clustering is used to identify groups of insurance policyholders, assist insurers in taking the necessary measures for mitigating the impact of mortality deviations on the economy and help comprehend the company’s experience with the emergence of death claims and fraud identification.
- Image segmentation: Clustering is widely used in image segmentation for processing satellite images, medical images, real-life images, surveillance images, benchmark datasets, object identification, criminal investigations and security system in airports, to name a few.
2. Quantum Computing Fundamentals
3. Automatic Clustering
4. Cluster Validity Indices
4.1. Davies–Bouldin Index (DB)
4.2. Dunn Index (DI)
4.3. Calinski–Harabasz Index (CH)
4.4. Silhouette Index (SI)
4.5. Xie–Beni Index (XB)
4.6. S_Dbw Index
4.7. I Index
4.8. CS-Measure (CSM)
4.9. PBM Index (PBM)
4.10. Local Cores-Based Cluster Validity (LCCV) Index
5. Classical Approaches to Automatic Clustering
6. Metaheuristic Approaches to Automatic Clustering
6.1. Single-Objective Approaches
6.2. Multi-Objective Approaches
7. Quantum-Inspired Metaheuristic Approaches for Automatic Clustering
7.1. Single-Objective Approaches
7.2. Multi-Objective Approaches
8. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
ABC | Artificial Bee Colony |
ACDE | Automatic Clustering DE |
ACMOEPO | Automatic Clustering Using Multi-Objective Emperor Penguin Optimiser |
AC-QuPSO | Automatic Clustering-Based Qutrit version of Particle Swarm Optimisation |
AFCH | Automatic Fuzzy C-Means with selected Initial Centres by Histogram |
AFCR | Automatic Fuzzy C-Means with randomly selected Initial Centres |
AP | Affinity Propagation |
ASLoC | Automatic Security Log Clustering |
ASO | Atom Search Optimisation |
BA | Bat Algorithm |
BEA | Bacterial Evolutionary Algorithm |
BQANA | Binary Quantum-Based Avian Navigation Optimiser |
BSMO | Binary Starling Murmuration Optimiser |
CDA | Cluster Decomposition Algorithm |
CH | Calinski–Harabasz |
CLA | Communication with Local Agents |
CLIQUE | Clustering In QUEst |
CMA-ES | Covariance Matrix Adaption Evolution Strategies |
CS | Cuckoo Search |
CSM | CS-Measure |
CSOA | Crow Search Optimisation Algorithm |
CVI | Cluster Validity Index |
DBSCAN | Density-Based Spatial Clustering of Applications with Noise |
DB | Davies–Bouldin index |
DCPSO | Dynamic Clustering Particle Swarm Optimisation |
DE | Differential Evolution |
DEMO | DE for Multi-Objective Optimisation |
DI | Dunn Index |
DPSC | Density Peaks Sentence Clustering |
EM | Expectation Maximisation |
FA | Firefly Algorithm |
FAPSO | Firefly Particle Swarm Optimisation |
FCM | Fuzzy C-Means |
FCM-NSGA-II | Fuzzy C-Means-Non-Dominated Sorting Genetic Algorithm II |
FCS | Fuzzy C-Shells |
FRC-NSGA | Fuzzy Relational Clustering with NSGA-II |
GA | Genetic Algorithm |
GCA | Genetically Based Clustering Algorithm |
GCUK | Genetic Clustering with an Unknown Number of Clusters K |
GLS | Guided Local Search |
GMDD | Gaussian Mixture Density Modelling Decomposition |
GQA | Genetic Quantum Algorithm |
GRNN | Generalised Regression Neural Network |
GWO | Grey Wolf Optimiser |
HCMA | Hierarchical Cluster Merging Algorithm |
I | I Index |
ICSOA | Intelligent Crow Search Optimisation Algorithm |
IFRC-NSGA | Improved FRC with NSGA-II |
KP | K-Prototypes |
LCCV | Local Cores-Based Cluster Validity index |
MA | Microcanonical Annealing |
MAFIA | Merging of Adaptive Intervals Approach to Spatial Data Mining |
MDS | Multi-Document Summarisation |
MODE | Multi-Objective DE |
MOCK | Multi-Objective Clustering with an Unknown Number of Clusters K |
MOGA | Multi-Objective Genetic Algorithm |
MPSO | Multi-objective-Based Particle Swarm Optimisation |
MOPSOSA | Multi-Objective Particle Swarm Optimisation and Simulated Annealing |
Mo-QIGA | Multi-Objective Quantum-Inspired Genetic Algorithm |
MOSA | Multi-Objective Simulated Annealing |
MOIWO | Multi-Objective Invasive Weed Optimisation |
MQPSO | Many-Objective Quantum-Inspired Particle Swarm Optimisation Algorithm |
MRFO | Manta Ray Foraging Optimisation |
MS | Mean Shift |
NSDE | Non-Dominated Sorting DE |
NSGA-II | Non-Dominated Sorting Genetic Algorithm |
OPTICS | Ordering Points To Identify the Clustering Structure |
PBM | PBM Index |
PDE | Pareto DE |
PSNR | Peak Signal-to-Noise Ratio |
PSO | Particle Swarm Optimisation |
QANA | Quantum-Based Avian Navigation Optimiser Algorithm |
QEA | Quantum Evolutionary Algorithm |
QEAC | Quantum-Inspired Evolutionary Algorithm for Data Clustering |
QIASMO | Quantum-Inspired Ageist Spider Monkey Optimisation |
QIBAT | Quantum-Inspired BAT |
QIGA | Quantum-Inspired Genetic Algorithm |
QIMRFO | Quantum-Inspired Manta Ray Foraging Optimisation |
QIEPSO | Quantum-Inspired Enhanced Particle Swarm Optimisation |
QIMONSGA-II | Quantum-Inspired Multi-Objective NSGA-II Algorithm for |
Automatic Clustering of Grey-Scale Images | |
QIPSO | Quantum-Inspired Particle Swarm Optimisation |
QSMO | Quantum Spider Monkey Optimisation |
QMEC | Quantum-Inspired Multi-Objective Evolutionary Clustering |
QTM | Quantum Turing Machine |
SCA | Sine Cosine Algorithm |
SI | Silhouette Index |
SMO | Starling Murmuration Optimiser |
SOM | Self-Organising Map |
SOS | Symbiotic Organism Search |
STING | Statistical Information Grid |
STClu | Statistical Test-Based Clustering |
SGA-KP | Single-Objective Genetic Algorithm with K-Means |
TLBO | Teaching Learning-Based Optimisation |
TS | Tabu Search |
UCI | University of California at Irvine |
UCDP | Uppsala Conflict Data Program |
XB | Xie–Beni index |
References
- Jain, A.K.; Dubes, R.C. Algorithms for Clustering Data; Prentice-Hall, Inc.: Hoboken, NJ, USA, 1988. [Google Scholar]
- Jain, A.K.; Murty, M.N.; Flynn, P.J. Data Clustering: A Review. ACM Comput. Surv. 1999, 31, 264–323. [Google Scholar] [CrossRef]
- Roberts, S.J. Parametric and non-parametric unsupervised cluster analysis. Pattern Recognit. 1997, 30, 261–272. [Google Scholar] [CrossRef]
- Gan, G.; Ma, C.; Wu, J. Data Clustering: Theory, Algorithms, and Applications; Society for Industrial and Applied Mathematics: Philadelphia, PA, USA, 2007. [Google Scholar]
- Faizan, M.; Zuhairi, M.F.; Ismail, S.; Sultan, S. Applications of Clustering Techniques in Data Mining: A Comparative Study. Int. J. Adv. Comput. Sci. Appl. 2020, 11, 146–153. [Google Scholar] [CrossRef]
- Dziechciarz-Duda, M. Marketing applications of cluster analysis to durables market segmentation. Klasyfikacja i analiza danych–teoria i zastosowania. Taksonomia 2007, 14, 523–532. [Google Scholar]
- Karataş, A.; Şahin, S. Application Areas of Community Detection: A Review. In Proceedings of the International Congress on Big Data, Deep Learning and Fighting Cyber Terrorism (IBIGDELFT), Ankara, Turkey, 3–4 December 2018; pp. 65–70. [Google Scholar]
- Yin, S.; Gan, G.; Valdez, E.A.; Vadiveloo, J. Applications of Clustering with Mixed Type Data in Life Insurance. Risks 2021, 9, 47. [Google Scholar] [CrossRef]
- Tanwar, B. Clustering Techniques for Digital Image Segmentation. Int. J. Sci. Eng. Res. 2016, 7, 55. [Google Scholar]
- Mittal, H.; Pandey, A.C.; Saraswat, M.; Kumar, S.; Pal, R.; Modwel, G. A comprehensive survey of image segmentation: Clustering methods, performance parameters, and benchmark datasets. Multimed. Tools Appl. 2021, 81, 35001–35026. [Google Scholar] [CrossRef]
- Ramadas, M.; Abraham, A. Metaheuristics for Data Clustering and Image Segmentation. In Proceedings of the Intelligent Systems Reference Library; Springer: Berlin/Heidelberg, Germany, 2019. [Google Scholar]
- Singh, S.; Srivastava, S. Review of Clustering Techniques in Control System: Review of Clustering Techniques in Control System. Procedia Comput. Sci. 2020, 173, 272–280. [Google Scholar] [CrossRef]
- Gandhi, G.; Srivastava, R. Review Paper: A Comparative Study on Partitioning Techniques of Clustering Algorithms. Int. J. Comput. Appl. 2014, 87, 10–13. [Google Scholar] [CrossRef]
- Saxena, A.; Prasad, M.; Gupta, A.; Bharill, N.; Patel, O.P.; Tiwari, A.; Er, M.J.; Ding, W.; Lin, C.T. A Review of Clustering Techniques and Developments. Neurocomputing 2017, 267, 664–681. [Google Scholar] [CrossRef]
- Indhu, R.; Porkodi, R. Comparison of Clustering Algorithm. Int. J. Sci. Res. Comput. Sci. Eng. Inf. Technol. (IJSRCSEIT) 2018, 3, 218–223. [Google Scholar]
- Abu Abbas, O. Comparisons Between Data Clustering Algorithms. Int. Arab. J. Inf. Technol. 2008, 5, 320–325. [Google Scholar]
- Johnson, S. Hierarchical clustering schemes. Psychometrika 1967, 32, 241–254. [Google Scholar] [CrossRef]
- Xu, D.; Tian, Y. A Comprehensive Survey of Clustering Algorithms. Ann. Data Sci. 2015, 2, 165–193. [Google Scholar] [CrossRef]
- Murtagh, F. A Survey of Recent Advances in Hierarchical Clustering Algorithms. Comput. J. 1983, 26, 354–359. [Google Scholar] [CrossRef]
- Gil-García, R.; Pons-Porrata, A. Dynamic hierarchical algorithms for document clustering. Pattern Recognit. Lett. 2010, 31, 469–477. [Google Scholar] [CrossRef]
- Feng, L.; Qiu, M.H.; Wang, Y.X.; Xiang, Q.L.; Yang, Y.F.; Liu, K. A fast divisive clustering algorithm using an improved discrete particle swarm optimizer. Pattern Recognit. Lett. 2010, 31, 1216–1225. [Google Scholar] [CrossRef]
- Wang, W.; Zhang, Y.; Li, y.; Zhang, X. The Global Fuzzy C-Means Clustering Algorithm. In Proceedings of the 6th World Congress on Intelligent Control and Automation, Dalian, China, 21–23 June 2006; Volume 1, pp. 3604–3607. [Google Scholar]
- Dave, R.N.; Bhaswan, K. Adaptive fuzzy c-shells clustering and detection of ellipses. IEEE Trans. Neural Networks 1992, 3, 643–662. [Google Scholar] [CrossRef] [PubMed]
- Yager, R.R.; Filev, D.P. Approximate clustering via the mountain method. IEEE Trans. Syst. Man. Cybern. 1994, 24, 1279–1284. [Google Scholar] [CrossRef]
- MacQueen, J. Some methods for classification and analysis of multivariate observations. Proc. Fifth Berkeley Symp. Math. Stat. Probab. 1967, 1, 281–297. [Google Scholar]
- Park, H.S.; Jun, C.H. A simple and fast algorithm for K-medoids clustering. Expert Syst. Appl. 2009, 36, 3336–3341. [Google Scholar] [CrossRef]
- Kaufman, L.; Rousseeuw, P. Finding Groups in Data: An Introduction to Cluster Analysis; Wiley: New York, NY, USA, 1990; pp. 126–163. [Google Scholar]
- Ng, R.T.; Han, J. CLARANS: A method for clustering objects for spatial data mining. IEEE Trans. Knowl. Data Eng. 2002, 14, 1003–1016. [Google Scholar] [CrossRef]
- Kriegel, H.P.; Kröger, P.; Sander, J.; Zimek, A. Density-based Clustering. Wiley Interdisc. Rew. Data Min. Knowl. Discov. 2011, 1, 231–240. [Google Scholar] [CrossRef]
- Ester, M.; Kriegel, H.P.; Sander, J.; Xu, X. A Density-Based Algorithm for Discovering Clusters in Large Spatial Databases with Noise; AAAI Press: Palo Alto, CA, USA, 1996. [Google Scholar]
- Ankerst, M.; Breunig, M.M.; Kriegel, H.P.; Sander, J. OPTICS: Ordering Points to Identify the Clustering Structure. Acm Sigmod Rec. 1999, 28, 49–60. [Google Scholar] [CrossRef]
- Comaniciu, D.; Meer, P. Mean shift: A robust approach toward feature space analysis. IEEE Trans. Pattern Anal. Mach. Intell. 2002, 24, 603–619. [Google Scholar] [CrossRef]
- Agrawal, R.; Gehrke, J.; Gunopulos, D.; Raghavan, P. Automatic subspace clustering of high dimensional data for data mining applications. In Proceedings of the ACM SIGMOD International Conference on Management of Data (SIGMOD ’98), Seattle, WA, USA, 1–4 June 1998; pp. 94–105. [Google Scholar]
- Wang, W.; Yang, J.; Muntz, R.R. STING: A Statistical Information Grid Approach to Spatial Data Mining. In Proceedings of the VLDB, Athens, Greece, 26–29 August 1997. [Google Scholar]
- Goil, S.; Nagesh, H.; Choudhary, A. MAFIA: Efficient and Scalable Subspace Clustering for Very Large Data Sets; Technical Report NumberCPDC-TR-9906-019; Center for Parallel and Distributed Computing, Northwestern University: Evanston, IL, USA, 1999. [Google Scholar]
- Sheikholeslami, G.; Chatterjee, S.; Zhang, A. Wavecluster: A multi-resolution clustering approach for very large spatial databases. In Proceedings of the VLDB, New York, NY, USA, 24–27 August 1998; pp. 428–439. [Google Scholar]
- Si, Y.; Liu, P.; Li, P.; Brutnell, T.P. Model-based clustering for RNA-seq data. Bioinformatics 2013, 30, 197–205. [Google Scholar] [CrossRef]
- Zhuang, X.; Huang, Y.; Palaniappan, K.; Zhao, Y. Gaussian mixture density modeling, decomposition, and applications. IEEE Trans. Image Process. 1996, 5, 1293–1302. [Google Scholar] [CrossRef]
- Barbará, D.; Li, Y.; Couto, J. COOLCAT: An entropy-based algorithm for categorical clustering. In Proceedings of the 11th International Conference on Information and Knowledge Management (CIKM ’02), McLean, VA, USA, 4–9 November 2002; pp. 582–589. [Google Scholar]
- Bay, S.D.; Pazzani, M.J. Detecting change in categorical data: Mining contrast sets. In Proceedings of the 5th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD ’99), San Diego, CA, USA, 15–18 August 1999; pp. 302–306. [Google Scholar]
- Goldberg, D.; Holland, J. Genetic Algorithms and Machine Learning. Mach. Learn. 1988, 3, 95–99. [Google Scholar] [CrossRef]
- Storn, R.; Price, K.V. Differential Evolution—A Simple and Efficient Heuristic for global Optimization over Continuous Spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
- Eberhart, R.; Kennedy, J. A new optimizer using particle swarm theory. In Proceedings of the MHS’95—6th International Symposium on Micro Machine and Human Science, Nagoya, Japan, 4–6 October 1995; pp. 39–43. [Google Scholar]
- Agbaje, M.; Ezugwu, A.; Els, R. Automatic Data Clustering Using Hybrid Firefly Particle Swarm Optimization Algorithm. IEEE Access 2019, 7, 184963–184984. [Google Scholar] [CrossRef]
- Karaboga, D. An Idea Based on Honey Bee Swarm for Numerical Optimization; Technical Report—TR06; Erciyes University: Kayseri, Turkey, 2005; pp. 1–10. [Google Scholar]
- Rajah, V.; Ezugwu, A.E. Hybrid Symbiotic Organism Search algorithms for Automatic Data Clustering. In Proceedings of the Conference on Information Communications Technology and Society (ICTAS), Virtual, 9–10 March 2020; pp. 1–9. [Google Scholar]
- Das, S.; Chowdhury, A.; Abraham, A. A Bacterial Evolutionary Algorithm for automatic data clustering. In Proceedings of the Congress on Evolutionary Computation, Trondheim, Norway, 18–21 May 2009; pp. 2403–2410. [Google Scholar]
- Kapoor, S.; Zeya, I.; Singhal, C.; Nanda, S. A Grey Wolf Optimizer Based Automatic Clustering Algorithm for Satellite Image Segmentation. Procedia Comput. Sci. 2017, 115, 415–422. [Google Scholar] [CrossRef]
- Jensi, R.; Jiji, G.W. MBA-IF:A New Data Clustering Method Using Modified Bat Algorithm and Levy Flight. In Proceedings of the SOCO 2015, Burgos, Spain, 15–17 June 2015; pp. 15–17. [Google Scholar]
- Goel, S.; Sharma, A.; Bedi, P. Cuckoo Search Clustering Algorithm: A novel strategy of biomimicry. In Proceedings of the World Congress on Information and Communication Technologies, Mumbai, India, 11–14 December 2011; pp. 916–921. [Google Scholar]
- Rao, R.; Savsani, V.; Vakharia, D. Teaching–learning-based optimization: A novel method for constrained mechanical design optimization problems. Comput. Aided Des. 2011, 43, 303–315. [Google Scholar] [CrossRef]
- Zamani, H.; Nadimi-Shahraki, M.H.; Gandomi, A.H. Starling murmuration optimizer: A novel bio-inspired algorithm for global and engineering optimization. Comput. Methods Appl. Mech. Eng. 2022, 392, 114616. [Google Scholar] [CrossRef]
- Nadimi-Shahraki, M.H.; Asghari Varzaneh, Z.; Zamani, H.; Mirjalili, S. Binary Starling Murmuration Optimizer Algorithm to Select Effective Features from Medical Data. Appl. Sci. 2023, 13, 564. [Google Scholar] [CrossRef]
- Montiel Ross, O.H. A Review of Quantum-Inspired Metaheuristics: Going From Classical Computers to Real Quantum Computers. IEEE Access 2020, 8, 814–838. [Google Scholar] [CrossRef]
- Mani, N.; Srivastava, G.; Mani, A. Solving Combinatorial Optimization problems with Quantum inspired Evolutionary Algorithm Tuned using a Novel Heuristic Method. arXiv 2016, arXiv:1612.08109. [Google Scholar]
- Abs da Cruz, A.; Barbosa, C.; Pacheco, M.; Vellasco, M. Quantum-inspired evolutionary algorithms and its application to numerical optimization problems. Lect. Notes Comput. Sci. 2004, 3316, 212–217. [Google Scholar]
- DBLP-Citation-Network V12. Available online: https://www.aminer.org/citation (accessed on 7 October 2021).
- Amami, M.; Pasi, G.; Stella, F.; Faiz, R. An LDA-Based Approach to Scientific Paper Recommendation. In Natural Language Processing and Information Systems; Springer: Berlin/Heidelberg, Germany, 2016; Volume 9612, pp. 200–210. [Google Scholar]
- Saha, R.; Tariq, M.T.; Hadi, M.; Xiao, Y. Pattern Recognition Using Clustering Analysis to Support Transportation System Management, Operations, and Modeling. J. Adv. Transp. 2019, 2019, 1–12. [Google Scholar] [CrossRef]
- Jardine, N.; van Rijsbergen, C.J. The use of hierarchic clustering in information retrieval. Inf. Storage Retr. 1971, 7, 217–240. [Google Scholar] [CrossRef]
- Oyelade, J.; Isewon, I.; Oladipupo, F.; Aromolaran, O.; Uwoghiren, E.; Ameh, F.; Achas, M.; Adebiyi, E. Clustering Algorithms: Their Application to Gene Expression Data. Bioinform. Biol. Insights 2016, 10, 237–253. [Google Scholar] [CrossRef]
- Seetharaman, S.K.; Thouheed Ahmed, S.; Gunashree; Bhumika, P.; Ishwarya; Anusha, B. A Generalized Study on Data Mining and Clustering Algorithm. In New Trends in Computational Vision and Bio-inspired Computing; Springer: Berlin/Heidelberg, Germany, 2018. [Google Scholar]
- Xu, J.; Liu, H. Web user clustering analysis based on K-Means algorithm. In Proceedings of the International Conference on Information, Networking and Automation (ICINA), Kunming, China, 18–19 October 2010; Volume 2, pp. 6–9. [Google Scholar]
- Montanaro, A. Quantum algorithms: An overview. Npj Quantum Inf. 2016, 2, 15023. [Google Scholar] [CrossRef]
- Jordan, S. The Quantum Algorithm Zoo. Available online: https://quantumalgorithmzoo.org/ (accessed on 7 October 2021).
- Deutsch, D. Quantum theory, the Church–Turing principle and the universal quantum computer. Proc. R. Soc. London. A. Math. Phys. Sci. 1985, 400, 117–197. [Google Scholar]
- Pour-El, M.B.; Richards, I. The wave equation with computable initial data such that its unique solution is not computable. Adv. Math. 1981, 39, 215–239. [Google Scholar] [CrossRef]
- Benioff, P. Quantum Mechanical Models of Turing Machines That Dissipate No Energy. Phys. Rev. Lett. 1982, 48, 1581–1585. [Google Scholar] [CrossRef]
- Feynman, R.P. Simulating physics with computers. Int. J. Theor. Phys. 1982, 21, 467–488. [Google Scholar] [CrossRef]
- Nourbakhsh, A.; Jones, M.; Kristjuhan, K.; Carberry, D.; Karon, J.; Beenfeldt, C.; Shahriari, K.; Andersson, M.; Jadidi, M.; Mansouri, S. Quantum Computing: Fundamentals, Trends and Perspectives for Chemical and Biochemical Engineers. arXiv 2022, arXiv:2201.02823. [Google Scholar]
- Aung, D.M.M.; Aye, K.T.K.; Aung, T.M. On the Study of Quantum Computing. In Proceedings of the Conference on Science and Technology Development (CSTD-2019), Pyin Oo Lwin, Myanmar, 31 October–1 November 2019. [Google Scholar]
- Deutsch, D.; Jozsa, R. Rapid Solution of Problems by Quantum Computation. Proc. R. Soc. Lond. Ser. A 1992, 439, 553–558. [Google Scholar]
- Simon, D.R. On the power of quantum computation. In Proceedings of the 35th Annual Symposium on Foundations of Computer Science, Santa Fe, NM, USA, 20–22 November 1994; pp. 116–123. [Google Scholar]
- Shor, P.W. Algorithms for quantum computation: Discrete logarithms and factoring. In Proceedings of the 35th Annual Symposium on Foundations of Computer Science, Santa Fe, NM, USA, 20–22 November 1994; pp. 124–134. [Google Scholar]
- Grover, L.K. A fast quantum mechanical algorithm for database search. In Proceedings of the 28th Annual ACM Symposium on Theory of Computing, Philadelphia, PA, USA, 22–24 May 1996; pp. 212–219. [Google Scholar]
- Hey, T. Quantum computing: An introduction. Comput. Control Eng. J. 1999, 10, 105–112. [Google Scholar] [CrossRef]
- Blatt, R.; Häiffner, H.; Roos, C.F.; Becher, C.; Schmidt-Kaler, F. Course 5—Quantum Information Processing in Ion Traps I. In Quantum Entanglement and Information Processing; Estève, D., Raimond, J.M., Dalibard, J., Eds.; Elsevier: Amsterdam, The Netherlands, 2004; Volume 79, pp. 223–260. [Google Scholar]
- Vedral, V. Quantum entanglement. Nat. Phys. 2014, 10, 256–258. [Google Scholar] [CrossRef]
- Li, B.; Yu, Z.H.; Fei, S.M. Geometry of Quantum Computation with Qutrits. Sci. Rep. 2013, 3, 2594. [Google Scholar] [CrossRef]
- Gokhale, P.; Baker, J.M.; Duckering, C.; Chong, F.T.; Brown, N.C.; Brown, K.R. Extending the Frontier of Quantum Computers With Qutrits. IEEE Micro 2020, 40, 64–72. [Google Scholar] [CrossRef]
- Chi, Y.; Huang, J.; Zhang, Z.; Mao, J.; Zhou, Z.; Chen, X.; Zhai, C.; Bao, J.; Dai, T.; Yuan, H.; et al. A programmable qudit-based quantum processor. Nat. Commun. 2022, 13, 1166. [Google Scholar] [CrossRef] [PubMed]
- Luo, M.; Wang, X. Universal quantum computation with qudits. Sci. China Physics Mech. Astron. 2014, 57, 1712–1717. [Google Scholar] [CrossRef]
- Wang, Y.; Hu, Z.; Sanders, B.C.; Kais, S. Qudits and high-dimensional quantum computing. Front. Phys. 2020, 8, 589504. [Google Scholar] [CrossRef]
- Brylinski, J.L.; Brylinski, R. Universal quantum gates. In Proceedings of the Mathematics of Quantum Computation; American Mathematical Society: New York, NY, USA, 2002; p. 117. [Google Scholar]
- Nielsen, M.A.; Chuang, I.L. Quantum Computation and Quantum Information: 10th Anniversary Edition; Cambridge University Press: Cambridge, UK, 2010. [Google Scholar]
- Gómez, F.J.O.; Lopez, G.O.; Garzon, E.M. A Faster Half Subtractor Circuit Using Reversible Quantum Gates. Balt. J. Mod. Comput. 2019, 7, 99–111. [Google Scholar]
- Fahdil, M.A.; Al-Azawi, A.F.; Said, S. Operations Algorithms on Quantum Computer. IJCSNS Int. J. Comput. Sci. Netw. Secur. 2010, 10, 85. [Google Scholar]
- Acampora, G.; Vitiello, A. Implementing evolutionary optimization on actual quantum processors. Inf. Sci. 2021, 575, 542–562. [Google Scholar] [CrossRef]
- Yang, S.; Wang, M.; Jiao, L. A novel quantum evolutionary algorithm and its application. In Proceedings of the Congress on Evolutionary Computation (IEEE Cat. No. 04TH8753), Portland, OR, USA, 19–23 June 2004; Volume 1, pp. 820–826. [Google Scholar]
- Narayanan, A.; Moore, M. Quantum-inspired genetic algorithms. In Proceedings of the International Conference on Evolutionary Computation, Nayoya, Japan, 20–22 May 1996; pp. 61–66. [Google Scholar]
- Konar, D.; Sharma, K.; Sarogi, V.; Bhattacharyya, S. A Multi-Objective Quantum-Inspired Genetic Algorithm (Mo-QIGA) for Real-Time Tasks Scheduling in Multiprocessor Environment. Procedia Comput. Sci. 2018, 131, 591–599. [Google Scholar] [CrossRef]
- Balicki, J. Many-Objective Quantum-Inspired Particle Swarm Optimization Algorithm for Placement of Virtual Machines in Smart Computing Cloud. Entropy 2022, 24, 58. [Google Scholar] [CrossRef]
- Zamani, H.; Nadimi-Shahraki, M.H.; Gandomi, A.H. QANA: Quantum-based avian navigation optimizer algorithm. Eng. Appl. Artif. Intell. 2021, 104, 104314. [Google Scholar] [CrossRef]
- Nadimi-Shahraki, M.H.; Fatahi, A.; Zamani, H.; Mirjalili, S. Binary Approaches of Quantum-Based Avian Navigation Optimizer to Select Effective Features from High-Dimensional Medical Data. Mathematics 2022, 10, 2770. [Google Scholar] [CrossRef]
- Platt, J.C.; Czerwinski, M.; Field, B.A. PhotoTOC: Automatic clustering for browsing personal photographs. In Proceedings of the 4th International Conference on Information, Communications and Signal Processing, 2003 and the 4th Pacific Rim Conference on Multimedia. Proceedings of the 2003 Joint, Singapore, 15–18 December 2003; Volume 1, pp. 6–10. [Google Scholar]
- Lei, T.; Liu, P.; Jia, X.; Zhang, X.; Meng, H.; Nandi, A.K. Automatic Fuzzy Clustering Framework for Image Segmentation. IEEE Trans. Fuzzy Syst. 2020, 28, 2078–2092. [Google Scholar] [CrossRef]
- Tseng, L.Y.; Bien Yang, S. A genetic approach to the automatic clustering problem. Pattern Recognit. 2001, 34, 415–424. [Google Scholar] [CrossRef]
- Azhir, E.; Navimipour, N.J.; Hosseinzadeh, M.; Sharifi, A.; Darwesh, A. An automatic clustering technique for query plan recommendation. Inf. Sci. 2021, 545, 620–632. [Google Scholar] [CrossRef]
- Chen, J.H.; Chang, Y.C.; Hung, W.L. A robust automatic clustering algorithm for probability density functions with application to categorizing color images. Commun. Stat. Simul. Comput. 2018, 47, 2152–2168. [Google Scholar] [CrossRef]
- Geraud, T.; Strub, P.; Darbon, J. Color image segmentation based on automatic morphological clustering. In Proceedings of the International Conference on Image Processing (Cat. No.01CH37205), Thessaloniki, Greece, 7–10 October 2001; Volume 3, pp. 70–73. [Google Scholar]
- Zhu, S.; Xu, L.; Cao, L. A Study of Automatic Clustering Based on Evolutionary Many-Objective Optimization. In Proceedings of the Genetic and Evolutionary Computation Conference Companion, Kyoto, Japan, 15–19 July 2018; pp. 173–174. [Google Scholar]
- Binu, D. Cluster analysis using optimization algorithms with newly designed objective functions. Expert Syst. Appl. 2015, 42, 5848–5859. [Google Scholar] [CrossRef]
- Wang, C.W.; Hwang, J.I. Automatic clustering using particle swarm optimization with various validity indices. In Proceedings of the 5th International Conference on BioMedical Engineering and Informatics, Chongqing, China, 16–18 October 2012; pp. 1557–1561. [Google Scholar]
- Tsai, C.W.; Liao, Y.H.; Chiang, M.C. A quantum-inspired evolutionary clustering algorithm. In Proceedings of the International Conference on Fuzzy Theory and Its Applications (iFUZZY), Taipei, Taiwan, 6–8 December 2013; pp. 305–310. [Google Scholar]
- Li, Y.; Shi, H.; Gong, M.; Shang, R. Quantum-Inspired Evolutionary Clustering Algorithm Based on Manifold Distance. In Proceedings of the 1st ACM/SIGEVO Summit on Genetic and Evolutionary Computation, Shanghai, China, 12–14 June 2009; pp. 871–874. [Google Scholar]
- Theodoridis, S.; Koutroumbas, K. Pattern Recognition, 4th ed.; Academic Press: Cambridge, MA, USA, 2009. [Google Scholar]
- Xie, X.; Beni, G. A validity measure for fuzzy clustering. IEEE Trans. Pattern Anal. Mach. Intell. 1991, 13, 841–847. [Google Scholar] [CrossRef]
- Kim, M.; Ramakrishna, R. New indices for cluster validity assessment. Pattern Recognit. Lett. 2005, 26, 2353–2363. [Google Scholar] [CrossRef]
- Zhou, M.; Şenol, A. VIASCKDE Index: A Novel Internal Cluster Validity Index for Arbitrary-Shaped Clusters Based on the Kernel Density Estimation. Comput. Intell. Neurosci. 2022, 2022, 1687–5265. [Google Scholar]
- Saha, S.; Bandyopadhyay, S. Some connectivity based cluster validity indices. Appl. Soft Comput. 2012, 12, 1555–1565. [Google Scholar] [CrossRef]
- Arbelaitz, O.; Gurrutxaga, I.; Muguerza, J.; Pérez, J.; Perona, I. An extensive comparative study of cluster validity indices. Pattern Recognit. 2013, 46, 243–256. [Google Scholar] [CrossRef]
- José-García, A.; Gómez-Flores, W. A Survey of Cluster Validity Indices for Automatic Data Clustering Using Differential Evolution. Proceedings of Genetic and Evolutionary Computation Conference, Lille, France, 10–14 July 2021; pp. 314–322. [Google Scholar]
- Liu, Y.; Li, Z.; Xiong, H.; Gao, X.; Wu, J.; Wu, S. Understanding and Enhancement of Internal Clustering Validation Measures. IEEE Trans. Cybern. 2013, 43, 982–994. [Google Scholar]
- Hu, L.; Zhong, C. An Internal Validity Index Based on Density-Involved Distance. IEEE Access 2019, 7, 40038–40051. [Google Scholar] [CrossRef]
- Li, Q.; Yue, S.; Wang, Y.; Ding, M.; Li, J. A New Cluster Validity Index Based on the Adjustment of Within-Cluster Distance. IEEE Access 2020, 8, 202872–202885. [Google Scholar] [CrossRef]
- Davies, D.L.; Bouldin, D.W. A Cluster Separation Measure. IEEE Trans. Pattern Anal. Mach. Intell. 1979, PAMI-1, 224–227. [Google Scholar] [CrossRef]
- Dunn, J.C. A Fuzzy Relative of the ISODATA Process and Its Use in Detecting Compact Well-Separated Clusters. J. Cybern. 1973, 3, 32–57. [Google Scholar] [CrossRef]
- Caliński, T.; Harabasz, J. A dendrite method for cluster analysis. Commun. Stat. 1974, 3, 1–27. [Google Scholar]
- Rousseeuw, P.J. Silhouettes: A Graphical Aid to the Interpretation and Validation of Cluster Analysis. J. Comput. Appl. Math. 1987, 20, 53–65. [Google Scholar] [CrossRef]
- Bezdek, J.C. Cluster Validity. In Pattern Recognition with Fuzzy Objective Function Algorithms; Springer: Boston, MA, USA, 1981; pp. 95–154. [Google Scholar]
- Halkidi, M.; Vazirgiannis, M. Clustering Validity Assessment: Finding the optimal partitioning of a data set. In Proceedings of the International Conference on Data Mining, San Jose, CA, USA, 29 November–2 December 2001; pp. 187–194. [Google Scholar]
- Maulik, U.; Bandyopadhyay, S. Performance evaluation of some clustering algorithms and validity indices. IEEE Trans. Pattern Anal. Mach. Intell. 2002, 24, 1650–1654. [Google Scholar] [CrossRef]
- Chou, C.H.; Su, M.C.; Lai, E. A new cluster validity measure and its application to image compression. Pattern Anal. Appl. 2004, 7, 205–220. [Google Scholar] [CrossRef]
- Pakhira, M.K.; Bandyopadhyay, S.; Maulik, U. Validity index for crisp and fuzzy clusters. Pattern Recognit. 2004, 37, 487–501. [Google Scholar] [CrossRef]
- Cheng, D.; Zhu, Q.; Huang, J.; Wu, Q.; Yang, L. A Novel Cluster Validity Index Based on Local Cores. IEEE Trans. Neural Networks Learn. Syst. 2019, 30, 985–999. [Google Scholar] [CrossRef] [PubMed]
- Husain, H.; Khalid, M.; Yusof, R. Automatic clustering of generalized regression neural network by similarity index based fuzzy c-means clustering. In Proceedings of the Region 10 Conference TENCON 2004, Chiang Mai, Thailand, 24 November 2004; Volume 2, pp. 302–305. [Google Scholar]
- Specht, D.F.; Shapiro, P.D. Generalization accuracy of probabilistic neural networks compared with backpropagation networks. In Proceedings of the IJCNN-91-Seattle International Joint Conference on Neural Networks, Seattle, WA, USA, 8–12 July 1991; Volume 1, pp. 887–892. [Google Scholar]
- Specht, D. A General Regression Neural Network. IEEE Trans. Neural Networks 1991, 2, 568–578. [Google Scholar] [CrossRef] [PubMed]
- Nosovskiy, G.V.; Liu, D.; Sourina, O. Automatic clustering and boundary detection algorithm based on adaptive influence function. Pattern Recognit. 2008, 41, 2757–2776. [Google Scholar] [CrossRef]
- Li, L.; Yu, Z.; Feng, Z.; Zhang, X. Automatic classification of uncertain data by soft classifier. In Proceedings of the International Conference on Machine Learning and Cybernetics, Guilin, China, 10–13 July 2011; Volume 2, pp. 679–684. [Google Scholar]
- Zhang, Y.; Xia, Y.; Liu, Y.; Wang, W. Clustering Sentences with Density Peaks for Multi-document Summarization. In Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Denver, CO, USA, 31 May–5 June 2015; pp. 1262–1267. [Google Scholar]
- Rodriguez, A.; Laio, A. Clustering by fast search and find of density peaks. Science 2014, 344, 1492–1496. [Google Scholar] [CrossRef] [PubMed]
- DUC 2004: Documents, Tasks, and Measures (Some Comparisons to DUC 2003). Available online: https://duc.nist.gov/duc2004/ (accessed on 7 October 2021).
- Conroy, J.M.; Schlesinger, J.D.; Goldstein, J.; O’Leary, D.P. Left-Brain/Right-Brain Multi-Document Summarization. In Proceedings of the Document Understanding Conference (DUC 2004), Boston, MA, USA, 6–7 May 2004. [Google Scholar]
- Radev, D.R.; Jing, H.; Styś, M.; Tam, D. Centroid-based summarization of multiple documents. Inf. Process. Manag. 2004, 40, 919–938. [Google Scholar] [CrossRef]
- Wan, X.; Yang, J. Multi-document summarization using cluster-based link analysis. In Proceedings of the 31st Annual international ACM SIGIR Conference on Research and Development in Information Retrieval, Singapore, 20–24 July 2008; pp. 299–306. [Google Scholar]
- Wang, D.; Li, T.; Zhu, S.; Ding, C. Multi-document summarization via sentence-level semantic analysis and symmetric matrix factorization. In Proceedings of the 31st Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, Singapore, 20–24 July 2008; pp. 307–314. [Google Scholar]
- Cai, X.; Li, W. Ranking Through Clustering: An Integrated Approach to Multi-Document Summarization. IEEE Trans. Audio Speech Lang. Process. 2013, 21, 1424–1433. [Google Scholar]
- Wang, D.; Zhu, S.; Li, T.; Chi, Y.; Gong, Y. Integrating Document Clustering and Multidocument Summarization. TKDD 2011, 5, 14. [Google Scholar] [CrossRef]
- Erkan, G.; Radev, D.R. Lexrank: Graph-based lexical centrality as salience in text summarization. J. Artif. Intell. Res. 2004, 22, 457–479. [Google Scholar] [CrossRef]
- Lin, H.; Bilmes, J. A Class of Submodular Functions for Document Summarization. In Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, Portland, OR, USA, 19–24 June 2011; Volume 1, pp. 510–520. [Google Scholar]
- Wang, D.; Li, T. Weighted consensus multi-document summarization. Inf. Process. Manag. 2012, 48, 513–523. [Google Scholar] [CrossRef]
- Wang, G.; Song, Q. Automatic Clustering via Outward Statistical Testing on Density Metrics. IEEE Trans. Knowl. Data Eng. 2016, 28, 1971–1985. [Google Scholar] [CrossRef]
- Fränti, P.; Sieranoja, S. K-means properties on six clustering benchmark datasets. Appl. Intell. 2018, 48, 4743–4759. [Google Scholar] [CrossRef]
- Chen, Z.; Chang, D.; Zhao, Y. An Automatic Clustering Algorithm Based on Region Segmentation. IEEE Access 2018, 6, 74247–74259. [Google Scholar] [CrossRef]
- Cassisi, C.; Ferro, A.; Giugno, R.; Pigola, G.; Pulvirenti, A. Enhancing density-based clustering: Parameter reduction and outlier detection. Inf. Syst. 2013, 38, 317–330. [Google Scholar] [CrossRef]
- Cheng, Q.; Lu, X.; Liu, Z.; Huang, J.; Cheng, G. Spatial clustering with Density-Ordered tree. Phys. A Stat. Mech. Its Appl. 2016, 460, 188–200. [Google Scholar] [CrossRef]
- Ram, A.; Sharma, A.; Jalal, A.S.; Agrawal, A.; Singh, R. An enhanced density based spatial clustering of applications with noise. In Proceedings of the International Advance Computing Conference, Patiala, India, 6–7 March 2009; IEEE: Piscataway, NJ, USA, 2009; pp. 1475–1478. [Google Scholar]
- Sundberg, R.; Melander, E. Introducing the UCDP georeferenced event dataset. J. Peace Res. 2013, 50, 523–532. [Google Scholar] [CrossRef]
- Yangyang, H.; Zengli, L. Fuzzy clustering algorithm for automatically determining the number of clusters. In Proceedings of the International Conference on Signal Processing, Communications and Computing (ICSPCC), Dalian, China, 20–22 September 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 1–4. [Google Scholar]
- Wang, L.; Zheng, K.; Tao, X.; Han, X. Affinity propagation clustering algorithm based on large-scale data-set. Int. J. Comput. Appl. 2018, 40, 1–6. [Google Scholar] [CrossRef]
- Studiawan, H.; Payne, C.; Sohel, F. Automatic Graph-Based Clustering for Security Logs. In Advanced Information Networking and Applications; Springer: Berlin/Heidelberg, Germany, 2020; pp. 914–926. [Google Scholar]
- Hofstede, R.; Hendriks, L.; Sperotto, A.; Pras, A. SSH Compromise Detection Using NetFlow/IPFIX. SIGCOMM Comput. Commun. Rev. 2014, 44, 20–26. [Google Scholar] [CrossRef]
- Sconzo, M. SecRepo.com: Security Data Samples Repository. 2014. Available online: http://www.secrepo.com (accessed on 7 October 2021).
- Chuvakin, A. Scan 34 2005 from The Honeynet Project. 2005. Available online: https://seclists.org/focus-ids/2005/Apr/21 (accessed on 7 October 2021).
- National CyberWatch Center. Snort Fast Alert Logs from The U.S. National Cyber-Watch (MACCDC); National CyberWatch Center: Largo, MD, USA, 2012. [Google Scholar]
- Chuvakin, A. Free Honeynet Log Data for Research. Available online: http://honeynet.org/node/456/ (accessed on 7 October 2021).
- Sahoo, A.; Parida, P. Automatic clustering based approach for brain tumor extraction. J. Physics Conf. Ser. 2021, 1921, 012007. [Google Scholar] [CrossRef]
- Wang, Z.; Yu, Z.; Chen, C.; You, J.; Gu, T.; Wong, H.; Zhang, J. Clustering by Local Gravitation. IEEE Trans Cybern 2017, 48, 1383–1396. [Google Scholar] [CrossRef]
- Ruba, T.; Beham, D.M.P.; Tamilselvi, R.; Rajendran, T. Accurate Classification and Detection of Brain Cancer Cells in MRI and CT Images using Nano Contrast Agents. Biomed. Pharmacol. J. 2020, 13, 1227–1237. [Google Scholar] [CrossRef]
- Chahar, V.; Katoch, S.; Chauhan, S. A Review on Genetic Algorithm: Past, Present, and Future. Multimed. Tools Appl. 2021, 80, 8091–8126. [Google Scholar]
- Beheshti, Z.; Shamsuddin, S.M. A review of population-based meta-heuristic algorithm. Int. J. Adv. Soft Comput. Its Appl. 2013, 5, 1–35. [Google Scholar]
- Talbi, E.G.; Basseur, M.; Nebro, A.; Alba, E. Multi-objective optimization using metaheuristics: Non-standard algorithms. Int. Trans. Oper. Res. 2012, 19, 283–305. [Google Scholar] [CrossRef]
- Suresh, K.; Kundu, D.; Ghosh, S.; Das, S.; Abraham, A. Data Clustering Using Multi-objective Differential Evolution Algorithms. Fundam. Inform. 2009, 97, 381–403. [Google Scholar] [CrossRef]
- Bandyopadhyay, S.; Saha, S.; Maulik, U.; Deb, K. A Simulated Annealing-Based Multiobjective Optimization Algorithm: AMOSA. IEEE Trans. Evol. Comput. 2008, 12, 269–283. [Google Scholar] [CrossRef]
- Deb, K.; Pratap, A.; Agarwal, S.; Meyarivan, T. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 2002, 6, 182–197. [Google Scholar] [CrossRef]
- José-García, A.; Gómez-Flores, W. Automatic clustering using nature-inspired metaheuristics: A survey. Appl. Soft Comput. 2016, 41, 192–213. [Google Scholar] [CrossRef]
- Ezugwu, A.; Shukla, A.; Agbaje, M.; José-García, A.; Olaide, O.; Agushaka, O. Automatic clustering algorithms: A systematic review and bibliometric analysis of relevant literature. Neural Comput. Appl. 2021, 33, 6247–6306. [Google Scholar] [CrossRef]
- Talbi, E.G. Single-Solution Based Metaheuristics. In Metaheuristics: From Design to Implementation; Wiley: Hoboken, NJ, USA, 2009; Volume 74, pp. 87–189. [Google Scholar]
- Van Laarhoven, P.J.M.; Aarts, E.H.L. Simulated annealing. In Simulated Annealing: Theory and Applications; Springer: Berlin/Heidelberg, Germany, 1987; pp. 7–15. [Google Scholar]
- Glover, F. Tabu search—Part I. ORSA J. Comput. 1989, 1, 190–206. [Google Scholar] [CrossRef]
- Linhares, A.; Torreão, J.R.A. Microcanonical optimization applied to the traveling salesman problem. Int. J. Mod. Phys. C 1998, 9, 133–146. [Google Scholar] [CrossRef]
- Voudouris, C.; Tsang, E.; Alsheddy, A. Guided Local Search. In Handbook of Metaheuristics; Springer: Boston, MA, USA, 2010; pp. 321–361. [Google Scholar]
- Dubes, R.; Jain, A.K. Clustering techniques: The user’s dilemma. Pattern Recognit. 1976, 8, 247–260. [Google Scholar] [CrossRef]
- Defays, D. An efficient algorithm for a complete link method. Comput. J. 1977, 20, 364–366. [Google Scholar] [CrossRef]
- Garofolo, J.S.; Lamel, L.F.; Fisher, W.M.; Fiscus, J.G.; Pallett, D.S. DARPA TIMIT acoustic-phonetic continous speech corpus CD-ROM. NIST speech disc 1-1.1. NASA STI/Recon Tech. Rep. N 1993, 93, 27403. [Google Scholar]
- Van der Merwe, D.; Engelbrecht, A. Data clustering using particle swarm optimization. In Proceedings of the Congress on Evolutionary Computation, CEC ’03, Canberra, ACT, Australia, 8–12 December 2003; Volume 1, pp. 215–220. [Google Scholar]
- Garai, G.; Chaudhuri, B. A novel genetic algorithm for automatic clustering. Pattern Recognit. Lett. 2004, 25, 173–187. [Google Scholar] [CrossRef]
- Duda, R.O.; Hart, P.E. Pattern Classification and Scene Analysis; Wiley: Hoboken, NJ, USA, 1974. [Google Scholar]
- Fisher, R. The use of multiple measurements in taxonomic problems. Ann. Eugen. 1936, 7, 179–188. [Google Scholar] [CrossRef]
- Guha, S.; Rastogi, R.; Shim, K. CURE: An efficient clustering algorithm for large databases. ACM Sigmod Rec. 1998, 27, 73–84. [Google Scholar] [CrossRef]
- Karypis, G.; Han, E.H.; Kumar, V. Chameleon: Hierarchical clustering using dynamic modeling. Computer 1999, 32, 68–75. [Google Scholar] [CrossRef]
- Das, S.; Abraham, A.; Konar, A. Automatic Clustering Using an Improved Differential Evolution Algorithm. IEEE Trans. Syst. Man, Cybern. Part A Syst. Humans 2008, 38, 218–237. [Google Scholar] [CrossRef]
- Bandyopadhyay, S.; Maulik, U. Genetic clustering for automatic evolution of clusters and application to image classification. Pattern Recognit. 2002, 35, 1197–1208. [Google Scholar] [CrossRef]
- Omran, M.; Engelbrecht, A.; Salman, A. Dynamic Clustering using Particle Swarm Optimization with Application in Unsupervised Image Classification. In Proceedings of the 5th World Enformatika Conference (ICCI 2005), Prague, Czech Republic, 2005; pp. 199–204. [Google Scholar]
- Karaboga, D.; Ozturk, C. A novel clustering approach: Artificial Bee Colony (ABC) algorithm. Appl. Soft Comput. 2011, 11, 652–657. [Google Scholar] [CrossRef]
- De Falco, I.; Della Cioppa, A.; Tarantino, E. Facing classification problems with Particle Swarm Optimization. Appl. Soft Comput. 2007, 7, 652–658. [Google Scholar] [CrossRef]
- Chen, C.Y.; Ye, F. Particle swarm optimization algorithm and its application to clustering analysis. In Proceedings of the 17th Conference on Electrical Power Distribution, Tehran, Iran, 2–3 May 2012; IEEE: Piscataway, NJ, USA, 2012; pp. 789–794. [Google Scholar]
- Pacheco, T.M.; Gonçalves, L.B.; Ströele, V.; Soares, S.S.R. An ant colony optimization for automatic data clustering problem. In Proceedings of the Congress on Evolutionary Computation (CEC), Rio de Janeiro, Brazil, 8–13 July 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1–8. [Google Scholar]
- Abd Elaziz, M.; Nabil, N.; Ewees, A.A.; Lu, S. Automatic data clustering based on hybrid atom search optimization and sine-cosine algorithm. In Proceedings of the Congress on evolutionary computation (CEC), Wellington, New Zealand, 10–13 June 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 2315–2322. [Google Scholar]
- Zhao, W.; Wang, L.; Zhang, Z. Atom search optimization and its application to solve a hydrogeologic parameter estimation problem. Knowl. Based Syst. 2019, 163, 283–304. [Google Scholar] [CrossRef]
- Mirjalili, S. SCA: A Sine Cosine Algorithm for solving optimization problems. Knowl. Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
- Alrosan, A.; Alomoush, W.; Alswaitti, M.; Alissa, K.; Sahran, S.; Makhadmeh, S.N.; Alieyan, k. Automatic Data Clustering Based Mean Best Artificial Bee Colony Algorithm. Comput. Mater. Contin. 2021, 68, 1575–1593. [Google Scholar] [CrossRef]
- Ozturk, C.; Hancer, E.; Karaboga, D. Dynamic clustering with improved binary artificial bee colony algorithm. Appl. Soft Comput. 2015, 28, 69–80. [Google Scholar] [CrossRef]
- Martin, D.; Fowlkes, C.; Tal, D.; Malik, J. A database of human segmented natural images and its application to evaluating segmentation algorithms and measuring ecological statistics. In Proceedings of the 8th IEEE International Conference on Computer Vision, Vancouver, BC, Canada, 7–14 July 2001; IEEE: Piscataway, NJ, USA, 2001; Volume 2, pp. 416–423. [Google Scholar]
- Suresh, K.; Kundu, D.; Ghosh, S.; Das, S.; Abraham, A. Automatic clustering with multi-objective Differential Evolution algorithms. In Proceedings of the Congress on Evolutionary Computation, Trondheim, Norway, 18–21 May 2009; IEEE: Piscataway, NJ, USA, 2009; pp. 2590–2597. [Google Scholar]
- Handl, J.; Knowles, J. An Evolutionary Approach to Multiobjective Clustering. IEEE Trans. Evol. Comput. 2007, 11, 56–76. [Google Scholar] [CrossRef]
- Xue, F.; Sanderson, A.C.; Graves, R.J. Pareto-based multi-objective differential evolution. In Proceedings of the The Congress on Evolutionary Computation, Canberra, ACT, Australia, 8–12 December 2003; pp. 862–869. [Google Scholar]
- Tusar, T.; Filipic, B. DEMO: Differential Evolution for Multiobjective Optimization; Institut Jozef Stefan: Ljubljana, Slovenia, 2005; pp. 520–533. [Google Scholar]
- Abbass, H.; Sarker, R. The Pareto Differential Evolution Algorithm. Int. J. Artif. Intell. Tools 2002, 11, 531–552. [Google Scholar] [CrossRef]
- Bandyopadhyay, S.; Maulik, U.; Mukhopadhyay, A. Multiobjective Genetic Clustering for Pixel Classification in Remote Sensing Imagery. IEEE Trans. Geosci. Remote Sens. 2007, 45, 1506–1511. [Google Scholar] [CrossRef]
- Matake, N.; Hiroyasu, T.; Miki, M.; Senda, T. Multiobjective clustering with automatic k-determination for large-scale data. In Proceedings of the 9th Annual Conference on Genetic and Evolutionary Computation, London, UK, 7–11 July 2007; pp. 861–868. [Google Scholar]
- Blake, C.; Keough, E.; Merz, C.J. UCI Repository of Machine Learning Database. Available online: http://www.ics.uci.edu/~mlearn/MLrepository.html (accessed on 7 October 2021).
- Sporulation Dataset. Available online: http://cmgm.stanford.edu/pbrown/sporulation (accessed on 7 October 2021).
- Kundu, D.; Suresh, K.; Ghosh, S.; Das, S.; Abraham, A.; Badr, Y. Automatic Clustering Using a Synergy of Genetic Algorithm and Multi-objective Differential Evolution. In Proceedings of the Hybrid Artificial Intelligence Systems; Corchado, E., Wu, X., Oja, E., Herrero, Á., Baruque, B., Eds.; Springer: Berlin/Heidelberg, Germany, 2009; pp. 177–186. [Google Scholar]
- Storn, R.; Price, P. Differential Evolution—A simple and efficient adaptive scheme for global optimization over continuous spaces. J. Glob. Optim. 1997, 11, 347. [Google Scholar] [CrossRef]
- Price, K.; Storn, R.M.; Lampinen, J.A. Differential Evolution: A Practical Approach to Global Optimization; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
- Bezdek, J.C. Cluster validity with fuzzy sets. J. Cybern. 1973, 3, 58–72. [Google Scholar] [CrossRef]
- Saha, S.; Bandyopadhyay, S. A generalized automatic clustering algorithm in a multiobjective framework. Appl. Soft Comput. 2013, 13, 89–108. [Google Scholar] [CrossRef]
- Bandyopadhyay, S.; Saha, S. A Point Symmetry-Based Clustering Technique for Automatic Evolution of Clusters. IEEE Trans. Knowl. Data Eng. 2008, 20, 1441–1457. [Google Scholar] [CrossRef]
- Pal, S.K.; Mitra, S. Fuzzy versions of Kohonen’s net and MLP-based classification: Performance evaluation for certain nonconvex decision regions. Inf. Sci. 1994, 76, 297–337. [Google Scholar] [CrossRef]
- Friedman, M. The Use of Ranks to Avoid the Assumption of Normality Implicit in the Analysis of Variance. Am. Stat. Assoc. 1937, 32, 675–701. [Google Scholar] [CrossRef]
- Nemenyi, P.B. Distribution-free Multiple Comparisons. Ph.D. Thesis, Princeton University, Princeton, NJ, USA, 1963. [Google Scholar]
- Abubaker, A.; Baharum, A.; Alrefaei, M. Automatic Clustering Using Multi-objective Particle Swarm and Simulated Annealing. PLoS ONE 2015, 10, e0130995. [Google Scholar] [CrossRef]
- Shieh, H.L.; Kuo, C.C.; Chiang, C.M. Modified particle swarm optimization algorithm with simulated annealing behavior and its numerical verification. Appl. Math. Comput. 2011, 218, 4365–4383. [Google Scholar] [CrossRef]
- Ulungu, B.; Teghem, J.; Fortemps, P. Heuristic for multi-objective combinatorial optimization problems by simulated annealing. In Proceedings of the MCDM: Theory and Applications; SciTech: Encinitas, CA, USA, 1995; pp. 229–238. [Google Scholar]
- Lichman, M. UCI Machine Learning Repository; University of California: Irvine, CA, USA, 2013. [Google Scholar]
- Paul, A.K.; Shill, P.C. New automatic fuzzy relational clustering algorithms using multi-objective NSGA-II. Inf. Sci. 2018, 448–449, 112–133. [Google Scholar] [CrossRef]
- Skabar, A.; Abdalgader, K. Clustering Sentence-Level Text Using a Novel Fuzzy Relational Clustering Algorithm. IEEE Trans. Knowl. Data Eng. 2013, 25, 62–75. [Google Scholar] [CrossRef]
- Wikaisuksakul, S. A multi-objective genetic algorithm with fuzzy c-means for automatic data clustering. Appl. Soft Comput. 2014, 24, 679–691. [Google Scholar] [CrossRef]
- Alon, U.; Barkai, N.; Notterman, D.A.; Gish, K.; Ybarra, S.; Mack, D.; Levine, A.J. Broad patterns of gene expression revealed by clustering analysis of tumor and normal colon tissues probed by oligonucleotide arrays. Proc. Natl. Acad. Sci. USA 1999, 96, 6745–6750. [Google Scholar] [CrossRef] [PubMed]
- Singh, D.; Febbo, P.G.; Ross, K.; Jackson, D.G.; Manola, J.; Ladd, C.; Tamayo, P.; Renshaw, A.A.; D’Amico, A.V.; Richie, J.P. Gene expression correlates of clinical prostate cancer behavior. Cancer Cell 2002, 1, 203–209. [Google Scholar] [CrossRef] [PubMed]
- O’Neill, M.C.; Song, L. Neural network analysis of lymphoma microarray data: Prognosis and diagnosis near-perfect. BMC Bioinform. 2003, 4, 13. [Google Scholar]
- Golub, T.R.; Slonim, D.K.; Tamayo, P.; Huard, C.; Gaasenbeek, M.; Mesirov, J.P.; Coller, H.; Loh, M.L.; Downing, J.R.; Caligiuri, M.A.; et al. Molecular Classification of Cancer: Class Discovery and Class Prediction by Gene Expression Monitoring. Science 1999, 286, 531–537. [Google Scholar] [CrossRef]
- Real Life Data Set. Available online: https://archive.ics.uci.edu/ml/machine-learning-databases (accessed on 23 September 2020).
- Dutta, D.; Sil, J.; Dutta, P. Automatic Clustering by Multi-Objective Genetic Algorithm with Numeric and Categorical Features. Expert Syst. Appl. 2019, 137, 357–379. [Google Scholar] [CrossRef]
- Chen, E.; Wang, F. Dynamic Clustering Using Multi-objective Evolutionary Algorithm. In Proceedings of the Computational Intelligence and Security; Hao, Y., Liu, J., Wang, Y., Cheung, Y.m., Yin, H., Jiao, L., Ma, J., Jiao, Y.C., Eds.; Springer: Berlin/Heidelberg, Germany, 2005; pp. 73–80. [Google Scholar]
- Huang, Z. Clustering Large Data Sets with Mixed Numeric and Categorical Values. In Proceedings of the 1st Pacific-Asia Conference on Knowledge Discovery and Data Mining, (PAKDD), Singapore, 23–24 February 1997; pp. 21–34. [Google Scholar]
- Bezdek, J.C.; Ehrlich, R.; Full, W. FCM: The fuzzy c-means clustering algorithm. Comput. Geosci. 1984, 10, 191–203. [Google Scholar] [CrossRef]
- Cheng, Y. Mean shift, mode seeking, and clustering. IEEE Trans. Pattern Anal. Mach. Intell. 1995, 17, 790–799. [Google Scholar] [CrossRef]
- King, B. Step-Wise Clustering Procedures. J. Am. Stat. Assoc. 1967, 62, 86–101. [Google Scholar] [CrossRef]
- Kohonen, T. The self-organizing map. Proc. IEEE 1990, 78, 1464–1480. [Google Scholar] [CrossRef]
- Rahman, M.A.; Islam, M.Z. A hybrid clustering technique combining a novel genetic algorithm with K-Means. Knowl. Based Syst. 2014, 71, 345–365. [Google Scholar] [CrossRef]
- Asuncion, A.; Newman, D. UCI Machine Learning Repository. 2007. Available online: https://archive.ics.uci.edu/ml/index.php (accessed on 23 September 2020).
- Fisher, R. Statistical Methods and Scientific Induction. J. R. Stat. Society. Ser. B (Methodological) 1955, 17, 69–78. [Google Scholar] [CrossRef]
- Qu, H.; Yin, L. An Automatic Clustering Algorithm Using NSGA-II with Gene Rearrangement. In Proceedings of the 10th International Conference on Intelligent Systems (IS), Varna, Bulgaria, 28–30 August 2020; pp. 503–509. [Google Scholar]
- Qu, H.; Yin, L.; Tang, X. An automatic clustering method using multi-objective genetic algorithm with gene rearrangement and cluster merging. Appl. Soft Comput. 2021, 99, 106929. [Google Scholar] [CrossRef]
- Lin, H.J.; Yang, F.W.; Kao, Y.T. An efficient GA-based clustering technique. J. Appl. Sci. Eng. 2005, 8, 113–122. [Google Scholar]
- Artificial Data Sets. Available online: https://research.manchester.ac.uk/en/publications/an-evolutionary-approach-to-multiobjective-clustering (accessed on 23 September 2020).
- Moore, M.; Narayanan, A. Quantum-Inspired Computing; University of Exeter: Exeter, UK, 1995. [Google Scholar]
- Han, K.H.; Kim, J.H. Genetic quantum algorithm and its application to combinatorial optimization problem. In Proceedings of the Congress on Evolutionary Computation, CEC00 (Cat. No. 00TH8512), La Jolla, CA, USA, 16–19 July 2000; IEEE: Piscataway, NJ, USA, 2000; Volume 2, pp. 1354–1360. [Google Scholar]
- Han, K.H.; Kim, J.H. Quantum-inspired evolutionary algorithm for a class of combinatorial optimization. IEEE Trans. Evol. Comput. 2002, 6, 580–593. [Google Scholar] [CrossRef]
- Wang, Y.; Feng, X.Y.; Huang, Y.; Pu, D.B.; Zhou, W.; Liang, Y.C.; Zhou, C.G. A novel quantum swarm evolutionary algorithm and its applications. Neurocomputing 2007, 70, 633–640. [Google Scholar] [CrossRef]
- Zouache, D.; Nouioua, F.; Moussaoui, A. Quantum Inspired Firefly Algorithm with Particle Swarm Optimization for Discrete Optimization Problems. Soft Comput. 2015, 20, 2781–2799. [Google Scholar] [CrossRef]
- Moore, P.; Venayagamoorthy, G.K. Evolving combinational logic circuits using a hybrid quantum evolution and particle swarm inspired algorithm. In Proceedings of the NASA/DoD Conference on Evolvable Hardware (EH’05), Washington, DC, USA, 29 June–1 July 2005; pp. 97–102. [Google Scholar]
- Ramdane, C.; Meshoul, S.; Batouche, M.; Kholladi, M.K. A quantum evolutionary algorithm for data clustering. IJDMMM 2010, 2, 369–387. [Google Scholar] [CrossRef]
- Maulik, U.; Bandyopadhyay, S. Genetic algorithm-based clustering technique. Pattern Recognit. 2000, 33, 1455–1465. [Google Scholar] [CrossRef]
- Zhou, W.; Zhou, C.; Huang, Y.; Wang, Y. Analysis of gene expression data: Application of quantum-inspired evolutionary algorithm to minimum sum-of-squares clustering. In Proceedings of the International Workshop on Rough Sets, Fuzzy Sets, Data Mining, and Granular-Soft Computing; Springer: Berlin/Heidelberg, Germany, 2005; pp. 383–391. [Google Scholar]
- Dey, S.; Bhattacharyya, S.; Maulik, U. Quantum Inspired Automatic Clustering for Multi-level Image Thresholding. In Proceedings of the International Conference on Computational Intelligence and Communication Networks, Bhopal, India, 14–16 November 2014; pp. 247–251. [Google Scholar]
- Dey, S.; Bhattacharyya, S.; Snasel, V.; Dey, A.; Sarkar, S. PSO and DE based novel quantum inspired automatic clustering techniques. In Proceedings of the 3rd International Conference on Research in Computational Intelligence and Communication Networks (ICRCICN), Kolkata, India, 3–5 November 2017; pp. 285–290. [Google Scholar]
- Dey, A.; Dey, S.; Bhattacharyya, S.; Snasel, V.; Hassanien, A.E. Simulated Annealing Based Quantum Inspired Automatic Clustering Technique; Springer: Berlin/Heidelberg, Germany, 2018; pp. 73–81. [Google Scholar]
- Dey, A.; Bhattacharyya, S.; Dey, S.; Snasel, V.; Hassanien, A.E. 7. Quantum inspired simulated annealing technique for automatic clustering. In Intelligent Multimedia Data Analysis; Bhattacharyya, S., Pan, I., Das, A., Gupta, S., Eds.; De Gruyter: Berlin, Germany, 2019; pp. 145–166. [Google Scholar]
- Dey, S.; Bhattacharyya, S.; Maulik, U. Quantum-inspired automatic clustering technique using ant colony optimization algorithm. In Quantum-Inspired Intelligent Systems for Multimedia Data Analysis; IGI Global: Hershey, PA, USA, 2018; pp. 27–54. [Google Scholar]
- Flury, B. A First Course in Multivariate Statistics; Springer: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
- Bhattacharyya, S.; Snasel, V.; Dey, A.; Dey, S.; Konar, D. Quantum Spider Monkey Optimization (QSMO) Algorithm for Automatic Gray-Scale Image Clustering. In Proceedings of the International Conference on Advances in Computing, Communications and Informatics (ICACCI), Bangalore, India, 19–22 September 2018; pp. 1869–1874. [Google Scholar]
- Dey, A.; Dey, S.; Bhattacharyya, S.; Platos, J.; Snasel, V. Novel quantum inspired approaches for automatic clustering of gray level images using Particle Swarm Optimization, Spider Monkey Optimization and Ageist Spider Monkey Optimization algorithms. Appl. Soft Comput. 2020, 88, 106040. [Google Scholar] [CrossRef]
- Dey, A.; Bhattacharyya, S.; Dey, S.; Platos, J.; Snasel, V. Quantum-Inspired Bat Optimization Algorithm for Automatic Clustering of Grayscale Images. In Recent Trends in Signal and Image Processing; Springer: Singapore, 2019; pp. 89–101. [Google Scholar]
- Dey, A.; Bhattacharyya, S.; Dey, S.; Platos, J. 5. Quantum Inspired Automatic Clustering Algorithms: A Comparative Study of Genetic Algorithm and Bat Algorithm. In Quantum Machine Learning; Bhattacharyya, S., Pan, I., Mani, A., De, S., Behrman, E., Chakraborti, S., Eds.; De Gruyter: Berlin, Germany, 2020; pp. 89–114. [Google Scholar]
- Dey, A.; Dey, S.; Bhattacharyya, S.; Platos, J.; Snasel, V. Quantum Inspired Meta-Heuristic Approaches for Automatic Clustering of Colour Images. Int. J. Intell. Syst. 2021, 36, 4852–4901. [Google Scholar] [CrossRef]
- Askarzadeh, A. A novel metaheuristic method for solving constrained engineering optimization problems: Crow search algorithm. Comput. Struct. 2016, 169, 1–12. [Google Scholar] [CrossRef]
- Shekhawat, S.; Saxena, A. Development and applications of an intelligent crow search algorithm based on opposition based learning. ISA Trans 2020, 99, 210–230. [Google Scholar] [CrossRef] [PubMed]
- Dutta, T.; Bhattacharyya, S.; Mukhopadhyay, S. Automatic Clustering of Hyperspectral Images Using Qutrit Exponential Decomposition Particle Swarm Optimization. In Proceedings of the International India Geoscience and Remote Sensing Symposium (InGARSS), Ahmedabad, India, 6–10 December 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 289–292. [Google Scholar]
- Xie, F.; Li, F.; Lei, C.; Yang, J.; Zhang, Y. Unsupervised band selection based on artificial bee colony algorithm for hyperspectral image classification. Appl. Soft Comput. 2019, 75, 428–440. [Google Scholar] [CrossRef]
- Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95—International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; pp. 1942–1948. [Google Scholar]
- Hore, A.; Ziou, D. Image quality metrics: PSNR vs. SSIM. In Proceedings of the 20th International Conference on Pattern Recognition, Istanbul, Turkey, 23–26 August 2010; IEEE: Piscataway, NJ, USA, 2010; pp. 2366–2369. [Google Scholar]
- Fletcher, S.; Islam, M.Z. Comparing sets of patterns with the Jaccard index. Australas. J. Inf. Syst. 2018, 22, 1–17. [Google Scholar] [CrossRef]
- Dice, L.R. Measures of the amount of ecologic association between species. Ecology 1945, 26, 297–302. [Google Scholar] [CrossRef]
- Hyperspectral Remote Sensing Scenes—Grupo de Inteligencia Computacional (GIC). Available online: http://www.ehu.eus/ccwintco/index.php?title=Hyperspectral_Remote_Sensing_Scenes (accessed on 7 October 2019).
- Dey, A.; Bhattacharyya, S.; Dey, S.; Platos, J.; Snasel, V. Quantum Inspired Manta Ray Foraging. Optimization Algorithm for Automatic Clustering of Colour Images. In Quantum Machine Intelligence; CRC Press: Boca Raton, FL, USA, 2022; pp. 95–116. [Google Scholar]
- Zhao, W.; Zhang, z.; Wang, L. Manta ray foraging optimization: An effective bio-inspired optimizer for engineering applications. Eng. Appl. Artif. Intell. 2020, 87, 103300. [Google Scholar] [CrossRef]
- Dey, A.; Bhattacharyya, S.; Dey, S.; Platos, J.; Snasel, V. Automatic clustering of colour images using quantum inspired meta-heuristic algorithms. Appl. Intell. 2022, 1–23. [Google Scholar] [CrossRef]
- Xu, Y.; Fan, P.; Yuan, L. A Simple and Efficient Artificial Bee Colony Algorithm. Math. Probl. Eng. 2013, 2013, 9. [Google Scholar] [CrossRef]
- Biedrzycki, R. On equivalence of algorithm’s implementations: The CMA-ES algorithm and its five implementations. In Proceedings of the Genetic and Evolutionary Computation Conference Companion, Prague, Czech Republic, 13–17 July 2019; pp. 247–248. [Google Scholar]
- Berkeley Images. Available online: www2.eecs.berkeley.edu/Research/Projects/CS/vision/bsds/BSDS300/html/dataset/images.html (accessed on 1 May 2017).
- Real Life Images. Available online: www.hlevkin.com/06testimages.htm (accessed on 1 February 2018).
- Li, Y.; Feng, S.; Zhang, X.; Jiao, L. SAR image segmentation based on quantum-inspired multiobjective evolutionary clustering algorithm. Inf. Process. Lett. 2014, 114, 287–293. [Google Scholar] [CrossRef]
- Li, G.; Wang, W.; Zhang, W.; You, W.; Wu, F.; Tu, H. Handling multimodal multi-objective problems through self-organizing quantum-inspired particle swarm optimization. Inf. Sci. 2021, 577, 510–540. [Google Scholar] [CrossRef]
- Dey, S.; Bhattacharyya, S.; Maulik, U. Chapter 6—Quantum-inspired multi-objective simulated annealing for bilevel image thresholding. In Quantum Inspired Computational Intelligence; Bhattacharyya, S., Maulik, U., Dutta, P., Eds.; Morgan Kaufmann: Boston, MA, USA, 2017; pp. 207–232. [Google Scholar]
- Yan, L.; Chen, H.; Ji, W.; Lu, Y.; Li, J. Optimal VSM Model and Multi-Object Quantum-Inspired Genetic Algorithm for Web Information Retrieval. In Proceedings of the International Symposium on Computer Network and Multimedia Technology, Wuhan, China, 18–20 December 2009; pp. 1–4. [Google Scholar]
- Kumar, D.; Chahar, V.; Kumari, R. Automatic Clustering using Quantum based Multi-objective Emperor Penguin Optimizer and its Applications to Image Segmentation. Mod. Phys. Lett. A 2019, 34, 1950193. [Google Scholar] [CrossRef]
- Liu, R.; Wang, X.; Yangyang, L.; Zhang, X. Multi-objective Invasive Weed Optimization algorithm for clustering. In Proceedings of the Congress on Evolutionary Computation, Brisbane, QLD, Australia, 10–15 June 2012; pp. 1–8. [Google Scholar]
- Dey, A.; Bhattacharyya, S.; Dey, S.; Platos, J.; Snasel, V. Quantum-Inspired Multi-Objective NSGA-II Algorithm for Automatic Clustering of Gray Scale Images (QIMONSGA-II). In Quantum Machine Intelligence; CRC Press: Boca Raton, FL, USA, 2022; pp. 207–230. [Google Scholar]
- Srinivas, N.; Deb, K. Muiltiobjective Optimization Using Non dominated Sorting in Genetic Algorithms. Evol. Comput. 1994, 2, 221–248. [Google Scholar] [CrossRef]
- Mukhopadhyay, A.; Bandyopadhyay, S.; Maulik, U. Clustering using Multi-objective Genetic Algorithm and its Application to Image Segmentation. In Proceedings of the 2006 IEEE International Conference on Systems, Man and Cybernetics, Taipei, Taiwan, 8–11 October 2006; IEEE: Piscataway, NJ, USA, 2006; Volume 3, pp. 2678–2683. [Google Scholar]
Ref. | Aim of the Work | Mechanism Used | Data Specifications | Merits and Demerits |
---|---|---|---|---|
[126] | This paper presents an efficient automatic clustering technique referred to as the Similarity Index Fuzzy C-Means Clustering to generate a more optimal GRNN (Husain et al., 2004). | Used techniques include the conventional fuzzy C-means clustering algorithm and a similarity indexing technique. | Two benchmark problems, viz., the gas furnace data of Box and Jenkins and the MackeyGlass model for producing WBC, served as a simulation for the proposed approach. | Merits: 1. It is suitable for online dynamic GRNN-based modelling. Demerits: 1. Only two dynamic time series are considered for simulating the work. |
[129] | This paper presents an automatic clustering and boundary detection algorithm referred to as ADACLUS (Nosovskiy et al., 2008). | It is based on a local-adaptive influence function which is not predefined as in DBSCAN and DENCLUE. | It has been applied to various two-dimensional datasets with arbitrary shapes and densities. Two shape features (circular/non-circular and concave/non-concave) of the clusters in the datasets were considered. | Merits: 1. It is suitable for large-scale real-time applications. 2. It can efficiently identify the clusters of datasets with arbitrary shapes and non-uniform densities on the run. 3. It can also easily detect the boundary of the clusters. 4. It is more robust to noise resistance than other competitive algorithms. Demerits: 1. It is not designed for model-based clustering and hence cannot distinguish overlapping clusters. |
[130] | This paper presents an automatic soft classifier to classify uncertain data in synthetic datasets (Li et al., 2011). | The proposed classifier includes fuzzy C-means with a fuzzy distance function and an evaluation function. | It has been applied to two synthetic datasets (Synthetic I, Synthetic II) and the sensor database (sensor). | Merits: 1. The proposed classifier performed effectively on different types of data containing uncertain data. 2. The required running time was found to be less than others. Demerits: 1. No significant difference was achieved in the rate of error. |
[131] | This paper presents a clustering method referred to as MDS (Zhang et al., 2015). | It uses the DPSC technique to automatically produce a summary from a given set of sentences by selecting the appropriate sentences from those documents. | The DUC2004 dataset has been used to conduct the experiments. | Merits: 1. It verifies that DPSC can effectively handle MDS. Demerits: 1. Sentence similarity matrix is required to improve the query-based multi-document summarisation method preparation. |
Ref. | Aim of the Work | Mechanism Used | Data Specifications | Merits and Demerits |
---|---|---|---|---|
[143] | This study introduces the automatic clustering algorithm, STClu, which is based on external statistical testing on density metrics (Wang et al., 2016). | It introduces a local density evaluation metric K-density which offers more robustness than detecting the clustering centre from RLClu. | It has been applied to five groups of benchmark clustering datasets. | Merits: 1. STClu is efficient in identifying the clustering centres in most cases. Demerits: 1. No significant improvement in terms of the time complexity is found in STClu when compared with RLClu. |
[145] | This paper presents an automatic clustering algorithm referred to as CRS (Chen et al., 2018). | It uses a region segmentation mechanism, which is unaffected by the parameter settings, the shape of the cluster and the density of the data. | For experimental purposes, six groups of synthetic datasets and seven real-world datasets were used. | Merits: 1. It can efficiently and automatically identify the optimal number of clusters and the clusters of the dataset. Demerits: 1. Before the execution, it requires an approximate value of the number of nearest neighbour K. |
[150] | This study introduces a fuzzy clustering algorithm referred to as AP-FAFCM for automatic clustering (Yangyang et al., 2019). | In this algorithm, the number of clusters is initially estimated using the AP clustering algorithm. After that, the FCM algorithm receives the obtained number of clusters, and then the FA is used to optimise the clustering centre. | It has been applied to three randomly selected images. | Merits: 1. It not only resolves the issue of automatic segmentation but also significantly raises the level of segmentation quality while maintaining the effect of segmentation. Demerits: 1. Very few datasets have been chosen. 2. The effect of segmentation has not been considered for all the datasets. |
Ref. | Aim of the Work | Mechanism Used | Data Specifications | Merits and Demerits |
---|---|---|---|---|
[152] | This paper presents a graph-theoretic approach for ASLoC (Studiawan et al., 2020). | This method incorporates three steps of operation to maximise the number of percolations and intensity threshold for clique percolation, starting with a graph-theoretic approach, including intensity threshold and ultimately using a simulated annealing process. | Five publicly available security log datasets were used for experimental purposes. | Merits: 1. It automatically determines the used parameters without the intervention of user input. 2. It provides more significant results than comparable algorithms in all scenarios. Demerits: 1. The deployment of event log clustering is not yet achieved for anomaly detection. 2. A multi-objective framework still needs to be addressed. |
[158] | This study describes an effective way to extract brain tumours by employing the CLA clustering technique (Sahoo et al., 2021). | All image slices are subjected to skull stripping using a morphological operation and a histogram-based methodology. | Three different types of tumours, viz., meningiomas, gliomas and pituitary tumours, have been taken into account from a publicly available brain tumour dataset. | Merits: 1. It shows accuracy and outperforms other comparable algorithms in finding meningioma tumours near the skull regions compared to other types of tumours. Demerits: 1. Among three types of tumour, viz., meningiomas, gliomas and pituitary tumours, it only performs better for the meningioma tumours. |
Ref. | Aim of the Work | Mechanism Used | Data Specifications | Merits and Demerits |
---|---|---|---|---|
[97] | This study introduces the CLUSTERING algorithm to automatically identify the exact number of clusters and to also simultaneously classify the objects into these clusters (Tseng et al., 2001). | This work proposes a genetic clustering algorithm in which, initially, the single-linkage algorithm is employed to decrease the size of the large dataset. After that, a heuristic strategy is used to select the appropriate clustering. | Spectral feature vectors derived from the TIMIT database were used for this study. | Merits: 1. Almost all types of data can be effectively clustered by CLUSTERING. Demerits: 1. A good clustering result is only achieved for some specified settings of parameters. |
[177] | This study presents two new PSO-based approaches for clustering data vectors (Merwe et al., 2003). | A standard gbest PSO and a hybrid approach were used in which the swarm individuals are seeded by the result of the K-means algorithm. | Two artificial classification problems and four publicly available datasets, viz., Iris Plants Database, Wine, Breast Cancer and Automotives were used for experimental purposes. | Merits: 1. It is successful with respect to faster convergence to reduce quantisation errors and provides higher inter-cluster distances and lower intra-cluster distances. Demerits: 1. For early convergence, it gets stuck in local optima. |
[178] | This study demonstrates a GCA to automatically recognise the correct number of clusters using a two-stage split-and-merge strategy (Garai et al., 2004). | GCA uses two algorithms, viz., the CDA and the HCMA. | One real-life and nine artificial datasets were used for this study. | Merits: 1. It provides simple codes for implementation. 2. It shows quite encouraging results compared with other performing algorithms. Demerits: 1. Flexibility is decreased due to the fixed values of crossover and mutation operators. |
[183] | This study presents an improved differential evolution algorithm for automatically clustering real-life datasets (Das et al., 2008). | An improved version of the DE algorithm was implemented in this study. | The experimental datasets include the Iris Plants Database, Glass, Wisconsin Breast Cancer Dataset, Wine and Vowel Dataset. This algorithm was also used to segment five 256 × 256 grey-scale images. | Merits: 1. It is easy to implement. 2. It proves to be superior to other competitive algorithms. Demerits: 1. It may not outperform DCPSO or GCUK for every dataset. |
[48] | This paper presents an automatic clustering technique referred to as GWO for satellite image segmentation (Kapoor et al., 2018). | The proposed work is based on the Grey Wolf Optimisation algorithm. | The dataset comprises two satellite images of New Delhi. | Merits: 1. It has good convergence speed. 2. It can avoid local optima. Demerits: 1. It is unable to explore the entire search space. |
Ref. | Aim of the Work | Mechanism Used | Data Specifications | Merits and Demerits |
---|---|---|---|---|
[189] | This paper provides an automatic clustering algorithm referred to as Anthill, which is influenced by the collaborative intelligent behaviour of ants (Pacheco et al., 2018). | It is based on the Ant Colony Optimisation (ACO) algorithm. In Anthill, the solution is represented by a set of solutions produced by the entire colony, formed from the partial digraphs and obtained by acquiring the highly connected components. | The experimental datasets include the Wine Dataset, Iris Dataset, Breast Cancer Wisconsin (Original) Dataset, Pima Indians Diabetes Dataset and Haberman’s Survival Dataset from the UCI machine learning database. | Merits: 1. It uses an iterated racing parameter calibrator for automatically configuring the algorithm. Demerits: 1. It has low convergence speed. |
[190] | This study presents an automatic clustering algorithm known as ASOSCA to identify the exact number of centroids along with their positions on the run (Elaziz et al., 2019). | It uses the hybridisation of the ASO and the SCA. | Sixteen clustering datasets were used for all the experiments. | Merits: 1. It can achieve global optima. Demerits: 1. It is not suitable for all types of datasets. |
[44] | This study introduces a hybrid metaheuristic algorithm, FAPSO algorithm, for the automatic clustering of real-life datasets (Agbaje et al., 2019). | FAPSO incorporates the basic features of the FA algorithm and PSO algorithms. | Twelve benchmark datasets have been taken from the UCI machine learning data repository of the University of California for this study. | Merits: 1. It can reach the global optima. 2. FAPSO performs significantly better than other state-of-the-art clustering algorithms. Demerits: 1. Convergence speed is low. |
[193] | This paper proposes an automatic clustering algorithm referred to as AC-MeanABC, which incorporates an improved exploration process of the ABC algorithm (Alrosan et al., 2021). | The unsupervised data clustering method, AC-MeanABC, uses the MeanABC capability of balancing between exploration and exploitation and its capacity to explore the positive and negative directions in search space to determine the optimal result. | Eleven benchmark real-life datasets and natural images from the Berkeley1 segmentation dataset were considered for the experiments. | Merits: 1. It can explore and exploit the search space in both positive and negative directions. Demerits: 1. It has a lower convergence speed. |
Ref. | Aim of the Work | Mechanism Used | Data Specifications | Merits and Demerits |
---|---|---|---|---|
[196] | This study presents a comparison between four multi-objective variants of DE for the automatic clustering of artificial and real-life datasets (Suresh et al., 2009). | The comparable algorithms include the MODE, the PDE and DE for Multi-Objective Optimisation (DEMO) and the NSDE. | In this study, six artificial and four real-life datasets were considered. | Merits: 1. It can handle multi-objective optimisation problems. Demerits: 1. Time and space complexity are higher. |
[205] | This paper presents the hybrid multi-objective optimisation algorithm, GADE, for solving the automatic fuzzy clustering problem (Kundu et al., 2009). | It is a hybridisation of GA and DE algorithms. | Six artificial and four real-life datasets have been used. | Merits: 1. It is based on the multi-objective framework. Demerits: 1. Time and space complexity are higher. |
[209] | This work presents an automatic clustering algorithm referred to as GenClustMOO for the automatic clustering of artificial and real-life datasets (Saha et al., 2013). | GenClustMOO uses a simulated annealing-based multi-objective framework, AMOSA, to identify the optimal number of clusters and the appropriate partitioning from datasets with various cluster structures. | All the experiments were conducted on nineteen artificial and seven real-life datasets. | Merits: 1. It is based on a multi-objective framework and generates a set of Pareto optimal fronts. Demerits: 1. Convergence rate is lower. |
[214] | This study introduces an automatic clustering algorithm referred to as MOPSOSA for the automatic identification of the exact number of clusters in a dataset (Abubaker et al., 2015). | It combines the features of MPSO and the MOSA. | During the experiments, with fourteen artificial and five real-life datasets being considered. | Merits: 1. It can solve the multi-objective optimisation problem. 2. MOPSOSA outperforms all its competitors. Demerits: 1. Suffers from a low convergence rate. |
[218] | This paper presents two multi-objective automatic clustering algorithms, viz., FRC-NSGA and IFRC-NSGA (Paul et al., 2018). | The proposed FRC-NSGA algorithm incorporates the features of the well-known FRC and NSGA-II algorithms. The proposed algorithm IFRC-NSGA is designed to improve the performance of FRC-NSGA. | Several Gene Expression data and Non-Gene Expression data were considered to conduct the experiments. | Merits: 1. Multi-objective framework is used to handle real-life problems. Demerits: 1. Space requirement is higher. |
Ref. | Aim of the Work | Mechanism Used | Data Specifications | Merits and Demerits |
---|---|---|---|---|
[226] | This paper introduces a multi-objective-based automatic clustering algorithm referred to as MOGA-KP (Dutta et al., 2019). | This work combines the features of MOGA and K-Prototypes (KPs) to automatically identify the exact number of clusters from some real-life benchmark datasets with multiple numeric or categorical features. | Twenty-five benchmark datasets from the UCI machine learning repository were used for experimental purposes. | Merits: 1. It can provide a set of Pareto optimal solutions. 2. It can handle different features, viz., continuous and categorical and missing feature values. 3. It can also deal with the missing features. Demerits: 1. It does not give priority to the features in clustering. |
[237] | This study demonstrates a multi-objective-based automatic clustering algorithm referred to as NSGAII-GR for the automatic clustering of different types of real-life and artificial datasets (Qu et al., 2021). | It is based on the principle of the well-known Non-Dominated Sorting Genetic Algorithm-II (NSGAII) with a gene rearrangement technique. | The experiments have been conducted on five two-dimensional artificial datasets, five real-world datasets and twenty ten-dimensional datasets with various clusters. | Merits: 1. The remarkable advantage offered by the algorithm is that it prevents the time complexity from increasing while conducting the gene rearrangement process and inter-cluster merging process. Demerits: 1. It cannot properly handle uneven overlapping datasets. |
Ref. | Aim of the Work | Mechanism Used | Data Specifications | Merits and Demerits |
---|---|---|---|---|
[249] | This study introduces an automatic clustering algorithm referred to as QIAGA for multi-level image thresholding which is capable of automatically identifying the optimal number of clusters from an image dataset (Dey et al., 2014). | This work incorporates the quantum computing mechanism with the well-known GA. | The test images include four real-life grey-scale images. | Merits: 1. It has a good convergence rate. 2. Experimental results statistically prove the efficiency and effectiveness of the proposed algorithm. Demerits: 1. Time complexity is higher. |
[252] | This study presents a quantum-inspired automatic clustering algorithm for grey-scale images (Dey et al., 2018). | It incorporates the quantum computing principles with the single solution-based Simulated Annealing algorithm. | For experimental purposes, four real-life grey-scale images and four Berkeley images of different dimensions were used. | Merits: 1. It has a higher convergence rate. 2. The supremacy of the proposed algorithm over its classical equivalent has been established with respect to some metrics. Demerits: 1. It suffers from premature convergence. |
[253] | This study introduces an automatic clustering algorithm referred to as the Quantum-Inspired Automatic Clustering Technique Using the Ant Colony Optimisation algorithm for the automatic identification of the optimal number of clusters in a grey-scale image (Dey et al., 2018). | The Ant Colony Optimisation algorithm was chosen as the underlying algorithm to be incorporated in a quantum computing framework. The Xie–Beni cluster validity measure was used as the objective function for this study. | All the experiments were conducted on four real-life grey-scale images. | Merits: 1. The proposed technique is found to be superior to its classical counterpart with regard to several factors such as accuracy, stability, computational speed and standard errors. Demerits: 1. It does not always attain global optimum. |
[258] | This study introduces two quantum-inspired metaheuristic algorithms, viz., QIBA and QIGA, for the automatic clustering of grey-scale images (Dey et al., 2019). | Quantum computing principles are incorporated in this work with the classical Bat optimisation algorithm and GA. -index was used to validate the clustering process. | All the experiments were performed on four Berkeley images and two real-life grey-scale images. | Merits: 1. QIBA can efficiently balance exploration and exploitation in the search space. 2. Computational results indicate that QIBA outperforms others. Demerits: 1. The time complexity is higher. |
[256] | This study presents three automatic clustering algorithms for grey-scale images (Dey et al., 2020). | This work includes the QIPSO, Quantum-Inspired Spider Monkey Optimisation algorithm (QISMO) and QIASMO algorithm (QIASMO). | All the experiments were conducted on five Berkeley images, five real-life images and four mathematical benchmark functions. | Merits: 1. QIASMO has better convergence speed than the others.
2. The quantum-inspired algorithms outperform state-of-the-art algorithms in all cases. 3. QIASMO has been regarded as being the most effective algorithm out of these three. Demerits:1. QIPSO may get stuck in the local optima. |
Ref. | Aim of the Work | Mechanism Used | Data Specifications | Merits and Demerits |
---|---|---|---|---|
[259] | This study presents two quantum-inspired metaheuristic algorithms referred to as the Quantum-Inspired Crow Search Optimisation Algorithm (QICSOA) and Quantum-Inspired Intelligent Crow Search Optimisation Algorithm (QIICSOA) for automatically clustering colour images (Dey et al., 2021). | For this study, the underlying metaheuristic algorithms include the CSOA and ICSOA. This work uses four cluster validity indices, viz., -index, I-index, Silhouette (SIL) index and -Measure (CSM). | Fifteen Berkeley colour images and five publicly available real-life colour images with varied dimensions were used for experimental purposes. | Merits: 1. All of them have good convergence speed. 2. QIICSOA is found to be the most promising algorithm in terms of performance. Demerits: 1. Time complexity is higher. |
[262] | This study introduces an automatic clustering algorithm for hyper-spectral images, which is referred to as AC-QuPSO (Dutta et al., 2021). | A new concept called qutrit was introduced to reduce space and time complexity. | All the experiments were performed on the Salinas Dataset. | Merits: 1. It has good convergence speed. 2. The supremacy of AC-QuPSO over its conventional alternatives was realised by performing the unpaired t-test. Demerits: 1. It has difficulties in the placement of the controller. |
[269] | This study presents the QIMRFO algorithm for clustering of colour images on the run (Dey et al., 2022). | Quantum computing framework was combined with the classical version of the MRFO algorithm. | Four Berkeley colour images and four publicly available real-life colour images were used for experimental purposes. | Merits: 1. It has a good convergence rate. 2. It is capable of efficiently exploring and exploiting the search space. 3. Experimental results indicate that the QIMRFO quantitatively and qualitatively outperforms other comparative algorithms. Demerits: 1. In QIMRFO, the space requirement is high. |
[271] | This paper presents two algorithms, viz., QIPSO algorithm and QIEPSO algorithm, for the automatic clustering of colour images (Dey et al., 2022). | Quasi quantum operations were performed to achieve the goal. | All the experiments were conducted on ten Berkeley images and ten real-life colour images. | Merits: 1. QIEPSO has good convergence speed. 2. The QIEPSO algorithm has enough potential to be a viable candidate for the automatic clustering of colour images. Demerits: 1. QIPSO does not always find an optimal solution. |
Ref. | Aim of the Work | Mechanism Used | Data Specifications | Merits and Demerits |
---|---|---|---|---|
[280] | This paper presents the Automatic Clustering using Multi-Objective Emperor Penguin Optimiser (ACMOEPO) algorithm to automatically determine the optimal number of clusters from real-life datasets (Kumar et al., 2019). | To maintain the balance between inter-cluster and intra-cluster distances, a unique fitness function is suggested, which comprises multiple cluster validity indices. | The test dataset includes nine real-life benchmark datasets. | Merits: 1. It can handle very large datasets. 2. This type of algorithm is helpful in data mining applications. 3. The superiority of ACMOEPO was proved by performing the unpaired t-test among all the participating algorithms. Demerits: 1. Space requirement is higher. |
[282] | This paper introduces the QIMONSGA-II for the automatic clustering of grey-scale images (Dey et al., 2022). | QIMONSGA-II performs quasi-quantum computation and simultaneously optimises two objectives, viz., the -Measure (CSM) and -index. | All the experiments were conducted over six Berkeley grey-scale images of varied dimensions. | Merits: 1. It can identify optimal results in a multi-objective environment. 2. The superiority of QIMONSGA-II was proven by utilising the Minkowski score and Demerits: 1. Space requirement is higher. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Dey, A.; Bhattacharyya, S.; Dey, S.; Konar, D.; Platos, J.; Snasel, V.; Mrsic, L.; Pal, P. A Review of Quantum-Inspired Metaheuristic Algorithms for Automatic Clustering. Mathematics 2023, 11, 2018. https://doi.org/10.3390/math11092018
Dey A, Bhattacharyya S, Dey S, Konar D, Platos J, Snasel V, Mrsic L, Pal P. A Review of Quantum-Inspired Metaheuristic Algorithms for Automatic Clustering. Mathematics. 2023; 11(9):2018. https://doi.org/10.3390/math11092018
Chicago/Turabian StyleDey, Alokananda, Siddhartha Bhattacharyya, Sandip Dey, Debanjan Konar, Jan Platos, Vaclav Snasel, Leo Mrsic, and Pankaj Pal. 2023. "A Review of Quantum-Inspired Metaheuristic Algorithms for Automatic Clustering" Mathematics 11, no. 9: 2018. https://doi.org/10.3390/math11092018
APA StyleDey, A., Bhattacharyya, S., Dey, S., Konar, D., Platos, J., Snasel, V., Mrsic, L., & Pal, P. (2023). A Review of Quantum-Inspired Metaheuristic Algorithms for Automatic Clustering. Mathematics, 11(9), 2018. https://doi.org/10.3390/math11092018