An Improved Soft Island Model of the Fish School Search Algorithm with Exponential Step Decay Using Cluster-Based Population Initialization
Abstract
:1. Introduction
2. Related Work
3. Materials and Methods
3.1. The Tent-Map-Based Fish School Search Algorithm with Exponential Step Decay
3.1.1. Individual Movement Stage
3.1.2. Feeding Stage
3.1.3. Collective-Instinctive Movement Stage
3.1.4. Collective-Volitive Movement Stage
Algorithm 1: ETFSS pseudocode. | |
Input: | —objective function for minimization; —total number of agents (population size); —total number of iterations of the algorithm; —maximum weight of an agent; —initial values for the maximum step lengths. |
1. | -th agent in population do: |
2. |
|
3. |
|
4. |
|
5. | end for |
6. | do: |
7. |
|
8. |
|
9. |
|
10. |
|
11. |
|
12. |
|
13. |
|
14. |
|
15. |
|
16. |
|
17. |
|
18. |
|
19. |
|
20. |
|
21. |
|
22. |
|
23. |
|
24. |
|
25. | end for |
3.2. The Modified Soft Island Model
Algorithm 2: SIM-ETFSS pseudocode. | |
Input: | —objective function for minimization; —total number of islands; —initial number of agents belonging to one island (population size); —total number of iterations of the algorithm; —maximum weight of an agent; —initial values for the maximum step lengths; —probability hyperparameter. |
1. | do: |
2. |
|
3. | end for |
4. | do: |
5. |
|
6. |
|
7. |
|
8. |
|
9. |
|
10. |
|
11. |
|
12. |
|
13. |
|
14. |
|
15. |
|
16. |
|
17. | end for |
3.3. Cluster-Based Population Initialization
Algorithm 3: Cluster-based initialization of island populations. | |
Input: | —total number of islands; —initial number of agents belonging to one island (population size); —dimensionality of the problem to be solved; dimensions. |
1. | do: |
2. |
|
3. | end for |
4. | do: |
5. |
|
6. |
|
7. | end for |
3.4. Testing for Defects
3.4.1. The Center-Bias Operator
3.4.2. The Unevenness Defect
3.5. Benchmark Functions
4. Results
4.1. Algorithm Performance
4.2. Defect Detection
4.2.1. Testing for the Presence of a Center-Bias Operator
4.2.2. Testing for the Presence of an Unevenness Defect
4.3. The Solution of the Real Data Problem Using the SIM-ETFSS Algorithm
Algorithm 4: ELM model estimation metric. | |
Input: | —input weights; —hidden layer biases. |
1. | for each fold in 10-fold cross-validation do: |
2. |
|
3. |
|
4. |
|
5. |
|
6. |
|
7. | end for |
8. | -score values obtained by cross-validation. |
5. Discussion
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Hotta, S.; Kiyasu, S.; Miyahara, S. Pattern recognition using average patterns of categorical k-nearest neighbors. In Proceedings of the 17th International Conference on Pattern Recognition, Cambridge, UK, 26 August 2004; IEEE: New York, NY, USA, 2004; Volume 4, pp. 412–415. [Google Scholar]
- Dalal, N.; Triggs, B. Histograms of Oriented Gradients for Human Detection. In Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), San Diego, CA, USA, 20–25 June 2005; IEEE: New York, NY, USA, 2005; Volume 1, pp. 886–893. [Google Scholar]
- Russakovsky, O.; Deng, J.; Su, H.; Krause, J.; Satheesh, S.; Ma, S.; Huang, Z.; Karpathy, A.; Khosla, A.; Bernstein, M.; et al. ImageNet Large Scale Visual Recognition Challenge. Int. J. Comput. Vis. 2014, 115, 211–252. [Google Scholar] [CrossRef]
- Meng, Q. Application of machine learning in medicine. Appl. Comput. Eng. 2024, 33, 207–212. [Google Scholar] [CrossRef]
- Srinivasaiah, B. The Power of Personalized Healthcare: Harnessing the Potential of Machine Learning in Precision Medicine. Int. J. Sci. Res. 2024, 13, 426–429. [Google Scholar] [CrossRef]
- Gramovich, I.V.; Musatov, D.Y.; Petrusevich, D.A. Implementation of bagging in time series forecasting. Russ. Technol. J. 2024, 12, 101–110. [Google Scholar] [CrossRef]
- Aïmeur, E.; Amri, S.; Brassard, G. Fake news, Disinformation and Misinformation in Social Media: A Review. Soc. Netw. Anal. Min. 2023, 13, 30. [Google Scholar] [CrossRef] [PubMed]
- Wang, K.; Abid, M.A.; Rasheed, A.; Crossa, J.; Hearne, S.; Li, H. DNNGP, a deep neural network-based method for genomic prediction using multi-omics data in plants. Mol. Plant 2023, 16, 279–293. [Google Scholar] [CrossRef]
- Wang, Y.; Moradi, R.; Haghighi, M.H.Z.; Rastegarnia, F. Introduction of machine learning for astronomy (hands-on workshop). Astron. Astrophys. Trans. 2022, 337–346. [Google Scholar] [CrossRef]
- Joshi, P.B. Navigating with chemometrics and machine learning in chemistry. Artif. Intell. Rev. 2023, 56, 9089–9114. [Google Scholar] [CrossRef]
- Singh, S.P.; Yadav, D.K.; Chamran, M.K.; Perera, D.G. Intelligent mutation based evolutionary optimization algorithm for genomics and precision medicine. Funct. Integr. Genom. 2024, 24, 128. [Google Scholar] [CrossRef]
- Zhang, J. Optimization design of highway route based on deep learning. Front. Future Transp. 2024, 5, 1430509. [Google Scholar] [CrossRef]
- Boronnikov, A.S.; Tsyngalev, P.S.; Ilyin, V.G.; Demenkova, T.A. Evaluation of connection pool PgBouncer efficiency for optimizing relational database computing resources. Russ. Technol. J. 2024, 12, 7–24. [Google Scholar] [CrossRef]
- Dasgupta, S.; Sen, J. A Comparative Study of Hyperparameter Tuning Methods. arXiv 2024, arXiv:2408.16425. [Google Scholar]
- Wu, J.; Chen, X.Y.; Zhang, H.; Xiong, L.D.; Lei, H.; Deng, S.H. Hyperparameter Optimization for Machine Learning Models Based on Bayesian Optimization. J. Electron. Sci. Technol. 2019, 17, 26–40. [Google Scholar]
- Hertel, L.; Collado, J.; Sadowski, P.; Ott, J.; Baldi, P. Sherpa: Robust hyperparameter optimization for machine learning. SoftwareX 2020, 12, 100591. [Google Scholar] [CrossRef]
- Demidova, L.A.; Gorchakov, A.V. A Study of Biology-inspired Algorithms Applied to Long Short-Term Memory Network Training for Time Series Forecasting. In Proceedings of the 2021 3rd International Conference on Control Systems, Mathematical Modeling, Automation and Energy Efficiency (SUMMA), Lipetsk, Russia, 10–12 November 2021; IEEE: New York, NY, USA, 2021; pp. 473–478. [Google Scholar]
- Romadloni, M.I.H.; Saputro, D.R.S. Backpropagation neural network weight training using particle swarm optimization. In Proceedings of the 6th National Conference on Mathematics and Mathematics Education; AIP Publishing: Semarang, Indonesia, 2022; Volume 2577, p. 020056. [Google Scholar]
- Xu, C.; Zhang, J. A Survey of Quasi-Newton Equations and Quasi-Newton Methods for Optimization. Ann. Oper. Res. 2001, 103, 213–234. [Google Scholar] [CrossRef]
- Cagan, J.; Shimada, K.; Yin, S. A survey of computational approaches to three-dimensional layout problems. Comput. Aided Des. 2002, 34, 597–611. [Google Scholar] [CrossRef]
- Singh, A.; Sharma, S.; Singh, J. Nature-Inspired Algorithms for Wireless Sensor Networks: A Comprehensive Survey. arXiv 2021, arXiv:2101.10453. [Google Scholar] [CrossRef]
- Li, X.; Hua, S.; Liu, Q.; Li, Y. A partition-based convergence framework for population-based optimization algorithms. Inf. Sci. 2023, 627, 169–188. [Google Scholar] [CrossRef]
- Wolpert, D.; Macready, W. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
- Demidova, L.A.; Zhuravlev, V.E. Novel Four-stage Comprehensive Analysis Approach for Population-based Optimization Algorithms. In Proceedings of the 2023 5th International Conference on Control Systems, Mathematical Modeling, Automation and Energy Efficiency (SUMMA), Lipetsk, Russia, 8–10 November 2023; IEEE: New York, NY, USA, 2023; pp. 263–268. [Google Scholar]
- Kudela, J. The Evolutionary Computation Methods No One Should Use. arXiv 2023, arXiv:2301.01984. [Google Scholar]
- Velasco, L.; Guerrero, H.; Hospitaler, A. A Literature Review and Critical Analysis of Metaheuristics Recently Developed. Arch. Comput. Methods Eng. 2023, 31, 125–146. [Google Scholar] [CrossRef]
- Demidova, L.A.; Gorchakov, A.V. A Study of Chaotic Maps Producing Symmetric Distributions in the Fish School Search Optimization Algorithm with Exponential Step Decay. Symmetry 2020, 12, 784. [Google Scholar] [CrossRef]
- Akhmedova, S.; Stanovov, V.; Semenkin, E. Soft Island Model for Population-Based Optimization Algorithms. In Advances in Swarm Intelligence; Springer International Publishing: Berlin/Heidelberg, Germany, 2018; pp. 68–77. [Google Scholar]
- Bilal; Pant, M.; Zaheer, H.; Garcia-hernandez, L.; Abraham, A. Differential Evolution: A review of more than two decades of research. Eng. Appl. Artif. Intell. 2020, 90, 103479. [Google Scholar] [CrossRef]
- Stanovov, V.; Semenkin, E. Surrogate-Assisted Automatic Parameter Adaptation Design for Differential Evolution. Mathematics 2023, 11, 2937. [Google Scholar] [CrossRef]
- Halim, A.H.; Ismail, I.; Das, S. Performance assessment of the metaheuristic optimization algorithms: An exhaustive review. Artif. Intell. Rev. 2021, 54, 2323–2409. [Google Scholar] [CrossRef]
- Zhuravlev, V.E. Study of the Influence of the Search Area Shape on the Performance of Population-based Optimization Algorithms. IT Standard 2024, 2, 24–31. (In Russian) [Google Scholar]
- Karaboga, D.; Basturk, B. On the performance of artificial bee colony (ABC) algorithm. Appl. Soft Comput. 2008, 8, 687–697. [Google Scholar] [CrossRef]
- Chen, B.; Zhao, L.; Lu, J.H. Wind power forecast using RBF network and culture algorithm. In Proceedings of the 2009 International Conference on Sustainable Power Generation and Supply, Nanjing, China, 6–7 April 2009; IEEE: New York, NY, USA, 2009; pp. 1–6. [Google Scholar]
- Storn, R.; Price, K. Differential Evolution—A Simple and Efficient Heuristic for global Optimization over Continuous Spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
- Whitley, D. A genetic algorithm tutorial. Stat. Comput. 1994, 4, 65–85. [Google Scholar] [CrossRef]
- Mitchell, M.; Holland, J.H.; Forrest, S. When will a Genetic Algorithm Outperform Hill Climbing. In Neural Information Processing Systems; Morgan Kaufmann Publishers, Inc.: San Francisco, CA, USA, 1993. [Google Scholar]
- Moscato, P. On Evolution, Search, Optimization, Genetic Algorithms and Martial Arts: Towards Memetic Algorithms; Technical Report C3P 826, Caltech Con-Current Computation Program 158–79; California Institute of Technology: Pasadena, CA, USA, 1989. [Google Scholar]
- Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95—International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; IEEE: New York, NY, USA, 2002; Volume 4, pp. 1942–1948. [Google Scholar]
- Mirjalili, S. The Ant Lion Optimizer. Adv. Eng. Softw. 2015, 83, 80–98. [Google Scholar] [CrossRef]
- Bastos Filho, C.J.A.; de Lima Neto, F.B.; Lins, A.J.; Nascimento, A.I.; Lima, M.P. A novel search algorithm based on fish school behavior. In Proceedings of the 2008 IEEE International Conference on Systems, Man and Cybernetics, Singapore, 12–15 October 2008; IEEE: New York, NY, USA, 2009; pp. 2646–2651. [Google Scholar]
- Demidova, L.A.; Gorchakov, A.V. Biology-inspired optimization algorithms applied to intelligent input weights selection of an extreme learning machine in regression problems. In VII International Conference “Safety Problems of Civil Engineering Critical Infrastructures” (SPCECI2021); AIP Publishing: Krasnoyarsk, Russia, 2023; Volume 2700, p. 030003. [Google Scholar]
- Demidova, L.A.; Gorchakov, A.V. Classification of Program Texts Represented as Markov Chains with Biology-Inspired Algorithms-Enhanced Extreme Learning Machines. Algorithms 2022, 15, 329. [Google Scholar] [CrossRef]
- Duarte, G.R.; de Castro Lemonge, A.C.; da Fonseca, L.G.; de lima, B.S.L.P. An Island Model based on Stigmergy to solve optimization problems. Nat. Comput. 2020, 20, 413–441. [Google Scholar] [CrossRef]
- Tismer, A.; Riedelbauch, S. Optimization of a diffuser augmented kinetic turbine with the island model. IOP Conf. Ser. Earth Environ. Sci. 2024, 1411, 012054. [Google Scholar] [CrossRef]
- Wolpert, D.; Macready, W. Coevolutionary Free Lunches. IEEE Trans. Evol. Comput. 2005, 9, 721–735. [Google Scholar] [CrossRef]
- Ali, O.; Abbas, Q.; Mahmood, K.; Bautista thompson, E.; Arambarri, J.; Ashraf, I. Competitive Coevolution-Based Improved Phasor Particle Swarm Optimization Algorithm for Solving Continuous Problems. Mathematics 2023, 11, 4406. [Google Scholar] [CrossRef]
- Sun, Y.; Xu, P.; Zhang, Z.; Zhu, T.; Luo, W. Brain Storm Optimization Integrated with Cooperative Coevolution for Large-Scale Constrained Optimization. In Advances in Swarm Intelligence; Springer Nature: Cham, Switzerland, 2023; pp. 356–368. [Google Scholar]
- Ghasemi, M.; Akbari, E.; Rahimnejad, A.; Razavi, S.E.; Ghavidel, S.; Li, L. Phasor particle swarm optimization: A simple and efficient variant of PSO. Soft Comput. 2018, 23, 9701–9718. [Google Scholar] [CrossRef]
- Shi, Y. Brain Storm Optimization Algorithm. In Advances in Swarm Intelligence; Springer: Berlin/Heidelberg, Germany, 2011; Volume 6728, pp. 303–309. [Google Scholar]
- Stanovov, V.; Kazakovtsev, L.; Semenkin, E. Hyper-Heuristic Approach for Tuning Parameter Adaptation in Differential Evolution. Axioms 2024, 13, 59. [Google Scholar] [CrossRef]
- Stanovov, V.; Semenkin, E. Adaptation of the Scaling Factor Based on the Success Rate in Differential Evolution. Mathematics 2024, 12, 516. [Google Scholar] [CrossRef]
- Awad, N.; Ali, M.; Liang, J.; Qu, B.; Suganthan, P. Problem Definitions and Evaluation Criteria for the CEC 2017 Special Session and Competition on Single Objective Bound Constrained Real-Parameter Numerical Optimization; Nanyang Technological University: Singapore, 2016; Technical Report. [Google Scholar]
- Kumar, A.; Price, K.; Mohamed, A.K.; Suganthan, P.N. Problem Definitions and Evaluation Criteria for the CEC 2022 Special Session and Competition on Single Objective Bound Constrained Numerical Optimization; Nanyang Technological University: Singapore, 2021; Technical Report. [Google Scholar]
- Ruciński, M.; Izzo, D.; Biscani, F. On the impact of the migration topology on the Island Model. Parallel Comput. 2010, 36, 555–571. [Google Scholar] [CrossRef]
- Frahnow, C.; Kötzing, T. Ring Migration Topology Helps Bypassing Local Optima. arXiv 2018, arXiv:1806.01128. [Google Scholar]
- Homayounfar, H.; Areibi, S.; Wang, F. An advanced island based ga for optimization problems. Dyn. Contin. Discret. Impuls. Syst. Ser. B Appl. Algorithms 2003, 46–51. [Google Scholar]
- Kwak, J.W.; Jhon, C.S. Torus Ring: Improving performance of interconnection network by modifying hierarchical ring. Parallel Comput. 2007, 33, 2–20. [Google Scholar] [CrossRef]
- Fernández, F.; Tomassini, M.; Vanneschi, L. An Empirical Study of Multipopulation Genetic Programming. Genet. Program. Evolvable Mach. 2003, 4, 21–51. [Google Scholar] [CrossRef]
- Plevris, V.; Solorzano, G. A Collection of 30 Multidimensional Functions for Global Optimization Benchmarking. Data 2022, 7, 46. [Google Scholar] [CrossRef]
- Jamil, M.; Yang, X.S. A literature survey of benchmark functions for global optimisation problems. Int. J. Math. Model. Numer. Optim. 2013, 4, 150. [Google Scholar] [CrossRef]
- Wolberg, W.; Mangasarian, O.; Street, N.; Street, W. Breast Cancer Wisconsin (Diagnostic)—UCI Machine Learning Repository. Available online: https://doi.org/10.24432/C5DW2B (accessed on 10 January 2025).
- Huang, G.; Zhu, Q.; Siew, C. Extreme learning machine: Theory and applications. Neurocomputing 2006, 70, 489–501. [Google Scholar] [CrossRef]
- Lim, B.; Zohren, S.; Roberts, S. Population-based Global Optimisation Methods for Learning Long-term Dependencies with RNNs. arXiv 2019, arXiv:1905.09691. [Google Scholar]
- Mckay, M.D.; Beckman, R.J.; Conover, W.J. A Comparison of Three Methods for Selecting Values of Input Variables in the Analysis of Output from a Computer Code. Technometrics 2000, 42, 55–61. [Google Scholar] [CrossRef]
- Sobol’, I. On the distribution of points in a cube and the approximate evaluation of integrals. USSR Comput. Math. Math. Phys. 1967, 7, 86–112. [Google Scholar] [CrossRef]
Designation | Function Name | Type | Search Range for Each Dimension |
---|---|---|---|
Sphere | U, S | [−100, 100] | |
Schwefel 2.22 | U, N | [−100, 100] | |
Schwefel 1.2 | U, N | [−100, 100] | |
Schwefel 2.21 | U, S | [−100, 100] | |
Rosenbrock | U, N | [−30, 30] | |
Step | U, S | [−100, 100] | |
Quartic with noise | U, S | [−1.28, 1.28] | |
Schwefel 2.26 | M, S | [−500, 500] | |
Rastrigin | M, S | [−5.12, 5.12] | |
Ackley | M, N | [−32, 32] | |
Griewank | M, N | [−600, 600] | |
Drop-Wave | M, N | [−5.12, 5.12] | |
Alpine 1 | M, S | [−10, 10] | |
HappyCat | M, N | [−20, 20] | |
HGBat | M, N | [−15, 15] | |
Discus | U, S | [−100, 100] | |
Bent Cigar | U, S | [−100, 100] | |
Xin-She Yang | M, N | [−6.28, 6.28] | |
Salomon | M, N | [−20, 20] | |
Zakharov | U, N | [−10, 10] |
Unit | Parameter | Value |
---|---|---|
CPU | Type | Intel® Core™ i5-7360U |
Clock rate | 2.3 GHz (2 cores) | |
Architecture | 64-bit | |
Cache | 4 MB shared L3 | |
Manufacturer | Intel Corporation, Santa Clara, CA, USA | |
RAM | Type | 8 GB LPDDR3 |
Clock rate | 2133 MHz | |
Manufacturer | Samsung Electronics Co., Ltd., Seoul, Republic of Korea |
Hyperparameter of the SIM-ETFSS Algorithm | Variable Values: | Total Number of Variations | |
---|---|---|---|
For Single Island | For Multiple Islands | ||
Test function | |||
Initialization method | |||
Total: |
Function | ||||
---|---|---|---|---|
Function | Values of the Algorithm Hyperparameters | ||
---|---|---|---|
N = 1, (R) | N = 5, (C) | N = 5, (R) | |
Function | Values of the Algorithm Hyperparameters | ||
---|---|---|---|
N = 1, (R) | N = 5, (C) | N = 5, (R) | |
Hyperparameter | Value |
---|---|
3 | |
30 | |
20 | |
0.2 | |
2 | |
0.5 | |
0.25 | |
5000 | |
Initialization method | Cluster-based |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Demidova, L.A.; Zhuravlev, V.E. An Improved Soft Island Model of the Fish School Search Algorithm with Exponential Step Decay Using Cluster-Based Population Initialization. Stats 2025, 8, 10. https://doi.org/10.3390/stats8010010
Demidova LA, Zhuravlev VE. An Improved Soft Island Model of the Fish School Search Algorithm with Exponential Step Decay Using Cluster-Based Population Initialization. Stats. 2025; 8(1):10. https://doi.org/10.3390/stats8010010
Chicago/Turabian StyleDemidova, Liliya A., and Vladimir E. Zhuravlev. 2025. "An Improved Soft Island Model of the Fish School Search Algorithm with Exponential Step Decay Using Cluster-Based Population Initialization" Stats 8, no. 1: 10. https://doi.org/10.3390/stats8010010
APA StyleDemidova, L. A., & Zhuravlev, V. E. (2025). An Improved Soft Island Model of the Fish School Search Algorithm with Exponential Step Decay Using Cluster-Based Population Initialization. Stats, 8(1), 10. https://doi.org/10.3390/stats8010010