Next Article in Journal
Automated Generation and Internal Force Visualization for Box Culvert Based on Building Information Modeling
Previous Article in Journal
MLSL-Spell: Chinese Spelling Check Based on Multi-Label Annotation
Previous Article in Special Issue
Semantic-Based Multi-Objective Optimization for QoS and Energy Efficiency in IoT, Fog, and Cloud ERP Using Dynamic Cooperative NSGA-II
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Evolutionary Computation: Theories, Techniques, and Applications

by
Vincent A. Cicirello
Computer Science, Stockton University, 101 Vera King Farris Dr, Galloway, NJ 08205, USA
Appl. Sci. 2024, 14(6), 2542; https://doi.org/10.3390/app14062542
Submission received: 12 March 2024 / Accepted: 14 March 2024 / Published: 18 March 2024
(This article belongs to the Special Issue Evolutionary Computation: Theories, Techniques, and Applications)

1. Introduction

Evolutionary computation is now nearly 50 years old, originating with the seminal work of John Holland at the University of Michigan in 1975 which introduced the genetic algorithm [1]. Evolutionary computation [2] encompasses a variety of problem-solving methodologies that take inspiration from natural evolutionary and genetic processes. The most well-known form of evolutionary computation is the genetic algorithm [3,4], which evolves a population of solutions to the problem at hand, each represented as a bit-string—the genotype—with a fitness function measuring the fitness of the bit-string within the context of the problem (i.e., mapping a genotype to a phenotype). Evolutionary operators, such as mutation, crossover, and selection, control the simulated evolution over several generations.
There are now many forms of evolutionary computation (a few of which are illustrated in Figure 1) that have developed over the years, including genetic programming [5], evolution strategies [6], differential evolution [7,8], evolutionary programming [9], permutation-based evolutionary algorithms [10], memetic algorithms [11], the estimation of distribution algorithms [12], particle swarm optimization [13], interactive evolutionary algorithms [14], ant colony optimization [15,16], and artificial immune systems [17], among others [18,19]. Among the characteristics of evolutionary algorithms that lead to powerful problem solving is the fact that they lend themselves very well to parallel implementation [20,21,22], enabling the exploitation of today’s multicore and manycore computer architectures. Rich theoretical foundations also exist which are related to convergence properties [23,24,25], parameter optimization, and control [26], as well as the powerful analytical tools of fitness landscape analysis [27,28,29], such as fitness–distance correlation [30] and search landscape calculus [31], among others. These theoretical foundations inform the engineering of evolutionary solutions to specific problems. There are also many open-source libraries and toolkits available for evolutionary computation in a variety of programming languages [32,33,34,35,36,37,38,39,40,41], making the application of evolutionary algorithms to new problems and domains particularly easy.
Evolutionary computation has been effective in solving problems with a variety of characteristics, and within many application domains, such as multiobjective optimization [42,43,44,45], data science [46], machine learning [47,48,49], classification [50], feature selection [51], neural architecture search [52], neuroevolution [53], bioinformatics [54], scheduling [55], algorithm selection [56], computer vision [57], hardware validation [58], software engineering [59,60], and multi-task optimization [61,62], among many others.
This Special Issue brings together recent advances in the theory and application of evolutionary computation. It includes 13 articles. The authors of the 13 articles represent institutions from 11 different countries, demonstrating the global reach of the topic of evolutionary computation. The published articles span the breadth of evolutionary computation techniques, and cover a variety of applications. The remainder of this Editorial briefly describes the articles included within this Special Issue; and I encourage you to read and explore each.

2. Overview of the Published Articles

This overview of the articles is organized in the order in which the contributions to the Special Issue were published.
Cicirello (contribution 1) presents a new mutation operator for evolutionary algorithms where solutions are represented by permutations. The new mutation operator, cycle mutation, is inspired by cycle crossover. Cycle mutation is designed specifically for assignment and mapping problems (e.g., quadratic assignment, largest common subgraph, etc.) rather than ordering problems like the traveling salesperson. This article includes a fitness landscape analysis exploring the strengths and weaknesses of cycle mutation in terms of permutation features.
Osuna-Enciso and Guevara-Martínez (contribution 2) propose a variation of differential evolution that they call stigmergic differential evolution which can be used for solving continuous optimization problems. Their approach integrates the concept of stigmergy with differential evolution. Stigmergy originated from swarm intelligence, and refers to the indirect communication among members of a swarm that occurs when swarm members manipulate the environment and detect modifications made by others (e.g.,the pheromone trail-following behavior of ants, among others).
Córdoba, Gata, and Reina (contribution 3) consider a problem related to energy access in remote, rural areas. Namely, they utilize a ( μ + λ ) -evolutionary algorithm to optimize the design of mini hydropower plants, using cubic Hermite splines to model the terrain in 3D, rather than the more common 2D simplifications.
Parra, et al. (contribution 4) consider the binary classification problem of predicting obesity. In their experiments, they explore utilizing evolutionary computation in feature selection for binary classifier systems. They consider ten different machine learning classifiers, combined with four feature-selection strategies. Two of the feature-selection strategies considered use the classic bit-string-encoded genetic algorithm.
Fan and Liang (contribution 5) consider directional sensor networks and target coverage. In their approach to target coverage, they developed a hybrid of particle swarm optimization and a genetic algorithm. Their experiments demonstrate that the hybrid approach outperforms both particle swarm optimization and the genetic algorithm alone for the problem of maximizing covered targets and minimizing active sensors.
Wang, et al. (contribution 6) developed a hybrid between particle swarm optimization and differential evolution for real-valued function optimization. Their hybrid combines a self-adaptive form of differential evolution with particle swarm optimization, and they experiment with their approach on a variety of function optimization benchmarks.
Chen, et al. (contribution 7) explore the constrained optimization problem of optimizing the linkage system for vehicle wipers. Their aim was to improve steadiness of wipers. They utilize differential evolution to optimize the maximal magnitude of the angular acceleration of the links in the system subject to a set of constraints. They were able to reduce the maximal magnitude of angular acceleration by 10%.
Tong, Sung, and Wong (contribution 8) analyze the performance of a parameter-free evolutionary algorithm known as pure random orthogonal search. They propose improvements to the algorithm involving local search. They performed experiments on a variety of benchmark function optimization problems with a variety of features (e.g., unimodal vs. multi-modal, convex vs. non-convex, separable vs. non-separable).
Anđelić, et al. (contribution 9) approach the problem of searching for candidates for dark matter particles, so-called weakly interacting massive particles, using symbolic regression via genetic programming. Their approach estimates the interaction locations with high accuracy.
Wu, et al. (contribution 10) developed a recommender system utilizing an interactive evolutionary algorithm for making personalized recommendations. In an interactive evolutionary algorithm, human users are directly involved in evaluating the fitness of members of the population. Wu, et al. use a surrogate model in their approach to reduce the number of evaluations required by users.
Dubey and Louis (contribution 11) utilize a ( μ + λ ) -evolutionary algorithm. They developed an approach to deploying a UAV-based ad hoc network to cover an area of interest. UAV motion is controlled by a set of potential fields that are optimized by the ( μ + λ ) -evolutionary algorithm using polynomial mutation and simulated binary crossover.
Lazari and Chassiakos (contribution 12) take on the problem of deploying electric vehicle charging stations. They define it as a multi-objective optimization problem with two cost functions: station deployment costs and user travel costs between areas of demand and the station’s location. Their evolutionary algorithm’s chromosome representation combines x and y coordinates of candidate charging station locations, using the classic bit-string of genetic algorithms to model whether or not each candidate station is deployed.
Reffad and Alti (contribution 13) use NSGA-II to optimize enterprise resource planning performance. They aimed to optimize average service quality and average energy consumption. They propose an adaptive and dynamic solution within IoT, fog, and cloud environments.

3. Conclusions

This collection of articles spans a variety of forms of evolutionary computation, including genetic algorithms, genetic programming, differential evolution, particle swarm optimization, and evolutionary algorithms more generally, as well as hybrids of multiple forms of evolutionary computation. The evolutionary algorithms represent solutions in several ways, including the common bit-string representation, vectors of reals, and permutations, as well as custom representations. The authors of the articles tackle a very diverse collection of problems of different types and from many application domains. For example, some of the problems considered are discrete optimization problems, while others optimize continuous functions. Although many of the articles focus on optimizing a single objective function, others involve multi-objective optimization. Some of the articles primarily utilize common benchmarking optimization functions and problems, while several others explore a variety of real-world applications, such as optimizing mini hydropower plants, UAV deployment, the deployment of electric vehicle charging stations, target coverage in wireless sensor networks, enterprise resource planning, recommender systems, dark matter detection, and optimizing vehicle wiper linkage systems, among others. The diversity of evolutionary techniques, evolutionary operators, problem features, and applications that are covered within this collection of articles demonstrates the wide reach and applicability of evolutionary computation.

Conflicts of Interest

The author declares no conflicts of interest.

List of Contributions

  • Cicirello, V.A. Cycle Mutation: Evolving Permutations via Cycle Induction. Appl. Sci. 2022, 12, 5506. https://doi.org/10.3390/app12115506.
  • Osuna-Enciso, V.; Guevara-Martínez, E. A Stigmergy-Based Differential Evolution. Appl. Sci. 2022, 12, 6093. https://doi.org/10.3390/app12126093.
  • Córdoba, A.T.; Gata, P.M.; Reina, D.G. Optimizing the Layout of Run-of-River Powerplants Using Cubic Hermite Splines and Genetic Algorithms. Appl. Sci. 2022, 12, 8133. https://doi.org/10.3390/app12168133.
  • Parra, D.; Gutiérrez-Gallego, A.; Garnica, O.; Velasco, J.M.; Zekri-Nechar, K.; Zamorano-León, J.J.; Heras, N.d.l.; Hidalgo, J.I. Predicting the Risk of Overweight and Obesity in Madrid—A Binary Classification Approach with Evolutionary Feature Selection. Appl. Sci. 2022, 12, 8251. https://doi.org/10.3390/app12168251.
  • Fan, Y.A.; Liang, C.K. Hybrid Discrete Particle Swarm Optimization Algorithm with Genetic Operators for Target Coverage Problem in Directional Wireless Sensor Networks. Appl. Sci. 2022, 12, 8503. https://doi.org/10.3390/app12178503.
  • Wang, S.L.; Adnan, S.H.; Ibrahim, H.; Ng, T.F.; Rajendran, P. A Hybrid of Fully Informed Particle Swarm and Self-Adaptive Differential Evolution for Global Optimization. Appl. Sci. 2022, 12, 11367. https://doi.org/10.3390/app122211367.
  • Chen, T.J.; Hong, Y.J.; Lin, C.H.; Wang, J.Y. Optimization on Linkage System for Vehicle Wipers by the Method of Differential Evolution. Appl. Sci. 2023, 13, 332. https://doi.org/10.3390/app13010332.
  • Tong, B.K.B.; Sung, C.W.; Wong, W.S. Random Orthogonal Search with Triangular and Quadratic Distributions (TROS and QROS): Parameterless Algorithms for Global Optimization. Appl. Sci. 2023, 13, 1391. https://doi.org/10.3390/app13031391.
  • Anđelić, N.; Baressi Šegota, S.; Glučina, M.; Car, Z. Estimation of Interaction Locations in Super Cryogenic Dark Matter Search Detectors Using Genetic Programming-Symbolic Regression Method. Appl. Sci. 2023, 13, 2059. https://doi.org/10.3390/app13042059.
  • Wu, W.; Sun, X.; Man, G.; Li, S.; Bao, L. Interactive Multifactorial Evolutionary Optimization Algorithm with Multidimensional Preference Surrogate Models for Personalized Recommendation. Appl. Sci. 2023, 13, 2243. https://doi.org/10.3390/app13042243.
  • Dubey, R.; Louis, S.J. Genetic Algorithms Optimized Adaptive Wireless Network Deployment. Appl. Sci. 2023, 13, 4858. https://doi.org/10.3390/app13084858.
  • Lazari, V.; Chassiakos, A. Multi-Objective Optimization of Electric Vehicle Charging Station Deployment Using Genetic Algorithms. Appl. Sci. 2023, 13, 4867. https://doi.org/10.3390/app13084867.
  • Reffad, H.; Alti, A. Semantic-Based Multi-Objective Optimization for QoS and Energy Efficiency in IoT, Fog, and Cloud ERP Using Dynamic Cooperative NSGA-II. Appl. Sci. 2023, 13, 5218. https://doi.org/10.3390/app13085218.

References

  1. Holland, J.H. Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence; MIT Press: Cambridge, MA, USA, 1992. [Google Scholar]
  2. Eiben, A.; Smith, J. Introduction to Evolutionary Computing, 2nd ed.; Springer: Heidelberg, Germany, 2015. [Google Scholar]
  3. Mitchell, M. An Introduction to Genetic Algorithms; MIT Press: Cambridge, MA, USA, 1998. [Google Scholar]
  4. Katoch, S.; Chauhan, S.S.; Kumar, V. A review on genetic algorithm: Past, present, and future. Multimed. Tools Appl. 2021, 80, 8091–8126. [Google Scholar] [CrossRef]
  5. Langdon, W.B.; Poli, R. Foundations of Genetic Programming; Springer: Heidelberg, Germany, 2010. [Google Scholar]
  6. Beyer, H.G.; Schwefel, H.P. Evolution strategies—A comprehensive introduction. Nat. Comput. Int. J. 2002, 1, 3–52. [Google Scholar] [CrossRef]
  7. Das, S.; Suganthan, P.N. Differential Evolution: A Survey of the State-of-the-Art. IEEE Trans. Evol. Comput. 2011, 15, 4–31. [Google Scholar] [CrossRef]
  8. Bilal; Pant, M.; Zaheer, H.; Garcia-Hernandez, L.; Abraham, A. Differential Evolution: A review of more than two decades of research. Eng. Appl. Artif. Intell. 2020, 90, 103479. [Google Scholar] [CrossRef]
  9. Yao, X.; Liu, Y.; Lin, G. Evolutionary programming made faster. IEEE Trans. Evol. Comput. 1999, 3, 82–102. [Google Scholar] [CrossRef]
  10. Cicirello, V.A. A Survey and Analysis of Evolutionary Operators for Permutations. In Proceedings of the 15th International Joint Conference on Computational Intelligence, Rome, Italy, 13–15 November 2023; pp. 288–299. [Google Scholar] [CrossRef]
  11. Osaba, E.; Del Ser, J.; Cotta, C.; Moscato, P. Memetic Computing: Accelerating optimization heuristics with problem-dependent local search methods. Swarm Evol. Comput. 2022, 70, 101047. [Google Scholar] [CrossRef]
  12. Larrañaga, P.; Bielza, C. Estimation of Distribution Algorithms in Machine Learning: A Survey. IEEE Trans. Evol. Comput. 2023. early access. [Google Scholar] [CrossRef]
  13. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; Volume 4, pp. 1942–1948. [Google Scholar] [CrossRef]
  14. Uusitalo, S.; Kantosalo, A.; Salovaara, A.; Takala, T.; Guckelsberger, C. Creative collaboration with interactive evolutionary algorithms: A reflective exploratory design study. Genet. Program. Evolvable Mach. 2023, 25, 4. [Google Scholar] [CrossRef]
  15. Dorigo, M.; Gambardella, L. Ant colony system: A cooperative learning approach to the traveling salesman problem. IEEE Trans. Evol. Comput. 1997, 1, 53–66. [Google Scholar] [CrossRef]
  16. Dorigo, M.; Maniezzo, V.; Colorni, A. Ant system: Optimization by a colony of cooperating agents. IEEE Trans. Syst. Man Cybern. Part B (Cybernetics) 1996, 26, 29–41. [Google Scholar] [CrossRef]
  17. Dasgupta, D. Advances in artificial immune systems. IEEE Comput. Intell. Mag. 2006, 1, 40–49. [Google Scholar] [CrossRef]
  18. Siarry, P. (Ed.) Metaheuristics; Springer Nature: Cham, Switzerland, 2016. [Google Scholar]
  19. Hoos, H.H.; Stützle, T. Stochastic Local Search: Foundations and Applications; Morgan Kaufmann: San Francisco, CA, USA, 2005. [Google Scholar]
  20. Harada, T.; Alba, E. Parallel Genetic Algorithms: A Useful Survey. ACM Comput. Surv. 2020, 53, 86. [Google Scholar] [CrossRef]
  21. Cicirello, V.A. Impact of Random Number Generation on Parallel Genetic Algorithms. In Proceedings of the 31st International Florida Artificial Intelligence Research Society Conference, Melbourne, FL, USA, 21–23 May 2018; AAAI Press: Menlo Park, CA, USA, 2018; pp. 2–7. [Google Scholar]
  22. Luque, G.; Alba, E. Parallel Genetic Algorithms: Theory and Real World Applications; Springer: Heidelberg, Germany, 2011. [Google Scholar]
  23. Rudolph, G. Convergence analysis of canonical genetic algorithms. IEEE Trans. Neural Netw. 1994, 5, 96–101. [Google Scholar] [CrossRef] [PubMed]
  24. Rudolph, G. Convergence of evolutionary algorithms in general search spaces. In Proceedings of the IEEE International Conference on Evolutionary Computation, Nagoya, Japan, 20–22 May 1996; pp. 50–54. [Google Scholar] [CrossRef]
  25. He, J.; Yao, X. Drift analysis and average time complexity of evolutionary algorithms. Artif. Intell. 2001, 127, 57–85. [Google Scholar] [CrossRef]
  26. Karafotias, G.; Hoogendoorn, M.; Eiben, A.E. Parameter Control in Evolutionary Algorithms: Trends and Challenges. IEEE Trans. Evol. Comput. 2015, 19, 167–187. [Google Scholar] [CrossRef]
  27. Cicirello, V.A. On Fitness Landscape Analysis of Permutation Problems: From Distance Metrics to Mutation Operator Selection. Mob. Netw. Appl. 2023, 28, 507–517. [Google Scholar] [CrossRef]
  28. Pimenta, C.G.; de Sá, A.G.C.; Ochoa, G.; Pappa, G.L. Fitness Landscape Analysis of Automated Machine Learning Search Spaces. In Proceedings of the Evolutionary Computation in Combinatorial Optimization: 20th European Conference, EvoCOP 2020, Held as Part of EvoStar 2020, Seville, Spain, 15–17 April 2020; Springer: Cham, Switzerland, 2020; pp. 114–130. [Google Scholar] [CrossRef]
  29. Huang, Y.; Li, W.; Tian, F.; Meng, X. A fitness landscape ruggedness multiobjective differential evolution algorithm with a reinforcement learning strategy. Appl. Soft Comput. 2020, 96, 106693. [Google Scholar] [CrossRef]
  30. Jones, T.; Forrest, S. Fitness Distance Correlation as a Measure of Problem Difficulty for Genetic Algorithms. In Proceedings of the 6th International Conference on Genetic Algorithms, Pittsburgh, PA, USA, 15–19 July 1995; pp. 184–192. [Google Scholar]
  31. Cicirello, V.A. The Permutation in a Haystack Problem and the Calculus of Search Landscapes. IEEE Trans. Evol. Comput. 2016, 20, 434–446. [Google Scholar] [CrossRef]
  32. Scott, E.O.; Luke, S. ECJ at 20: Toward a General Metaheuristics Toolkit. In Proceedings of the Genetic and Evolutionary Computation Conference Companion, Prague, Czech Republic, 13–17 July 2019; ACM Press: New York, NY, USA, 2019; pp. 1391–1398. [Google Scholar] [CrossRef]
  33. Cicirello, V.A. Chips-n-Salsa: A Java Library of Customizable, Hybridizable, Iterative, Parallel, Stochastic, and Self-Adaptive Local Search Algorithms. J. Open Source Softw. 2020, 5, 2448. [Google Scholar] [CrossRef]
  34. Jenetics. Jenetics—Genetic Algorithm, Genetic Programming, Evolutionary Algorithm, and Multi-Objective Optimization. 2024. Available online: https://jenetics.io/ (accessed on 27 January 2024).
  35. Bell, I.H. CEGO: C++11 Evolutionary Global Optimization. J. Open Source Softw. 2019, 4, 1147. [Google Scholar] [CrossRef]
  36. Gijsbers, P.; Vanschoren, J. GAMA: Genetic Automated Machine learning Assistant. J. Open Source Softw. 2019, 4, 1132. [Google Scholar] [CrossRef]
  37. Detorakis, G.; Burton, A. GAIM: A C++ library for Genetic Algorithms and Island Models. J. Open Source Softw. 2019, 4, 1839. [Google Scholar] [CrossRef]
  38. de Dios, J.A.M.; Mezura-Montes, E. Metaheuristics: A Julia Package for Single- and Multi-Objective Optimization. J. Open Source Softw. 2022, 7, 4723. [Google Scholar] [CrossRef]
  39. Izzo, D.; Biscani, F. dcgp: Differentiable Cartesian Genetic Programming made easy. J. Open Source Softw. 2020, 5, 2290. [Google Scholar] [CrossRef]
  40. Simson, J. LGP: A robust Linear Genetic Programming implementation on the JVM using Kotlin. J. Open Source Softw. 2019, 4, 1337. [Google Scholar] [CrossRef]
  41. Tarkowski, T. Quilë: C++ genetic algorithms scientific library. J. Open Source Softw. 2023, 8, 4902. [Google Scholar] [CrossRef]
  42. Deb, K.; Pratap, A.; Agarwal, S.; Meyarivan, T. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 2002, 6, 182–197. [Google Scholar] [CrossRef]
  43. Liang, J.; Ban, X.; Yu, K.; Qu, B.; Qiao, K.; Yue, C.; Chen, K.; Tan, K.C. A Survey on Evolutionary Constrained Multiobjective Optimization. IEEE Trans. Evol. Comput. 2023, 27, 201–221. [Google Scholar] [CrossRef]
  44. Tian, Y.; Si, L.; Zhang, X.; Cheng, R.; He, C.; Tan, K.C.; Jin, Y. Evolutionary Large-Scale Multi-Objective Optimization: A Survey. ACM Comput. Surv. 2021, 54, 174. [Google Scholar] [CrossRef]
  45. Li, M.; Yao, X. Quality Evaluation of Solution Sets in Multiobjective Optimisation: A Survey. ACM Comput. Surv. 2019, 52, 26. [Google Scholar] [CrossRef]
  46. Sohail, A. Genetic Algorithms in the Fields of Artificial Intelligence and Data Sciences. Ann. Data Sci. 2023, 10, 1007–1018. [Google Scholar] [CrossRef]
  47. Li, N.; Ma, L.; Yu, G.; Xue, B.; Zhang, M.; Jin, Y. Survey on Evolutionary Deep Learning: Principles, Algorithms, Applications, and Open Issues. ACM Comput. Surv. 2023, 56, 41. [Google Scholar] [CrossRef]
  48. Telikani, A.; Tahmassebi, A.; Banzhaf, W.; Gandomi, A.H. Evolutionary Machine Learning: A Survey. ACM Comput. Surv. 2021, 54, 161. [Google Scholar] [CrossRef]
  49. Li, N.; Ma, L.; Xing, T.; Yu, G.; Wang, C.; Wen, Y.; Cheng, S.; Gao, S. Automatic design of machine learning via evolutionary computation: A survey. Appl. Soft Comput. 2023, 143, 110412. [Google Scholar] [CrossRef]
  50. Espejo, P.G.; Ventura, S.; Herrera, F. A Survey on the Application of Genetic Programming to Classification. IEEE Trans. Syst. Man, Cybern. Part C (Appl. Rev.) 2010, 40, 121–144. [Google Scholar] [CrossRef]
  51. Xue, B.; Zhang, M.; Browne, W.N.; Yao, X. A Survey on Evolutionary Computation Approaches to Feature Selection. IEEE Trans. Evol. Comput. 2016, 20, 606–626. [Google Scholar] [CrossRef]
  52. Zhou, X.; Qin, A.K.; Sun, Y.; Tan, K.C. A Survey of Advances in Evolutionary Neural Architecture Search. In Proceedings of the 2021 IEEE Congress on Evolutionary Computation (CEC), Virtually, 28 June–1 July 2021; pp. 950–957. [Google Scholar] [CrossRef]
  53. Papavasileiou, E.; Cornelis, J.; Jansen, B. A Systematic Literature Review of the Successors of “NeuroEvolution of Augmenting Topologies”. Evol. Comput. 2021, 29, 1–73. [Google Scholar] [CrossRef]
  54. Fogel, G.B.; Corne, D.W. (Eds.) Evolutionary Computation in Bioinformatics; Morgan Kaufmann: San Francisco, CA, USA, 2003. [Google Scholar]
  55. Zhang, F.; Mei, Y.; Nguyen, S.; Zhang, M. Survey on Genetic Programming and Machine Learning Techniques for Heuristic Design in Job Shop Scheduling. IEEE Trans. Evol. Comput. 2023, 28, 147–167. [Google Scholar] [CrossRef]
  56. Kerschke, P.; Hoos, H.H.; Neumann, F.; Trautmann, H. Automated Algorithm Selection: Survey and Perspectives. Evol. Comput. 2019, 27, 3–45. [Google Scholar] [CrossRef]
  57. Bi, Y.; Xue, B.; Mesejo, P.; Cagnoni, S.; Zhang, M. A Survey on Evolutionary Computation for Computer Vision and Image Analysis: Past, Present, and Future Trends. IEEE Trans. Evol. Comput. 2023, 27, 5–25. [Google Scholar] [CrossRef]
  58. Jayasena, A.; Mishra, P. Directed Test Generation for Hardware Validation: A Survey. ACM Comput. Surv. 2024, 56, 132. [Google Scholar] [CrossRef]
  59. Sobania, D.; Schweim, D.; Rothlauf, F. A Comprehensive Survey on Program Synthesis with Evolutionary Algorithms. IEEE Trans. Evol. Comput. 2023, 27, 82–97. [Google Scholar] [CrossRef]
  60. Arcuri, A.; Galeotti, J.P.; Marculescu, B.; Zhang, M. EvoMaster: A Search-Based System Test Generation Tool. J. Open Source Softw. 2021, 6, 2153. [Google Scholar] [CrossRef]
  61. Tan, Z.; Luo, L.; Zhong, J. Knowledge transfer in evolutionary multi-task optimization: A survey. Appl. Soft Comput. 2023, 138, 110182. [Google Scholar] [CrossRef]
  62. Zhao, H.; Ning, X.; Liu, X.; Wang, C.; Liu, J. What makes evolutionary multi-task optimization better: A comprehensive survey. Appl. Soft Comput. 2023, 145, 110545. [Google Scholar] [CrossRef]
Figure 1. A few of the many forms of evolutionary computation.
Figure 1. A few of the many forms of evolutionary computation.
Applsci 14 02542 g001
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Cicirello, V.A. Evolutionary Computation: Theories, Techniques, and Applications. Appl. Sci. 2024, 14, 2542. https://doi.org/10.3390/app14062542

AMA Style

Cicirello VA. Evolutionary Computation: Theories, Techniques, and Applications. Applied Sciences. 2024; 14(6):2542. https://doi.org/10.3390/app14062542

Chicago/Turabian Style

Cicirello, Vincent A. 2024. "Evolutionary Computation: Theories, Techniques, and Applications" Applied Sciences 14, no. 6: 2542. https://doi.org/10.3390/app14062542

APA Style

Cicirello, V. A. (2024). Evolutionary Computation: Theories, Techniques, and Applications. Applied Sciences, 14(6), 2542. https://doi.org/10.3390/app14062542

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop