Monarch Butterfly Optimization Based Convolutional Neural Network Design
Abstract
:1. Introduction
1.1. Convolutional Neural Networks and Hyperparameters’ Optimization
1.2. Research Question, Objectives, and Scope
1.3. Structure of the Paper
2. Swarm Intelligence Algorithms and Related Work
3. Proposed Method
3.1. Original Monarch Butterfly Optimization Algorithm
- The population of the individuals is in two different locations (Land 1 and Land 2);
- The offspring are created on both places, by utilizing the migration operator;
- If the new individual has better fitness than the parent monarch butterfly, it will replace the old solution;
- The solutions with the best fitness value remain unchanged for the next iteration.
3.1.1. Migration Operator
3.1.2. Butterfly Adjusting Operator
Algorithm 1. Basic MBO pseudo-code. |
Randomly initialize the population of solutions (monarch butterflies) |
Initialize the parameters: migration ratio (p), migration period (), adjusting rate (), and maximum step size () |
Fitness evaluation |
Set t, the iteration counter, to one, and define the maximum number of iterations () |
whiledo |
Sort the solutions according to their fitness value |
Divide the whole population into two sub-populations ( and ) |
for all to (all individuals in Sub-population 1) do |
for all to D (all elements of the individual) do |
Generate (a random number), and calculate the value of r by using Equation (5) |
if then |
Choose an individual from , and create the element of the new solution by utilizing Equation (4) |
else |
Choose an individual from , and create the element of the new solution by utilizing Equation (4) |
end if |
end for |
end for |
for all to (all individuals in Sub-population 2) do |
for all to D (all elements of the individual) do |
Generate (a random number), and calculate the value of r by using Equation (5) |
if then |
Create the element of the new solution by utilizing Equation (6) |
else |
Choose an individual from , and create the element of the new solution by utilizing Equation (7) |
if then |
Apply Equation (8) |
end if |
end if |
end for |
end for |
Merge the two sub-populations into one population |
Evaluate the fitness of the new solutions |
end while |
Return the best solution |
3.2. Hybridized Monarch Butterfly Optimization Algorithm
Algorithm 2. MBO-ABCFE pseudocode. |
Randomly initialize the population of solutions (monarch butterflies); initialize the parameters: migration ratio (p), migration period (), adjusting rate (), maximum step size (), exhaustiveness parameter (), discarding mechanism trigger (), and modification rate (MR); Evaluate the fitness; set t, the iteration counter, to one and to zero, and define the maximum number of iterations () |
whiledo |
Sort the solutions according to their fitness value |
Divide the whole population into two sub-populations ( and ) |
for all to (all individuals in Sub-population 1) do |
for all to D (all elements of the individual) do |
Generate a random number between zero and one for |
if then |
if FAP<0.5 then |
Generate a new component by using Equation (12) |
else |
Generate (a random number), and calculate the value of r by using Equation (5) |
if then |
Choose a solution from , and create the element of the new solution by Equation (4) |
else |
Choose a solution from , and create the element of the new solution by Equation (4) |
end if |
end if |
end if |
end for |
Evaluate the fitness, and make a selection between the new and old solution based on the fitness value; if the old solution has better fitness, increment the parameter by one |
end for |
for all to (all individuals in Sub-population 2) do |
for all to D (all elements of the individual) do |
Generate (a random number), and calculate the value of r by using Equation (5) |
if then |
Create the element of the new solution by utilizing Equation (6) |
else |
Choose an individual from , and create the element of the new solution by Equation (7) |
if then |
Apply Equation (8) |
end if |
end if |
end for |
If the old solution has better fitness, increment the parameter by one |
end for |
Merge the two sub-populations into one population |
for all solutions in do |
if then |
Discard the solutions for which the condition is satisfied, and replace them with randomly created solutions by utilizing Equation (11) |
end if |
end for |
Evaluate the fitness of the new solutions |
Adjust the value of the parameter by using Equation (10) |
end while |
Return the best solution |
4. Simulation Setup
4.1. Parameter Settings and Dataset for Unconstrained Simulations
4.2. Parameters’ Settings and Simulation Setup for Cnns’ Neuroevolution Simulations
4.2.1. Configuration of the Convolutional Layer
4.2.2. Configuration of the Fully-Connected Layer
4.2.3. Configuration of the General Hyperparameters
4.2.4. Benchmark Dataset
5. Experimental Results and Discussion
5.1. Experiments on Unconstrained Benchmark Functions
5.2. Convolutional Neural Network Design Experiment
6. Conclusions
- An automated framework for “neuroevolution” based on the hybridized MBO-ABCFE algorithm, which managed to design and generate CNN architectures with high performance (accuracy) for image classification tasks, was developed;
- In the CNN hyperparameters’ optimization problem, we included more CNN hyperparameters than most previous works from this domain;
- We managed to enhance the original MBO approach significantly by performing hybridization with other state-of-the-art swarm algorithms; and
- The original MBO was implemented for the first time for tackling CNNs’ optimization challenge.
Author Contributions
Funding
Conflicts of Interest
References
- Lecun, Y.; Bottou, L.; Bengio, Y.; Haffner, P. Gradient-based learning applied to document recognition. Proc. IEEE 1998, 86, 2278–2324. [Google Scholar] [CrossRef] [Green Version]
- Farabet, C.; Couprie, C.; Najman, L.; LeCun, Y. Learning Hierarchical Features for Scene Labeling. IEEE Trans. Pattern Anal. Mach. Intell. 2013, 35, 1915–1929. [Google Scholar] [CrossRef] [Green Version]
- Stoean, C.; Stoean, R.; Becerra-García, R.A.; García-Bermúdez, R.; Atencia, M.; García-Lagos, F.; Velázquez-Pérez, L.; Joya, G. Unsupervised Learning as a Complement to Convolutional Neural Network Classification in the Analysis of Saccadic Eye Movement in Spino-Cerebellar Ataxia Type 2. In Advances in Computational Intelligence; Rojas, I., Joya, G., Catala, A., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 26–37. [Google Scholar]
- Karpathy, A.; Li, F.-F. Deep visual-semantic alignments for generating image descriptions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 3128–3137. [Google Scholar]
- Taigman, Y.; Yang, M.; Ranzato, M.; Wolf, L. DeepFace: Closing the Gap to Human-Level Performance in Face Verification. In Proceedings of the 2014 IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 23–28 June 2014; pp. 1701–1708. [Google Scholar] [CrossRef]
- Samide, A.; Stoean, C.; Stoean, R. Surface study of inhibitor films formed by polyvinyl alcohol and silver nanoparticles on stainless steel in hydrochloric acid solution using convolutional neural networks. Appl. Surf. Sci. 2019, 475, 1–5. [Google Scholar] [CrossRef]
- Toshev, A.; Szegedy, C. DeepPose: Human Pose Estimation via Deep Neural Networks. In Proceedings of the 2014 IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 23–28 June 2014; IEEE Computer Society: Washington, DC, USA, 2014; pp. 1653–1660. [Google Scholar] [CrossRef] [Green Version]
- Stoean, R.; Stoean, C.; Samide, A.; Joya, G. Convolutional Neural Network Learning Versus Traditional Segmentation for the Approximation of the Degree of Defective Surface in Titanium for Implantable Medical Devices. In Advances in Computational Intelligence; Rojas, I., Joya, G., Catala, A., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 871–882. [Google Scholar]
- Hubel, D.H.; Wiesel, T.N. Receptive fields of single neurones in the cat’s striate cortex. J. Physiol. 1959, 148, 574–591. [Google Scholar] [CrossRef] [PubMed]
- Fukushima, K. Neocognitron: A self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position. Biol. Cybern. 1980, 36, 193–202. [Google Scholar] [CrossRef] [PubMed]
- LeCun, Y.; Boser, B.E.; Denker, J.S.; Henderson, D.; Howard, R.E.; Hubbard, W.E.; Jackel, L.D. Handwritten digit recognition with a back-propagation network. In Advances in Neural Information Processing Systems; Morgan Kaufmann Publishers Inc.: San Francisco, CA, USA, 1990; pp. 396–404. [Google Scholar]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet Classification with Deep Convolutional Neural Networks. In Advances in Neural Information Processing Systems 25; Pereira, F., Burges, C.J.C., Bottou, L., Weinberger, K.Q., Eds.; Curran Associates, Inc.: Red Hook, NY, USA, 2012; pp. 1097–1105. [Google Scholar]
- Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going deeper with convolutions. In Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015; pp. 1–9. [Google Scholar] [CrossRef] [Green Version]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015. [Google Scholar]
- Huang, G.; Liu, Z.; Van Der Maaten, L.; Weinberger, K.Q. Densely Connected Convolutional Networks. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 2261–2269. [Google Scholar] [CrossRef] [Green Version]
- Hu, J.; Shen, L.; Sun, G. Squeeze-and-Excitation Networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017. [Google Scholar]
- Duchi, J.C.; Hazan, E.; Singer, Y. Adaptive Subgradient Methods for Online Learning and Stochastic Optimization. J. Mach. Learn. Res. 2011, 12, 2121–2159. [Google Scholar]
- Zeiler, M.D. ADADELTA: An Adaptive Learning Rate Method. arXiv 2012, arXiv:cs.LG/1212.5701. [Google Scholar]
- Kingma, D.P.; Ba, J. Adam: A Method for Stochastic Optimization. arXiv 2014, arXiv:cs.LG/1412.6980. [Google Scholar]
- Ng, A.Y. Feature Selection, L1 vs. L2 Regularization, and Rotational Invariance. In Proceedings of the Twenty-first International Conference on Machine Learning; ACM: New York, NY, USA, 2004; p. 788. [Google Scholar] [CrossRef] [Green Version]
- Srivastava, N.; Hinton, G.; Krizhevsky, A.; Sutskever, I.; Salakhutdinov, R. Dropout: A Simple Way to Prevent Neural Networks from Overfitting. J. Mach. Learn. Res. 2014, 15, 1929–1958. [Google Scholar]
- Wan, L.; Zeiler, M.; Zhang, S.; Le Cun, Y.; Fergus, R. Regularization of neural networks using dropconnect. In Proceedings of the International Conference on Machine Learning, Atlanta, GA, USA, 16–21 June 2013; pp. 1058–1066. [Google Scholar]
- Ioffe, S.; Szegedy, C. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift. In Proceedings of the 32nd International Conference on Machine Learning; Bach, F., Blei, D., Eds.; PMLR: Lille, France, 2015; Volume 37, pp. 448–456. [Google Scholar]
- Nair, V.; Hinton, G.E. Rectified Linear Units Improve Restricted Boltzmann Machines. In Proceedings of the 27th International Conference on International Conference on Machine Learning, Scotland, UK, 26 June–1 July 2010; pp. 807–814. [Google Scholar]
- Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning; MIT Press: Cambridge, MA, USA, 2017; p. 800. [Google Scholar]
- Wang, Y.; Zhang, H.; Zhang, G. cPSO-CNN: An efficient PSO-based algorithm for fine-tuning hyper-parameters of convolutional neural networks. Swarm Evol. Comput. 2019, 49, 114–123. [Google Scholar] [CrossRef]
- Darwish, A.; Ezzat, D.; Hassanien, A.E. An optimized model based on convolutional neural networks and orthogonal learning particle swarm optimization algorithm for plant diseases diagnosis. Swarm Evol. Comput. 2020, 52, 100616. [Google Scholar] [CrossRef]
- Yamasaki, T.; Honma, T.; Aizawa, K. Efficient Optimization of Convolutional Neural Networks Using Particle Swarm Optimization. In Proceedings of the 2017 IEEE Third International Conference on Multimedia Big Data (BigMM), Laguna Hills, CA, USA, 19–21 April 2017; pp. 70–73. [Google Scholar] [CrossRef]
- Qolomany, B.; Maabreh, M.; Al-Fuqaha, A.; Gupta, A.; Benhaddou, D. Parameters optimization of deep learning models using Particle swarm optimization. In Proceedings of the 2017 13th International Wireless Communications and Mobile Computing Conference (IWCMC), Valencia, Spain, 26–30 June 2017; pp. 1285–1290. [Google Scholar] [CrossRef] [Green Version]
- Bochinski, E.; Senst, T.; Sikora, T. Hyper-parameter optimization for convolutional neural network committees based on evolutionary algorithms. In Proceedings of the 2017 IEEE International Conference on Image Processing (ICIP), Beijing, China, 17–20 September 2017; pp. 3924–3928. [Google Scholar] [CrossRef] [Green Version]
- Baldominos, A.; Saez, Y.; Isasi, P. Evolutionary convolutional neural networks: An application to handwriting recognition. Neurocomputing 2018, 283, 38–52. [Google Scholar] [CrossRef]
- Strumberger, I.; Tuba, E.; Bacanin, N.; Jovanovic, R.; Tuba, M. Convolutional Neural Network Architecture Design by the Tree Growth Algorithm Framework. In Proceedings of the 2019 International Joint Conference on Neural Networks (IJCNN), Budapest, Hungary, 14–19 July 2019; pp. 1–8. [Google Scholar] [CrossRef]
- Strumberger, I.; Tuba, E.; Bacanin, N.; Zivkovic, M.; Beko, M.; Tuba, M. Designing Convolutional Neural Network Architecture by the Firefly Algorithm. In Proceedings of the 2019 International Young Engineers Forum (YEF-ECE), Caparica, Portugal, 10 May 2019; pp. 59–65. [Google Scholar] [CrossRef]
- Bacanin, N.; Bezdan, T.; Tuba, E.; Strumberger, I.; Tuba, M. Optimizing Convolutional Neural Network Hyperparameters by Enhanced Swarm Intelligence Metaheuristics. Algorithms 2020, 13, 67. [Google Scholar] [CrossRef] [Green Version]
- Wang, G.G.; Deb, S.; Cui, Z. Monarch Butterfly Optimization. Neural Comput. Appl. 2015, 1–20. [Google Scholar] [CrossRef] [Green Version]
- Strumberger, I.; Tuba, E.; Bacanin, N.; Beko, M.; Tuba, M. Monarch butterfly optimization algorithm for localization in wireless sensor networks. In Proceedings of the 2018 28th International Conference Radioelektronika (RADIOELEKTRONIKA), Prague, Czech Republic, 19–20 April 2018; pp. 1–6. [Google Scholar] [CrossRef]
- Wang, G.G.; Hao, G.S.; Cheng, S.; Qin, Q. A Discrete Monarch Butterfly Optimization for Chinese TSP Problem. In Advances in Swarm Intelligence; Tan, Y., Shi, Y., Niu, B., Eds.; Springer International Publishing: Cham, Switzerland, 2016; pp. 165–173. [Google Scholar]
- Strumberger, I.; Tuba, E.; Bacanin, N.; Beko, M.; Tuba, M. Modified and Hybridized Monarch Butterfly Algorithms for Multi-Objective Optimization. In International Conference on Hybrid Intelligent Systems; Springer: Berlin, Germany, 2018; pp. 449–458. [Google Scholar]
- Strumberger, I.; Tuba, M.; Bacanin, N.; Tuba, E. Cloudlet Scheduling by Hybridized Monarch Butterfly Optimization Algorithm. J. Sensor Actuator Networks 2019, 8, 44. [Google Scholar] [CrossRef] [Green Version]
- Strumberger, I.; Sarac, M.; Markovic, D.; Bacanin, N. Hybridized Monarch Butterfly Algorithm for Global Optimization Problems. Int. J. Comput. 2018, 3, 63–68. [Google Scholar]
- Wang, G.G.; Deb, S.; Zhao, X.; Cui, Z. A new monarch butterfly optimization with an improved crossover operator. Oper. Res. 2018, 18, 731–755. [Google Scholar] [CrossRef]
- Suganuma, M.; Shirakawa, S.; Nagao, T. A Genetic Programming Approach to Designing Convolutional Neural Network Architectures. In GECCO ’17, Proceedings of the Genetic and Evolutionary Computation Conference, Berlin, Germany, 15–19 July 2017; ACM: New York, NY, USA, 2017; pp. 497–504. [Google Scholar] [CrossRef] [Green Version]
- De Rosa, G.H.; Papa, J.P.; Yang, X.S. Handling dropout probability estimation in convolution neural networks using meta-heuristics. Soft Comput. 2018, 22, 6147–6156. [Google Scholar] [CrossRef] [Green Version]
- Ting, T.O.; Yang, X.S.; Cheng, S.; Huang, K. Hybrid Metaheuristic Algorithms: Past, Present, and Future. Recent Adv. Swarm Intell. Evol. Comput. Stud. Comput. Intell. 2015, 585, 71–83. [Google Scholar]
- Bacanin, N.; Tuba, M. Artificial Bee Colony (ABC) Algorithm for Constrained Optimization Improved with Genetic Operators. Stud. Inform. Control 2012, 21, 137–146. [Google Scholar] [CrossRef]
- Dorigo, M.; Birattari, M. Ant Colony Optimization; Springer: Berlin/Heidelberg, Germany, 2010. [Google Scholar]
- Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; Volume 4, pp. 1942–1948. [Google Scholar] [CrossRef]
- Karaboga, D.; Basturk, B. On the performance of artificial bee colony (ABC) algorithm. Appl. Soft Comput. 2008, 8, 687–697. [Google Scholar] [CrossRef]
- Yang, X.S. Firefly Algorithms for Multimodal Optimization. In Stochastic Algorithms: Foundations and Applications; Watanabe, O., Zeugmann, T., Eds.; Springer: Berlin/Heidelberg, Germany, 2009; pp. 169–178. [Google Scholar]
- Strumberger, I.; Bacanin, N.; Tuba, M. Enhanced Firefly Algorithm for Constrained Numerical Optimization, IEEE Congress on Evolutionary Computation. In Proceedings of the IEEE International Congress on Evolutionary Computation (CEC 2017), San Sebastián, Spain, 5–8 June 2017; pp. 2120–2127. [Google Scholar]
- Gandomi, A.H.; Yang, X.S.; Alavi, A.H. Cuckoo search algorithm: A metaheuristic approach to solve structural optimization problems. Eng. Comput. 2013, 29, 17–35. [Google Scholar] [CrossRef]
- Bacanin, N. Implementation and performance of an object-oriented software system for cuckoo search algorithm. Int. J. Math. Comput. Simul. 2010, 6, 185–193. [Google Scholar]
- Yang, X.S.; Hossein Gandomi, A. Bat algorithm: A novel approach for global engineering optimization. Eng. Comput. 2012, 29, 464–483. [Google Scholar] [CrossRef] [Green Version]
- Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
- Wang, G.G.; Deb, S.; dos S. Coelho, L. Elephant Herding Optimization. In Proceedings of the 2015 3rd International Symposium on Computational and Business Intelligence (ISCBI), Bali, Indonesia, 7–9 December 2015; pp. 1–5. [Google Scholar]
- Strumberger, I.; Bacanin, N.; Tuba, M. Hybridized ElephantHerding Optimization Algorithm for Constrained Optimization. In Hybrid Intelligent Systems; Abraham, A., Muhuri, P.K., Muda, A.K., Gandhi, N., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 158–166. [Google Scholar]
- Strumberger, I.; Tuba, E.; Zivkovic, M.; Bacanin, N.; Beko, M.; Tuba, M. Dynamic Search Tree Growth Algorithm for Global Optimization. In Technological Innovation for Industry and Service Systems; Camarinha-Matos, L.M., Almeida, R., Oliveira, J., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 143–153. [Google Scholar]
- Strumberger, I.; Bacanin, N.; Tomic, S.; Beko, M.; Tuba, M. Static drone placement by elephant herding optimization algorithm. In Proceedings of the 2017 25th Telecommunication Forum (TELFOR), Belgrade, Serbia, 21–22 November 2017; pp. 1–4. [Google Scholar]
- Cheraghalipour, A.; Hajiaghaei-Keshteli, M.; Paydar, M.M. Tree Growth Algorithm (TGA): A novel approach for solving optimization problems. Eng. Appl. Artif. Intell. 2018, 72, 393–414. [Google Scholar] [CrossRef]
- Mucherino, A.; Seref, O. Monkey search: A novel metaheuristic search for global optimization. In Data Mining, Systems Analysis and Optimization in Biomedicine; Seref, O., Kundakcioglu, E., Pardalos, P., Eds.; American Institute of Physics Conference Series; American Institute of Physics: Melville, NY, USA, 2007; Volume 953, pp. 162–173. [Google Scholar] [CrossRef]
- Strumberger, I.; Tuba, E.; Bacanin, N.; Beko, M.; Tuba, M. Hybridized moth search algorithm for constrained optimization problems. In Proceedings of the 2018 International Young Engineers Forum (YEF-ECE), Costa da Caparica, Portugal, 4 May 2018; pp. 1–5. [Google Scholar] [CrossRef]
- Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
- Yang, X.S. Flower pollination algorithm for global optimization. In Proceedings of the International Conference on Unconventional Computing and Natural Computation, Orléans, France, 3–7 September 2012; pp. 240–249. [Google Scholar]
- Strumberger, I.; Bacanin, N.; Tuba, M. Hybridized elephant herding optimization algorithm for constrained optimization. In Proceedings of the International Conference on Health Information Science, Moscow, Russia, 7–9 October 2017; pp. 158–166. [Google Scholar]
- Strumberger, I.; Sarac, M.; Markovic, D.; Bacanin, N. Moth Search Algorithm for Drone Placement Problem. Int. J. Comput. 2018, 3, 75–80. [Google Scholar]
- Strumberger, I.; Tuba, E.; Bacanin, N.; Tuba, M. Modified Moth Search Algorithm for Portfolio Optimization. In Smart Trends in Computing and Communications; Zhang, Y.D., Mandal, J.K., So-In, C., Thakur, N.V., Eds.; Springer: Singapore, 2020; pp. 445–453. [Google Scholar]
- Tuba, E.; Strumberger, I.; Bacanin, N.; Zivkovic, D.; Tuba, M. Brain Storm Optimization Algorithm for Thermal Image Fusion using DCT Coefficients. In Proceedings of the 2019 IEEE Congress on Evolutionary Computation (CEC), Wellington, New Zealand, 10–13 June 2019; pp. 234–241. [Google Scholar]
- Tuba, E.; Strumberger, I.; Zivkovic, D.; Bacanin, N.; Tuba, M. Mobile Robot Path Planning by Improved Brain Storm Optimization Algorithm. In Proceedings of the 2018 IEEE Congress on Evolutionary Computation (CEC), Rio de Janeiro, Brazil, 8–13 July 2018; pp. 1–8. [Google Scholar] [CrossRef]
- Tuba, E.; Strumberger, I.; Bacanin, N.; Tuba, M. Optimal Path Planning in Environments with Static Obstacles by Harmony Search Algorithm. In Advances in Harmony Search, Soft Computing and Applications; Kim, J.H., Geem, Z.W., Jung, D., Yoo, D.G., Yadav, A., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 186–193. [Google Scholar]
- Bacanin, N.; Tuba, M. Firefly Algorithm for Cardinality Constrained Mean-Variance Portfolio Optimization Problem with Entropy Diversity Constraint. Sci. World J. 2014, 2014, 16. [Google Scholar] [CrossRef]
- Tuba, M.; Bacanin, N. Artificial bee colony algorithm hybridized with firefly metaheuristic for cardinality constrained mean-variance portfolio problem. Appl. Math. Inf. Sci. 2014, 8, 2831–2844. [Google Scholar] [CrossRef]
- Strumberger, I.; Minovic, M.; Tuba, M.; Bacanin, N. Performance of Elephant Herding Optimization and Tree Growth Algorithm Adapted for Node Localization in Wireless Sensor Networks. Sensors 2019, 19, 2515. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Strumberger, I.; Tuba, E.; Bacanin, N.; Beko, M.; Tuba, M. Wireless Sensor Network Localization Problem by Hybridized Moth Search Algorithm. In Proceedings of the 2018 14th International Wireless Communications Mobile Computing Conference (IWCMC), Limassol, Cyprus, 25–29 June 2018; pp. 316–321. [Google Scholar] [CrossRef]
- Tuba, M.; Bacanin, N. Hybridized bat algorithm for multi-objective radio frequency identification (RFID) network planning. In Proceedings of the 2015 IEEE Congress on Evolutionary Computation (CEC), Sendai, Japan, 25-28 May 2015; pp. 499–506. [Google Scholar] [CrossRef]
- Bacanin, N.; Tuba, M.; Strumberger, I. RFID network planning by ABC algorithm hybridized with heuristic for initial number and locations of readers. In Proceedings of the 2015 17th UKSim-AMSS International Conference on Modelling and Simulation (UKSim), Cambridge, UK, 25–27 March 2015; pp. 39–44. [Google Scholar]
- Bacanin, N.; Tuba, M.; Jovanovic, R. Hierarchical multiobjective RFID network planning using firefly algorithm. In Proceedings of the 2015 International Conference on Information and Communication Technology Research (ICTRC), Abu Dhabi, UAE, 17–19 May 2015; pp. 282–285. [Google Scholar] [CrossRef]
- Strumberger, I.; Bacanin, N.; Tuba, M.; Tuba, E. Resource Scheduling in Cloud Computing Based on a Hybridized Whale Optimization Algorithm. Appl. Sci. 2019, 9, 4893. [Google Scholar] [CrossRef] [Green Version]
- Strumberger, I.; Tuba, E.; Bacanin, N.; Tuba, M. Hybrid Elephant Herding Optimization Approach for Cloud Computing Load Scheduling. In Swarm, Evolutionary, and Memetic Computing and Fuzzy and Neural Computing; Zamuda, A., Das, S., Suganthan, P.N., Panigrahi, B.K., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 201–212. [Google Scholar]
- Strumberger, I.; Tuba, E.; Bacanin, N.; Tuba, M. Dynamic Tree Growth Algorithm for Load Scheduling in Cloud Environments. In Proceedings of the 2019 IEEE Congress on Evolutionary Computation (CEC), Wellington, New Zealand, 10–13 June 2019; pp. 65–72. [Google Scholar] [CrossRef]
- Magud, O.; Tuba, E.; Bacanin, N. Medical ultrasound image speckle noise reduction by adaptive median filter. Wseas Trans. Biol. Biomed. 2017, 14, 38–46. [Google Scholar]
- Hrosik, R.C.; Tuba, E.; Dolicanin, E.; Jovanovic, R.; Tuba, M. Brain Image Segmentation Based on Firefly Algorithm Combined with K-means Clustering. Stud. Inform. Control 2019, 28, 167–176. [Google Scholar] [CrossRef] [Green Version]
- Tuba, M.; Bacanin, N.; Alihodzic, A. Multilevel image thresholding by fireworks algorithm. In Proceedings of the 2015 25th International Conference Radioelektronika (RADIOELEKTRONIKA), Pardubice, Czech Republic, 21–22 April 2015; pp. 326–330. [Google Scholar] [CrossRef]
- Tuba, M.; Alihodzic, A.; Bacanin, N. Cuckoo Search and Bat Algorithm Applied to Training Feed-Forward Neural Networks. In Recent Advances in Swarm Intelligence and Evolutionary Computation; Springer International Publishing: Cham, Switzerland, 2015; pp. 139–162. [Google Scholar] [CrossRef]
- Tuba, E.; Strumberger, I.; Bacanin, N.; Tuba, M. Bare Bones Fireworks Algorithm for Capacitated p-Median Problem. In Advances in Swarm Intelligence; Tan, Y., Shi, Y., Tang, Q., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 283–291. [Google Scholar]
- Sulaiman, N.; Mohamad-Saleh, J.; Abro, A.G. A hybrid algorithm of ABC variant and enhanced EGS local search technique for enhanced optimization performance. Eng. Appl. Artif. Intell. 2018, 74, 10–22. [Google Scholar] [CrossRef]
- Ghosh, S.; Kaur, M.; Bhullar, S.; Karar, V. Hybrid ABC-BAT for Solving Short-Term Hydrothermal Scheduling Problems. Energies 2019, 12, 551. [Google Scholar] [CrossRef] [Green Version]
- Bacanin, N.; Tuba, E.; Bezdan, T.; Strumberger, I.; Tuba, M. Artificial Flora Optimization Algorithm for Task Scheduling in Cloud Computing Environment. In Proceedings of the International Conference on Intelligent Data Engineering and Automated Learning, Manchester, UK, 14–16 November 2019; pp. 437–445. [Google Scholar]
- Tuba, E.; Strumberger, I.; Bacanin, N.; Zivkovic, D.; Tuba, M. Acute Lymphoblastic Leukemia Cell Detection in Microscopic Digital Images Based on Shape and Texture Features. In Advances in Swarm Intelligence; Tan, Y., Shi, Y., Niu, B., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 142–151. [Google Scholar]
- Goldberg, D.E. Genetic Algorithms in Search, Optimization and Machine Learning, 1st ed.; Addison-Wesley Longman Publishing Co., Inc.: Boston, MA, USA, 1989. [Google Scholar]
- Fogel, D.; Society, I.C.I. Evolutionary Computation: Toward a New Philosophy of Machine Intelligence; IEEE Series on Computational Intelligence; Wiley: Hoboken, NJ, USA, 2006. [Google Scholar]
- Beyer, H.G.; Schwefel, H.P. Evolution strategies—A comprehensive introduction. Nat. Comput. 2002, 1, 3–52. [Google Scholar] [CrossRef]
- Gao, Z.; Li, Y.; Yang, Y.; Wang, X.; Dong, N.; Chiang, H.D. A GPSO-optimized convolutional neural networks for EEG-based emotion recognition. Neurocomputing 2020, 380, 225–235. [Google Scholar] [CrossRef]
- Martín, A.; Vargas, V.M.; Gutiérrez, P.A.; Camacho, D.; Hervás-Martínez, C. Optimising Convolutional Neural Networks using a Hybrid Statistically-driven Coral Reef Optimisation algorithm. Appl. Soft Comput. 2020, 90, 106144. [Google Scholar] [CrossRef]
- Anaraki, A.K.; Ayati, M.; Kazemi, F. Magnetic resonance imaging-based brain tumor grades classification and grading via convolutional neural networks and genetic algorithms. Biocybern. Biomed. Eng. 2019, 39, 63–74. [Google Scholar] [CrossRef]
- Fernando, C.; Banarse, D.; Reynolds, M.; Besse, F.; Pfau, D.; Jaderberg, M.; Lanctot, M.; Wierstra, D. Convolution by Evolution: Differentiable Pattern Producing Networks. In Proceedings of the Genetic and Evolutionary Computation Conference 2016; Association for Computing Machinery: New York, NY, USA, 2016; pp. 109–116. [Google Scholar] [CrossRef] [Green Version]
- Davison, J. DEvol: Automated Deep Neural Network Design via Genetic Programming. Available online: https://github.com/joeddav/devol (accessed on 1 March 2020).
- Karaboga, D.; Akay, B. A modified Artificial Bee Colony (ABC) Algorithm for constrained optimization problems. Appl. Soft Comput. 2011, 11, 3021–3031. [Google Scholar] [CrossRef]
- Ghanem, W.A.; Jantan, A. Hybridizing artificial bee colony with monarch butterfly optimization for numerical optimization problems. Neural Comput. Appl. 2018, 30, 163–181. [Google Scholar] [CrossRef]
- Tuba, M.; Nebojsa. Improved seeker optimization algorithm hybridized with firefly algorithm for constrained optimization problems. Neurocomputing 2014, 143, 197–207. [Google Scholar] [CrossRef]
- LeCun, Y.; Cortes, C. MNIST Handwritten Digit Database. 2010. Available online: http://yann.lecun.com/exdb/mnist/ (accessed on 1 March 2020).
- Jarrett, K.; Kavukcuoglu, K.; Ranzato, M.; LeCun, Y. What is the best multi-stage architecture for object recognition? In Proceedings of the 2009 IEEE 12th International Conference on Computer Vision, Kyoto, Japan, 29 September–2 October 2009; pp. 2146–2153. [Google Scholar]
- Xu, Y.F.; Lu, W.; Rabinowitz, J.D. Avoiding Misannotation of In-Source Fragmentation Products as Cellular Metabolites in Liquid Chromatography–Mass Spectrometry-Based Metabolomics. Anal. Chem. 2015, 87, 2273–2281. [Google Scholar] [CrossRef] [Green Version]
- Verbancsics, P.; Harguess, J. Generative NeuroEvolution for Deep Learning. arXiv 2013, arXiv:cs.NE/1312.5355. [Google Scholar]
- Desell, T. Large Scale Evolution of Convolutional Neural Networks Using Volunteer Computing. arXiv 2017, arXiv:cs.NE/1703.05422. [Google Scholar]
- Baldominos, A.; Saez, Y.; Isasi, P. Model selection in committees of evolved convolutional neural networks using genetic algorithms. In Proceedings of the International Conference on Intelligent Data Engineering and Automated Learning, Madrid, Spain, 21–23 November 2018; pp. 364–373. [Google Scholar]
Parameter | Notation | Value |
---|---|---|
Population of the solutions | 50 | |
Sub-population 1 | 21 | |
Sub-population 2 | 29 | |
Ratio of migration | p | 5/12 |
Period of migration | 1.2 | |
Max step size | 1.0 | |
Butterfly adjusting rate | 5/12 | |
Exhaustiveness | 4 | |
Discarding mechanism trigger | 33 | |
Rate of modification | MR | 0.8 |
Initial value for randomization parameter | 0.5 | |
Light absorption coefficient | 1.0 | |
Attractiveness at r = 0 | 0.2 |
ID | Function Name | Function Definition |
---|---|---|
F1 | Ackley | |
F2 | Alpine | |
F3 | Brown | |
F4 | Dixon and Price | |
F5 | Fletcher–Powell | |
where | ||
F6 | Griewank | |
F7 | Holzman 2 function | |
F8 | Lévy 3 function | |
F9 | Pathological function | |
F10 | Generalized Penalized Function 1 | |
where | ||
F11 | Generalized Penalized Function 2 | |
where | ||
F12 | Perm | |
F13 | Powel | |
F14 | Quartic with noise | |
F15 | Rastrigin | |
F16 | Rosenbrock | |
F17 | Schwefel 2.26 | |
F18 | Schwefel 1.2 | |
F19 | Schwefel 2.22 | |
F20 | Schwefel 2.21 | |
F21 | Sphere | |
F22 | Step | |
F23 | Sum function | |
F24 | Zakharov | |
F25 | Wavy 1 |
ID | Global Minimum | MBO | GCMBO | MBO-ABCFE | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
F1 | 0 | 0.01 | 11.43 | 6.43 | 18.83 | 4.24 | 4.62 | 14.57 | 0.00 | 2.21 | 2.96 | 11.47 | |
F2 | 0 | 7.51 | 10.37 | 39.49 | 0.03 | 0.20 | 1.44 | 0.00 | 0.01 | 0.15 | 1.06 | ||
F3 | 0 | 48.58 | 102.82 | 494.07 | 0.66 | 3.06 | 21.46 | 0.00 | 0.32 | 2.28 | 8.39 | ||
F4 | 0 | 155.44 | 3.2 | 0.18 | 4.4 | 0.07 | |||||||
F5 | 0 | 1.2 | 3.0 | 1.1 | 6.6 | 2.5 | 1.2 | 5.3 | 2.6 | 1.19 | 2.81 | 4.22 | 9.85 |
F6 | 0 | 1.01 | 93.72 | 94.71 | 342.68 | 1.00 | 20.74 | 21.69 | 83.11 | 1.00 | 28.52 | 30.74 | 97.51 |
F7 | 0 | 2.3 | 6.2 | 5.9 | 1.9 | 1.2 | 1.9 | 3.6 | 1.8 | 7.21 | 382.21 | 425.32 | 954.22 |
F8 | 0 | 2.5 | 20.58 | 33.27 | 113.39 | 2.2 | 2.11 | 5.00 | 20.12 | 0.00 | 1.18 | 3.85 | 9.53 |
F9 | 0 | 0.04 | 1.62 | 0.94 | 3.51 | 2.2 | 0.79 | 0.77 | 2.98 | 0.00 | 0.47 | 0.53 | 1.52 |
F10 | 0 | 2.5 | 3.2 | 6.5 | 2.8 | 9.8 | 3.1 | 1.2 | 7.5 | 5.28 | 5.97 | 1.84 | 5.74 |
F11 | 0 | 1.6 | 7.9 | 1.3 | 4.6 | 2.2 | 1.1 | 5.6 | 3.9 | 0.00 | 3.1 | 2.8 | 9.7 |
F12 | 0 | 2.3 | 5.9 | 1.1 | 6.0 | 5.7 | 1.5 | 2.0 | 6.0 | 5.87 | 4.58 | 8.54 | 9.57 |
F13 | 0 | 0.04 | 2.1 | 1.9 | 6.4 | 2.2 | 435.79 | 562.68 | 2.8 | 0.00 | 211.20 | 429.32 | 895.25 |
F14 | 0 | 1.2 | 36.86 | 35.96 | 134.22 | 2.2 | 0.09 | 0.31 | 1.90 | 0.00 | 0.03 | 0.15 | 1.07 |
F15 | 0 | 4.7 | 41.18 | 36.19 | 119.22 | 2.2 | 7.71 | 8.49 | 28.26 | 0.00 | 3.52 | 5.69 | 18.28 |
F16 | 0 | 1.9 | 969.30 | 1.7 | 7.8 | 2.2 | 69.97 | 116.50 | 414.22 | 0.00 | 45.37 | 87.31 | 311.20 |
F17 | 0 | 1.99 | 3.0 | 1.9 | 5.6 | 2.5 | 1.0 | 1.0 | 3.7 | 4.68 | 2.31 | 5.41 | 3.25 |
F18 | 0 | 1.4 | 2.5 | 1.5 | 5.5 | 0.05 | 1.1 | 8.5 | 3.3 | 0.02 | 8.32 | 8.42 | 9.35 |
F19 | 0 | 8.87 | 20.08 | 22.86 | 75.65 | 2.20 | 2.46 | 4.61 | 20.45 | 0.00 | 1.27 | 3.81 | 18.65 |
F20 | 0 | 0.76 | 30.53 | 20.87 | 77.54 | 0.13 | 26.63 | 18.90 | 61.74 | 0.08 | 15.21 | 16.78 | 58.32 |
F21 | 0 | 9.72 | 20.49 | 39.06 | 147.90 | 2.20 | 0.12 | 0.43 | 2.55 | 0.00 | 0.09 | 0.18 | 1.72 |
F22 | 0 | 1.00 | 22.74 | 35.69 | 125.00 | 1.00 | 1.72 | 2.94 | 20.00 | 1.00 | 0.89 | 1.31 | 17.21 |
F23 | 0 | 4.30 | 1.81 | 1.39 | 4.32 | 2.20 | 120.90 | 165.80 | 816.90 | 0.00 | 79.25 | 112.52 | 297.52 |
F24 | 0 | 6.36 | 328.80 | 223.90 | 831.40 | 2.40 | 137.20 | 128.70 | 452.90 | 1.38 | 52.68 | 85.36 | 584.32 |
F25 | 0 | 0.04 | 320.00 | 281.20 | 1.07 | 3.43 | 108.60 | 142.30 | 578.80 | 1.68 | 54.33 | 118.32 | 225.36 |
ID | Global Minimum | MBO | GCMBO | MBO-ABCFE | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
F1 | 0 | 0.01 | 14.68 | 5.63 | 19.10 | 6.94 | 8.19 | 5.25 | 16.33 | 8.52 | 3.21 | 4.92 | 12.28 |
F2 | 0 | 4.18 | 37.36 | 35.50 | 103.00 | 1.76 | 4.53 | 22.85 | 0.00 | 0.18 | 2.17 | 18.71 | |
F3 | 0 | 0.13 | 6.2 | 2.4 | 1.35 | 7.30 | 23.87 | 53.71 | 294.60 | 1.54 | 19.38 | 24.71 | 117.32 |
F4 | 0 | 1.58 | 1.0 | 8.5 | 2.65 | 1.32 | 1.2 | 1.3 | 5.22 | 0.36 | 8.39 | 1.21 | 6.17 |
F5 | 0 | 1.45 | 2.7 | 7.2 | 4.71 | 4.14 | 8.4 | 2.2 | 1.28 | 0.83 | 7.12 | 3.56 | 2.83 |
F6 | 0 | 1.00 | 341.06 | 287.07 | 827.20 | 1.00 | 65.66 | 71.85 | 262.60 | 1.00 | 73.16 | 132.58 | 285.36 |
F7 | 0 | 3.86 | 5.1 | 3.9 | 1.24 | 2.20 | 4.7 | 5.1 | 2.07 | 0.00 | 0.39 | 0.97 | 5.69 |
F8 | 0 | 2.15 | 103.80 | 118.35 | 392.30 | 3.28 | 12.24 | 18.15 | 60.17 | 2.65 | 0.69 | 8.63 | 58.24 |
F9 | 0 | 0.08 | 5.73 | 3.08 | 10.69 | 0.02 | 1.97 | 1.67 | 5.79 | 0.06 | 2.05 | 2.98 | 7.51 |
F10 | 0 | 3.00 | 2.9 | 3.3 | 1.04 | 3.97 | 1.7 | 4.0 | 2.10 | 8.67 | 8.37 | 4.72 | 5.31 |
F11 | 0 | 3.69 | 5.3 | 6.3 | 1.89 | 2.01 | 1.1 | 3.0 | 1.46 | 9.75 | 4.38 | 6.33 | 1.57 |
F12 | 0 | 1.70 | 1.4 | 3.4 | 1.34 | 7.52 | 1.1 | 2.3 | 1.34 | 8.15 | 5.74 | 8.91 | 1.35 |
F13 | 0 | 3.04 | 7.3 | 7.9 | 2.56 | 2.20 | 1.7 | 2.1 | 8.23 | 0.00 | 0.09 | 1.12 | 1.97 |
F14 | 0 | 3.81 | 272.95 | 231.23 | 641.30 | 2.20 | 11.60 | 19.85 | 97.64 | 0.00 | 8.11 | 14.92 | 32.15 |
F15 | 0 | 3.66 | 162.21 | 113.05 | 317.92 | 2.20 | 32.20 | 25.02 | 99.73 | 0.00 | 71.98 | 115.32 | 458.21 |
F16 | 0 | 0.02 | 7.3 | 8.0 | 2.3 | 2.20 | 298.88 | 449.49 | 1.4 | 0.00 | 115.21 | 389.27 | 536.25 |
F17 | 0 | 0.21 | 8.4 | 3.8 | 1.4 | 5.09 | 3.9 | 2.5 | 8.1 | 0.05 | 0.81 | 5.32 | 1.58 |
F18 | 0 | 18.87 | 1.2 | 6.9 | 3.0 | 0.35 | 4.9 | 3.5 | 1.2 | 0.01 | 1.2 | 1.13 | 3.91 |
F19 | 0 | 5.77 | 88.24 | 67.55 | 177.00 | 2.20 | 13.94 | 20.79 | 71.94 | 0.00 | 8.21 | 15.32 | 38.27 |
F20 | 0 | 0.48 | 40.07 | 27.64 | 91.91 | 0.36 | 36.74 | 25.71 | 83.00 | 0.07 | 21.92 | 19.24 | 48.25 |
F21 | 0 | 2.71 | 123.60 | 115.50 | 307.50 | 2.20 | 7.96 | 16.57 | 63.77 | 0.00 | 4.85 | 7.36 | 41.28 |
F22 | 0 | 1.00 | 117.90 | 120.80 | 326.00 | 1.00 | 13.02 | 19.16 | 80.00 | 1.00 | 8.52 | 17.23 | 56.34 |
F23 | 0 | 1.18 | 1.27 | 7.61 | 2.24 | 3.38 | 2.26 | 1.84 | 7.56 | 1.15 | 23.54 | 45.89 | 98.54 |
F24 | 0 | 0.03 | 2.45 | 1.45 | 1.06 | 2.31 | 600.70 | 314.90 | 1.14 | 0.01 | 81.37 | 96.24 | 248.54 |
F25 | 0 | 15.61 | 1.53 | 981.90 | 3.32 | 8.33 | 507.40 | 368.30 | 1.48 | 1.25 | 123.11 | 251.25 | 652.54 |
ID | Global Minimum | HAM | ABC | ACO | PSO | MBO-ABCFE | |||||
---|---|---|---|---|---|---|---|---|---|---|---|
F1 | 0 | 2.46 | 5.08 | 8.44 | 1.42 | 1.16 | 1.50 | 1.36 | 1.61 | 0.00 | 0.72 |
F6 | 0 | 9.87 | 2.15 | 1.43 | 1.33 | 4.49 | 1.34 | 4.31 | 8.21 | 1.00 | 7.35 |
F10 | 0 | 5.64 | 1.19 | 1.66 | 1.72 | 1.57 | 8.26 | 1.53 | 7.23 | 3.12 | 4.32 |
F11 | 0 | 4.94 | 7.50 | 4.13 | 1.54 | 1.35 | 1.60 | 4.01 | 2.73 | 0.00 | 2.83 |
F14 | 0 | 4.69 | 5.79 | 6.85 | 1.10 | 1.22 | 1.12 | 7.58 | 3.39 | 0.00 | 0.03 |
F15 | 0 | 3.99 | 4.46 | 3.09 | 6.72 | 1.02 | 1.59 | 1.29 | 1.65 | 0.00 | 3.52 |
F16 | 0 | 7.54 | 1.79 | 7.54 | 1.79 | 8.56 | 1.91 | 2.22 | 6.13 | 0.00 | 35.21 |
F18 | 0 | 3.93 | 4.21 | 8.92 | 1.62 | 2.78 | 7.95 | 3.17 | 8.54 | 0.02 | 2.19 |
F19 | 0 | 7.66 | 1.38 | 9.01 | 3.21 | 1.49 | 4.98 | 2.52 | 4.76 | 0.00 | 1.27 |
F20 | 0 | 2.67 | 5.58 | 4.12 | 6.07 | 1.91 | 3.97 | 3.20 | 5.30 | 0.01 | 9.21 |
F21 | 0 | 8.07 | 2.26 | 7.13 | 3.66 | 1.33 | 3.32 | 1.46 | 2.45 | 0.00 | 0.09 |
F22 | 0 | 5.07 | 5.16 | 8.03 | 1.41 | 5.33 | 1.96 | 5.39 | 9.09 | 0.79 | 0.98 |
Category | Hyperparameter | Notation |
---|---|---|
Convolutional layer | Number of convolutional layers | |
Number of filters | ||
Filter size | ||
Activation function | ||
Pooling layer size | ||
Fully-connected layer | Number of fully-connected layers | |
Connectivity pattern | ||
Number of units | ||
Activation function | ||
Weight regularization | ||
Dropout | d | |
General hyperparameters | Batch size | |
Learning rule | ||
Learning rate |
No. of Structures | Mean | StdDev | Median | Minimum | Maximum |
---|---|---|---|---|---|
1 | 0.5015 | 0.0609 | 0.505 | 0.41 | 0.63 |
2 | 0.5200 | 0.0219 | 0.520 | 0.47 | 0.55 |
3 | 0.4995 | 0.0387 | 0.515 | 0.42 | 0.56 |
4 | 0.4845 | 0.0562 | 0.485 | 0.37 | 0.57 |
5 | 0.4330 | 0.0320 | 0.440 | 0.39 | 0.48 |
6 | 0.5110 | 0.0319 | 0.515 | 0.45 | 0.57 |
7 | 0.4035 | 0.0282 | 0.410 | 0.36 | 0.45 |
8 | 0.5340 | 0.0348 | 0.545 | 0.46 | 0.59 |
9 | 0.4520 | 0.0314 | 0.450 | 0.39 | 0.5 |
10 | 0.5470 | 0.0435 | 0.555 | 0.46 | 0.6 |
11 | 0.4535 | 0.0348 | 0.450 | 0.41 | 0.52 |
12 | 0.5080 | 0.0595 | 0.505 | 0.42 | 0.61 |
13 | 0.5085 | 0.0395 | 0.490 | 0.46 | 0.59 |
14 | 0.5685 | 0.0385 | 0.575 | 0.49 | 0.62 |
15 | 0.4650 | 0.0380 | 0.475 | 0.40 | 0.52 |
16 | 0.4965 | 0.0320 | 0.500 | 0.44 | 0.54 |
17 | 0.5570 | 0.0762 | 0.540 | 0.44 | 0.71 |
18 | 0.5765 | 0.0385 | 0.595 | 0.50 | 0.62 |
19 | 0.5510 | 0.0465 | 0.535 | 0.49 | 0.65 |
20 | 0.5185 | 0.0283 | 0.520 | 0.47 | 0.58 |
No. of Structures | Mean | StdDev | Median | Minimum | Maximum |
---|---|---|---|---|---|
1 | 0.5250 | 0.0326 | 0.525 | 0.47 | 0.59 |
2 | 0.4775 | 0.0497 | 0.475 | 0.40 | 0.58 |
3 | 0.5215 | 0.0217 | 0.520 | 0.47 | 0.56 |
4 | 0.4490 | 0.0460 | 0.450 | 0.36 | 0.53 |
5 | 0.5030 | 0.0332 | 0.500 | 0.44 | 0.57 |
6 | 0.5155 | 0.0277 | 0.520 | 0.45 | 0.57 |
7 | 0.4925 | 0.0288 | 0.485 | 0.45 | 0.55 |
8 | 0.4745 | 0.0401 | 0.475 | 0.41 | 0.54 |
9 | 0.4355 | 0.0353 | 0.430 | 0.38 | 0.48 |
10 | 0.4385 | 0.0307 | 0.450 | 0.39 | 0.49 |
11 | 0.4520 | 0.0506 | 0.445 | 0.35 | 0.54 |
12 | 0.3650 | 0.0132 | 0.370 | 0.34 | 0.39 |
13 | 0.5055 | 0.0229 | 0.500 | 0.45 | 0.55 |
14 | 0.4680 | 0.0294 | 0.460 | 0.42 | 0.51 |
15 | 0.4600 | 0.0446 | 0.460 | 0.40 | 0.58 |
16 | 0.4960 | 0.0559 | 0.510 | 0.42 | 0.59 |
17 | 0.4800 | 0.0288 | 0.485 | 0.43 | 0.54 |
18 | 0.4855 | 0.0319 | 0.480 | 0.44 | 0.56 |
19 | 0.4120 | 0.0220 | 0.415 | 0.37 | 0.45 |
20 | 0.4400 | 0.0383 | 0.435 | 0.38 | 0.49 |
Method | Error Rate (%) |
---|---|
CNN LeNet-5 [1] | 0.95 |
CNN (2 conv, 1 dense, ReLU) with DropConnect [22] | 0.57 |
CNN (2 conv, 1 dense, ReLU) with dropout [22] | 0.52 |
CNN (3 conv maxout, 1 dense) with dropout [101] | 0.45 |
CNN with multi-loss regularization [102] | 0.42 |
Verbancsics et al. [103] | 7.9 |
EXACT [104] | 1.68 |
DEvol [105] | 0.60 |
Baldominos et al. [31] | 0.37 |
MBO-CNN | 0.36 |
MBO-ABCFE-CNN | 0.34 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Bacanin, N.; Bezdan, T.; Tuba, E.; Strumberger, I.; Tuba, M. Monarch Butterfly Optimization Based Convolutional Neural Network Design. Mathematics 2020, 8, 936. https://doi.org/10.3390/math8060936
Bacanin N, Bezdan T, Tuba E, Strumberger I, Tuba M. Monarch Butterfly Optimization Based Convolutional Neural Network Design. Mathematics. 2020; 8(6):936. https://doi.org/10.3390/math8060936
Chicago/Turabian StyleBacanin, Nebojsa, Timea Bezdan, Eva Tuba, Ivana Strumberger, and Milan Tuba. 2020. "Monarch Butterfly Optimization Based Convolutional Neural Network Design" Mathematics 8, no. 6: 936. https://doi.org/10.3390/math8060936
APA StyleBacanin, N., Bezdan, T., Tuba, E., Strumberger, I., & Tuba, M. (2020). Monarch Butterfly Optimization Based Convolutional Neural Network Design. Mathematics, 8(6), 936. https://doi.org/10.3390/math8060936