Optimizing a Multi-Layer Perceptron Based on an Improved Gray Wolf Algorithm to Identify Plant Diseases
Abstract
:1. Introduction
1.1. Motivations behind the Present Work
- The performance of the GWO should be ensured to become infinitely close to the global optimal solution.
- Local stagnation should be relieved.
- The possible defects of MLP include overfitting, difficulty in determining the optimal structure, long training times, and the ease of falling into a locally optimal solution. The improved GWO-MLP alleviates these problems and increases the optimization capability of the MLP to improve the classification rate and alleviate local stagnation.
1.2. Contribution of This Study
- EGWO with the non-linear change of parameter a contributes to the balance between the exploration and exploitation capability.
- The introduction of the chaotic disturbance mechanism is conducive to search diversity.
- The candidate migration mechanism ensures the accuracy of the global optimal solution to strengthen the global convergence ability.
- The attacking mechanism guarantees a trade-off between the exploration and exploitation capabilities.
- The EGWO-MLP model is built to identify crop disease.
2. Literature Review
2.1. Meta heuristic Optimization Algorithms (MOAs)
- (1)
- Evolutionary computation class algorithms
- (2)
- Swarm intelligence optimization algorithm
- (3)
- Physics-based optimization algorithm
- (4)
- Human-based optimization algorithms
2.2. Improved GWO and Its Application
- Mechanism Innovation
- Practical Application
- Feature Selection
- Optimization of the Artificial Neural Network (ANN)
- Algorithm Optimization of the MLP
3. Method
3.1. Multi-Layer Perception
3.2. Gray Wolf Optimizer
3.2.1. Principle of Motion
- To track and approach prey.
- To harass, chase, and surround prey until the prey stops moving.
- To attack the prey.
3.2.2. Insufficiency of the Algorithm
3.3. The Proposed Enhanced Gray Wolf Optimizer Algorithm (EGWO)
3.3.1. Chaotic Disturbance
3.3.2. Candidate Migration Mechanism
3.3.3. Attacking Mechanism
3.4. Computational Complexity of EGWO Algorithm
4. Combing EGWO with Multi-Layer Perceptron (EGWO-MLP)
4.1. EGWO-MLP Optimization Model
4.2. Encoding Mechanism
4.3. Evaluation Criteria
4.4. Selection of Activation Function
- Its derivative reduces decay and dilution errors, signal problems, oscillation problems, and asymmetrical input problems. When this function is used to solve problems, it can be used for category classification and is suitable for prediction [78].
- The segmented linear recursive approximation method calculates the Sigmoid function and its derivatives in artificial neurons. This method helps the neuron to estimate the Sigmoid function and its derivatives more accurately during the learning process so that the neuron can better process the input data and output the correct results [79].
5. Experimental Preparation
5.1. Experimental Setting
- Experimental environment. The experiment codes are executed in Matlab R2015b under the Windows 10 operating system, all simulations were run on a computer with an Intel (R) Core(TM) i5-9300 CPU @ 2.40 GHz, and its memory is 8G. Thirty runs for each working access the predictive performances. The population is set to 30, and the max iteration is 500 for the IEEE CEC 2014 benchmark functions and 100 iterations for the UCI dataset to verify the EGWO and EGWO-MLP.
- Data processing. In order to eliminate the dimensional impact between indicators, data are standardized to achieve comparability between data indicators. After the original data are standardized, all indicators are on the same order of magnitude so that comprehensive processing can occur. The experiment in this paper will process the data to the range of [0, 1], which uses the method of Min-Max normalization. Min-Max normalization can be computed by Equation (47).
5.2. Comparison Algorithm Selection
5.2.1. GWO Variant
- Improved gray wolf optimization (IGWO) introduces an adaptive weight mechanism, which can dynamically adjust an individual’s weight according to its fitness value so that individuals with a higher fitness have more significant influence [80]. Through this mechanism, the IGWO algorithm can more effectively explore the search space and speed up the convergence. It has many applications for solving complex optimization problems, parameter optimization, feature selection, etc.
- Greedy non-hierarchical gray wolf optimizer (G-NHGWO) introduces a greedy strategy to increase the locality of the search [81]. In addition, the method also adopts a non-hierarchical optimization strategy, which avoids the use of fixed weight factors. G-NHGWO can search for the best solution more efficiently in practical problems and provide more accurate and reliable optimization results.
- Weighted distance gray wolf optimizer (WdGWO) introduces the concept of a weighted distance, which measures how close an individual wolf is to the current global best solution [52]. The WdGWO algorithm exploits the notion of social hierarchy among gray wolves to guide the search for promising regions of the solution space.
5.2.2. Traditional Algorithms
- Particle swarm optimization (PSO) has a more vital ability to explore the solution set space for non-convex optimization problems. It is relatively simple, and the calculation process is separated from the problem model. As a population-based meta-heuristic algorithm, PSO is applicable to distributed computing and can effectively improve the computing power. Its speed (update) mechanism, inertia, and other factors can be well optimized for parameter optimization in ANNs [38]
- Differential evolution (DE) is a heuristic random search algorithm based on population differences. The differential evolution algorithm has the advantages of simple principles, few controlled parameters, and strong robustness [82].
- The Bat Algorithm (BA) is an optimization algorithm for simulating bat swarm behavior, which has multiple advantages such as parallelism, an adaptive search strategy, diversity maintenance, a relatively simple implementation, powerful global search capability, and adaptability. Its adaptability can adjust the search strategy according to the characteristics of the problem and improve the robustness and global search ability. Randomness and exploration operations are introduced to maintain population diversity, avoid falling into local optimal solutions, and provide more comprehensive search space coverage [83].
- The Tree-seed Algorithm (TSA) has a simpler structure, a higher search accuracy, and a stronger robustness than some traditional intelligent optimization algorithms [84].
- The Sine-Cosine optimization algorithm (SCA) is a random optimization algorithm that is highly flexible, simple in principle, and easy to implement. It can be easily applied to optimization problems in different fields [85].
- The Butterfly Optimization Algorithm (BOA) solves global optimization problems by mimicking the food searching and mating behavior of butterflies [86]. The design framework of the algorithm is mainly based on the foraging strategy of butterflies looking for nectar or mating partners, in which butterflies use their sense of smell to determine the target location. The BOA algorithm draws on this foraging behavior and combines the characteristics of optimization algorithms to provide an effective solution for complex global optimization problems.
5.2.3. Recent Algorithms
- The Spider Wasp Optimizer (SWO) is inspired by the hunting, nesting, and mating behavior of female spider wasps in nature [87]. Furthermore, it shows promising results in solving various optimization problems with different exploration and development requirements through various unique update strategies.
- The Zebra Optimization Algorithm (ZOA) is a heuristic optimization algorithm that simulates the behavior of zebra swarms and is used to solve optimization problems [88]. The ZOA algorithm searches for the optimal solution in the solution space by imitating the foraging and migration strategies of the zebra population. The core idea of the ZOA is to regard the candidate solution of the problem as an individual zebra and search by simulating the foraging and migration behavior of zebra.
- The Reptile Search Algorithm (RSA) is inspired by the hunting behavior of crocodiles [89]. The implementation of the algorithm includes two key steps: encirclement and hunting. This makes the RSA algorithm adaptable to different optimization problems and have better exploration and exploitation capabilities.
- The Brown-bear Optimization Algorithm (BOA) is inspired by the communication patterns of pedal scent marking and sniffing behavior of brown bears, and utilizes the communication behavior characteristics of brown bears and provides an effective solution by simulating their strategies in finding food and marking territory [90].
- The Osprey Optimization Algorithm (OOA) mimics the behavior of ospreys in nature and is mainly inspired by an osprey’s strategy when fishing at sea [91]. Ospreys detect the location of their prey, hunt it down, and bring it to a suitable place to enjoy it. OOA algorithms can efficiently solve various optimization problems and balance exploration and exploitation during the search process.
- The Cheetah Optimizer (CO) is proposed by simulating cheetahs’ hunting behavior and related strategies. The cheetah optimizer can effectively solve various optimization problems and adapt to complex environments [92].
5.3. Standard Test Set
5.3.1. IEEE CEC 2014 Benchmark Functions
5.3.2. University of California, Irvine Dataset (UCI Dataset)
6. Analysis and Discussion of Experimental Results
6.1. Analysis and Discussion of Results on IEEE CEC 2014 Benchmark Functions
6.2. Analysis and Discussion of Results on UCI Dataset
6.3. Advantages and Disadvantages
- The EGWO algorithm can quickly find the global optimal solution in solving the single-mode simple function problem but can ensure the accuracy of the optimal solution.
- The EGWO-MLP model has apparent advantages in solving multi-classification problems, such as a fast convergence and a strong stability, ensuring a high classification rate.
- In combinatorial function problems, local stagnation occurs when searching for the global optimal solution.
- The performance of the EGWO-MLP model in solving single classification problems is not very significant.
7. EGWO-MLP Identification Model
7.1. Soybean (Large) Dataset
7.2. Identification Model (EGWO-MLP) Parameter Setting
7.3. Experimental Analysis
8. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Ratnadass, A.; Fernandes, P.; Avelino, J.; Habib, R. Plant species diversity for sustainable management of crop pests and diseases in agroecosystems: A review. Agron. Sustain. Dev. 2012, 32, 142–149. [Google Scholar] [CrossRef] [Green Version]
- Donatelli, M.; Magarey, R.D.; Bregaglio, S.; Willocquet, L.; Whish, J.P.; Savary, S. Modelling the impacts of pests and diseases on agricultural systems. Agric. Syst. 2017, 155, 213–224. [Google Scholar] [CrossRef]
- Wrather, J.; Anderson, T.; Arsyad, D.; Tan, Y.; Ploper, L.D.; Porta-Puglia, A.; Ram, H.; Yorinori, J. Soybean disease loss estimates for the top ten soybean-producing counries in 1998. Can. J. Plant Pathol. 2001, 23, 115–121. [Google Scholar] [CrossRef]
- Qin, W.; Xue, X.; Zhang, S.; Gu, W.; Wang, B. Droplet deposition and efficiency of fungicides sprayed with small UAV against wheat powdery mildew. Int. J. Agric. Biol. Eng. 2018, 11, 27–32. [Google Scholar] [CrossRef] [Green Version]
- Ficke, A.; Cowger, C.; Bergstrom, G.; Brodal, G. Understanding yield loss and pathogen biology to improve disease management: Septoria nodorum blotch—A case study in wheat. Plant Dis. 2018, 102, 696–707. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kulkarni, O. Crop disease detection using deep learning. In Proceedings of the 2018 Fourth International Conference on Computing Communication Control and Automation (ICCUBEA 2018), Pune, India, 16–18 August 2018. [Google Scholar]
- Park, H.; JeeSook, E.; Kim, S.-H. Crops disease diagnosing using image-based deep learning mechanism. In Proceedings of the 2018 International Conference on Computing and Network Communications (CoCoNet 2018), Astana, Kazakhstan, 15–17 August 2018. [Google Scholar]
- Xiong, Y.; Liang, L.; Wang, L.; She, J.; Wu, M. Identification of cash crop diseases using automatic image segmentation algorithm and deep learning with expanded dataset. Comput. Electron. Agric. 2020, 177, 105712. [Google Scholar] [CrossRef]
- Devi, N.; Sarma, K.K.; Laskar, S. Design of an intelligent bean cultivation approach using computer vision, IoT and spatio-temporal deep learning structures. Ecol. Inform. 2023, 75, 102044. [Google Scholar] [CrossRef]
- Barbedo, J.G.A. Plant disease identification from individual lesions and spots using deep learning. Biosyst. Eng. 2018, 180, 96–107. [Google Scholar] [CrossRef]
- Chen, J.; Chen, J.; Zhang, D.; Sun, Y.; Nanehkaran, Y.A. Using deep transfer learning for image-based plant disease identification. Comput. Electron. Agric. 2020, 173, 105393. [Google Scholar] [CrossRef]
- Goncalves, J.P.; Pinto, F.A.; Queiroz, D.M.; Villar, F.M.; Barbedo, J.G.; Del Ponte, E.M. Deep learning architectures for semantic segmentation and automatic estimation of severity of foliar symptoms caused by diseases or pests. Biosyst. Eng. 2021, 210, 129–142. [Google Scholar] [CrossRef]
- Keyvanpour, M.R.; Shirzad, M.B. Machine learning techniques for agricultural image recognition. In Application of Machine Learning in Agriculture; Elsevier: Amsterdam, The Netherlands, 2022; pp. 283–305. [Google Scholar]
- Camero, A.; Toutouh, J.; Alba, E. Random error sampling-based recurrent neural network architecture optimization. Eng. Appl. Artif. Intell. 2020, 96, 103946. [Google Scholar] [CrossRef]
- Maurya, L.S.; Hussain, M.S.; Singh, S. Machine learning classification models for student placement prediction based on skills. Int. J. Artif. Intell. Soft Comput. 2022, 7, 194–207. [Google Scholar] [CrossRef]
- Wang, G.; Sim, K.C. Sequential classification criteria for NNs in automatic speech recognition. In Proceedings of the Twelfth Annual Conference of the International Speech Communication Association, Florence, Italy, 27–31 August 2021. [Google Scholar]
- Baser, P.; Saini, J.R.; Kotecha, K. Tomconv: An improved cnn model for diagnosis of diseases in tomato plant leaves. Procedia Comput. Sci. 2023, 218, 1825–1833. [Google Scholar] [CrossRef]
- Hush, D.R.; Horne, B.G. Progress in supervised neural networks. IEEE Signal Process. Mag. 1993, 10, 8–39. [Google Scholar] [CrossRef]
- Monti, F.; Boscaini, D.; Masci, J.; Rodola, E.; Svoboda, J.; Bronstein, M.M. Geometric deep learning on graphs and manifolds using mixture model CNNs. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 22–25 July 2017; pp. 5115–5124. [Google Scholar]
- Srivastava, A.; Singh, A.; Tiwari, A.K. An efficient hybrid approach for the prediction of epilepsy using CNN with LSTM. Int. J. Artif. Intell. Soft Comput. 2022, 7, 179–193. [Google Scholar] [CrossRef]
- Mirjalili, S. How effective is the grey wolf optimizer in training Multi-Layer Perceptrons. Appl. Intell. 2015, 43, 150–161. [Google Scholar] [CrossRef]
- Zhang, C.; Pan, X.; Li, H.; Gardiner, A.; Sargent, I.; Hare, J.; Atkinson, P.M. A hybrid MLP-CNN classifier for very fine resolution remotely sensed image classification. ISPRS J. Photogramm. Remote Sens. 2018, 140, 133–144. [Google Scholar] [CrossRef] [Green Version]
- Das, H.; Jena, A.K.; Nayak, J.; Naik, B.; Behera, H. A novel PSO based Back Propagation learning-MLP (PSO-BP-MLP) for Classification. In Computational Intelligence in Data Mining-Volume 2, Proceedings of the International Conference on CIDM, 20–21 December 2014; Springer: Berlin/Heidelberg, Germany, 2015; pp. 461–471. [Google Scholar]
- Singh, K.J.; De, T. MLP-GA based algorithm to detect application layer DDoS attack. J. Inf. Secur. Appl. 2017, 36, 145–153. [Google Scholar] [CrossRef]
- Sheikhan, M.; Mohammadi, N. Neural-based electricity load forecasting using hybrid of GA and ACO for feature selection. Neural Comput. Appl. 2012, 21, 1961–1970. [Google Scholar] [CrossRef]
- Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Let a biogeography-based optimizer train your Multi-Layer Perceptron. Inf. Sci. 2014, 269, 188–209. [Google Scholar] [CrossRef]
- Emary, E.; Zawbaa, H.M.; Grosan, C.; Hassenian, A.E. Feature subset selection approach by Gray-Wolf Optimization. In Proceedings of the Afro-European Conference for Industrial Advancement, Ababa, Ethiopia, 17–19 November 2015; pp. 1–13. [Google Scholar]
- Meng, X.; Jiang, J.; Wang, H. AGWO:Advanced GWO in multi-layer perception optimization. Expert Syst. Appl. 2021, 173, 114676. [Google Scholar] [CrossRef]
- Mittal, N.; Singh, U.; Sohi, B.S. Modified grey wolf optimizer for global engineering optimization. Appl. Comput. Intell. Soft Comput. 2016, 2016, 7950348. [Google Scholar] [CrossRef] [Green Version]
- Zhu, A.; Xu, C.; Li, Z.; Wu, J.; Liu, Z. Hybridizing grey wolf optimization with differential evolution for global optimization and test scheduling for 3d stacked SoC. J. Syst. Eng. Electron. 2015, 26, 317–328. [Google Scholar] [CrossRef]
- Kamboj, V.K. A novel hybrid PSO–GWO approach for unit commitment problem. Neural Comput. Appl. 2016, 27, 1643–1655. [Google Scholar] [CrossRef]
- Gómez, D.; Rojas, A. An empirical overview of the No Free Lunch Theorem and its effect on real-world machine learning classification. Neural Comput. 2016, 28, 216–228. [Google Scholar] [CrossRef] [Green Version]
- Shadkam, E.; Bijari, M. A novel improved cuckoo optimisation algorithm for engineering optimisation. Int. J. Artif. Intell. Soft Comput. 2020, 7, 164–177. [Google Scholar] [CrossRef]
- Mostafa Bozorgi, S.; Yazdani, S. IWOA: An improved whale optimization algorithm for optimization problems. J. Comput. Des. Eng. 2019, 6, 243–259. [Google Scholar]
- Nadimi-Shahraki, M.H.; Taghian, S.; Mirjalili, S.; Faris, H. MTDE: An effective multi-trial vector-based differential evolution algorithm and its applications for engineering design problems. Appl. Soft Comput. 2020, 97, 106761. [Google Scholar] [CrossRef]
- Nadimi-Shahraki, M.H.; Taghian, S.; Zamani, H.; Mirjalili, S.; Elaziz, M.A. MMKE: Multi-trial vector-based monkey king evolution algorithm and its applications for engineering optimization problems. PLoS ONE 2023, 18, e0280006. [Google Scholar] [CrossRef] [PubMed]
- Mullen, R.J.; Monekosso, D.; Barman, S.; Remagnino, P. A review of ant algorithms. Expert Syst. Appl. 2009, 36, 9608–9617. [Google Scholar] [CrossRef]
- Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; Volume 4, pp. 1942–1948. [Google Scholar]
- Pourpanah, F.; Wang, R.; Lim, C.P.; Wang, X.Z.; Yazdani, D. A review of artificial fish swarm algorithms: Recent advances and applications. Artif. Intell. Rev. 2023, 56, 1867–1903. [Google Scholar] [CrossRef]
- Nadimi-Shahraki, M.H.; Moeini, E.; Taghian, S.; Mirjalili, S. DMFO-CD: A discrete moth-flame optimization algorithm for community detection. Algorithms 2021, 14, 314. [Google Scholar] [CrossRef]
- Rutenbar, R.A. Simulated annealing algorithms: An overview. IEEE Circuits Devices Mag. 2023, 5, 1867–1903. [Google Scholar] [CrossRef]
- Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A gravitational search algorithm. Inf. Sci. 2023, 179, 2232–2248. [Google Scholar] [CrossRef]
- Li, L.L.; Chang, Y.B.; Tseng, M.L.; Liu, J.Q.; Lim, M.K. Wind power prediction using a novel model on wavelet decomposition-support vector machines-improved atomic search algorithm. J. Clean. Prod. 2020, 270, 121817. [Google Scholar] [CrossRef]
- Erol, O.K.; Eksin, I. A new optimization method: Big bang–big crunch. Adv. Eng. Softw. 2006, 37, 106–111. [Google Scholar] [CrossRef]
- Shaheen, A.M.; Ginidi, A.R.; El-Sehiemy, R.A.; Ghoneim, S.S. A forensic-based investigation algorithm for parameter extraction of solar cell models. IEEE Access 2020, 9, 1–20. [Google Scholar] [CrossRef]
- Askari, Q.; Younas, I.; Saeed, M. Political Optimizer: A novel socio-inspired meta-heuristic for global optimization. Knowl.-Based Syst. 2020, 159, 105709. [Google Scholar] [CrossRef]
- Askari, Q.; Saeed, M.; Younas, I. Heap-based optimizer inspired by corporate rank hierarchy for global optimization. Expert Syst. Appl. 2020, 161, 113702. [Google Scholar] [CrossRef]
- Jiang, J.; Zhao, Z.; Liu, Y.; Li, W.; Wang, H. Dsgwo: An improved grey wolf optimizer with diversity enhanced strategy based on group-stage competition and balance mechanisms. Knowl.-Based Syst. 2022, 250, 109100. [Google Scholar] [CrossRef]
- Duan, Y.; Yu, X. A collaboration-based hybrid GWO-SCA optimizer for engineering optimization problems. Expert Syst. Appl. 2023, 213, 119017. [Google Scholar] [CrossRef]
- Singh, N.; Singh, S. A novel hybrid GWO-SCA approach for optimization problems. Eng. Sci. Technol. 2017, 20, 1586–1601. [Google Scholar] [CrossRef]
- Nadimi-Shahraki, M.H.; Taghian, S.; Mirjalili, S.; Zamani, H.; Bahreininejad, A. GGWO: Gaze cues learning-based grey wolf optimizer and its applications for solving engineering problems. J. Comput. Sci. 2022, 61, 101636. [Google Scholar] [CrossRef]
- Malik, M.R.S.; Mohideen, E.R.; Ali, L. Weighted distance grey wolf optimizer for global optimization problems. In Proceedings of the 2015 IEEE International Conference on Computational Intelligence and Computing Research (ICCIC 2015), Madurai, India, 10–12 December 2015. [Google Scholar]
- Long, W.; Jiao, J.; Liang, X.; Tang, M. An exploration-enhanced grey wolf optimizer to solve high-dimensional numerical optimization. Eng. Appl. Artif. Intell. 2018, 68, 63–80. [Google Scholar] [CrossRef]
- Kannan, K.; Yamini, B.; Fernandez, F.M.H.; Priyadarsini, P.U. A novel method for spectrum sensing in cognitive radio networks using fractional GWO-CS optimization. Ad Hoc Netw. 2023, 103135. [Google Scholar] [CrossRef]
- Wang, F.; Zhao, S.; Wang, L.; Zhou, Y.; Huang, T.; Shu, X. Study on FOG scale factor error calibration in start-up stage based on GWO-GRU. Measurement 2023, 206, 112214. [Google Scholar] [CrossRef]
- Lim, S.-J. Hybrid image embedding technique using Steganographic Signcryption and IWT-GWO methods. Microprocess. Microsyst. 2022, 95, 104688. [Google Scholar]
- Ocran, D.; Ikiensikimama, S.S.; Broni-Bediako, E. A compositional function hybridization of PSO and GWO for solving well placement optimization problem. Pet. Res. 2022, 7, 401–408. [Google Scholar]
- Yu, X.; Jiang, N.; Wang, X.; Li, M. A hybrid algorithm based on grey wolf optimizer and differential evolution for UAV path planning. Expert Syst. Appl. 2014, 215, 119327. [Google Scholar] [CrossRef]
- Pan, H.; Chen, S.; Xiong, H. A high-dimensional feature selection method based on modified Gray Wolf optimization. Appl. Soft Comput. 2023, 135, 110031. [Google Scholar] [CrossRef]
- Almomani, O. A feature selection model for network intrusion detection system based on pso, gwo, ffa and ga algorithms. Symmetry 2020, 12, 1046. [Google Scholar] [CrossRef]
- Dhal, P.; Azad, C. A multi-objective feature selection method using Newton’s law based PSO with GWO. Appl. Soft Comput. 2021, 107, 107394. [Google Scholar] [CrossRef]
- Tu, Q.; Chen, X.; Liu, X. Multi-strategy ensemble grey wolf optimizer and its application to feature selection. Appl. Soft Comput. 2019, 76, 16–30. [Google Scholar] [CrossRef]
- Al-Wajih, A.R.; Abdulkadir, S.J.; Aziz, N.; Al-Tashi, Q.; Talpur, N. Hybridbinary grey wolf with harris hawks optimizer for feature selection. IEEE Access 2021, 9, 31662–31677. [Google Scholar] [CrossRef]
- Abdollahzadeh, B.; Gharehchopogh, F.S. A multi-objective optimization algorithm for feature selection problems. Eng. Comput. 2022, 38, 1845–1863. [Google Scholar] [CrossRef]
- Nikoo, M.; Malekabadi, R.A.; Hafeez, G. Estimating the mechanical properties of Heat-Treated woods using Optimization algorithms-based ANN. Measurement 2023, 207, 112354. [Google Scholar] [CrossRef]
- Astarita, V.; Haghshenas, S.S.; Guido, G.; Vitale, A. Developing new hybrid grey wolf optimization-based artificial neural network for predicting road crash severity. Transp. Eng. 2023, 12, 100164. [Google Scholar] [CrossRef]
- Tian, Y.; Yu, J.; Zhao, A. Predictive model of energy consumption for office building by using improved GWO-BP. Energy Rep. 2020, 6, 620–627. [Google Scholar] [CrossRef]
- Amirsadri, S.; Mousavirad, S.J.; Ebrahimpour-Komleh, H. A levy flightbased grey wolf optimizer combined with back-propagation algorithm for neural network training. Neural Comput. Appl. 2018, 30, 3707–3720. [Google Scholar] [CrossRef]
- Mosavi, A.; Samadianfard, S.; Darbandi, S.; Nabipour, N.; Qasem, S.N.; Salwana, E.; Band, S.S. Predicting soil electrical conductivity using multilayer perceptron integrated with grey wolf optimizer. J. Geochem. Explor. 2021, 220, 106639. [Google Scholar] [CrossRef]
- Al-Badarneh, I.; Habib, M.; Aljarah, I.; Faris, H. Neuro-evolutionary models for imbalanced classification problems. J. King Saud Univ.-Comput. Inf. Sci. 2022, 34, 2787–2797. [Google Scholar] [CrossRef]
- Pasti, R.; de Castro, L.N. Bio-inspired and gradient-based algorithms to train MLPs: The influence of diversity. Inf. Sci. 2009, 179, 1441–1453. [Google Scholar] [CrossRef]
- Al-Majidi, S.D.; Abbod, M.F.; Al-Raweshidy, H.S. A particle swarm optimisation-trained feedforward neural network for predicting the maximum power point of a photovoltaic array. Eng. Appl. Artif. Intell. 2020, 92, 103688. [Google Scholar] [CrossRef]
- Mirjalili, S.; Hashim, S.Z.M.; Sardroudi, H.M. Training feedforward neural networks using hybrid particle swarm optimization and gravitational search algorithm. Inf. Sci. 2014, 218, 11125–11137. [Google Scholar] [CrossRef]
- Heidari, A.A.; Faris, H.; Aljarah, I.; Mirjalili, S. An efficient hybrid multilayer perceptron neural network with grasshopper optimization. Soft Comput. 2019, 23, 7941–7958. [Google Scholar] [CrossRef]
- Azzini, A.; Tettamanzi, A.G. Evolutionary ANNs: A state of the art survey. Intell. Artif. 2011, 25, 19–35. [Google Scholar] [CrossRef]
- Alecsa, C.D.; Pinţa, T.; Boros, I. New optimization algorithms for neural network training using operator splitting techniques. Neural Netw. 2020, 126, 178–190. [Google Scholar] [CrossRef] [Green Version]
- Ridge, B.; Gams, A.; Morimoto, J.; Ude, A. Training of deep neural networks for the generation of dynamic movement primitives. Neural Netw. 2020, 127, 121–131. [Google Scholar]
- Zhang, R.; Wang, Q.; Yang, Q.; Wei, W. Temporal link prediction via adjusted sigmoid function and 2-simplex structure. Sci. Rep. 2022, 12, 16585. [Google Scholar] [CrossRef] [PubMed]
- Basterretxea, K.; Tarela, J.M.; del Campo, I. Approximation of sigmoid function and the derivative for hardware implementation of artificial neurons. IEE Proc.-Circuits Devices Syst. 2004, 151, 18–24. [Google Scholar] [CrossRef]
- Nadimi-Shahraki, M.H.; Taghian, S.; Mirjalili, S. An improved grey wolf optimizer for solving engineering problems. Expert Syst. Appl. 2021, 166, 113917. [Google Scholar] [CrossRef]
- Akbari, E.; Rahimnejad, A.; Gadsden, S.A. A greedy non-hierarchical grey wolf optimizer for real-world optimization. Electron. Lett. 2021, 57, 499–501. [Google Scholar] [CrossRef]
- Storn, R.; Price, K. Differential Evolution—A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
- Yang, X.-S.; He, X. Bat algorithm: Literature review and applications. Int. J. Bio-Inspired Comput. 2013, 5, 141–149. [Google Scholar] [CrossRef] [Green Version]
- Kiran, M.S. TSA: Tree-seed algorithm for continuous optimization. Expert Syst. Appl. 2015, 42, 6686–6698. [Google Scholar] [CrossRef]
- Mirjalili, S. SCA: A Sine Cosine algorithm for solving optimization problems. Knowl.-Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
- Arora, S.; Singh, S. Butterfly optimization algorithm: A novel approach for global optimization. Soft Comput. 2019, 23, 715–734. [Google Scholar] [CrossRef]
- Abdel-Basset, M.; Mohamed, R.; Jameel, M.; Abouhawwash, M. Spider wasp optimizer: A novel meta-heuristic optimization algorithm. Artif. Intell. Rev. 2023, 1–64. [Google Scholar] [CrossRef]
- Trojovská, E.; Dehghani, M.; Trojovský, P. Zebra optimization algorithm: A new bio-inspired optimization algorithm for solving optimization algorithm. IEEE Access 2022, 10, 49445–49473. [Google Scholar] [CrossRef]
- Abualigah, L.; Abd Elaziz, M.; Sumari, P.; Geem, Z.W.; Gandomi, A.H. Reptile Search Algorithm (RSA): A nature-inspired meta-heuristic optimizer. Expert Syst. Appl. 2022, 191, 116158. [Google Scholar] [CrossRef]
- Prakash, T.; Singh, P.P.; Singh, V.P.; Singh, S.N. A novel Brown-bear optimization algorithm for solving economic dispatch problem. In Advanced Control & Optimization Paradigms for Energy System Operation and Management; River Publishers: New York, NY, USA, 2023; pp. 137–164. [Google Scholar]
- Dehghani, M.; Trojovský, P. Osprey optimization algorithm: A new bio-inspired metaheuristic algorithm for solving engineering optimization problems. Front. Mech. Eng. 2023, 8, 1126450. [Google Scholar] [CrossRef]
- Akbari, M.A.; Zare, M.; Azizipanah-Abarghooee, R.; Mirjalili, S.; Deriche, M. The cheetah optimizer: A nature-inspired metaheuristic algorithm for large-scale optimization problems. Sci. Rep. 2022, 8, 10953. [Google Scholar] [CrossRef] [PubMed]
- Liang, J.J.; Qu, B.Y.; Suganthan, P.N. Problem Definitions and Evaluation Criteria for the CEC 2014 Special Session and Competition on Single Objective Real-Parameter Numerical Optimization; Technical Report; Computational Intelligence Laboratory, Zhengzhou University: Zhengzhou, China; Nanyang Technological University: Singapore, 2013; Volume 635. [Google Scholar]
Algorithms | Parameters | Values |
---|---|---|
GWO variants | ||
WdGWO | Null | Null |
IGWO | Null | Null |
GNHGWO | Null | Null |
Traditional and popular algorithms | ||
PSO | Coefficient of the cognitive component | 2 |
Coefficient of the social component | 2 | |
DE | Scale factor primary (F) | 0.6 |
Crossover rate (Cr) | 0.8 | |
BA | Loudness (A) | 0.5 |
Pulse rate (a) | 0.5 | |
Frequency minimum | 0 | |
Frequency maximum | 2 | |
TSA | ST | 0.1 |
The number of seeds (ns) | [0.1 × N, 0.25 × N] | |
SCA | a | 2 |
Linearly decreased from a to 0 | ||
BOA | p | 0.8 |
power exponent | 0.1 | |
JAYA | sensory modality | 0.01 |
Null | Null | |
Traditional and popular algorithms | ||
SWO | TR | 0.3 |
Cr | 0.02 | |
Minimum population size | 20 | |
ZOA | Null | Null |
COA | Null | Null |
BOA | Null | Null |
OOA | Null | Null |
CO | search agents in a group | 2 |
Name of the Functions | Expression |
---|---|
Rotated High Conditioned Elliptic Function | |
Rotated Bent Cigar Function | |
Rotated Discus Function |
Name of the Functions | Expression |
---|---|
Shifted and Rotated Rosenbrock’s Function | |
Shifted and Rotated Ackley’s Function | |
Shifted and Rotated Weierstrass Function | |
Shifted and Rotated Griewank’s Function | |
Shifted Rastrigin’s Function | |
Shifted and Rotated Rastrigin’s Function | |
Shifted Schwefel’s Function | |
Shifted and Rotated Schwefel’s Function | |
Shifted and Rotated Katsuura Function | |
Shifted and Rotated HappyCat Function | |
Shifted and Rotated HGBat Function | |
Shifted and Rotated Expanded Griewank’s plus Rosenbrock’s Function | |
Shifted and Rotated Expanded Scaffer’s F6 Function |
Name of the Functions | Expression |
---|---|
p = | |
p = | |
p = | |
p = | |
p = | |
p = | |
Notes: | |
, , | |
, ,…, , |
Name of the Functions | Expression |
---|---|
Notes: | |
EGWO | GWO | GNHGWO | IGWO | WdGWO | EGWO | GWO | GNHGWO | IGWO | WdGWO |
---|---|---|---|---|---|---|---|---|---|
Mean | Rank | ||||||||
2 | 1 | 5 | 3 | 4 | |||||
1 | 2 | 5 | 3 | 4 | |||||
1 | 2 | 5 | 3 | 4 | |||||
1 | 2 | 5 | 3 | 4 | |||||
5 | 1 | 2 | 3 | 4 | |||||
2 | 1 | 5 | 3 | 4 | |||||
1 | 2 | 5 | 3 | 4 | |||||
2 | 1 | 5 | 3 | 4 | |||||
2 | 1 | 5 | 3 | 4 | |||||
2 | 1 | 5 | 3 | 4 | |||||
1 | 2 | 5 | 3 | 4 | |||||
1 | 2 | 5 | 3 | 4 | |||||
1 | 2 | 5 | 3 | 4 | |||||
1 | 2 | 5 | 3 | 4 | |||||
1 | 2 | 5 | 3 | 4 | |||||
1 | 2 | 5 | 3 | 4 | |||||
2 | 1 | 5 | 3 | 4 | |||||
5 | 1 | 2 | 3 | 4 | |||||
5 | 2 | 1 | 3 | 4 | |||||
1 | 2 | 5 | 3 | 4 | |||||
1 | 2 | 5 | 3 | 4 | |||||
1 | 2 | 5 | 3 | 4 | |||||
2 | 1 | 5 | 3 | 4 | |||||
2 | 1 | 5 | 3 | 4 | |||||
2 | 1 | 5 | 3 | 4 | |||||
2 | 1 | 5 | 3 | 4 | |||||
2 | 1 | 5 | 3 | 4 | |||||
2 | 5 | 1 | 3 | 4 | |||||
2 | 5 | 1 | 3 | 4 | |||||
2 | 5 | 1 | 3 | 4 | |||||
Average Ranking | 1.87 | 1.87 | 4.27 | 3 | 4 | ||||
Total Ranking | 1 | 1 | 4 | 2 | 3 |
Friedman ANOVA Test | Wilcoxon Rank Sum Test | ||||||||
---|---|---|---|---|---|---|---|---|---|
SS | df | MS | Chi-sq | p | p | ||||
EGWO vs. | GWO | 4374 | 29 | 150.828 | 56.44 | 0.0017 | 0.797098 | No | No |
GNHGWO | 4404 | 29 | 151.862 | 56.83 | 0.0015 | Yes | Yes | ||
IGWO | 4409 | 29 | 152.034 | 56.89 | 0.0015 | Yes | Yes | ||
WdGWO | 4472 | 29 | 154.207 | 57.7 | 0.0012 | 0.00532 | Yes | Yes |
EGWO | GA | BOA | SCA | TSA | JAYA | EGWO | GA | BOA | SCA | TSA | JAYA |
---|---|---|---|---|---|---|---|---|---|---|---|
Mean | Ranking | ||||||||||
1 | 5 | 4 | 3 | 6 | 2 | ||||||
5 | 1 | 4 | 3 | 2 | 6 | ||||||
1 | 3 | 5 | 4 | 6 | 2 | ||||||
5 | 1 | 4 | 3 | 6 | 2 | ||||||
5 | 4 | 1 | 6 | 3 | 2 | ||||||
1 | 5 | 3 | 4 | 6 | 2 | ||||||
5 | 1 | 4 | 3 | 6 | 2 | ||||||
1 | 5 | 3 | 4 | 2 | 6 | ||||||
1 | 5 | 3 | 4 | 2 | 6 | ||||||
1 | 5 | 6 | 4 | 3 | 2 | ||||||
1 | 5 | 3 | 4 | 6 | 2 | ||||||
5 | 1 | 4 | 6 | 3 | 2 | ||||||
5 | 1 | 4 | 3 | 6 | 2 | ||||||
5 | 1 | 4 | 6 | 3 | 2 | ||||||
1 | 5 | 4 | 3 | 2 | 6 | ||||||
1 | 5 | 3 | 4 | 6 | 2 | ||||||
1 | 5 | 4 | 6 | 3 | 2 | ||||||
5 | 1 | 4 | 6 | 3 | 2 | ||||||
5 | 1 | 4 | 6 | 3 | 2 | ||||||
1 | 5 | 4 | 3 | 6 | 2 | ||||||
1 | 5 | 4 | 6 | 3 | 2 | ||||||
1 | 5 | 4 | 6 | 3 | 2 | ||||||
3 | 5 | 1 | 4 | 6 | 2 | ||||||
3 | 1 | 5 | 4 | 2 | 6 | ||||||
3 | 1 | 5 | 4 | 2 | 6 | ||||||
3 | 1 | 5 | 4 | 6 | 2 | ||||||
1 | 5 | 4 | 3 | 6 | 2 | ||||||
1 | 5 | 6 | 3 | 4 | 2 | ||||||
3 | 5 | 1 | 4 | 6 | 2 | ||||||
3 | 5 | 1 | 4 | 6 | 2 | ||||||
Average Ranking | 2.60 | 3.43 | 3.70 | 4.23 | 4.23 | 2.80 | |||||
Total Ranking | 1 | 3 | 4 | 5 | 5 | 2 |
Friedman ANOVA Test | Wilcoxon Rank Sum Test | ||||||||
---|---|---|---|---|---|---|---|---|---|
SS | df | MS | Chi-sq | p | p | ||||
EGWO vs. | GA | 4416 | 29 | 152.276 | 56.98 | 0.0014 | Yes | Yes | |
BOA | 4197 | 29 | 144.724 | 54.15 | 0.0031 | 0.002279 | Yes | Yes | |
SCA | 4463 | 29 | 153.897 | 57.59 | 0.0012 | Yes | Yes | ||
TSA | 4373 | 29 | 150.793 | 56.43 | 0.0017 | 0.336552 | No | No | |
JAYA | 4432 | 29 | 152.828 | 57.19 | 0.0014 | Yes | Yes |
EGWO | ZOA | RSA | SWO | BOA | CO | OOA | EGWO | ZOA | RSA | SWO | BOA | CO | OOA |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Mean | Rank | ||||||||||||
1 | 2 | 5 | 4 | 3 | 6 | 7 | |||||||
1 | 2 | 5 | 3 | 4 | 7 | 6 | |||||||
2 | 1 | 5 | 3 | 7 | 4 | 6 | |||||||
1 | 2 | 5 | 3 | 4 | 6 | 7 | |||||||
2 | 5 | 1 | 3 | 7 | 4 | 6 | |||||||
1 | 2 | 5 | 3 | 7 | 4 | 6 | |||||||
1 | 2 | 5 | 3 | 4 | 6 | 7 | |||||||
2 | 1 | 5 | 7 | 3 | 4 | 6 | |||||||
2 | 1 | 5 | 7 | 3 | 4 | 6 | |||||||
2 | 1 | 5 | 7 | 3 | 6 | 4 | |||||||
2 | 1 | 5 | 3 | 7 | 4 | 6 | |||||||
2 | 7 | 1 | 3 | 5 | 4 | 6 | |||||||
1 | 2 | 5 | 3 | 4 | 6 | 7 | |||||||
1 | 2 | 5 | 3 | 4 | 6 | 7 | |||||||
1 | 2 | 5 | 3 | 7 | 4 | 6 | |||||||
2 | 1 | 3 | 5 | 7 | 4 | 6 | |||||||
1 | 2 | 5 | 4 | 6 | 3 | 7 | |||||||
1 | 2 | 5 | 6 | 4 | 3 | 7 | |||||||
1 | 2 | 5 | 6 | 4 | 3 | 7 | |||||||
2 | 1 | 5 | 3 | 7 | 4 | 6 | |||||||
1 | 2 | 5 | 4 | 7 | 3 | 6 | |||||||
1 | 2 | 5 | 6 | 4 | 3 | 7 | |||||||
2 | 3 | 5 | 7 | 4 | 1 | 6 | |||||||
2 | 3 | 5 | 7 | 4 | 1 | 6 | |||||||
2 | 3 | 5 | 7 | 4 | 1 | 6 | |||||||
5 | 2 | 3 | 7 | 4 | 1 | 6 | |||||||
7 | 5 | 1 | 6 | 3 | 2 | 4 | |||||||
7 | 5 | 1 | 3 | 2 | 6 | 4 | |||||||
7 | 3 | 5 | 1 | 2 | 4 | 6 | |||||||
5 | 7 | 1 | 2 | 3 | 6 | 4 | |||||||
Averagr Ranking | 2.27 | 2.53 | 4.20 | 4.40 | 4.57 | 4 | 6.03 | ||||||
Total Ranking | 1 | 2 | 4 | 5 | 6 | 3 | 7 |
Friedman ANOVA Test | Wilcoxon Rank Sum Test | ||||||||
---|---|---|---|---|---|---|---|---|---|
SS | df | MS | Chi-sq | p | p | ||||
EGWO vs. | ZOA | 4454 | 29 | 153.586 | 57.47 | 0.0013 | 0.047156 | Yes | Yes |
RSA | 4397 | 29 | 151.621 | 56.74 | 0.0015 | 0.000413 | Yes | Yes | |
SWO | 4416 | 29 | 152.276 | 56.98 | 0.0014 | Yes | Yes | ||
BOA | 4351 | 29 | 150.034 | 56.14 | 0.0018 | 0.022778 | Yes | Yes | |
CO | 4435 | 29 | 152.931 | 57.23 | 0.0013 | Yes | Yes | ||
OOA | 4208 | 29 | 145.103 | 54.3 | 0.003 | 0.015788 | Yes | Yes |
Tic-Tac-Toe Dataset | ||||||||
---|---|---|---|---|---|---|---|---|
EGWO-MLP | GWO-MLP | DE-MLP | TSA-MLP | PSO-MLP | BA-MLP | GA-MLP | SCA-MLP | |
Rate | 97.643% | 93.790% | 94.091% | 97.310% | 87.830% | 94.704% | 64.725% | 93.531% |
MSE | 0.005 | 0.001 | 0.013 | 0.013 | 0.017 | 0.017 | 0.028 | 0.015 |
Std. | 2.551 | 8.245 | 8.792 | 5.404 | 16.561 | 9.703 | 33.350 | 11.274 |
Heart Dataset | ||||||||
EGWO-MLP | GWO-MLP | DE-MLP | TSA-MLP | PSO-MLP | BA-MLP | GA-MLP | SCA-MLP | |
Rate | 85.292% | 89.042% | 79.167% | 71.417% | 59.833% | 57.000% | 44.750% | 70.042% |
MSE | 0.103 | 0.076 | 0.157 | 0.180 | 0.272 | 0.286 | 0.323 | 0.206 |
Std. | 33.188 | 3.074 | 3.586 | 3.796 | 9.042 | 9.311 | 8.423 | 3.543 |
XOR Dataset | ||||||||
EGWO-MLP | GWO-MLP | DE-MLP | TSA-MLP | PSO-MLP | BA-MLP | GA-MLP | SCA-MLP | |
Rate | 95.417% | 93.750% | 52.500% | 35.417% | 31.667% | 88.333% | 31.667% | 47.500% |
MSE | 0.005 | 0.010 | 0.045 | 0.065 | 0.191 | 0.011 | 0.191 | 0.090 |
Std. | 8.980 | 12.607 | 18.971 | 17.084 | 16.973 | 24.330 | 16.973 | 15.186 |
Balloon Dataset | ||||||||
EGWO-MLP | GWO-MLP | DE-MLP | TSA-MLP | PSO-MLP | BA-MLP | GA-MLP | SCA-MLP | |
Rate | 100% | 100 % | 100% | 44.333% | 43.333% | 59.000 % | 41.167% | 97.333% |
MSE | 0.184 | 0.197 | 0.119 | 0.210 | 0.001 | |||
Std. | 0 | 0 | 0 | 12.087 | 13.792 | 16.578 | 14.779 | 7.397 |
Attribution | Means | |
---|---|---|
1 | date | April, May, June, July, August, September, October, unknown. |
2 | plant-stand | normal, lt-normal, unknown. |
3 | precip | lt-norm, norm, gt-norm, unknown. |
4 | temp | lt-norm, norm, gt-norm, unknown. |
5 | hail | yes, no, unknown. |
6 | crop-hist | diff-lst-year, same-lst-yr, same-lst-two-yrs, same-lst-sev-yrs, unknown. |
7 | area-damaged | scattered, low-areas, upper-areas, whole-field, unknown. |
8 | severity | minor, pot-severe, severe, unknown. |
9 | seed-tmt | none, fungicide, other, unknown. |
10 | germination | 90–100%, 80–89%, lt–80%, unknown. |
11 | plant-growth | norm, abnorm, unknown. |
12 | leaves | norm, abnorm. |
13 | leafspots-halo | absent, yellow-halos, no-yellow-halos, unknown. |
14 | leafspots-marg | w-s-marg, no-w-s-marg, dna, unknown. |
15 | leafspot-size | lt-1/8, gt-1/8, dna, unknown. |
16 | leaf-shread | absent, present, unknown. |
17 | leaf-malf | absent, present, unknown. |
18 | leaf-mild | absent, upper-surf, lower-surf, unknown. |
19 | stem | norm, abnorm, unknown. |
20 | lodging | yes, no, unknown. |
21 | stem-cankers | absent, below-soil, above-soil, above-sec-nde, unknown. |
22 | canker-lesion | dna, brown, dk-brown-blk, tan, unknown. |
23 | fruiting-bodies | absent, present, unknown. |
24 | external decay | absent, firm-and-dry, watery, unknown. |
25 | mycelium | absent, present, unknown. |
26 | int-discolor | none, brown, black, unknown. |
27 | sclerotia | absent, present, unknown. |
28 | fruit-pods | norm, diseased, few-present, dna, unknown. |
29 | fruit spots | absent, colored, brown-w/blk-specks, distort, dna, unknown. |
30 | seed | norm, abnorm, unknown. |
31 | mold-growth | absent, present, unknown. |
32 | seed-discolor | absent, present, unknown. |
33 | seed-size | norm, lt-norm, unknown. |
34 | shriveling | absent, present, unknown. |
35 | roots | norm, rotted, galls-cysts, unknown. |
EGWO-MLP | PSOGWO-MLP | DE-MLP | TSA-MLP | PSO-MLP | BA-MLP | GA-MLP | SCA-MLP | |
---|---|---|---|---|---|---|---|---|
Rate | 98.763% | 77.204% | 68.548% | 91.505% | 51.935% | 22.957% | 39.677% | 64.677% |
Std. | 3.108 | 15.818 | 21.933 | 10.020 | 27.993 | 30.217 | 42.994 | 23.717 |
MSE | 12.627 | 80.225 | 100.142 | 36.019 | 118.036 | 104.711 | 40.241 | 125.489 |
Std. | 2.397 | 38.901 | 13.384 | 3.173 | 38.263 | 41.688 | 20.044 | 22.566 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Bi, C.; Tian, Q.; Chen, H.; Meng, X.; Wang, H.; Liu, W.; Jiang, J. Optimizing a Multi-Layer Perceptron Based on an Improved Gray Wolf Algorithm to Identify Plant Diseases. Mathematics 2023, 11, 3312. https://doi.org/10.3390/math11153312
Bi C, Tian Q, Chen H, Meng X, Wang H, Liu W, Jiang J. Optimizing a Multi-Layer Perceptron Based on an Improved Gray Wolf Algorithm to Identify Plant Diseases. Mathematics. 2023; 11(15):3312. https://doi.org/10.3390/math11153312
Chicago/Turabian StyleBi, Chunguang, Qiaoyun Tian, He Chen, Xianqiu Meng, Huan Wang, Wei Liu, and Jianhua Jiang. 2023. "Optimizing a Multi-Layer Perceptron Based on an Improved Gray Wolf Algorithm to Identify Plant Diseases" Mathematics 11, no. 15: 3312. https://doi.org/10.3390/math11153312
APA StyleBi, C., Tian, Q., Chen, H., Meng, X., Wang, H., Liu, W., & Jiang, J. (2023). Optimizing a Multi-Layer Perceptron Based on an Improved Gray Wolf Algorithm to Identify Plant Diseases. Mathematics, 11(15), 3312. https://doi.org/10.3390/math11153312