Enhanced Brain Storm Optimization Algorithm Based on Modified Nelder–Mead and Elite Learning Mechanism
Abstract
:1. Introduction
- (1)
- The basic Nelder–Mead method is modified and used as a local searcher to explore more promising search direction for the population, so as to reduce the probability of falling into local optimum;
- (2)
- An elite learning mechanism based on Euclidean distance is proposed to update the population with a certain probability. The evolution information hidden in the elite individuals is integrated into the population, which improves the convergence of the algorithm;
- (3)
- The reinitialization strategy is employed when the population is at the stagnation stage. The well-directed and revolutionary jumping can help the population jump out of the local optimum with a large probability, so as to explore a new evolutionary direction;
- (4)
- CEC2014 contest benchmark problems and two engineering prediction problems are used to evaluate the performance of the proposed algorithm.
2. Preliminaries
2.1. Brain Storm Optimization Algorithm
- (1)
- Clustering: At each generation, the population is divided into M clusters according to K-means algorithm. Each cluster selects the best idea as its center;
- (2)
- Parent idea construction: According to the probability index, one or two clusters are used to construct the parent individual Xselect.
- (3)
- New idea generation: After parent idea Xselect is generated, a new idea is generated according to the following formula:
- (4)
- Selection: The greedy selection strategy is used to select the better one from the parent idea Xi and the newly generated idea Xnew according to their fitness. The better one is kept as a parent idea in the next generation.
Algorithm 1 The BSO algorithm | |
1: | Randomly generate N individuals |
2: | Evaluate N individuals |
3: | while the termination criteria are not met do |
4: | Divide N individuals into M clusters by K-means |
5: | Select the cluster center for each cluster |
6: | if rand < p1 then |
7: | Randomly generated an individual to replace a selected cluster center |
8: | end if |
9: | for each individual Xi do |
10: | if rand < pone then |
11: | Select one cluster c according to Equation (1) |
12: | if rand < pone_center then |
13: | Xselect ← the center of the cluster c |
14: | else |
15: | Xselect ← Random select an individual from cluster c |
16: | end if |
17: | else |
18: | Randomly select two clusters c1 and c2 |
19: | if rand < ptwo_center then |
20: | Generate Xselect according to linear combination rule |
21: | else |
22: | Randomly select an individual from X1 and X2, respectively |
23: | Generate Xselect according to linear combination rule |
24: | end if |
25: | end if |
26: | Generate new idea Xnew according to Equation (2) |
27: | if f(Xnew) < f(Xi)then |
28: | Xi ←Xnew |
29: | f(Xi) ← f(Xnew) |
30: | end if |
31: | end for |
32: | end while |
2.2. Nelder–Mead Method
3. Proposed Method
3.1. Motivations
3.2. BSONME Optimization Algorithm
3.2.1. Modified Nelder–Mead Updating Strategy
Algorithm 2 Modified Nelder–Mead updating strategy | |
1: | Input the individuals in ith neighborhood |
2: | while the termination criteria are not met do |
3: | Sort the individuals in ascending order according to fitness |
4: | Generate a reflection individual Xnewr according to Equation (10) |
5: | if f(Xnewr) < f(Xbest) then |
6: | Generate an expansion individual Xnewe according to Equation (12) |
7: | if f(Xnewe) < f(Xnewr) then |
8: | Xw ← Xnewe; f(Xw) ← f(Xnewe) |
9: | else |
10: | Xw ← Xnewr; f(Xw) ← f(Xnewr) |
11: | end if |
12: | else |
13: | if f(Xnewr) < f(Xw) then |
14: | Xw ← Xnewr; f(Xw) ← f(Xnewr) |
15: | else |
16: | if rand < 0.5 then |
17: | Generate an outside contraction individual Xoc according to Equation (13) |
18: | if f(Xoc) < f(Xw) then |
19: | Xw ← Xoc; f(Xw) ← f(Xoc) |
20: | end if |
21: | else |
22: | Generate an inside contraction individual Xic according to Equation (14) |
23: | if f(Xic) < f(Xw) then |
24: | Xw ← Xic; f(Xw) ← f(Xic) |
25: | end if |
26: | end if |
27: | end if |
28: | end if |
29: | end whlie |
30: | Sort the individuals in ascending order according to fintness |
31: | Return Xbest and f(Xbest) |
3.2.2. Elite Learning Mechanism
3.2.3. Reinitialization Strategy
Algorithm 3 The BSONME algorithm | |
1: | Randomly generate N individuals |
2: | Evaluate N individuals |
3: | Randomly select an individual as gbest |
4: | while the termination criteria are not met do |
5: | Divide the population into elite population and non-elite population |
6: | Update gbest according to Equation (15) |
7: | if rand < p1 then |
8: | Randomly select an individual to update a selected dimension |
9: | end if |
10: | for each individual Xi do |
11: | if rand < p2 then |
12: | if rand < p3 then |
13: | Xselect ← Randomly select an individual from the elite subpopulation |
14: | else |
15: | Randomly select Xi and Xj from the elite subpopulation (i ≠ j) |
16: | Xselect = r1 × Xi + (1 − r1) × Xj |
17: | end if |
18: | else |
19: | if rand < p3 then |
20: | Xselect ← Randomly select an individual from the nonelite subpopulation |
21: | else |
22: | Randomly select Xi and Xj from the non-elite subpopulation (i ≠ j) |
23: | Xselect = r1 × Xi + (1 − r1) × Xj |
24: | end if |
25: | end if |
26: | Generate new idea Xnew according to Equation (17) |
27: | Evaluate Xnew |
28: | if f(Xnew) < f(Xi) then |
29: | XNMA ← perform NMS on Xi according to algorithm 2; |
30: | if f(XNMA) < f(Xnew) then |
31: | Xi ← XNMA; f(Xi) ← f(XNMA) |
32: | else |
33: | Xi ← Xnew; f(Xi) ← f(Xnew) |
34: | end if |
35: | counti = 0 |
36: | else |
37: | if counti > TH |
38: | Update Xi with reinitialization strategy according to Equation (19) |
39: | counti = 0 |
40: | else |
41: | counti = counti + 1 |
42: | end if |
43 | end if |
44: | end for |
45: | end while |
4. Comparative Studies of Experiments
- Particle swarm optimization (PSO) [3];
- Brain storm optimization algorithm (BSO) [15];
- Modified brain storm optimization (MBSO) [27];
- Brain storm optimization algorithm in objective space (BSO-OS) [31];
- Random grouping brain storm optimization algorithm (RGBSO) [29];
- Improved random grouping BSO (IRGBSO) [38];
- BSO with learning strategy (BSOLS) [41];
- Active learning brain storm optimization (ALBSO) [42];
- BSO with role-playing strategy (RPBSO) [30];
- Brain storm optimization algorithm with adaptive learning strategy (BSO-AL) [34];
- Our approach (BSONME).
4.1. Parameter Settings
4.2. Experiment I: Mathematical Benchmark Problems
4.3. Result and Discussion
4.3.1. Discussion on Elite Learning Mechanism and Reinitialization Strategy
4.3.2. Analysis of Population Diversity
4.3.3. The Overall Effectiveness of BSONME
4.4. Experiment II: Least Squares Support Vector Machine for Prediction Problems
4.4.1. Least Squares Support Vector Machine
4.4.2. BSONME-LSSVM Model
4.4.3. Prediction of Pipeline Instantaneous Water Flow
4.4.4. Forecast of Fund Trend
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A
References
- Schranz, M.; Caro, G.; Schmickl, T.; Elmenreich, W.; Arvin, F.; Şekercioğlu, A.; Sende, M. Swarm Intelligence and Cyber-Physical Systems: Concepts, Challenges and Future Trends. Swarm Evol. Comput. 2020, 60, 100762. [Google Scholar] [CrossRef]
- Dorigo, M.; Di Caro, G. Ant colony optimization: A new meta-heuristic. In Proceedings of the IEEE Congress on Evolutionary Computation, Washington, DC, USA, 6–9 July 1999; Volume 2, pp. 1470–1477. [Google Scholar]
- Shi, Y.H.; Eberhart, R. A modified particle swarm optimizer. In Proceedings of the IEEE International Conference on Evolutionary Computation, Anchorage, AK, USA, 4–9 May 1998; pp. 611–616. [Google Scholar]
- Basturk, B.; Karaboga, D. An artifical bee colony (ABC) algorithm for numeric function optimization. In Proceedings of the IEEE Swarm Intelligence Symposium, Indianapolis, IN, USA, 12–14 May 2006; pp. 12–14. [Google Scholar]
- Simon, D. Biogeography-based optimization. IEEE Trans. Evol. Comput. 2008, 12, 702–713. [Google Scholar] [CrossRef] [Green Version]
- Gandomi, A.H.; Yang, X.S.; Alavi, A.H. Cuckoo search algorithm: A metaheuristic approach to solve structural optimization problems. Eng. Comput. 2013, 29, 17–35. [Google Scholar] [CrossRef]
- Yang, X.S. A new metaheuristic bat-inspired algorithm. In Nature Inspired Cooperative Strategies for Optimization (NICSO 2010); Springer: Berlin/Heidelberg, Germany, 2010; pp. 65–74. [Google Scholar]
- Nadimi-Shahraki, M.H.; Fatahi, A.; Zamani, H.; Mirjalili, S.; Abualigah, L. An Improved Moth-Flame Optimization Algorithm with Adaptation Mechanism to Solve Numerical and Mechanical Engineering Problems. Entropy 2021, 23, 1637. [Google Scholar] [CrossRef] [PubMed]
- Abualigah, L.; Diabat, A.; Mirjalili, S.; Abd Elaziz, M.; Gandomi, A.H. The arithmetic optimization algorithm. Comput. Methods Appl. Mech. Eng. 2021, 376, 113609. [Google Scholar] [CrossRef]
- Nadimi-Shahraki, M.H.; Fatahi, A.; Zamani, H.; Mirjalili, S.; Abualigah, L.; Abd Elaziz, M. Migration-based moth-flame opti-mization algorithm. Processes 2021, 9, 2276. [Google Scholar] [CrossRef]
- Abualigah, L.; Yousri, D.; Abd Elaziz, M.; Ewees, A.A.; Al-Qaness, M.A.; Gandomi, A.H. Aquila optimizer: A novel meta-heuristic optimization algorithm. Comput. Ind. Eng. 2021, 157, 107250. [Google Scholar] [CrossRef]
- Połap, D.; Woźniak, M. Red fox optimization algorithm. Expert Syst. Appl. 2021, 166, 114107. [Google Scholar] [CrossRef]
- Jia, H.; Peng, X.; Lang, C. Remora optimization algorithm. Expert Syst. Appl. 2021, 185, 115665. [Google Scholar] [CrossRef]
- Li, M.W.; Wang, Y.T.; Geng, J.; Hong, W.C. Chaos cloud quantum bat hybrid optimization algorithm. Nonlinear Dyn. 2021, 103, 1167–1193. [Google Scholar] [CrossRef]
- Shi, Y. Brain storm optimization algorithm. In Advances in Swarm Intelligence—Second International Conference, ICSI 2011, Chongqing, China, 12–15 June 2011; Springer: Berlin/Heidelberg, Germany, 2011; pp. 303–309. [Google Scholar]
- Papa, J.J.; Rosa, G.H.; de Souza, A.N.; Afonso, L.C.S. Feature selection through binary brain storm optimization. Comput. Electr. Eng. 2018, 72, 468–481. [Google Scholar] [CrossRef]
- Xiong, G.; Shi, D.; Zhang, J.; Zhang, Y. A binary coded brain storm optimization for fault section diagnosis of power systems. Electr. Power Syst. Res. 2018, 163, 441–451. [Google Scholar] [CrossRef]
- Pourpanah, F.; Shi, Y.; Lim, C.P.; Hao, Q.; Tan, C.J. Feature selection based on brain storm optimization for data classification. Appl. Soft Comput. J. 2019, 80, 761–775. [Google Scholar] [CrossRef]
- Wang, J.; Hou, R.; Wang, C.; Shen, L. Improved v-Support vector regression model based on variable selection and brain storm optimization for stock price forecasting. Appl. Soft Comput. 2016, 49, 164–178. [Google Scholar] [CrossRef]
- Yang, J.; Shen, Y.; Shi, Y. Visual fixation prediction with incomplete attention map based on brain storm optimization. Appl. Soft Comput. J. 2020, 96, 106653. [Google Scholar] [CrossRef]
- Duan, H.; Li, C. Quantum-behaved brain storm optimization approach to solving Loney’s solenoid problem. IEEE Trans. Magn. 2014, 51, 1–7. [Google Scholar] [CrossRef]
- Aldhafeeri, A.; Rahmat-Samii, Y. Brain storm optimization for electromagnetic applications: Continuous and discrete. IEEE Trans. Antennas Propag. 2019, 67, 2710–2722. [Google Scholar] [CrossRef]
- Jiang, Y.; Chen, X.; Zheng, F.C.; Niyato, T.D.; You, X. Brain Storm Optimization-Based Edge Caching in Fog Radio Access Networks. IEEE Trans. Veh. Technol. 2021, 70, 1807–1820. [Google Scholar] [CrossRef]
- Dai, Z.; Fang, W.; Tang, K.; Li, Q. An optima-identified framework with brain storm optimization for multimodal optimization problems. Swarm Evol. Comput. 2021, 62, 100827. [Google Scholar] [CrossRef]
- Yang, Y.; Shi, Y.; Xia, S. Advanced discussion mechanism-based brain storm optimization algorithm. Soft Comput. 2015, 19, 2997–3007. [Google Scholar] [CrossRef]
- Bezdan, T.; Stoean, C.; Naamany, A.A.; Bacanin, N.; Rashid, T.A.; Zivkovic, M.; Venkatachalam, K. Hybrid Fruit-Fly Optimization Algorithm with K-Means for Text Document Clustering. Mathematics 2021, 9, 1929. [Google Scholar] [CrossRef]
- Zhan, Z.; Zhang, J.; Shi, Y.; Liu, H. A modified brain storm optimization. In Proceedings of the 2012 IEEE Congress on Evolutionary Computation, Brisbane, Australia, 10–15 June 2012; pp. 1–8. [Google Scholar]
- Zhu, H.; Shi, Y. Brain storm optimization algorithms with k-medians clustering algorithms. In Proceedings of the 2015 Seventh International Conference on Advanced Computational Intelligence (ICACI), Wuyi, China, 27–29 March 2015; pp. 107–110. [Google Scholar]
- Cao, Z.; Shi, Y.; Rong, X.; Liu, B.; Du, Z.; Yang, B. Random grouping brain storm optimization algorithm with a new dynamically changing step size. In Proceedings of the International Conference in Swarm Intelligence, Beijing, China, 25–28 June 2015; Springer: Cham, Switzerland, 2015; pp. 357–364. [Google Scholar]
- Chen, J.; Deng, C.; Peng, H.; Tan, Y.; Wang, F. Enhanced brain storm optimization with roleplaying strategy. In Proceedings of the 2019 IEEE Congress on Evolutionary Computation (CEC), Wellington, New Zealand, 10–13 June 2019; IEEE: Pistacaway, NJ, USA, 2019; pp. 1132–1139. [Google Scholar]
- Shi, Y. Brain storm optimization algorithm in objective space. In Proceedings of the 2015 IEEE Congress on Evolutionary Computation (CEC), Sendai, Japan, 25–28 May 2015; IEEE: Pistacaway, NJ, USA, 2019; pp. 1227–1234. [Google Scholar]
- Zhou, D.; Shi, Y.; Cheng, S. Brain storm optimization algorithm with modified step-size and individual generation. In Proceedings of the International Conference in Swarm Intelligence, Shenzhen, China, 17–20 June 2012; Springer: Berlin/Heidelberg, Germany, 2012; pp. 243–252. [Google Scholar]
- Cheng, S.; Shi, Y.; Qin, Q.; Ting, T.O.; Bai, R. Maintaining population diversity in brain storm optimization algorithm. In Proceedings of the 2014 IEEE Congress on Evolutionary Computation (CEC), Beijing, China, 6–11 July 2014; pp. 3230–3237. [Google Scholar]
- Shen, Y.; Yang, J.; Cheng, S.; Shi, Y. BSO-AL: Brain storm optimization algorithm with adaptive learning strategy. In Proceedings of the 2020 IEEE Congress on Evolutionary Computation (CEC), Glasgow, UK, 19–24 July 2020; p. 17. [Google Scholar]
- El-Abd, M. Global-best brain storm optimization algorithm. Swarm Evol. Comput. 2017, 37, 27–44. [Google Scholar] [CrossRef]
- Ma, L.; Cheng, S.; Shi, Y. Enhancing learning efficiency of brain storm optimization via orthogonal learning design. IEEE Trans. Syst. Man Cybern. Syst. 2020, 51, 6723–6742. [Google Scholar] [CrossRef]
- Nelder, J.A.; Mead, R. A simplex method for function minimization. Comput. J. 1965, 7, 308–313. [Google Scholar] [CrossRef]
- El-Abd, M. Brain storm optimization algorithm with re-initialized ideas and adaptive step size. In Proceedings of the 2016 IEEE Congress on Evolutionary Computation (CEC), Vancouver, BC, Canada, 24–29 July 2016; pp. 2682–2686. [Google Scholar]
- Liang, J.J.; Qu, B.Y.; Suganthan, P.N. Problem Definitions and Evaluation Criteria for the CEC 2014 Special Session and Competition on Single Objective Real-Parameter Numerical Optimization; Computational Intelligence Laboratory, Zhengzhou University, Zhengzhou China and Technical Report; Nanyang Technological University: Singapore, 2013; Volume 635, p. 490. [Google Scholar]
- Tizhoosh, H.R. Opposition-based learning: A new scheme for machine intelligence. In Proceedings of the International Conference on Computational Intelligence for Modelling, Control and Automation and International Conference on Intelligent Agents, Web Technologies and Internet Commerce (CIMCA-IAWTIC’06), Vienna, Austria, 28–30 November 2005; Volume 1, pp. 695–701. [Google Scholar]
- Wang, H.; Liu, L.; Yi, W.; Niu, B.; Baek, J. An improved brain storm optimization with learning strategy. In Proceedings of the International Conference on Swarm Intelligence, Fukuoka, Japan, 27 July–1 August 2017; Springer: Cham, Switzerland, 2017; pp. 511–518. [Google Scholar]
- Cao, Z.; Wang, L. An active learning brain storm optimization algorithm with a dynamically changing cluster cycle for global optimization. Clust. Comput. 2019, 22, 1413–1429. [Google Scholar] [CrossRef]
- Suganthan, P.N.; Hansen, N.; Liang, J.J.; Deb, K.; Chen, Y.P.; Auger, A.; Tiwari, S. Problem Definitions and Evaluation Criteria for the CEC2005 Special Session on Real-Parameter Optimization. 2005. Available online: https://github.com/P-N-Suganthan (accessed on 13 September 2019).
- Wang, Y.; Cai, Z.; Zhang, Q. Differential evolution with composite trial vector generation strategies and control parameters. IEEE Trans. Evol. Comput. 2011, 15, 55–66. [Google Scholar] [CrossRef]
- Alcalá-Fdez, J.; Sánchez, L.; García, S.; del Jesus, M.J.; Ventura, S.; Garrell, J.M.; Otero, J.; Romero, C.; Bacardit, J.; Rivas, V.M.; et al. A software tool to assess evolutionary algorithms to data mining problems. Soft Comput. 2009, 13, 307–318. [Google Scholar] [CrossRef]
- Morrison, R.W. Designing Evolutionary Algorithms for Dynamic Environments; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2004. [Google Scholar]
- Nadimi-Shahraki, M.H.; Taghian, S.; Mirjalili, S.; Faris, H. MTDE: An effective multi-trial vector-based differential evolution algorithm and its applications for engineering design problems. Appl. Soft Comput. 2020, 97, 106761. [Google Scholar] [CrossRef]
- Suykens, J.A.K.; Vandewalle, J. Least squares support vector machine classifiers. Neural Process. Lett. 1999, 9, 293–300. [Google Scholar] [CrossRef]
- Tian, Z. Short-term wind speed prediction based on LMD and improved FA optimized combined kernel function LSSVM. Eng. Appl. Artif. Intell. 2020, 91, 103573. [Google Scholar] [CrossRef]
- Song, Y.; Niu, W.; Wang, Y.; Xie, X.; Yang, S. A Novel Method for Energy Consumption Prediction of Underwater Gliders Using Optimal LSSVM with PSO Algorithm. In Proceedings of the Global Oceans 2020: Singapore—US Gulf Coast, Biloxi, MS, USA, 5–30 October 2020; pp. 1–5. [Google Scholar]
Population Size | 100 |
---|---|
Solution Error | F(x) − F(x*) |
F(x) | Best fitness value |
F(x*) | Real global optimization value |
Run times | 30 |
Dimension (D) | 30 |
Termination Criterion | D × 10,000 |
F1–F3 | Unimodal Problems |
F4–F16 | Simple Multimodal Problems |
F17–F22 | Hybrid Problems |
F23–F30 | Composite Problems |
[−100, 100]D | Search Range |
CPU | Intel Core i7-5500 2.40 GHz |
Application Software | Matlab R2016a |
Termination Criterion | Maximum number of function evaluations (MaxFES) |
MaxFES | D × 10,000 |
Dimension | D = 30 |
Independent run times | 30 |
Population Size | 100 |
Wilcoxon signed-rank test [44] | 5% significance level |
Compare BSONME with other compared algorithms | |
Multiple-problem Wilcoxon’s test [45] | Show the significant differences of the compared algorithm |
Friedman’s test [45] | Determine the ranking of all compared algorithms |
† | The performance of BSONME is better than that of the corresponding algorithm. |
≈ | The performance of BSONME is similar to that of the corresponding algorithm. |
− | The performance of BSONME is worse than that of the corresponding algorithm. |
Algorithm | N | Parameter |
---|---|---|
PSO | 100 | c1 = 2, c2 = 2, wmax = 0.9, wmin = 0.4 |
BSO | 100 | m = 5, preplace = 0.2, pone = 0.8, ponecenter = 0.4, ptwocenter = 0.5 |
MBSO | 100 | m = 5, preplace = 0.2, pone = 0.8, pr = 0.005, ponecenter = 0.4, ptwocenter = 0.5 |
BSO-OS | 100 | perce = 0.1, preplace = 0.2, pone = 0.8 |
RGBSO | 100 | m = 5, pone = 0.8, ponecenter = 0.4, ptwocenter = 0.5 |
IRGBSO | 100 | m = 5, pone = 0.8, ponecenter = 0.4, ptwocenter = 0.5, threshold = 10, F = 0.5 |
BSOLS | 100 | m = 5, pone = 0.8, ponecenter = 0.4, ptwocenter = 0.5, pe = 0.1, pl = 0.1, q1 = 0.13 |
ALBSO | 100 | m = 5, pone = 0.8, dc = 0.5, ponecenter = 0.4, ptwocenter = 0.5 |
RPBSO | 100 | m = 3, pone = 0.5, pr = 0.005 |
BSO-AL | 100 | m = 5, pone = 0.8, ponecenter = 0.4, ptwocenter = 0.5 |
BSONME | 100 | p1 = 0.2, p2 = 0.2, p3 = 0.8, n = 4, TH = 50 |
F | PSO | BSO | MBSO | BSO-OS | RGBSO | BSONME |
---|---|---|---|---|---|---|
F1 | 7.66 × 106 † | 1.75 × 106 † | 5.95 × 104 † | 1.69 × 106 † | 2.30 × 106 † | 2.76 × 104 |
F2 | 1.59 × 102 † | 8.28 × 103 † | 14.5 † | 9.95 × 103 † | 8.74 × 103 † | 7.23 × 10−6 |
F3 | 4.17 × 102 † | 1.71 × 104 † | 5.23 × 10−1 † | 5.71 × 103 † | 5.47 × 104 † | 8.80 × 10−2 |
† | 3 | 3 | 3 | 3 | 3 | \ |
− | 0 | 0 | 0 | 0 | 0 | \ |
≈ | 0 | 0 | 0 | 0 | 0 | \ |
F | IRGBSO | BSOLS | ALBSO | RPBSO | BSO-AL | BSONME |
---|---|---|---|---|---|---|
F1 | 1.30 × 108 † | 9.82 × 106 † | 5.50 × 108 † | 1.32 × 106 † | 7.10 × 107 † | 2.76 × 104 |
F2 | 3.49 × 109 † | 1.62 × 104 † | 3.31 × 1010 † | 5.73 × 103 † | 1.04 × 109 † | 7.23 × 10−6 |
F3 | 3.17 × 104 † | 2.95 × 104 † | 9.83 × 104 † | 1.96 × 103 † | 8.34 × 104 † | 8.80 × 10−2 |
† | 3 | 3 | 3 | 3 | 3 | \ |
− | 0 | 0 | 0 | 0 | 0 | \ |
≈ | 0 | 0 | 0 | 0 | 0 | \ |
F | PSO | BSO | MBSO | BSO-OS | RGBSO | BSONME |
---|---|---|---|---|---|---|
F4 | 1.61 × 102 † | 85.8 † | 4.72 † | 65.3 † | 83.9 † | 8.06 × 10−1 |
F5 | 20.9 † | 20.0 − | 20.9 † | 20.0 − | 20.0 − | 20.1 |
F6 | 11.0 − | 29.3 † | 3.24 − | 20.7 ≈ | 31.6 † | 20.8 |
F7 | 1.09 × 10−2 − | 5.26 × 10−4 − | 7.72 × 10−3 − | 2.49 × 10−3 − | 2.14 × 10−3 − | 4.14 × 10−2 |
F8 | 18.7 † | 1.39 × 102 † | 31.4 † | 12.9 † | 1.36 × 102 † | 1.63 |
F9 | 65.3 − | 1.63 × 102 † | 36.6 − | 1.19 × 102 † | 1.79 × 102 † | 76.2 |
F10 | 5.75 × 102 † | 4.27 × 103 † | 1.57 × 103 † | 1.63 × 102 † | 4.27 × 103 † | 80.1 |
F11 | 2.89 × 103 ≈ | 4.15 × 103 † | 3.02 × 103 ≈ | 3.20 × 103 † | 4.33 × 103 † | 2.82 × 103 |
F12 | 1.78 † | 2.01 × 10−2 − | 2.41 † | 1.66 × 10−2 − | 4.30 × 10−2 − | 2.62 × 10−1 |
F13 | 4.40 × 10−1 ≈ | 3.33 × 10−1 − | 2.49 × 10−1 − | 3.14 × 10−1 − | 3.54 × 10−1 − | 4.77 × 10−1 |
F14 | 2.99 × 10−1 ≈ | 2.18 × 10−1 − | 2.85 × 10−1 ≈ | 2.06 × 10−1 − | 2.10 × 10−1 − | 2.83 × 10−1 |
F15 | 7.87 − | 7.68 − | 2.98 − | 5.88 − | 15.3 − | 22.7 |
F16 | 11.1 ≈ | 12.6 † | 11.6 † | 11.2 † | 12.8 † | 11.0 |
† | 5 | 7 | 6 | 6 | 7 | \ |
− | 4 | 6 | 5 | 6 | 6 | \ |
≈ | 4 | 0 | 2 | 1 | 0 | \ |
F | IRGBSO | BSOLS | ALBSO | RPBSO | BSO-AL | BSONME |
---|---|---|---|---|---|---|
F4 | 5.17 × 102 † | 61.1 † | 4.45 × 103 † | 12.4 † | 2.12 × 102 † | 8.06 × 10−1 |
F5 | 20.3 † | 20.7 † | 21.1 † | 21.0 † | 20.7 † | 20.1 |
F6 | 25.0 † | 34.4 † | 37.8 † | 2.77 × 10−1 − | 37.9 † | 20.8 |
F7 | 37.1 † | 1.82 × 10−1 † | 3.19 × 102 † | 2.47 × 10−4 − | 5.83 † | 4.14 × 10−2 |
F8 | 1.80 × 102 † | 1.55 × 102 † | 2.71 × 102 † | 31.3 † | 1.58 × 102 † | 1.63 |
F9 | 2.05 × 102 † | 1.97 × 102 † | 3.06 × 102 † | 54.7 − | 2.03 × 102 † | 76.2 |
F10 | 3.73 × 103 † | 3.58 × 103 † | 6.51 × 103 † | 1.34 × 103 † | 3.89 × 103 † | 80.1 |
F11 | 4.12 × 103 † | 3.91 × 103 † | 6.95 × 103 † | 6.25 × 103 † | 4.30 × 103 † | 2.82 × 103 |
F12 | 5.84 × 10−1 † | 9.20 × 10−1 † | 2.98 † | 2.45 † | 9.19 × 10−1 † | 2.62 × 10−1 |
F13 | 8.33 × 10−1 † | 4.79 × 10−1 ≈ | 5.17 † | 3.46 × 10−1 − | 3.84 × 10−1 − | 4.77 × 10−1 |
F14 | 12.9 † | 3.32 × 10−1 ≈ | 1.28 × 102 † | 2.82 × 10−1 ≈ | 1.75 † | 2.83 × 10−1 |
F15 | 70.0 † | 22.2 ≈ | 1.17 × 104 † | 15.3 − | 2.81 × 104 † | 22.7 |
F16 | 11.8 † | 12.3 † | 13.1 † | 12.2 † | 12.9 † | 11.0 |
† | 13 | 10 | 13 | 7 | 12 | \ |
− | 0 | 0 | 0 | 5 | 1 | \ |
≈ | 0 | 3 | 0 | 1 | 0 | \ |
F | PSO | BSO | MBSO | BSO-OS | RGBSO | BSONME |
---|---|---|---|---|---|---|
F17 | 6.43 × 105 † | 1.21 × 105 † | 8.73 × 103 † | 8.87 × 104 † | 2.05 × 105 † | 4.19 × 103 |
F18 | 5.16 × 103 † | 1.99 × 103 ≈ | 3.79 × 103 ≈ | 1.88 × 103 ≈ | 1.60 × 103 ≈ | 1.51 × 103 |
F19 | 10.1 † | 18.7 † | 5.94 − | 13.8 † | 24.8 † | 9.40 |
F20 | 6.34 × 102 − | 1.04 × 104 † | 1.51 × 102 ≈ | 1.60 × 104 † | 2.42 × 104 † | 2.24 × 103 |
F21 | 1.47 × 105 † | 6.03 × 104 † | 1.01 × 104 † | 7.91 × 104 † | 1.01 × 105 † | 1.62 × 103 |
F22 | 2.57 × 102 − | 8.41 × 102 † | 1.61 × 102 − | 7.69 × 102 ≈ | 9.25 × 102 † | 6.67 × 102 |
† | 4 | 5 | 2 | 4 | 5 | \ |
− | 2 | 0 | 2 | 0 | 0 | \ |
≈ | 0 | 1 | 2 | 2 | 1 | \ |
F | IRGBSO | BSOLS | ALBSO | RPBSO | BSO-AL | BSONME |
---|---|---|---|---|---|---|
F17 | 2.59 × 106 † | 7.29 × 105 † | 3.45 × 107 † | 4.03 × 104 † | 7.18 × 106 † | 4.19 × 103 |
F18 | 1.52 × 106 † | 4.01 × 103 † | 4.18 × 108 † | 3.08 × 103 ≈ | 2.66 × 107 † | 1.51 × 103 |
F19 | 58.2 † | 21.1 † | 2.49 × 102 † | 5.13 − | 20.6 † | 9.40 |
F20 | 2.06 × 104 † | 6.82 × 103 † | 1.10 × 105 † | 2.92 × 103 † | 1.31 × 105 † | 2.24 × 103 |
F21 | 4.63 × 105 † | 2.73 × 105 † | 9.88 × 106 † | 1.25 × 104 † | 4.14 × 106 † | 1.62 × 103 |
F22 | 5.50 × 102 − | 8.86 × 102 † | 1.45 × 103 † | 64.0 − | 1.20 × 103 † | 6.67 × 102 |
† | 5 | 6 | 6 | 3 | 6 | \ |
− | 1 | 0 | 0 | 2 | 0 | \ |
≈ | 0 | 0 | 0 | 1 | 0 | \ |
F | PSO | BSO | MBSO | BSO-OS | RGBSO | BSONME |
---|---|---|---|---|---|---|
F23 | 3.16 × 102 † | 3.15 × 102 † | 3.15 × 102 † | 3.14 × 102 † | 2.00 × 102 ≈ | 2.00 × 102 |
F24 | 2.30 × 102 † | 2.51 × 102 † | 2.31 × 102 † | 2.31 × 102 † | 2.00 × 102 ≈ | 2.00 × 102 |
F25 | 2.08 × 102 † | 2.21 × 102 † | 2.04 × 102 † | 2.18 × 102 † | 2.00 × 102 ≈ | 2.00 × 102 |
F26 | 1.21 × 102 ≈ | 1.15 × 102 ≈ | 1.00 × 102 − | 1.73 × 102 † | 1.28 × 102 ≈ | 1.07 × 102 |
F27 | 5.84 × 102 † | 8.44 × 102 † | 4.22 × 102 † | 7.32 × 102 † | 2.00 × 102 ≈ | 2.00 × 102 |
F28 | 1.11 × 103 † | 4.35 × 103 † | 8.86 × 102 † | 3.30 × 103 † | 2.00 × 102 ≈ | 2.00 × 102 |
F29 | 7.92 × 105 † | 3.97 × 105 † | 1.34 × 103 † | 1.48 × 103 † | 2.00 × 102 ≈ | 2.00 × 102 |
F30 | 4.13 × 103 † | 8.62 × 103 † | 1.77 × 103 † | 3.13 × 103 † | 2.00 × 102 ≈ | 2.00 × 102 |
† | 7 | 7 | 7 | 8 | 0 | \ |
− | 0 | 0 | 1 | 0 | 0 | \ |
≈ | 1 | 1 | 0 | 0 | 8 | \ |
F | IRGBSO | BSOLS | ALBSO | RPBSO | BSO-AL | BSONME |
---|---|---|---|---|---|---|
F23 | 2.00 × 102 ≈ | 3.20 × 102 † | 5.03 × 102 † | 2.00 × 102 ≈ | 2.00 × 102 ≈ | 2.00 × 102 |
F24 | 2.00 × 102 ≈ | 2.58 × 102 † | 2.37 × 102 † | 2.00 × 102 † | 2.00 × 102 ≈ | 2.00 × 102 |
F25 | 2.00 × 102 ≈ | 2.21 × 102 † | 2.17 × 102 † | 2.00 × 102 ≈ | 2.00 × 102 ≈ | 2.00 × 102 |
F26 | 1.04 × 102 − | 1.67 × 102 † | 1.12 × 102 † | 1.10 × 102 ≈ | 1.76 × 102 † | 1.07 × 102 |
F27 | 2.00 × 102 ≈ | 1.02 × 103 † | 1.11 × 103 † | 2.86 × 102 † | 2.41 × 102 ≈ | 2.00 × 102 |
F28 | 2.00 × 102 ≈ | 1.58 × 103 † | 4.78 × 103 † | 2.00 × 102 ≈ | 2.00 × 102 ≈ | 2.00 × 102 |
F29 | 2.00 × 102 ≈ | 2.88 × 102 † | 3.23 × 105 † | 1.64 × 103 † | 2.19 × 105 † | 2.00 × 102 |
F30 | 2.00 × 102 ≈ | 1.21 × 104 † | 1.38 × 106 † | 9.72 × 102 † | 3.21 × 105 † | 2.00 × 102 |
† | 0 | 8 | 8 | 4 | 3 | \ |
− | 1 | 0 | 0 | 0 | 0 | \ |
≈ | 7 | 0 | 0 | 4 | 5 | \ |
VS | R+ | R– | Asymptotic p-Value |
---|---|---|---|
PSO | 385.0 | 80.0 | 0.00165 |
BSO | 440.0 | 25.0 | 0.000019 |
MBSO | 350.0 | 115.0 | 0.015222 |
BSO-OS | 433.0 | 32.0 | 0.000034 |
RGBSO | 366.5 | 68.5 | 0.001226 |
IRGBSO | 396.5 | 38.5 | 0.000104 |
BSOLS | 461.0 | 4.0 | 0.000002 |
ALBSO | 465.0 | 0.0 | 0.000002 |
RPBSO | 367.0 | 98.0 | 0.005491 |
BSO-AL | 455.0 | 10.0 | 0.000005 |
Algorithm | Ranking |
---|---|
BSONME | 2.9333 |
PSO | 5.65 |
BSO | 6.1833 |
MBSO | 3.9167 |
BSO-OS | 4.8167 |
RGBSO | 5.4333 |
IRGBSO | 6.6667 |
BSOLS | 7.7 |
ALBSO | 10.4667 |
RPBSO | 4.15 |
BSO-AL | 8.0833 |
PSO | BSO | MBSO | BSO-OS | RGBSO | IRGBSO | BSOLS | ALBSO | RPBSO | BSO-AL | BSONME | |
---|---|---|---|---|---|---|---|---|---|---|---|
W/T/L | 1/5/24 | 2/2/26 | 7/4/19 | 3/3/24 | 1/9/20 | 1/7/22 | 0/3/27 | 0/0/30 | 7/5/18 | 0/4/26 | 11/7/12 |
OE | 20% | 13% | 37% | 20% | 33% | 27% | 10% | 0% | 40% | 13% | 60% |
Algorithm | Training Error | Testing Error | ||
---|---|---|---|---|
Mean | SD | Mean | SD | |
PSO | 6.20 | 5.89 × 10−3 | 5.68 | 9.08 × 10−4 |
BSO | 6.20 | 1.27 × 10−2 | 5.68 | 3.85 × 10−3 |
MBSO | 6.20 | 3.95 × 10−4 | 5.68 | 4.39 × 10−6 |
BSO-OS | 6.21 | 1.59 × 10−2 | 5.68 | 6.98 × 10−3 |
RGBSO | 6.19 | 2.61 × 10−2 | 5.68 | 1.20 × 10−2 |
IRGBSO | 6.20 | 1.62 × 10−2 | 5.68 | 2.54 × 10−3 |
RPBSO | 6.20 | 5.80 × 10−4 | 5.68 | 6.84 × 10−6 |
BSONME | 6.20 | 2.50 × 10−3 | 5.68 | 9.90 × 10−5 |
Algorithm | Training Error | Testing Error | ||
---|---|---|---|---|
Mean | SD | Mean | SD | |
PSO | 4.06 × 10−2 | 2.12 × 10−5 | 3.18 × 10−2 | 5.05 × 10−6 |
BSO | 4.06 × 10−2 | 5.30 × 10−5 | 3.19 × 10−2 | 3.34 × 10−5 |
MBSO | 4.06 × 10−2 | 2.29 × 10−7 | 3.18 × 10−2 | 7.51 × 10−8 |
BSO-OS | 4.06 × 10−2 | 7.31 × 10−5 | 3.19 × 10−2 | 4.41 × 10−5 |
RGBSO | 4.06 × 10−2 | 7.70 × 10−5 | 3.19 × 10−2 | 6.28 × 10−5 |
IRGBSO | 4.06 × 10−2 | 7.15 × 10−5 | 3.19 × 10−2 | 3.62 × 10−5 |
RPBSO | 4.06 × 10−2 | 1.24 × 10−6 | 3.18 × 10−2 | 5.91 × 10−7 |
BSONME | 4.06 × 10−2 | 1.67 × 10−7 | 3.18 × 10−2 | 6.42 × 10−8 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Li, W.; Luo, H.; Wang, L.; Jiang, Q.; Xu, Q. Enhanced Brain Storm Optimization Algorithm Based on Modified Nelder–Mead and Elite Learning Mechanism. Mathematics 2022, 10, 1303. https://doi.org/10.3390/math10081303
Li W, Luo H, Wang L, Jiang Q, Xu Q. Enhanced Brain Storm Optimization Algorithm Based on Modified Nelder–Mead and Elite Learning Mechanism. Mathematics. 2022; 10(8):1303. https://doi.org/10.3390/math10081303
Chicago/Turabian StyleLi, Wei, Haonan Luo, Lei Wang, Qiaoyong Jiang, and Qingzheng Xu. 2022. "Enhanced Brain Storm Optimization Algorithm Based on Modified Nelder–Mead and Elite Learning Mechanism" Mathematics 10, no. 8: 1303. https://doi.org/10.3390/math10081303
APA StyleLi, W., Luo, H., Wang, L., Jiang, Q., & Xu, Q. (2022). Enhanced Brain Storm Optimization Algorithm Based on Modified Nelder–Mead and Elite Learning Mechanism. Mathematics, 10(8), 1303. https://doi.org/10.3390/math10081303