Locating the Parameters of RBF Networks Using a Hybrid Particle Swarm Optimization Method
Abstract
:1. Introduction
- The vector stands for the input pattern to the Equation (1). The number of elements in this vector is denoted as d.
- The vectors are denoted as the center vectors.
- The vector is considered as the output weight of the RBF network.
- The value represents the predicted value of the network for the pattern .
2. Method Description
- In the first phase, the k centers, as well as the associated variances are calculated through the K-means algorithm [50]. A typical formulation of the K-means algorithm is outlined in Algorithm 1.
- In the second phase, the weight vector is estimated by solving a linear system of equations:
- (a)
- Set ;
- (b)
- Set ;
- (c)
- Set ;
- (d)
- The system to be solved is identified as:
Algorithm 1 The K-means algorithm. |
|
2.1. Preliminaries
- The comparison of two intervals , is performed through the function:
- The function (Equation (3)) is modified to an interval one calculated with the procedure given in Algorithm 2.
- Every center has d variables, which means variables.
- For every center, a separate value is used for the Gaussian processing unit, which means k variables.
- The output weight also has k variables.
Algorithm 2 Fitness calculation for the modified PSO algorithm. |
The fitness calculation for a given particle g is as follows:
|
2.2. The Proposed PSO Algorithm
- Set as the amount of particles.
- Set the normalization factor .
- Set the k weights of the RBF network.
- Set the maximum generations allowed.
- Set the number of random samples that will be used in the fitness calculation algorithm.
- Set, the fitness of the best located particle .
- Construct, as obtained from the previous two algorithms.
- Initialize the particles. Each particle is considered as a set of intervals randomly initialized in S. The layout of each particle is graphically presented in Figure 2.
- For, do:
- (a)
- Calculate the fitness of particle using the procedure outlined in Algorithm 2.
- (b)
- If, then
- (c)
- Set as the best located position for particle i and the associated fitness value.
- (d)
- For, do:
- i.
- Set the width of interval .
- ii.
- Set, with r a random number in . The velocity is initialized to a small sub-interval of the range of values for the corresponding parameter in order to avoid, as much as possible, excessive values for the velocity. This would result in the particles moving out of their value range very quickly, thus making the optimization process difficult.
- (e)
- EndFor.
- EndFor.
- Set iter = 0.
- Calculate the inertia value as , where common values for these parameters are and . Many inertia calculations have appeared in the relevant literature such as constant inertia [58], linearly decreasing inertia [59], exponential inertia [60], random inertia calculation [61], dynamic inertia [62], fuzzy inertia calculation [63], etc. The present method of calculating the inertia was chosen because it decreases linearly with time, and for large values of the inertia, it allows a wider search in the search space, while for low values, it allows a more focused search.
- For, do:
- (a)
- Calculate the new velocity , where are random numbers in , and the constant values and stand for the cognitive and the social parameters, correspondingly. Usually, the values for and are in .
- (b)
- Normalize the velocity as: , where is a positive number with .
- (c)
- Update the position .
- (d)
- Calculate the fitness of particle .
- (e)
- If, then.
- (f)
- If, then.
- EndFor.
- Set iter = iter+1.
- If, goto Step 13.
- Else, return, the domain range for the best particle .
Algorithm 3 Algorithm used to locate the initial values for . |
|
2.3. Optimization of Parameters through Genetic Algorithm
- Initialization step:
- (a)
- Set as the number of chromosomes. Every chromosome is coded as in the case of PSO using the scheme of Figure 2.
- (b)
- Set as the maximum number of generations allowed.
- (c)
- Setk as the weight number of the RBF network.
- (d)
- Obtain the domain range S from the procedure of Section 2.2.
- (e)
- Initialize randomly in S.
- (f)
- Define the selection rate .
- (g)
- Define the mutation rate .
- (h)
- Set iter = 0.
- Evaluation step:For every chromosome g, calculate the associated fitness value :
- Genetic operations step:Perform the genetic operations of selection, crossover, and mutation.
- (a)
- Selection procedure: First, the population of chromosomes is sorted based on the associated fitness values. The first chromosomes are copied unchanged to the next generation, while the rest are replaced by offspring constructed by the crossover procedure. During the selection step, a series of mating pairs is chosen using the well-known procedure of tournament selection for each parent.
- (b)
- Crossover procedure: For each pair of chosen parents, two new offspring and are constructed with the steps:
- (c)
- Mutation procedure: For every element of each chromosome, pick a random number . If, then alter randomly the corresponding element.
- Termination check step:
- (a)
- Set.
- (b)
- If the termination criteria hold, then Terminate; else, goto evaluation step.
3. Experiments
- The UCI dataset repository, https://archive.ics.uci.edu/ml/index.php (accessed on 5 January 2023).
- The Keel repository, https://sci2s.ugr.es/keel/datasets.php (accessed on 5 January 2023) [66].
3.1. Experimental Datasets
- Appendicitis dataset, a medical dataset suggested in [67].
- Australian dataset [68], an economic dataset.
- Balance dataset [69], used for the prediction of psychological states.
- Bands dataset, a dataset related to printing problems [72].
- Dermatology dataset [73], which is a medical dataset.
- Hayes-roth dataset [74].
- Heart dataset [75], a medical dataset.
- HouseVotes dataset [76].
- Liverdisorder dataset [79], a medical dataset about liver disorders.
- Lymography dataset [80].
- Mammographic dataset [81], which is a dataset about breast cancer.
- Parkinsons dataset, a medical dataset about Parkinson’s Disease (PD) [82].
- Pima dataset, a medical dataset [83].
- Popfailures dataset [84], a dataset about climate.
- Spiral dataset: The spiral artificial dataset contains 1000 two-dimensional examples that belong to two classes (500 examples each). The number of features is 2. The data in the first class were created using the following formula: , and the second class data using: .
- Regions2 dataset, described in [85].
- Saheart dataset [86], which is related to heart diseases.
- Segment dataset [87], which is related to image processing.
- Wdbc dataset [88], which is related to breast tumors.
- Eeg dataset. As a real-word example, an EEG dataset described in [91] was used here. The datasets derived from the dataset are denoted as Z_F_S, ZONF_S, and ZO_NF_S.
- Zoo dataset [92], used for the classification of animals.
- Abalone dataset [93].
- Airfoil dataset, a dataset from NASA related to aerodynamic and acoustic tests [94].
- Baseball dataset, a dataset used to predict the points scored by baseball players.
- BK dataset [95], used to estimate the points scored per minute in a basketball game.
- BL dataset; this dataset is related to an experiment on the affects of machine adjustments on the time to count bolts.
- Concrete dataset, related to civil engineering [96].
- Dee dataset, used to predict the daily average price of electric energy in Spain.
- Diabetes dataset, a medical dataset.
- FA dataset, related to fat measurements.
- Housing dataset, described in [97].
- MB dataset, a statistics dataset [95].
- MORTGAGE dataset, which contains economic data.
- NT dataset, derived from [98].
- PY dataset (the Pyrimidines problem) [99].
- Quake dataset, which contains data from earthquakes [100].
- Treasure dataset, which contains economic data.
- Wankara dataset, which is about weather measurement
3.2. Experimental Results
- The column NN-GENETIC denotes the application of a genetic algorithm in the artificial neural network with 10 hidden nodes. The parameters of the used genetic algorithm are the same as in the second phase of the proposed method.
- The column RBF-KMEANS denotes the classic training method for RBF networks by estimating centers and variances through K-means and the output weights by solving a linear system of equations.
- The column IRBF-100 denotes the application of the current method with .
- The column IRBF-1000 denotes the application of the current method with .
- In both tables, an extra line is added, in which the mean error for each method is shown. This row is denoted by the name AVERAGE. This line also shows the number of times the corresponding method achieved the best result. This number is shown in parentheses.
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Mjahed, M. The use of clustering techniques for the classification of high energy physics data. Nucl. Instrum. Methods Phys. Res. Sect. A 2006, 559, 199–202. [Google Scholar] [CrossRef]
- Andrews, M.; Paulini, M.; Gleyzer, S.; Poczos, B. End-to-End Event Classification of High-Energy Physics Data. J. Phys. 2018, 1085, 42022. [Google Scholar] [CrossRef]
- He, P.; Xu, C.J.; Liang, Y.Z.; Fang, K.T. Improving the classification accuracy in chemistry via boosting technique. Chemom. Intell. Lab. Syst. 2004, 70, 39–46. [Google Scholar] [CrossRef]
- Aguiar, J.A.; Gong, M.L.; Tasdizen, T. Crystallographic prediction from diffraction and chemistry data for higher throughput classification using machine learning. Comput. Mater. Sci. 2020, 173, 109409. [Google Scholar] [CrossRef]
- Kaastra, I.; Boyd, M. Designing a neural network for forecasting financial and economic time series. Neurocomputing 1996, 10, 215–236. [Google Scholar] [CrossRef]
- Hafezi, R.; Shahrabi, J.E. Hadavandi, A bat-neural network multi-agent system (BNNMAS) for stock price prediction: Case study of DAX stock price. Appl. Soft Comput. 2015, 29, 196–210. [Google Scholar] [CrossRef]
- Yadav, S.S.; Jadhav, S.M. Deep convolutional neural network based medical image classification for disease diagnosis. J. Big Data 2019, 6, 113. [Google Scholar] [CrossRef] [Green Version]
- Qing, L.; Linhong, W.; Xuehai, D. A Novel Neural Network-Based Method for Medical Text Classification. Future Internet 2019, 11, 255. [Google Scholar] [CrossRef] [Green Version]
- Park, J.; Sandberg, I.W. Universal Approximation Using Radial-Basis-Function Networks. Neural Comput. 1991, 3, 246–257. [Google Scholar] [CrossRef]
- Ghosh, J.; Nag, A. An Overview of Radial Basis Function Networks. In Radial Basis Function Networks 2. Studies in Fuzziness and Soft Computing; Howlett, R.J., Jain, L.C., Eds.; Physica: Heidelberg, Germany, 2001; Volume 67. [Google Scholar]
- Nam, M.-D.; Thanh, T.-C. Numerical solution of differential equations using multiquadric radial basis function networks. Neural Netw. 2001, 14, 185–199. [Google Scholar]
- Mai-Duy, N. Solving high order ordinary differential equations with radial basis function networks. Int. J. Numer. Meth. Eng. 2005, 62, 824–852. [Google Scholar] [CrossRef]
- Laoudias, C.; Kemppi, P.; Panayiotou, C.G. Localization Using Radial Basis Function Networks and Signal Strength Fingerprints. In Proceedings of the WLAN, GLOBECOM 2009—2009 IEEE Global Telecommunications Conference, Honolulu, HI, USA, 30 November–4 December 2009; pp. 1–6. [Google Scholar]
- Azarbad, M.; Hakimi, S.; Ebrahimzadeh, A. Automatic recognition of digital communication signal. Int. J. Energy 2012, 3, 21–33. [Google Scholar]
- Teng, P. Machine-learning quantum mechanics: Solving quantum mechanics problems using radial basis function networks. Phys. Rev. E 2018, 98, 33305. [Google Scholar] [CrossRef] [Green Version]
- Jovanović, R.; Sretenovic, A. Ensemble of radial basis neural networks with K-means clustering for heating energy consumption prediction. Fme Trans. 2017, 45, 51–57. [Google Scholar] [CrossRef] [Green Version]
- Yu, D.L.; Gomm, J.B.; Williams, D. Sensor fault diagnosis in a chemical process via RBF neural networks. Control. Eng. Pract. 1999, 7, 49–55. [Google Scholar] [CrossRef]
- Shankar, V.; Wright, G.B.; Fogelson, A.L.; Kirby, R.M. A radial basis function (RBF) finite difference method for the simulation of reaction–diffusion equations on stationary platelets within the augmented forcing method. Int. J. Numer. Meth. Fluids 2014, 75, 1–22. [Google Scholar] [CrossRef] [Green Version]
- Shen, W.; Guo, X.; Wu, C.; Wu, D. Forecasting stock indices using radial basis function neural networks optimized by artificial fish swarm algorithm. Knowl.-Based Syst. 2011, 24, 378–385. [Google Scholar] [CrossRef]
- Momoh, J.A.; Reddy, S.S. Combined Economic and Emission Dispatch using Radial Basis Function. In Proceedings of the 2014 IEEE PES General Meeting Conference & Exposition, National Harbor, MD, USA, 27–31 July 2014; pp. 1–5. [Google Scholar]
- Sohrabi, P.; Shokri, B.J.; Dehghani, H. Predicting coal price using time series methods and combination of radial basis function (RBF) neural network with time series. Miner. Econ. 2021, 1–10. [Google Scholar] [CrossRef]
- Ravale, U.; Marathe, N.; Padiya, P. Feature Selection Based Hybrid Anomaly Intrusion Detection System Using K Means and RBF Kernel Function. Procedia Comput. Sci. 2015, 45, 428–435. [Google Scholar] [CrossRef] [Green Version]
- Lopez-Martin, M.; Sanchez-Esguevillas, A.; Arribas, J.I.; Carro, B. Network Intrusion Detection Based on Extended RBF Neural Network With Offline Reinforcement Learning. IEEE Access 2021, 9, 153153–153170. [Google Scholar] [CrossRef]
- Yu, H.T.; Xie, T.; Paszczynski, S.; Wilamowski, B.M. Advantages of Radial Basis Function Networks for Dynamic System Design. IEEE Trans. Ind. Electron. 2011, 58, 5438–5450. [Google Scholar] [CrossRef]
- Yokota, R.; Barba, L.A.; Knepley, M.G. PetRBF—A parallel O(N) algorithm for radial basis function interpolation with Gaussians. Comput. Methods Appl. Mech. Eng. 2010, 199, 1793–1804. [Google Scholar] [CrossRef]
- Lu, C.; Ma, N.; Wang, Z. Fault detection for hydraulic pump based on chaotic parallel RBF network. EURASIP J. Adv. Signal Process. 2011, 2011, 49. [Google Scholar] [CrossRef] [Green Version]
- Kuncheva, L.I. Initializing of an RBF network by a genetic algorithm. Neurocomputing 1997, 14, 273–288. [Google Scholar] [CrossRef]
- Ros, F.; Pintore, M.; Deman, A.; Chrétien, J.R. Automatical initialization of RBF neural networks. Chemom. Intell. Lab. Syst. 2007, 87, 26–32. [Google Scholar] [CrossRef]
- Wang, D.; Zeng, X.J.; Keane, J.A. A clustering algorithm for radial basis function neural network initialization. Neurocomputing 2012, 77, 144–155. [Google Scholar] [CrossRef]
- Ricci, E.; Perfetti, R. Improved pruning strategy for radial basis function networks with dynamic decay adjustment. Neurocomputing 2006, 69, 1728–1732. [Google Scholar] [CrossRef]
- Huang, G.-B.P. Saratchandran and N. Sundararajan, A generalized growing and pruning RBF (GGAP-RBF) neural network for function approximation. IEEE Trans. Neural Netw. 2005, 16, 57–67. [Google Scholar] [CrossRef] [PubMed]
- Bortman, M.; Aladjem, M. A Growing and Pruning Method for Radial Basis Function Networks. IEEE Trans. Neural. Netw. 2009, 20, 1039–1045. [Google Scholar] [CrossRef] [PubMed]
- Karayiannis, N.B.; Randolph-Gips, M.M. On the construction and training of reformulated radial basis function neural networks. IEEE Trans. Neural Netw. 2003, 14, 835–846. [Google Scholar] [CrossRef] [Green Version]
- Peng, J.X.; Li, K.; Huang, D.S. A Hybrid Forward Algorithm for RBF Neural Network Construction. IEEE Trans. Neural Netw. 2006, 17, 1439–1451. [Google Scholar] [CrossRef]
- Du, D.; Li, K.; Fei, M. A fast multi-output RBF neural network construction method. Neurocomputing 2010, 73, 2196–2202. [Google Scholar] [CrossRef]
- Marini, F.; Walczak, B. Particle swarm optimization (PSO). A tutorial. Chemom. Intell. Lab. Syst. 2015, 149, 153–165. [Google Scholar] [CrossRef]
- Liu, B.; Wang, L.; Jin, Y.H. An Effective PSO-Based Memetic Algorithm for Flow Shop Scheduling. IEEE Trans. Syst. Cybern. Part B 2007, 37, 18–27. [Google Scholar] [CrossRef] [PubMed]
- Yang, J.; He, L.; Fu, S. An improved PSO-based charging strategy of electric vehicles in electrical distribution grid. Appl. Energy 2014, 128, 82–92. [Google Scholar] [CrossRef]
- Mistry, K.; Zhang, L.; Neoh, S.C.; Lim, C.P.; Fielding, B. A Micro-GA Embedded PSO Feature Selection Approach to Intelligent Facial Emotion Recognition. IEEE Trans. Cybern. 2017, 47, 1496–1509. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Han, S.; Shan, X.; Fu, J.; Xu, W.; Mi, H. Industrial robot trajectory planning based on improved pso algorithm. J. Phys. Conf. Ser. 2021, 1820, 12185. [Google Scholar] [CrossRef]
- Floudas, C.A.; Gounaris, C.E. A review of recent advances in global optimization. J. Glob. Optim. 2009, 45, 3–38. [Google Scholar] [CrossRef]
- Goldberg, D. Genetic Algorithms. In Search, Optimization and Machine Learning; Addison-Wesley Publishing Company: Reading, MA, USA, 1989. [Google Scholar]
- Michaelewicz, Z. Genetic Algorithms + Data Structures = Evolution Programs; Springer: Berlin, Germany, 1996. [Google Scholar]
- Grady, S.A.; Hussaini, M.Y.; Abdullah, M.M. Placement of wind turbines using genetic algorithms. Renew. Energy 2005, 30, 259–270. [Google Scholar] [CrossRef]
- Agarwal, V.; Bhanot, S. Radial basis function neural network-based face recognition using firefly algorithm. Neural. Comput. Appl. 2018, 30, 2643–2660. [Google Scholar] [CrossRef]
- Jiang, S.; Lu, C.; Zhang, S.; Lu, X.; Tsai, S.-B.; Wang, C.-K.; Gao, Y.; Shi, Y.; Lee, C.-H. Prediction of Ecological Pressure on Resource-Based Cities Based on an RBF Neural Network Optimized by an Improved ABC Algorithm. IEEE Access 2019, 7, 47423–47436. [Google Scholar] [CrossRef]
- Wang, H.; Wang, W.; Zhou, X.; Sun, H.; Zhao, J.; Yu, X.; Cui, Z. Firefly algorithm with neighborhood attraction, Information. Sciences 2017, 382–383, 374–387. [Google Scholar]
- Khan, I.U.; Aslam, N.; Alshehri, R.; Alzahrani, S.; Alghamdi, M.; Almalki, A.; Balabeed, M. Cervical Cancer Diagnosis Model Using Extreme Gradient Boosting and Bioinspired Firefly Optimization. Sci. Program. 2021, 2021, 5540024. [Google Scholar] [CrossRef]
- Zivkovic, M.; Bacanin, N.; Antonijevic, M.; Nikolic, B.; Kvascev, G.; Marjanovic, M.; Savanovic, N. Hybrid CNN and XGBoost Model Tuned by Modified Arithmetic Optimization Algorithm for COVID-19 Early Diagnostics from X-ray Images. Electronics 2022, 11, 3798. [Google Scholar] [CrossRef]
- MacQueen, J. Some methods for classification and analysis of multivariate observations. In Proceedings of the Fifth Berkeley Symposium on Mathematical Statistics and Probability, Berkeley, CA, USA, 21 June–18 July 1965; Volume 1, pp. 281–297. [Google Scholar]
- Hansen, E.; Walster, G.W. Global Optimization Using Interval Analysis; Marcel Dekker Inc.: New York, NY, USA, 2004. [Google Scholar]
- Markót, M.; Fernández, J.; Casado, L.G.; Csendes, T. New interval methods for constrained global optimization. Math. Program. 2006, 106, 287–318. [Google Scholar] [CrossRef]
- Žilinskas, A.; Žilinskas, J. Interval Arithmetic Based Optimization in Nonlinear Regression. Informatica 2010, 21, 149–158. [Google Scholar] [CrossRef]
- Schnepper, C.A.; Stadtherr, M.A. Robust process simulation using interval methods. Comput. Chem. Eng. 1996, 20, 187–199. [Google Scholar] [CrossRef] [Green Version]
- Carreras, C.; Walker, I.D. Interval methods for fault-tree analysis in robotics. IEEE Trans. Reliab. 2001, 50, 3–11. [Google Scholar] [CrossRef]
- Serguieva, A.; Hunte, J. Fuzzy interval methods in investment risk appraisal. Fuzzy Sets Syst. 2004, 142, 443–466. [Google Scholar] [CrossRef]
- Poli, R.; kennedy, J.K.; Blackwell, T. Particle swarm optimization An Overview. Swarm Intell. 2007, 1, 33–57. [Google Scholar] [CrossRef]
- Trelea, I.C. The particle swarm optimization algorithm: Convergence analysis and parameter selection. Inf. Process. Lett. 2003, 85, 317–325. [Google Scholar] [CrossRef]
- Shi, Y.; Eberhart, R.C. Empirical study of particle swarm optimization. In Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406), Washington, DC, USA, 6–9 July 1999; Volume 3, pp. 1945–1950. [Google Scholar]
- Borowska, B. Exponential Inertia Weight in Particle Swarm Optimization. In Information Systems Architecture and Technology: Proceedings of 37th International Conference on Information Systems Architecture and Technology—ISAT 2016—Part IV; Wilimowska, Z., Borzemski, L., Grzech, A., Świątek, J., Eds.; Springer: Cham, Switzerland, 2017; Volume 524, p. 524. [Google Scholar]
- Zhang, L.; Yu, H.; Hu, S. A New Approach to Improve Particle Swarm Optimization. In Genetic and Evolutionary Computation—GECCO 2003; Springer: Berlin/Heidelberg, Germany, 2003; Volume 2723. [Google Scholar]
- Borowska, B. Dynamic Inertia Weight in Particle Swarm Optimization. In Advances in Intelligent Systems and Computing II. CSIT 2017; Shakhovska, N., Stepashko, V., Eds.; Springer: Cham, Switzerland, 2018; Volume 689. [Google Scholar]
- Shi, Y.; Eberhart, R.C. Fuzzy adaptive particle swarm optimization. In Proceedings of the 2001 Congress on Evolutionary Computation (IEEE Cat. No.01TH8546), Seoul, Republic of Korea, 27–30 May 2001; Volume 1, pp. 101–106. [Google Scholar]
- Kaelo, P.; Ali, M.M. Integrated crossover rules in real coded genetic algorithms. Eur. J. Oper. Res. 2007, 176, 60–76. [Google Scholar] [CrossRef]
- Tsoulos, I.G. Modifications of real code genetic algorithm for global optimization. Appl. Math. Comput. 2008, 203, 598–607. [Google Scholar] [CrossRef]
- Alcalá-Fdez, J.; Fernandez, A.; Luengo, J.; Derrac, J.; García, S.; Sánchez, L.; Herrera, F. KEEL Data-Mining Software Tool: Data Set Repository, Integration of Algorithms and Experimental Analysis Framework. J. Mult. Valued Log. Soft Comput. 2011, 17, 255–287. [Google Scholar]
- Weiss, S.M.; Kulikowski, C.A. Computer Systems That Learn: Classification and Prediction Methods from Statistics, Neural Nets, Machine Learning and Expert Systems; Morgan Kaufmann Publishers Inc.; Morgan Kaufmann Publishing: San Mateo CA, USA, 1991. [Google Scholar]
- Quinlan, J.R. Simplifying Decision Trees. Int. -Man-Mach. Stud. 1987, 27, 221–234. [Google Scholar] [CrossRef] [Green Version]
- Shultz, T.; Mareschal, D.; Schmidt, W. Modeling Cognitive Development on Balance Scale Phenomena. Mach. Learn. 1994, 16, 59–88. [Google Scholar] [CrossRef] [Green Version]
- Zhou, Z.H.; Jiang, Y. NeC4.5: Neural ensemble based C4.5. IEEE Trans. Knowl. Data Eng. 2004, 16, 770–773. [Google Scholar] [CrossRef]
- Setiono, R.; Leow, W.K. FERNN: An Algorithm for Fast Extraction of Rules from Neural Networks. Appl. Intell. 2000, 12, 15–25. [Google Scholar] [CrossRef]
- Evans, B.; Fisher, D. Overcoming process delays with decision tree induction. IEEE Expert. 1994, 9, 60–66. [Google Scholar] [CrossRef]
- Demiroz, G.; Govenir, H.A.; Ilter, N. Learning Differential Diagnosis of Eryhemato-Squamous Diseases using Voting Feature Intervals. Artif. Intell. Med. 1998, 13, 147–165. [Google Scholar]
- Hayes-Roth, B.; Hayes-Roth, B.F. Concept learning and the recognition and classification of exemplars. J. Verbal Learning Verbal Behav. 1977, 16, 321–338. [Google Scholar] [CrossRef]
- Kononenko, I.; Šimec, E.; Robnik-Šikonja, M. Overcoming the Myopia of Inductive Learning Algorithms with RELIEFF. Appl. Intell. 1997, 7, 39–55. [Google Scholar] [CrossRef]
- French, R.M.; Chater, N. Using noise to compute error surfaces in connectionist networks: A novel means of reducing catastrophic forgetting. Neural Comput. 2002, 14, 1755–1769. [Google Scholar] [CrossRef] [PubMed]
- Dy, J.G.; Brodley, C.E. Feature Selection for Unsupervised Learning. J. Mach. Learn. Res. 2004, 5, 845–889. [Google Scholar]
- Perantonis, S.J.; Virvilis, V. Input Feature Extraction for Multilayered Perceptrons Using Supervised Principal Component Analysis. Neural Process. Lett. 1999, 10, 243–252. [Google Scholar] [CrossRef]
- Garcke, J.; Griebel, M. Classification with sparse grids using simplicial basis functions. Intell. Data Anal. 2002, 6, 483–502. [Google Scholar] [CrossRef]
- Cestnik, G.; Konenenko, I.; Bratko, I. Assistant-86: A Knowledge-Elicitation Tool for Sophisticated Users. In Progress in Machine Learning; Bratko, I., Lavrac, N., Eds.; Sigma Press: Wilmslow, UK, 1987; pp. 31–45. [Google Scholar]
- Elter, M.R.; Schulz-Wendtland, T.W. The prediction of breast cancer biopsy outcomes using two CAD approaches that both emphasize an intelligible decision process. Med. Phys. 2007, 34, 4164–4172. [Google Scholar] [CrossRef]
- Little, M.A.; McSharry, P.E.; Hunter, E.J.; Spielman, J.; Ramig, L.O. Suitability of dysphonia measurements for telemonitoring of Parkinson’s disease. IEEE Trans. Biomed. Eng. 2009, 56, 1015. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Smith, J.W.; Everhart, J.E.; Dickson, W.C.; Knowler, W.C.; Johannes, R.S. Using the ADAP learning algorithm to forecast the onset of diabetes mellitus. In Proceedings of the Symposium on Computer Applications and Medical Care IEEE Computer Society Press in Medical Care, Orlando, FL, USA, 7–11 November 1988; pp. 261–265. [Google Scholar]
- Lucas, D.D.; Klein, R.; Tannahill, J.; Ivanova, D.; Brandon, S.; Domyancic, D.; Zhang, Y. Failure analysis of parameter-induced simulation crashes in climate models. Geosci. Model Dev. 2013, 6, 1157–1171. [Google Scholar] [CrossRef] [Green Version]
- Giannakeas, N.; Tsipouras, M.G.; Tzallas, A.T.; Kyriakidi, K.; Tsianou, Z.E.; Manousou, P.; Hall, A.; Karvounis, E.C.; Tsianos, V.; Tsianos, E. A clustering based method for collagen proportional area extraction in liver biopsy images. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, New Orleans, LA, USA, 4–7 November 1988; pp. 3097–3100. [Google Scholar]
- Hastie, T.; Tibshirani, R. Non-parametric logistic and proportional odds regression. JRSS-C 1987, 36, 260–276. [Google Scholar] [CrossRef]
- Dash, M.; Liu, H.; Scheuermann, P.; Tan, K.L. Fast hierarchical clustering and its validation. Data Knowl. Eng. 2003, 44, 109–138. [Google Scholar] [CrossRef]
- Wolberg, W.H.; Mangasarian, O.L. Multisurface method of pattern separation for medical diagnosis applied to breast cytology. Proc. Natl. Acad. Sci. USA 1990, 87, 9193–9196. [Google Scholar] [CrossRef] [Green Version]
- Raymer, M.; Doom, T.E.; Kuhn, L.A.; Punch, W.F. Knowledge discovery in medical and biological datasets using a hybrid Bayes classifier/evolutionary algorithm. IEEE Trans. Syst. Man Cybern. Part B Cybern. 2003, 33, 802–813. [Google Scholar] [CrossRef]
- Zhong, P.; Fukushima, M. Regularized nonsmooth Newton method for multi-class support vector machines. Optim. Methods Softw. 2007, 22, 225–236. [Google Scholar] [CrossRef]
- Andrzejak, R.G.; Lehnertz, K.; Mormann, F.; Rieke, C.; David, P.; Elger, C.E. Indications of nonlinear deterministic and finite-dimensional structures in time series of brain electrical activity: Dependence on recording region and brain state. Phys. Rev. E 2001, 64, 1–8. [Google Scholar] [CrossRef] [Green Version]
- Koivisto, M.; Sood, K. Exact Bayesian Structure Discovery in Bayesian Networks. J. Mach. Learn. Res. 2004, 5, 549–573. [Google Scholar]
- Nash, W.J.; Sellers, T.L.; Talbot, S.R.; Cawthor, A.J.; Ford, W.B. The Population Biology of Abalone (Haliotis Species) in Tasmania. I. Blacklip Abalone (H. rubra) from the North Coast and Islands of Bass Strait, Sea Fisheries Division; Technical Report No. 48; Department of Primary Industry and Fisheries, Tasmania: Hobart, Australia, 1994. [Google Scholar]
- Brooks, T.F.; Pope, D.S.; Marcolini, A.M. Airfoil Self-Noise and Prediction; Technical Report, NASA RP-1218; National Aeronautics and Space Administration: Washington, DC, USA, 1989. [Google Scholar]
- Simonoff, J.S. Smooting Methods in Statistics; Springer: Berlin/Heidelberg, Germany, 1996. [Google Scholar]
- Cheng, Y.I. Modeling of strength of high performance concrete using artificial neural networks. Cem. Concr. Res. 1998, 28, 1797–1808. [Google Scholar]
- Harrison, D.; Rubinfeld, D.L. Hedonic prices and the demand for clean ai. J. Environ. Econ. Manag. 1978, 5, 81–102. [Google Scholar] [CrossRef]
- Mackowiak, P.A.; Wasserman, S.S.; Levine, M.M. A critical appraisal of 98.6 degrees f, the upper limit of the normal body temperature, and other legacies of Carl Reinhold August Wunderlich. J. Amer. Med. Assoc. 1992, 268, 1578–1580. [Google Scholar] [CrossRef]
- King, R.D.; Muggleton, S.; Lewis, R.; Sternberg, M.J.E. Drug design by machine learning: The use of inductive logic programming to model the structure-activity relationships of trimethoprim analogues binding to dihydrofolate reductase. Proc. Nat. Acad. Sci. USA 1992, 89, 11322–11326. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Sikora, M.; Wrobel, L. Application of rule induction algorithms for analysis of data collected by seismic hazard monitoring systems in coal mines. Arch. Min. Sci. 2010, 55, 91–114. [Google Scholar]
- Sanderson, C.; Curtin, R. Armadillo: A template-based C++ library for linear algebra. J. Open Source Softw. 2016, 1, 26. [Google Scholar] [CrossRef]
- Dagum, L.; Menon, R. OpenMP: An industry standard API for shared-memory programming. IEEE Comput. Sci. Eng. 1998, 5, 46–55. [Google Scholar] [CrossRef] [Green Version]
- Riedmiller, M.; Braun, H. A Direct Adaptive Method for Faster Backpropagation Learning: The RPROP algorithm. In Proceedings of the IEEE International Conference on Neural Networks, San Francisco, CA, USA, 28 March–1 April 1993; pp. 586–591. [Google Scholar]
- Bishop, C. Neural Networks for Pattern Recognition; Oxford University Press: Oxford, UK, 1995. [Google Scholar]
- Cybenko, G. Approximation by superpositions of a sigmoidal function. Math. Control. Signals Syst. 1989, 2, 303–314. [Google Scholar] [CrossRef]
- Klima, G. Fast Compressed Neural Networks. Available online: http://fcnn.sourceforge.net/ (accessed on 5 January 2023).
- Das, S.; Suganthan, P.N. Differential Evolution: A Survey of the State-of-the-Art. IEEE Trans. Evol. 2011, 15, 4–31. [Google Scholar] [CrossRef]
Parameter | Value |
---|---|
200 | |
100 | |
50 | |
1.0 | |
1.0 | |
F | 5.0 |
B | 100.0 |
k | 10 |
0.90 | |
0.05 |
Dataset | NN-RPROP | NN-GENETIC | RBF-KMEANS | IRBF-100 | IRBF-1000 |
---|---|---|---|---|---|
Appendicitis | 16.30% | 18.10% | 12.23% | 16.47% | 14.03% |
Australian | 36.12% | 32.21% | 34.89% | 23.61% | 22.39% |
Balance | 8.81% | 8.97% | 33.42% | 12.65% | 13.15% |
Bands | 36.32% | 35.75% | 37.22% | 37.38% | 36.29% |
Cleveland | 61.41% | 51.60% | 67.10% | 49.77% | 49.64% |
Dermatology | 15.12% | 30.58% | 62.34% | 38.24% | 35.64% |
Hayes Roth | 37.46% | 56.18% | 64.36% | 33.62% | 34.13% |
Heart | 30.51% | 28.34% | 31.20% | 15.91% | 15.60% |
HouseVotes | 6.04% | 6.62% | 6.13% | 4.77% | 3.90% |
Ionosphere | 13.65% | 15.14% | 16.22% | 8.64% | 7.52% |
Liverdisorder | 40.26% | 31.11% | 30.84% | 27.36% | 25.63% |
Lymography | 24.67% | 23.26% | 25.31% | 19.12% | 20.02% |
Mammographic | 18.46% | 19.88% | 21.38% | 17.17% | 17.30% |
Parkinsons | 22.28% | 18.05% | 17.41% | 15.51% | 13.59% |
Pima | 34.27% | 32.19% | 25.78% | 23.61% | 23.23% |
Popfailures | 4.81% | 5.94% | 7.04% | 5.21% | 5.10% |
Regions2 | 27.53% | 29.39% | 38.29% | 26.08% | 25.77% |
Saheart | 34.90% | 34.86% | 32.19% | 27.94% | 28.91% |
Segment | 52.14% | 57.72% | 59.68% | 47.19% | 40.28% |
Spiral | 46.59% | 44.50% | 44.87% | 19.43% | 19.56% |
Wdbc | 21.57% | 8.56% | 7.27% | 5.33% | 5.44% |
Wine | 30.73% | 19.20% | 31.41% | 9.20% | 6.84% |
Z_F_S | 29.28% | 10.73% | 13.16% | 4.19% | 4.18% |
ZO_NF_S | 6.43% | 8.41% | 9.02% | 4.31% | 4.35% |
ZONF_S | 27.27% | 2.60% | 4.03% | 2.23% | 2.08% |
ZOO | 15.47% | 16.67% | 21.93% | 10.13% | 11.13% |
AVERAGE | 26.86%(3) | 24.87%(1) | 29.03%(1) | 19.43%(8) | 18.68%(13) |
DATASET | NN-RPROP | NN-GENETIC | RBF-KMEANS | IRBF-100 | IRBF-1000 |
---|---|---|---|---|---|
ABALONE | 4.55 | 7.17 | 7.37 | 5.57 | 5.32 |
AIRFOIL | 0.002 | 0.003 | 0.27 | 0.004 | 0.003 |
BASEBALL | 92.05 | 103.60 | 93.02 | 78.89 | 85.58 |
BK | 1.60 | 0.03 | 0.02 | 0.04 | 0.03 |
BL | 4.38 | 5.74 | 0.013 | 0.0003 | 0.0003 |
CONCRETE | 0.009 | 0.009 | 0.011 | 0.007 | 0.007 |
DEE | 0.608 | 1.013 | 0.17 | 0.16 | 0.16 |
DIABETES | 1.11 | 19.86 | 0.49 | 0.78 | 0.89 |
HOUSING | 74.38 | 43.26 | 57.68 | 20.27 | 21.54 |
FA | 0.14 | 1.95 | 0.015 | 0.032 | 0.029 |
MB | 0.55 | 3.39 | 2.16 | 0.12 | 0.09 |
MORTGAGE | 9.19 | 2.41 | 1.45 | 0.39 | 0.78 |
NT | 0.04 | 0.006 | 8.14 | 0.007 | 0.007 |
PY | 0.039 | 1.41 | 0.012 | 0.024 | 0.014 |
QUAKE | 0.041 | 0.040 | 0.07 | 0.04 | 0.03 |
TREASURY | 10.88 | 2.93 | 2.02 | 0.33 | 0.51 |
WANKARA | 0.0003 | 0.012 | 0.001 | 0.002 | 0.002 |
AVERAGE | 11.71(1) | 11.34(1) | 10.17(5) | 6.27(7) | 6.76(3) |
DATASET | |||
---|---|---|---|
Appendicitis | 14.43% | 14.03% | 14.47% |
Australian | 23.45% | 22.39% | 23.21% |
Balance | 13.35% | 13.15% | 11.79% |
Bands | 36.48% | 36.29% | 36.76% |
Cleveland | 49.26% | 49.64% | 49.02% |
Dermatology | 36.54% | 35.64% | 34.37% |
Hayes Roth | 39.28% | 34.13% | 36.46% |
Heart | 15.14% | 15.60% | 14.89% |
HouseVotes | 4.93% | 3.90% | 6.41% |
Ionosphere | 7.56% | 7.52% | 9.05% |
Liverdisorder | 28.37% | 25.63% | 28.97% |
Lymography | 20.12% | 20.02% | 21.05% |
Mammographic | 18.04% | 17.30% | 18.21% |
Parkinsons | 18.51% | 13.59% | 13.49% |
Pima | 23.69% | 23.23% | 23.52% |
Popfailures | 5.76% | 5.10% | 4.50% |
Regions2 | 25.79% | 25.77% | 25.32% |
Saheart | 28.89% | 28.91% | 26.99% |
Segment | 36.53% | 40.28% | 43.28% |
Spiral | 16.78% | 19.56% | 22.18% |
Wdbc | 4.64% | 5.44% | 5.10% |
Wine | 8.31% | 6.84% | 8.27% |
Z_F_S | 4.32% | 4.18% | 4.03% |
ZO_NF_S | 3.70% | 4.35% | 3.72% |
ZONF_S | 2.04% | 2.08% | 1.98% |
ZOO | 11.87% | 11.13% | 9.97% |
AVERAGE | 18.65% | 18.68% | 19.12% |
DATASET | |||
---|---|---|---|
ABALONE | 5.56 | 5.32 | 5.41 |
AIRFOIL | 0.004 | 0.003 | 0.004 |
BASEBALL | 88.40 | 85.58 | 84.43 |
BK | 0.03 | 0.03 | 0.02 |
BL | 0.0005 | 0.0003 | 0.0002 |
CONCRETE | 0.009 | 0.007 | 0.007 |
DEE | 0.18 | 0.16 | 0.16 |
DIABETES | 0.67 | 0.89 | 0.77 |
HOUSING | 20.03 | 21.54 | 20.84 |
FA | 0.03 | 0.029 | 0.036 |
MB | 0.19 | 0.09 | 0.26 |
MORTGAGE | 0.89 | 0.78 | 0.03 |
NT | 0.006 | 0.007 | 0.007 |
PY | 0.027 | 0.014 | 0.018 |
QUAKE | 0.04 | 0.03 | 0.04 |
TREASURY | 0.77 | 0.51 | 0.17 |
WANKARA | 0.002 | 0.002 | 0.002 |
AVERAGE | 6.87 | 6.76 | 6.60 |
RBF-KMEANS DATASET | IRBF-100 PRECISION | RECALL | F-SCORE | PRECISION | RECALL | F-SCORE |
---|---|---|---|---|---|---|
APPENDICITIS | 0.80 | 0.77 | 0.76 | 0.79 | 0.74 | 0.78 |
AUSTRALIAN | 0.67 | 0.61 | 0.58 | 0.79 | 0.76 | 0.76 |
BALANCE | 0.74 | 0.76 | 0.64 | 0.75 | 0.78 | 0.76 |
BANDS | 0.52 | 0.51 | 0.48 | 0.58 | 0.57 | 0.56 |
HEART | 0.68 | 0.69 | 0.67 | 0.86 | 0.85 | 0.85 |
IONOSPHERE | 0.84 | 0.81 | 0.81 | 0.92 | 0.89 | 0.90 |
LIVERDISORDER | 0.65 | 0.64 | 0.64 | 0.72 | 0.71 | 0.71 |
MAMMOGRAPHIC | 0.81 | 0.81 | 0.81 | 0.83 | 0.83 | 0.82 |
PARKINSONS | 0.76 | 0.68 | 0.69 | 0.85 | 0.80 | 0.81 |
PIMA | 0.72 | 0.67 | 0.68 | 0.75 | 0.70 | 0.71 |
SAHEART | 0.65 | 0.61 | 0.61 | 0.70 | 0.66 | 0.67 |
SEGMENT | 0.43 | 0.39 | 0.39 | 0.58 | 0.53 | 0.53 |
SPIRAL | 0.56 | 0.56 | 0.55 | 0.70 | 0.70 | 0.70 |
WDBC | 0.93 | 0.91 | 0.92 | 0.96 | 0.94 | 0.95 |
WINE | 0.74 | 0.65 | 0.66 | 0.93 | 0.93 | 0.92 |
Z_F_S | 0.85 | 0.84 | 0.83 | 0.96 | 0.97 | 0.96 |
ZO_NF_S | 0.90 | 0.90 | 0.90 | 0.95 | 0.95 | 0.95 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Tsoulos, I.G.; Charilogis, V. Locating the Parameters of RBF Networks Using a Hybrid Particle Swarm Optimization Method. Algorithms 2023, 16, 71. https://doi.org/10.3390/a16020071
Tsoulos IG, Charilogis V. Locating the Parameters of RBF Networks Using a Hybrid Particle Swarm Optimization Method. Algorithms. 2023; 16(2):71. https://doi.org/10.3390/a16020071
Chicago/Turabian StyleTsoulos, Ioannis G., and Vasileios Charilogis. 2023. "Locating the Parameters of RBF Networks Using a Hybrid Particle Swarm Optimization Method" Algorithms 16, no. 2: 71. https://doi.org/10.3390/a16020071
APA StyleTsoulos, I. G., & Charilogis, V. (2023). Locating the Parameters of RBF Networks Using a Hybrid Particle Swarm Optimization Method. Algorithms, 16(2), 71. https://doi.org/10.3390/a16020071