Next Article in Journal
FL-YOLOv8: Lightweight Object Detector Based on Feature Fusion
Previous Article in Journal
Energy Management in a Renewable-Based Microgrid Using a Model Predictive Control Method for Electrical Energy Storage Devices
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

A Hyper-Parameter Optimizer Algorithm Based on Conditional Opposition Local-Based Learning Forbidden Redundant Indexes Adaptive Artificial Bee Colony Applied to Regularized Extreme Learning Machine

by
Philip Vasquez-Iglesias
1,†,
Amelia E. Pizarro
2,†,
David Zabala-Blanco
1,*,
Juan Fuentes-Concha
2,
Roberto Ahumada-Garcia
2,
David Laroze
3 and
Paulo Gonzalez
4
1
Facultad de Ciencias de la Ingeniería, Universidad Católica del Maule, Avenida San Miguel 3605, Talca 3460000, Chile
2
Doctorado en Ingeniería, Facultad de Ciencias de la Ingeniería, Universidad Católica del Maule, Avenida San Miguel 3605, Talca 3460000, Chile
3
Instituto de Alta Investigación, Universidad de Tarapacá, Casilla 7 D, Arica 1000000, Chile
4
Facultad de Economía y Negocios, Universidad de Talca, Av. Lircay, Talca 3460000, Chile
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Electronics 2024, 13(23), 4652; https://doi.org/10.3390/electronics13234652
Submission received: 27 September 2024 / Revised: 10 November 2024 / Accepted: 19 November 2024 / Published: 25 November 2024
(This article belongs to the Section Computer Science & Engineering)

Abstract

Finding the best configuration of a neural network’s hyper-parameters may take too long to be feasible using an exhaustive search, especially when the cardinality of the search space has a big combinatorial number of possible solutions with various hyper-parameters. This problem is aggravated when we also need to optimize the parameters of the neural network, such as the weight of the hidden neurons and biases. Extreme learning machines (ELMs) are part of the random weights neural network family, in which parameters are randomly initialized, and the solution, unlike gradient-descent-based algorithms, can be found analytically. This ability is especially useful for metaheuristic analysis due to its reduced training times allowing a faster optimization process, but the problem of finding the best hyper-parameter configuration is still remaining. In this paper, we propose a modification of the artificial bee colony (ABC) metaheuristic to act as parameterizers for a regularized ELM, incorporating three methods: an adaptive mechanism for ABC to balance exploration (global search) and exploitation (local search), an adaptation of the opposition-based learning technique called opposition local-based learning (OLBL) to strengthen exploitation, and a record of access to the search space called forbidden redundant indexes (FRI) that allow us to avoid redundant calculations and track the explored percentage of the search space. We set ten parameterizations applying different combinations of the proposed methods, limiting them to explore up to approximately 10% of the search space, with results over 98% compared to the maximum performance obtained in the exhaustive search in binary and multiclass datasets. The results demonstrate a promising use of these parameterizations to optimize the hyper-parameters of the R-ELM in datasets with different characteristics in cases where computational efficiency is required, with the possibility of extending its use to other problems with similar characteristics with minor modifications, such as the parameterization of support vector machines, digital image filters, and other neural networks, among others.
Keywords: artificial bee colony (ABC); metaheuristics; regularized extreme learning machine (R-ELM); hyper-parameter optimization (HPO); heuristic optimization; opposition-based learning (OBL); tabu search (TS); classification applications; artificial neural network (ANN) artificial bee colony (ABC); metaheuristics; regularized extreme learning machine (R-ELM); hyper-parameter optimization (HPO); heuristic optimization; opposition-based learning (OBL); tabu search (TS); classification applications; artificial neural network (ANN)

Share and Cite

MDPI and ACS Style

Vasquez-Iglesias, P.; Pizarro, A.E.; Zabala-Blanco, D.; Fuentes-Concha, J.; Ahumada-Garcia, R.; Laroze, D.; Gonzalez, P. A Hyper-Parameter Optimizer Algorithm Based on Conditional Opposition Local-Based Learning Forbidden Redundant Indexes Adaptive Artificial Bee Colony Applied to Regularized Extreme Learning Machine. Electronics 2024, 13, 4652. https://doi.org/10.3390/electronics13234652

AMA Style

Vasquez-Iglesias P, Pizarro AE, Zabala-Blanco D, Fuentes-Concha J, Ahumada-Garcia R, Laroze D, Gonzalez P. A Hyper-Parameter Optimizer Algorithm Based on Conditional Opposition Local-Based Learning Forbidden Redundant Indexes Adaptive Artificial Bee Colony Applied to Regularized Extreme Learning Machine. Electronics. 2024; 13(23):4652. https://doi.org/10.3390/electronics13234652

Chicago/Turabian Style

Vasquez-Iglesias, Philip, Amelia E. Pizarro, David Zabala-Blanco, Juan Fuentes-Concha, Roberto Ahumada-Garcia, David Laroze, and Paulo Gonzalez. 2024. "A Hyper-Parameter Optimizer Algorithm Based on Conditional Opposition Local-Based Learning Forbidden Redundant Indexes Adaptive Artificial Bee Colony Applied to Regularized Extreme Learning Machine" Electronics 13, no. 23: 4652. https://doi.org/10.3390/electronics13234652

APA Style

Vasquez-Iglesias, P., Pizarro, A. E., Zabala-Blanco, D., Fuentes-Concha, J., Ahumada-Garcia, R., Laroze, D., & Gonzalez, P. (2024). A Hyper-Parameter Optimizer Algorithm Based on Conditional Opposition Local-Based Learning Forbidden Redundant Indexes Adaptive Artificial Bee Colony Applied to Regularized Extreme Learning Machine. Electronics, 13(23), 4652. https://doi.org/10.3390/electronics13234652

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop