Estimation of the optimal number of neurons in extreme learning machine using simulated annealing and the golden section
Autor
Gelvez-Almeida, Elkin
Huérfano-Maldonado, Y
Salazar-Jurado, Edwin
Martínez-Jeraldo, N
Lozada-Yavina, Rafael
Baldera-Moreno, Yvan
Tobar Valenzuela, Luis
Fecha
2023Resumen
Extreme learning machine is a neural network algorithm widely accepted in the scientific community due to the simplicity of the model and its good results in classification and regression problems; digital image processing, medical diagnosis, and signal recognition are some applications in the field of physics addressed with these neural networks. The algorithm must be executed with an adequate number of neurons in the hidden layer to obtain good results. Identifying the appropriate number of neurons in the hidden layer is an open problem in the extreme learning machine field. The search process has a high computational cost if carried out sequentially, given the complexity of the calculations as the number of neurons increases. In this work, we use the search of the golden section and simulated annealing as heuristic methods to calculate the appropriate number of neurons in the hidden layer of an Extreme Learning Machine; for the experiments, three real databases were used for the classification problem and a synthetic database for the regression problem. The results show that the search for the appropriate number of neurons is accelerated up to 4.5× times with simulated annealing and up to 95.7× times with the golden section search compared to a sequential method in the highest-dimensional database.
Fuente
Journal of Physics: Conference Series, 2515, 012003Link de Acceso
Click aquí para ver el documentoIdentificador DOI
doi.org/10.1088/1742-6596/2515/1/012003Colecciones
La publicación tiene asociados los siguientes ficheros de licencia: