Multi-Objective Hyperparameter Optimization for Spiking Neural Network Neuroevolution
Citations Over Time
Abstract
Neuroevolution has had significant success over recent years, but there has been relatively little work applying neuroevolution approaches to spiking neural networks (SNNs). SNNs are a type of neural networks that include temporal processing component, are not easily trained using other methods because of their lack of differentiable activation functions, and can be deployed into energy-efficient neuromorphic hardware. In this work, we investigate two evolutionary approaches for training SNNs. We explore the impact of the hyperparameters of the evolutionary approaches, including tournament size, population size, and representation type, on the performance of the algorithms. We present a multi-objective Bayesian-based hyperparameter optimization approach to tune the hyperparameters to produce the most accurate and smallest SNNs. We show that the hyperparameters can significantly affect the performance of these algorithms. We also perform sensitivity analysis and demonstrate that every hyperparameter value has the potential to perform well, assuming other hyperparameter values are set correctly.
Related Papers
- → Deep Neural Network with Hyperparameter Tuning for Detection of Heart Disease(2021)23 cited
- → Hyperparameter Tuning on Classification Algorithm with Grid Search(2022)16 cited
- → Hyp-RL : Hyperparameter Optimization by Reinforcement Learning(2019)34 cited
- → Novel Suboptimal approaches for Hyperparameter Tuning of Deep Neural Network [under the shelf of Optical Communication](2019)3 cited
- → Hyperparameter Tuning of Dense Neural Network for ECG Signal Classification(2022)