Effect of activation function symmetry on training of SFFANNs with the backpropagation algorithm
Citations Over Time
Abstract
On 17 learning task (12 function approximation and 5 real life regression problems), we compare the efficiency and efficacy of using asymmetric or anti-symmetric activation functions in sigmoidal feedforward artificial neural network training and usage. The result obtained in the experiment allows us to conclude that for networks trained using the batch update variant of the backpropagation algorithm, the usage of antisymmetric activation functions may give better performance is some cases, but in a few cases networks using antisymmetric activation functions may give better performance and in majority of cases the performance of networks using anti-symmetric or asymmetric activation are equivalent. Thus a clear preference for anti-symmetric activation functions cannot be made.
Related Papers
- → The generalized sigmoid activation function: Competitive supervised learning(1997)220 cited
- → RSigELU: A nonlinear activation function for deep neural networks(2021)107 cited
- → Weighted sigmoid gate unit for an activation function of deep neural network(2020)52 cited
- → A Novel Activation Function in Convolutional Neural Network for Image Classification in Deep Learning(2020)2 cited
- → Evaluation of Sigmoid and ReLU Activation Functions Using Asymptotic Method(2022)