Exponentiated backpropagation algorithm for multilayer feedforward neural networks
Citations Over Time
Abstract
The gradient descent backpropagation learning algorithm is based on minimizing the mean square error. An alternate approach to gradient descent is the exponentiated gradient descent algorithm which minimizes the relative entropy. Exponentiated gradient descent applied to backpropagation is proposed for a multilayer feedforward neural network. The learning rules for changing weights in the output layer as well the hidden layer neurons in the network are developed. Simulations were performed to explore the convergence and learning of the backpropagation algorithm with exponentiated gradient descent. Accuracy obtained with exponentiated gradient descent back propagation was comparable to the gradient descent back propagation while convergence was faster. The results show that exponentiated gradient descent can be extended to a multilayer feedforward neural network and used in pattern classification applications.
Related Papers
- → Exponentiated backpropagation algorithm for multilayer feedforward neural networks(2004)23 cited
- → Analisis Penurunan Gradien dengan Kombinasi Fungsi Aktivasi pada Algoritma JST untuk Pencarian Akurasi Terbaik(2020)7 cited
- → Training Artificial Neural Networks: Backpropagation via Nonlinear Optimization(2001)6 cited
- → Gradients without Backpropagation(2022)20 cited
- → A New Backpropagation Algorithm without Gradient Descent(2018)9 cited