Fast Terminal Attractor Based Backpropagation Algorithm For Feedforward Neural Networks
2007Vol. 13, pp. 521–525
Citations Over TimeTop 21% of 2007 papers
Abstract
In this paper, a new efficient fast terminal attractor based backpropagation learning algorithm for feedforward neural networks is proposed, which improves the convergence speed. The effectiveness of the proposed algorithm in improving learning speed is shown by the simulation results including a sensor network example.
Related Papers
- → A new recurrent neural-network architecture for visual pattern recognition(1997)82 cited
- Equivalence results between feedforward and recurrent neural networks for sequences(2015)
- → A new recurrent neural network architecture for pattern recognition(1996)8 cited
- → Comparative Analysis on variants of Neural Networks: An Experimental Study(2019)3 cited
- → A genetic algorithm based neural-tuned neural network(2004)5 cited