Orthogonal least squares learning algorithm for radial basis function networks
Citations Over TimeTop 1% of 1991 papers
Abstract
The radial basis function network offers a viable alternative to the two-layer neural network in many applications of signal processing. A common learning algorithm for radial basis function networks is based on first choosing randomly some data points as radial basis function centers and then using singular-value decomposition to solve for the weights of the network. Such a procedure has several drawbacks, and, in particular, an arbitrary selection of centers is clearly unsatisfactory. The authors propose an alternative learning procedure based on the orthogonal least-squares method. The procedure chooses radial basis function centers one by one in a rational way until an adequate network has been constructed. In the algorithm, each selected center maximizes the increment to the explained variance or energy of the desired output and does not suffer numerical ill-conditioning problems. The orthogonal least-squares learning strategy provides a simple and efficient means for fitting radial basis function networks. This is illustrated using examples taken from two different signal processing applications.
Related Papers
- → Radar target classification based on radial basis function and modified radial basis function networks(2002)5 cited
- Approximation of Function by Adaptively Growing Radial Basis Function Neural Networks(2003)
- → Reproductive and competitive radial basis function networks adaptable to dynamical environments(2000)
- → Reproductive and competitive radial basis function networks adaptable to dynamical environments(2000)