Short-cut algorithms for the learning subspace method
Abstract
The Learning Subspace method is a pattern recognition method in which each class is represented by its own subspace. In the recognition, the orthogonal projections of the vector to be recognized are computed onto each of the subspaces, The vector is assigned to the class corresponding to the largest projection. In this paper, methods to reduce the number of projections to be calculated during recognition are introduced and compared. The methods are based on a similarity measure between subspaces. By using this measure one can determine an upper limit of the length of projection onto a subspace on the basis of the projection onto another subspace. The methods were tested with speech data. The results show that about 30 percent of the calculations can be eliminated.
Related Papers
- → Developing an Ensemble Classifier for Bankruptcy Prediction(2012)4 cited
- → Construction of block diagrams to scale in orthographic projection(1956)10 cited
- → Optimization of Random Subspace Ensemble for Bankruptcy Prediction(2015)1 cited
- Reorganize for objects and Check for Projection in Three Orthographic Views which draw by Computer(2002)
- Reconstruction of Wire-frame Model from Orthographic Views(2001)