On Similarity Preserving Feature Selection
Citations Over TimeTop 10% of 2011 papers
Abstract
In the literature of feature selection, different criteria have been proposed to evaluate the goodness of features. In our investigation, we notice that a number of existing selection criteria implicitly select features that preserve sample similarity, and can be unified under a common framework. We further point out that any feature selection criteria covered by this framework cannot handle redundant features, a common drawback of these criteria. Motivated by these observations, we propose a new "Similarity Preserving Feature Selection” framework in an explicit and rigorous way. We show, through theoretical analysis, that the proposed framework not only encompasses many widely used feature selection criteria, but also naturally overcomes their common weakness in handling feature redundancy. In developing this new framework, we begin with a conventional combinatorial optimization formulation for similarity preserving feature selection, then extend it with a sparse multiple-output regression formulation to improve its efficiency and effectiveness. A set of three algorithms are devised to efficiently solve the proposed formulations, each of which has its own advantages in terms of computational complexity and selection performance. As exhibited by our extensive experimental study, the proposed framework achieves superior feature selection performance and attractive properties.
Related Papers
- Efficient Feature Selection via Analysis of Relevance and Redundancy(2004)
- → On some aspects of minimum redundancy maximum relevance feature selection(2019)69 cited
- → A stable feature selection method based on relevancy and redundancy(2021)4 cited
- → Hopfield Networks in Relevance and Redundancy Feature Selection Applied to Classification of Biomedical High-Resolution Micro-CT Images(2008)3 cited
- Feature Selection Based on Minimal Redundancy Principle for Text Classification(2007)