Evolutionary Computing Optimization for Parameter Determination and Feature Selection of Support Vector Machines
Citations Over TimeTop 25% of 2009 papers
Abstract
Support vector machine (SVM) is a popular pattern classification method with many diverse applications. Kernel parameter setting in the SVM training procedure, along with the feature selection, significantly influences the classification accuracy. This study simultaneously determines the parameter values while discovering a subset of features, increasing SVM classification accuracy. The study focuses two evolutionary computing approaches to optimize the parameters of SVM: particle swarm optimization (PSO) and genetic algorithm (GA). And we combine the two evolutionary methods with SVM to choose appropriate subset features and SVM parameters, experimental results demonstrate that the classification accuracy surpass traditional grid searching approach. Also the paper compares PSO with GA method based SVM classification and they have similar results.
Related Papers
- → Grid search in hyperparameter optimization of machine learning models for prediction of HIV/AIDS test results(2021)393 cited
- → Grid Search Based Hyperparameter Optimization for Machine Learning Based Non-Intrusive Load Monitoring(2023)6 cited
- → Deep Neural Network with Hyperparameter Tuning for Detection of Heart Disease(2021)23 cited
- → Hyperparameter Tuning on Classification Algorithm with Grid Search(2022)16 cited
- → Novel Suboptimal approaches for Hyperparameter Tuning of Deep Neural Network [under the shelf of Optical Communication](2019)3 cited