Hyperparameter Tuning on Classification Algorithm with Grid Search
Citations Over TimeTop 10% of 2022 papers
Abstract
Currently, machine learning algorithms continue to be developed to perform optimization with various methods to produce the best-performing model. In Supervised learning or classification, most of the algorithms have hyperparameters. Tuning hyperparameter is an architecture of deep learning to improve the performance of predictive models. One of the popular hyperparameter methodologies is Grid Search. Grid Search using Cross Validation provides convenience in testing each model parameter without having to do manual validation one by one. In this study, we will use a method in hyperparameter optimization, namely Grid Search. The purpose of this study is to find out the best optimization of hyperparameters against 7 machine learning classification algorithms. Validation of experimental results using the Mean Cross Validation. The experimental results show that the XGBoost model gets the best value while the Decision tree has the lowest value.
Related Papers
- → Grid Search Based Hyperparameter Optimization for Machine Learning Based Non-Intrusive Load Monitoring(2023)6 cited
- → Deep Neural Network with Hyperparameter Tuning for Detection of Heart Disease(2021)23 cited
- → Hyperparameter Tuning on Classification Algorithm with Grid Search(2022)16 cited
- → Novel Suboptimal approaches for Hyperparameter Tuning of Deep Neural Network [under the shelf of Optical Communication](2019)3 cited
- → Hyperparameter Tuning of Dense Neural Network for ECG Signal Classification(2022)