Auto-WEKA
Citations Over TimeTop 1% of 2013 papers
Abstract
Many different machine learning algorithms exist; taking into account each algorithm's hyperparameters, there is a staggeringly large number of possible alternatives overall. We consider the problem of simultaneously selecting a learning algorithm and setting its hyperparameters, going beyond previous work that attacks these issues separately. We show that this problem can be addressed by a fully automated approach, leveraging recent innovations in Bayesian optimization. Specifically, we consider a wide range of feature selection techniques (combining 3 search and 8 evaluator methods) and all classification approaches implemented in WEKA's standard distribution, spanning 2 ensemble methods, 10 meta-methods, 27 base classifiers, and hyperparameter settings for each classifier. On each of 21 popular datasets from the UCI repository, the KDD Cup 09, variants of the MNIST dataset and CIFAR-10, we show classification performance often much better than using standard selection and hyperparameter optimization methods. We hope that our approach will help non-expert users to more effectively identify machine learning algorithms and hyperparameter settings appropriate to their applications, and hence to achieve improved performance.
Related Papers
- → Initializing Bayesian Hyperparameter Optimization via Meta-Learning(2015)394 cited
- → Deep Neural Network with Hyperparameter Tuning for Detection of Heart Disease(2021)23 cited
- → Evaluation of Hyperparameter-Optimization Approaches in an Industrial Federated Learning System(2022)14 cited
- → Bayesian Hyperparameter Optimization for Ensemble Learning(2016)35 cited
- → Evaluation of Hyperparameter-Optimization Approaches in an Industrial Federated Learning System(2021)