Case Study III: Tuning of Deep Neural Networks
Citations Over TimeTop 19% of 2023 papers
Abstract
Abstract A surrogate model based Hyperparameter Tuning (HPT) approach for Deep Learning (DL) is presented. This chapter demonstrates how the architecture-level parameters (hyperparameters) of Deep Neural Networks (DNNs) that were implemented in / can be optimized. The implementation of the tuning procedure is 100% accessible from R , the software environment for statistical computing. How the software packages (, , and ) can be combined in a very efficient and effective manner will be exemplified in this chapter. The hyperparameters of a standard DNN are tuned. The performances of the six Machine Learning (ML) methods discussed in this book are compared to the results from the DNN. This study provides valuable insights in the tunability of several methods, which is of great importance for the practitioner.
Related Papers
- → Estimation of Sea State Parameters From Measured Ship Responses: The Bayesian Approach With Fixed Hyperparameters(2010)6 cited
- → Method for Hyperparameter Tuning of EfficientNetV2-based Image Classification by Deliberately Modifying Optuna Tuned Result(2023)2 cited
- → No More Pesky Hyperparameters: Offline Hyperparameter Tuning for RL(2022)3 cited
- → Practical Differentially Private Hyperparameter Tuning with Subsampling(2023)2 cited
- → Optional Hyperparameter Tuning of Convolutional Neural Network for ECG Classification(2023)