Benefits of using Deep Learning-specific hyperparameter optimization tools vs. sklearn?

There are quite a few library for hyperparameter optimization that are specific to Keras or other Deep Learning libraries, like Hyperas or Talos.

My question is, what's the main benefit of using these libraries compared to, for example, sklearn.model_selection.GridSearchCV() or sklearn.model_selection.RandomizedSearchCV?

Topic hyperparameter-tuning keras hyperparameter deep-learning python

Category Data Science


The biggest difference is the scikit-learn version are meant to work with scikit-learn Estimator API. Other deep learning models might not be consistent with that API. If you try to instantiate the class, it will not work.

It is better to use the options designed for the ecosystem. When doing hyperparameter optimization in scikit-learn, use that packages options. When doing hyperparameter optimization in deep learning, use deep learning specific options.


In my case with hyperas I noticed one of the distinct advantage over gridsearch, that is, gridsearch function takes only one array as input. My requirement was two be able to send two array as input as I am working with siamese network. I could do it with hyperas out of the box. So hyperas is more flexible than gridsearchcv. Check this example

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.