Parallel hyperparameter optimization techniques?

Most hyperparameter optimization technique want to evaluate points one by one. I have an expensive optimization problem, but i can run hundreds of evaluations in parallel. The dimension of the problem is around 20-30. My variables are mostly continuous.

Is there any technique with open source, documented implementation available for this kind of problem?

Topic bayesian hyperparameter optimization parallel

Category Data Science


The python hyperopt library will evaluate multiple trials in parallel, it's open source and there's a paper.

Also I'm fairly sure AWS Sagemaker has a distributed Baysian algorithm, it doens't meet your criteria of open source though.


Bayesian optimisation is sequential in the sense that you need to know the value of the function for n point to decide through an acquisition criteria the next point to evaluate.

Maybe you could customize it to your problem so that the acquisition returns not one point but a batch of them, which you distribute at the next step.

You can also use an hybrid method. First run a classic grid search, distributed, and evaluate the function at many many points. Feed all this knowledge (points and objective value at these points) to a classic bayesian optimiser which will pick points one by one and finer tune the optimisation here. Not as optimal as the former, but less implementation work here.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.