Gridsearchcv vs. BayessearchCV

Abdalsamad Keramatfar
2 min readDec 29, 2021
Hyperparameter Optimization

For applied machine learning hyperparameter optimization is an essential step which can boost the results and as you know it is very time-consuming.

Gridsearchcv and Randomsearchcv are two solutions to be used with scikit models. I personally do not use the later because, it is possible that there is bad luck best solution in the hyperparameter space. Recently, for a project I found hyperparameter space very large and I also doubted that maybe there is a better combination that i do not consider in my space. I found this tutorial. After reading the tutorial two question related to my condition raised.

Can Bayesian htperparameter tuning boost Gridsearchcv?

How does Bayesian htperparameter tuning perform against Gridsearchcv tuning, based on consumed time?

Note that both of the questions originate from my problem; unsatisfactory result and large hyperparameter space.

So, I decided to extend the tutorial by adding time and performance comparisons between two rivals. It should be noted that, it is just a case study and does not guarantee repetition in other cases.

I used Gridsearchcv and BayesSearchCV with the same dataset and with the same model. I defined a similar search space for Gridsearchcv and run the function for each of the rivals 10 times and report the mean of accuracy and spent time. Following table shows the results:

Performance and time-consumed comparisons between BayesSearchCV and Gridsearchcv

That is it! While Bayesian optimization performs better based on consumed-time, its performance is a bit lower than the Grid search. Colab notebook of the code.

--

--