End-to-End ML Project- Beginner friendly

90 / 94

Result of grid search

I got our best combination of hyperparameters as-

{'max_features': 8, 'n_estimators': 30}

It may be possible that you got some other value due to the stochastic nature of cross-validation. But it's much less probable to happen,

As these are the maximum values that were evaluated, you should perform grid search again with higher values since the score may continue to improve.

You can also check the evaluation scores for all combinations by running the following code-

cvres = grid_search.cv_results_
for mean_score, params in zip(cvres["mean_test_score"], cvres["params"]):
    print(-mean_score, params)

In our case, it will print the RMSE value as scores.

Also, when you have to explore a large number of combinations, GridSearchCv can take much time to perform. In such cases, you can use Randomized Search for hyperparameter tuning. It is much the same as Grid Search apart from that it doesn't try all the combinations. It picks up combinations at random and evaluates our model on those combinations. Hence, we can try a large number of combinations in less time.

You can refer to RandomizedSearchCV documentation and Grid and random search for further details about it.


No hints are availble for this assesment

Answer is not availble for this assesment

Loading comments...