End-to-End ML Project - California Housing

You are currently auditing this course.
16 / 17

End to End ML Project - Fine tune your model with Grid Search

We will further fine tune our models using hyper parameter tuning through GridSeachCV. It loops through predefined hyperparameters and fit your estimator (model) on your training set. After this you can select the best set of parameters from the listed hyperparameters to use with your model.

INSTRUCTIONS
  • First we will import GridSearchCV from Scikit-learn

    from sklearn.model_selection import <<your code goes here>>
    
  • Then we will define a set of various n_estimators and max_features in your model. First it will try a set of 3 n_estimators and 4 max_features giving a total of 12 combination of parameters. Next it will set the bootstrap hyperparameter as False and try a combination of 6 different hyperparameters as shown below

    param_grid = [
        {'n_estimators': [3, 10, 30], 'max_features': [2, 4, 6, 8]},
        {'bootstrap': [False], 'n_estimators': [3, 10], 'max_features': [2, 3, 4]},
      ]
    
  • Now we will use these combination of hyperparameters on our Random Forest model

    forest_reg = RandomForestRegressor(random_state=42)
    grid_search = GridSearchCV(forest_reg, param_grid, cv=5,
                               scoring='neg_mean_squared_error',
                               return_train_score=True)
    grid_search.fit(housing_prepared, housing_labels)
    
  • Now let's see the best combination of parameters

    grid_search.best_params_
    
  • And the best combination of estimator

    grid_search.best_estimator_
    
  • Finally, let's computer the results and print the scores

    cvres = grid_search.cv_results_
    for mean_score, params in zip(cvres["mean_test_score"], cvres["params"]):
        print(np.sqrt(-mean_score), params)
    
See Answer

No hints are availble for this assesment

Loading comments...