Login using Social Account
Login using your credentials
Download the slides
Want to create exercises like this yourself? Click here.
No hints are availble for this assesment
Answer is not availble for this assesment
1 Ensemble Learning Part -1
2 Ensemble Learning Part -2
3 Ensemble Learning Part -3
4 The model which consists of a group of predictors is...
5 A Random forest is an ensemble of Decision Trees...
6 The steps involved in deciding the output of a Random...
7 A hard voting classifier takes into consideration...
8 If each classifier is a weak learner, the ensemble can...
9 Ensemble methods work best when the predictors are...
10 To get diverse classifiers we cannot train them using different...
11 Training the classifiers in an ensemble using very different algorithms...
12 When we consider only the majority of the outputs from...
13 Soft voting takes into consideration...
14 In soft voting, the predicted class is the class with...
15 Soft voting achieves higher performance than hard voting because...
16 The parameter which decides the voting method in a VotingClassifier...
17 The parameter which holds the list of classifiers which are...
18 One way to get a diverse set of classifiers is...
19 When sampling is performed with replacement, the method is...
20 When sampling is performed without replacement, it is called...
21 Both bagging and pasting allow training instances to be sampled...
22 In bagging/pasting training set sampling and training, predictors can all...
23 To use the bagging method, the value of the bootstrap...
24 To use the pasting method, the value of the bootstrap...
25 Overall, bagging often results in better models...
26 How many training instances with replacement does the BaggingClassifier train...
27 With bagging, it is not possible that some instances are...
28 Features can also be sampled in the BaggingClassifier...
29 The hyperparameters which control the feature sampling are...
30 Sampling both training instances and features is called the...
31 Keeping all training instances (i.e., bootstrap=False and max_samples=1.0) but sampling...
32 Random forest is an ensemble of Decision Trees generally trained...
33 We can make the trees of a Random Forest even...
34 If we look at a single Decision Tree, important features...
35 Feature importances are available via the feature_importances_ method of the...
36 The general idea of most boosting methods is to train...
37 One of the drawbacks of AdaBoost classifier is that...
38 A Decision Stump is a Decision Tree with...
39 In Gradient Boosting, instead of tweaking the instance weights at...
40 The learning_rate hyperparameter of GradientBoostingRegressor scales the contribution of each...
41 The ensemble method in which we train a model to...
42 XGBoost
Loading comments...