Training Models

11 / 57

Batch Gradient Descent involves calculations over the full training set X, at each Gradient Descent step! As a result it is terribly slow on very large training sets. However, Gradient Descent scales well with the number of features. True or False?


Note - Having trouble with the assessment engine? Follow the steps listed here


No hints are availble for this assesment

Answer is not availble for this assesment

Please login to comment

Be the first one to comment!