Project - Building Cat vs Non-Cat Image Classifier using NumPy and ANN

22 / 32

Cat vs Non-cat Classifier - Defining some utility functions - Back Propagation

In backward propagation, we calculate the derivatives of weights and bias.

We compute the gradient of weights by:

  • calculating the dot product of X and (A-Y).T

  • dividing the result by m, the total number of samples in the dataset.

We compute the gradient of bias by:

  • calculating the np.sum(A-Y)

  • dividing the result by m, the total number of samples in the dataset.

Note:

  • np.dot(a,b) calculates the dot product of the two vectors a and b.

  • np.sum(x) calculates the sum of elements in the input array.

INSTRUCTIONS
  • The back_prop function below, calculates the gradients of w and b. Copy-paste the following function.

    def back_prop(X, A, Y):
    
        # calculate gradients for w,b
        m = X.shape[1]
        dw = (1/m) * np.dot(X, (A-Y).T)
        db = (1/m) * np.sum(A-Y)
    
        grads = {'dw': dw, 'db': db}
    
        return grads
    
Get Hint See Answer


Note - Having trouble with the assessment engine? Follow the steps listed here

Loading comments...