Flash Sale: **Flat 70% + Addl. 25% Off** on all Courses | Use Coupon **DS25** in Checkout | Offer Expires In

- Home
- Assessment

In backward propagation, we calculate the derivatives of weights and bias.

We compute the gradient of weights by:

calculating the dot product of X and (A-Y).T

dividing the result by m, the total number of samples in the dataset.

We compute the gradient of bias by:

calculating the np.sum(A-Y)

dividing the result by m, the total number of samples in the dataset.

**Note**:

`np.dot(a,b)`

calculates the dot product of the two vectors a and b.`np.sum(x)`

calculates the sum of elements in the input array.

The

`back_prop`

function below, calculates the gradients of`w`

and`b`

. Copy-paste the following function.`def back_prop(X, A, Y): # calculate gradients for w,b m = X.shape[1] dw = (1/m) * np.dot(X, (A-Y).T) db = (1/m) * np.sum(A-Y) grads = {'dw': dw, 'db': db} return grads`

XP

Checking Please wait.

Success

Error

No hints are availble for this assesment

Answer is not availble for this assesment

**Note - **Having trouble with the assessment engine? Follow the steps listed
here

## Loading comments...