- Home
- Assessment

22 / 32

In backward propagation, we calculate the derivatives of weights and bias.

We compute the gradient of weights by:

calculating the dot product of X and (A-Y).T

dividing the result by m, the total number of samples in the dataset.

We compute the gradient of bias by:

calculating the np.sum(A-Y)

dividing the result by m, the total number of samples in the dataset.

**Note**:

`np.dot(a,b)`

calculates the dot product of the two vectors a and b.`np.sum(x)`

calculates the sum of elements in the input array.

The

`back_prop`

function below, calculates the gradients of`w`

and`b`

. Copy-paste the following function.`def back_prop(X, A, Y): # calculate gradients for w,b m = X.shape[1] dw = (1/m) * np.dot(X, (A-Y).T) db = (1/m) * np.sum(A-Y) grads = {'dw': dw, 'db': db} return grads`

XP

Taking you to the next exercise in seconds...

Want to create exercises like this yourself? Click here.

Checking Please wait.

Success

Error

Fetching hint, please wait...

Error

Fetching answer, please wait...

Error

**Note - **Having trouble with the assessment engine? Follow the steps listed
here

## Loading comments...