Project - Building Cat vs Non-Cat Image Classifier using NumPy and ANN

23 / 32

Cat vs Non-cat Classifier - Defining some utility functions - Propagate

The forward and backward propagation are important steps to get the gradients and costs in the process of training our algorithm.

In the forward propagation, we calculate the dot product of the feature vector, ie X and the weights matrix, and then the resultant is added with the bias vector. Next, the sigmoid function is applied to get the activations and the cost is calculated.

In backward propagation, we calculate the gradients of the weights and bias matrices.

Let us use propagate function which calls both the forward_prop and back_prop functions.

INSTRUCTIONS
  • Call the forward_prop function and back_prop function the appropriate places of the below propagate function.

    def propagate(w, b, X, Y):
    
        #Forward propagation
        A, cost = << your code comes here >>(w, b, X, Y)
    
        #Backward propagation
        grads = << your code comes here >>(X, A, Y)
    
        return grads, cost
    
  • For example, we could get the gradients and cost returned by the function propagate as follows:

    w, b, X, Y = np.array([[1], [2]]), 2, np.array([[1,2], [3,4]]), np.array([[1, 0]])
    grads, cost = propagate(w, b, X, Y)
    
Get Hint See Answer


Note - Having trouble with the assessment engine? Follow the steps listed here

Loading comments...