Halloween Sale: Flat 70% + Addl. 25% Off + 30 Days Extra Lab on all Courses | Use Coupon HS25 in Checkout | Offer Expires InEnroll Now
We shall add the dense layers on top of the pre-trained layers, so as to make it learn about our cat vs non-cat dataset, and use it on our test data.
Let us see how we could do that.
Create the input layer as follows.
inp = Input(shape=(64, 64, 3), name='image_input')
Since our dataset has images of shape 64 x 64 x3, we shall set the shape of the input image the same. Also, we shall give the name of the layer as image_input.
Initial a sequential model:
#initiate a model vgg_model = Sequential()
Now, add the pre-trained
vgg_base to the sequential model
vgg_model we have initialized.
#Add the VGG base model vgg_model.add(vgg_base)
We shall now add the dense layers which we would train further:
vgg_model.add(GlobalAveragePooling2D()) vgg_model.add(Dense(1024,activation='relu')) vgg_model.add(Dropout(0.6)) vgg_model.add(Dense(512,activation='relu')) vgg_model.add(Dropout(0.5)) vgg_model.add(Dense(1024,activation='relu')) vgg_model.add(Dropout(0.4)) vgg_model.add(Dense(1024,activation='relu')) vgg_model.add(Dropout(0.3)) vgg_model.add(Dense(1, activation='sigmoid'))
We have first added the
GlobalAveragePooling2D layer, and then the dense layer with 1024 neurons and activation function
Observe that the
Dropout rate is 0.6, which means 60% of neurons will be randomly ignored during each pass in the training phase in order to make sure the network doesn't overfit. Note that dropout won't be functioning during the test time.
Similarly, other dense and dropout layers were added. At last, a dense layer with 1 neuron is added, which is the output layer. Thus, we have put the activation function
No hints are availble for this assesment
Answer is not availble for this assesment
Note - Having trouble with the assessment engine? Follow the steps listed here