Login using Social Account
     Continue with GoogleLogin using your credentials
Let us first see how many trainable parameters are there for model_B
we trained previously.
Then we shall create a new model model_B_on_A
which has the pre-trained parameters of model_A
but customized final dense layer with only 1 neuron.
Finally, we shall compare the performance of both the models - model_B
and model_B_on_A
.
See the model_B
summary using summary()
on model_B
.
model_B.<<your code comes here >>
We see that there are 275,801 trainable parameters for model_B
.
Now, before creating model_B_on_A
(a model based on pre-trained layers of model_A
), we shall clone the model_A
and set its trained weights so that when you train model_B_on_A
, it will not affect model_A
.
We could copy the model_A
architechture using keras.models.clone_model
.
Create model_A_clone
which is the copy of model_A
.
model_A_clone = keras.models.clone_model(model_A)
Get the weights of model_A
using get_weights()
, and set the model parameters for model_A_clone
using set_weights()
.
model_A_clone.<< your code comes here >>(model_A.get_weights())
Now, create a new model model_B_on_A
, based on existing layers of model_A
.
<< your code comes here >> = keras.models.Sequential(model_A.layers[:-1])
Add the final dense layer with 1 neuron to the model_B_on_A
. Set the activation
to "sigmoid"
, as this is a binary classification problem.
model_B_on_A.add(keras.layers.Dense(1, activation=<< your code comes here >>))
Set all the layers, except the last layer, of model_B_on_A
to be non-trainable.
for layer in model_B_on_A.layers[:-1]:
layer.trainable = False
Now check the number of trainable parameters of model_B_on_A
.
model_B_on_A.summary()
We observe there are only 51 parameters to train in model_B_on_A
, while there are as many as 275,801 trainable parameters for model_B
.
Compile the model model_B_on_A
by using model.compile
.
Set loss="binary_crossentropy"
.
Set optimizer=keras.optimizers.SGD(lr=1e-3)
model_B_on_A.compile(loss=<< your code comes here >>,
optimizer=<< your code comes here >>,
metrics=["accuracy"])
Now train the model_B_on_A
uaing model.fit
.
history = model_B_on_A.fit(X_train_B, y_train_B, epochs=5,
validation_data=(X_valid_B, y_valid_B))
Taking you to the next exercise in seconds...
Want to create exercises like this yourself? Click here.
Note - Having trouble with the assessment engine? Follow the steps listed here
Loading comments...