Does adding of many FC layers during re-training increase the model size ? Are there any ways to optimize the size of model?

I am re-training a pretrained model VGG16.

In the last layers, im using two FC layers of size 2048 each, with dropout=0.5.

When I saved the model, the size of the model was found to be 2 GB (which is so huge). I have trained this model for 9 classes (900 images).

Is there any way I could reduce the size of the model without affecting its performance?

The reason is I have to use this model in AWS cloud, where the systems are having 8GB ram. I am worried if their performance would be affected by such models.

Topic cnn keras deep-learning neural-network

Category Data Science


You can change the type of your weight. You will loose precision but reduce the size of the weights. You can also do some regularisation during training to learn sparse weight matrix

You can read these two articles:

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.