Using vgg16 or inception with wights equals to None

When using pre-trained models like vgg16 or inception, it seems that one of the benfits of using pre-trained model, is to save up time of training.

Is there a reason to use the pre-trained models without loading the weights ? (use random weights) ?

Topic inceptionresnetv2 vgg16 transfer-learning inception deep-learning

Category Data Science


The advantage of using a pre-trained model without loading the weights (which would mean you are only use the model, not a pre-trained version) is that you can easily use an existing model architecture and applying it to your problem. This can save you quite some time since you don't have to build the model architecture yourself in tensorflow/keras/pytorch and can go straight to applying the model on your data.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.