Fine tuning CNN with imbalance data gives good results - not sure why

I'm fine tuning with caffenet for a 3 classes classification problem. I have 100 instances of class A, 90 instances of class B and 30 instances of class C. I thought that my net would be biased toward classes A and B but I'm actually getting quite good results. I know that caffenet doesn't take care of imbalance data for me. Maybe it has to do with the fact that my entire train set fits into one batch? Or maybe it has to do with the fact that I'm not really training from scratch but mostly using caffenet's already learned weights? Thanks

Topic cnn caffe class-imbalance deep-learning

Category Data Science


Are you getting good results on your training or your test set? If it's only the first then you are overfitting, but if it's the second it will not have to do with the fact that your whole training set fits in your batch, the gradients still have a strong pull towards the dominating class(es). But having the pretrained weights will help a lot if the classes are relatively easy to seperate. It will also help if your dataset is similar to the one it was pretrained on, because the intermediate layers that do feature extractions have learned valuable filters for your problem instance.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.