tuning a convolution neural net, sample size

I keep reading that convolution neural net (CNN) performs best with lots and lots (100k+) of data. Is there any rule of thumb, or lower limit for data size during the grid search phase?

For example, if I run a CNN with 100 data points, vary just one parameter (say add an extra layer, or increase a filter size), and get better results, can I reasonably expect better results with those parameters during the actual training phase?

Topic hyperparameter-tuning cnn convolution machine-learning

Category Data Science


If you use pre-trained weights, you need significantly lesser data as the initial layers have already learned from a ton of data and you just need to fine tune the later ones.

What you said is not true, you can train on CIFAR10 and get 90%+ and that is not 100k+. It depends on the complexity of the data and how similar the features are. If they are easily Seperable -less data. If the disctintions are harder then the model needs a lot of examples to figure out which of features are seperate.

I would say you could IF you sample was representive of the population.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.