How can I choose num of epochs and batch size?

I have the coco 2014 dataset and need to train it as training is around 82700 and testing is 40500. However, I got the same sentence with different values every time with model.pedict() as I used one epoch only. Now, how can I decide the right number of epochs? I am trying 20 epochs now, but without batch size. Is that right?

Topic predict epochs training

Category Data Science


Epoch

One epoch leads to underfitting of the curve in the graph (below).

Epochs effects on prediction

Increasing number of epochs helps to increase number of times the weight are changed in the neural network and the curve goes from underfitting to optimal to overfitting curve. Number of epochs is related to how diverse your data is. Read this article for better understanding

Batch size

Good batch size can really speed up your training and have better performance

Finding the right batch size is usually through trial and error. 32 is a good batch size to start with and keep increasing in multiples of two.

There are few batch finders in Python like rossmann_bs_finder.py

This article can help you better understand batch size, How to get 4x speedup and better generalization using the right batch size

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.