More training data - Less Memory

I have a training dataset of images common images, there are more than 5K images in this dataset. But I have less memory in Google colab- RAM-12GB.

I need to train all the images but due to less memory, I can't.

What are the possible ways to train all the images with less memory?

I have an idea, but don't know it is an optimal solution, which is

I split the dataset into 5 sets[each set contains 1000 images] and train the 1 set of dataset. Then, using the model file, I will train the 2 nd set of dataset, then again load the updated model file, I will train the 3rd set of dataset, and continues...

If I followed this steps, then it means that I trained all the images in the dataset?

Thanks for your help

Topic ai keras deep-learning python machine-learning

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.