Neural network model in Google Colab significantly slower with increased data points
I've made a neural network in google colab. The original data contained 12 000 rows (csv 350KB). 4 features and 2 labels. Each epoch would run within a few ms. I tried to increase the amount of data to see if the algorithm would yield better results. All I did was update the file, which now contains 326 000 rows of data (12MB). But the algorithm is now incredibly slow, taking several seconds to run a single epoch making it completely unfeasible to do my analysis.
I tried increasing the batch_size significantly (i.e. from 100 to 1000), but to no avail. I have three hidden layers with 100 neurons in each.
Is this a general feature with machine learning (i.e. a larger csv file requires significantly more processing power)? Does my neural network need to be adapted to deal with the larger data set? Or is google colab generally slow in handling larger csv files?
Topic colab python machine-learning
Category Data Science