Help needed in interpreting the loss, val_loss vs epoch plots for an autoencoder training?

I am training a variational autoencoder and I am getting a loss-plot as follows:

Rigt after epoch 224, val-loss overtakes train-loss and sort of getting bigger but at an extremely slow pace as you can notice. I trained for 300 epochs.

Any opinion about the training. I don't think it is overfitting the data. But I want to be sure and hence seeking opinion from the data science community.

Thanks.

Topic vae training bayesian-networks autoencoder neural-network

Category Data Science


You tagged your question with vae, so I will assume that this is a variational autoencoder. In that case, your loss has 2 components: reconstruction error and KL divergence of the bottleneck distribution, probably balanced with a weight $\beta$. You should plot both components separately in order to understand how the training dynamics evolve. The reconstruction loss is what will tell you if there's overfitting. The KL divergence will tell you if there is posterior collapse.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.