Vanishing problem with cyclegan wasserstein loss function

I have modified a keras cyclegan keras cyclegan version of horses and zebras to the classical fer2013 face recognition file.

I got some results this cyclegan trying to get some additional DISGUST (disgust is the case with fewer samples in the file) faces from NORMALS and I'm trying to modify the discriminator loss with wasserstein as stated here:

But after some epochs I see that after every epoch result face is the same no matter the input face is fed. I conclude that this is the famous vanishing problem.

This is the colab network. You can see the images that result in the last epochs...Any help would be appreciated in order to prevent this to happen.

This is the code of loss function (from colab):

# implementation of wasserstein loss
def wasserstein_loss_fn(y_true, y_pred):
    return K.mean(y_true * y_pred)

# Define the loss function for the generators
def generator_loss_fn(fake):
    fake_loss = wasserstein_loss_fn(tf.ones_like(fake), fake)
    return fake_loss


# Define the loss function for the discriminators
def discriminator_loss_fn(real, fake):

    # Add random noise to the labels - important trick!
    real_labels = tf.ones_like(real)
    real_labels += 0.05 * tf.random.uniform(tf.shape(real_labels))
    fake_labels = tf.ones_like(fake)
    fake_labels += 0.05 * tf.random.uniform(tf.shape(fake_labels))

    real_loss = wasserstein_loss_fn(real_labels, real)
    fake_loss = wasserstein_loss_fn(fake_labels, fake)
    return (real_loss - fake_loss) * 0.5

Topic cyclegan wasserstein keras

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.