ImageDataGenerator for multi task output in Keras using flow_from_directory

I am creating a multitask CNN model and I have two different classification properties (one with 10 classes, 2nd with 5 classes) and my directory structure looks like this:

    -Train
       - image1.jpg
          ...
       - imageN.jpg
   
    -Test
       - image1.jpg
             ...
       - imageN.jpg

    -Vald
       - image1.jpg
          ...
       - imageN.jpg

And labels are in a csv file as propA, propB. So, a single image will have two classes, one from property A and one from property B.

The model uses VGG16 :

baseModel = VGG16(weights=imagenet, include_top=False,input_tensor=Input(shape=(img_size, img_size, 3)))
flatLayer = baseModel.output
sharedLayer = Flatten(name=flatten)(flatLayer)
sharedLayer = Dense(1024,name=Shared)(sharedLayer)
sharedLayer = Dropout(0.5)(sharedLayer)
task1 = Dense(512, activation=relu)(sharedLayer)
task1 = Dense(10, activation=softmax,name='PFR')(task1)
task2 = Dense(512, activation=relu)(sharedLayer)
task2 = Dense(5, activation=softmax,name='FT')(task2)
model3 = Model(inputs=baseModel.input, outputs=[task1,task2])

The number of images is large, therefore I cannot load them in memory and need to use flow_from_directory like functionality. But, in my train directory, there are no class sub directories because it is not possible to generate class directories as there are total 15 classes and I am not sure on which property to generate class subdirs for. (and flow_from_directory doesnt work if there are no class subdirs)

The labels are available in array, propALab and propBLab.

So far, I havenot found anything helpful, can someone help?

Topic cnn keras tensorflow multitask-learning

Category Data Science


Have you tried creating a custom Data Generator for your use case? The structure for it can be found below. You will need the __data_generation method such that you return y as an array with 2 elements, with the first element being your labels for task 1 and the second element for task 2.

import numpy as np
import keras

class DataGenerator(keras.utils.Sequence):
    'Generates data for Keras'
    def __init__(self, list_IDs, labels, batch_size=32, dim=(32,32,32), n_channels=1,
                 n_classes=10, shuffle=True):
        'Initialization'
        self.dim = dim
        self.batch_size = batch_size
        self.labels = labels
        self.list_IDs = list_IDs
        self.n_channels = n_channels
        self.n_classes = n_classes
        self.shuffle = shuffle
        self.on_epoch_end()

    def __len__(self):
        'Denotes the number of batches per epoch'
        return int(np.floor(len(self.list_IDs) / self.batch_size))

    def __getitem__(self, index):
        'Generate one batch of data'
        # Generate indexes of the batch
        indexes = self.indexes[index*self.batch_size:(index+1)*self.batch_size]

        # Find list of IDs
        list_IDs_temp = [self.list_IDs[k] for k in indexes]

        # Generate data
        X, y = self.__data_generation(list_IDs_temp)

        return X, y

    def on_epoch_end(self):
        'Updates indexes after each epoch'
        self.indexes = np.arange(len(self.list_IDs))
        if self.shuffle == True:
            np.random.shuffle(self.indexes)

    def __data_generation(self, list_IDs_temp):
        'Generates data containing batch_size samples' # X : (n_samples, *dim, n_channels)
        # Initialization
        X = np.empty((self.batch_size, *self.dim, self.n_channels))
        y = np.empty((self.batch_size), dtype=int)

        # Generate data
        for i, ID in enumerate(list_IDs_temp):
            # Store sample
            X[i,] = np.load('data/' + ID + '.npy')

            # Store class
            y[i] = self.labels[ID]

        return X, keras.utils.to_categorical(y, num_classes=self.n_classes)

You can then use your custom generator with the Model object you have created as shown below: training_generator = DataGenerator(partition['train'], labels, **params) validation_generator = DataGenerator(partition['validation'], labels, **params)

# Design model
model = Sequential()
[...] # Architecture
model.compile()

# Train model on dataset
model.fit_generator(generator=training_generator,
                    validation_data=validation_generator,
                    use_multiprocessing=True,
                    workers=6)

The codes have been taken from https://stanford.edu/~shervine/blog/keras-how-to-generate-data-on-the-fly

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.