Create dataset from 3d batches of feature maps for classification model

I am trying to create dataset of batches of several volumetric features and labels. It's a nifti volumes with extracted features from brain. I am trying to create a pair of feature X and labels y for further classification model. So the output of feature array should be

X = np.array([[feature1[batch1, batch2, …], feature2[batch1,batch2, …], 
feature3[batch1, batch2, …], [batch1, batch2, …]])

Y = np.array([[[1], [0]...])

I stuck on error of concatenating it to one dataset:

ValueError: all the input arrays must have same number of dimensions, but the array at index 0 has 1 dimension(s) and the array at index 1 has 4 dimension(s)

Part of a code patching 3d array:

all_patch = np.empty(1,num_of_patches_per_sub,36,36,36)

patchs = 32

for x in range(0, feature.shape[0]-patchs//2, patchs):
    for y in range(0, feature.shape[1]-patchs//2, patchs):
        for z in range(0, feature.shape[2]-patchs//2, patchs):

            patch_feature = feature[x: min(x+patchs, feature.shape[0]),
                               y: min(y+patchs, feature.shape[1]),
                               z: min(z+patchs, feature.shape[2]),]
            patch_label = label_data[x: min(x+patchs, feature.shape[0]),
                           y: min(y+patchs, feature.shape[1]),
                           z: min(z+patchs, feature.shape[2]),]

           all_patches = np.concatenate([all_patch, patch_feature])

Topic numpy naive-bayes-classifier scikit-learn python

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.