How to solve MemoryError problem
I've created and normalized my colored image dataset of 3716 sample and size 493*491 as x_train, its type is list I'm tring to convert it into numpy array as follows
from matplotlib import image
import numpy as np
import cv2
def prepro_resize(input_img):
oimg=image.imread(input_img)
return cv2.resize(oimg, (IMG_HEIGHT, IMG_WIDTH),interpolation = cv2.INTER_AREA)
x_train_ = [(prepro_resize(x_train[i])).astype('float32')/255.0 for i in range(len(x_train))]
x_train_ = np.array(x_train_) #L1
#print(x_train_.shape)
but i get the following error when L1 runs MemoryError: Unable to allocate 10.1 GiB for an array with shape (3716, 493, 491, 3) and data type float32
Topic opencv numpy tensorflow deep-learning python
Category Data Science