I am running a neural network, whose input should be of size (128, 128, 3). I have a dataset of images of size (256, 256, 3)
So, I am resizing every image img before inputting to neural network.
img.resize(128, 128, 3)
It is working well for some batches or some epochs.
But suddenly the program returns error due to resizing image as follows
ValueError: resize only works on single-segment arrays
I thought that there may be some issue with shape of images in my dataset, but the shapes of image are same through out my dataset i.e., (256, 256, 3). And I have no clue about this error.
If there is any issue with the resize function, then how does it work for some images and pops-up error for some other? Am I wrong anywhere?
img.sizefor image beforeimg.resizethen grab error(256, 256, 3)for the image causing error also.np.resize? if yes, the doc says it is not suitable for imagesnp.resizedoesn't do what you think it does here: it just chops off the array to the new size. If you want to resize the image by half it's easy: just take every second pixelimg = img[::2,::2,:]