I wrote a program to read images using Python's opencv and tried to load 3 GB images, but the program aborted. There is 32 GB of memory on my PC, but when I run this program it will run out of it. What is the cause?
The error message is not issued and the PC becomes abnormally heavy. I confirmed it with Ubuntu's System Monitor, and it ran out of memory and swap.
I import images into one array to pass to tensorflow deep learning program. The size of the images are 200 x 200 color images.
I use 64 bit version of Python.
import os
import numpy as np
import cv2
IMG_SIZE = 200
def read_images(path):
dirnames = sorted(os.listdir(path))
files = [sorted(os.listdir(path+dirnames[i]))\
for i in range(len(dirnames))]
i = 0
images = []
for fs in files:
tmp_images = []
for f in fs:
img = cv2.imread(path +dirnames[i] + "/" + f)
img = cv2.resize(img, (IMG_SIZE, IMG_SIZE))
img = img.flatten().astype(np.float32)/255.0
tmp_images.append(img)
i = i + 1
images.append(tmp_images)
return np.asarray(images)
np.ndarrayholds more information, and additionally, you're converting it tofloat32instead of the nativeuint8, which means that you're using four times the memory for each image. You could just convert when you use each image. You probably don't need to load 3 GB of images into a single array, so what are you exactly trying to do? It's not clear how large the images are being resized from either, so hard to give precise advice here.imreaddecompresses them to put the data into an array.