1

I have two machines running identical Python scripts that use OpenCV to convert image file formats. The script runs fine on my low-end notebook, which has 4 GB of memory. On my desktop, however, which has 32 GB, I get the following error:

OpenCV Error: Unspecified error (The numpy array of typenum=2, ndims=3 can not be created) in NumpyAllocator::allocate, file D:\Build\OpenCV\opencv-3.3.1\modules\python\src2\cv2.cpp, line 175 OpenCV Error: Insufficient memory (Failed to allocate 243000000 bytes) in cv::OutOfMemoryError, file D:\Build\OpenCV\opencv-3.3.1\modules\core\src\alloc.cpp, line 55

(1) The code that causes this error is as follows. No other code in the script uses OpenCV.

# png and jpg are filenames
img = cv2.imread(png)
cv2.imwrite(jpg, img, [cv2.IMWRITE_JPEG_QUALITY, 85])

(2) Both machines are running Windows 10 on a 64-bit AMD CPU.

(3) On both machines, Python is running in 32 bit mode, according to sys.maxsize.

(4) Both machines were running Python 3.6.2. I tried updating the desktop to 3.6.3, but it made no difference.

(5) Both machines have OpenCV version 3.3.1

(6) The desktop on which I get the memory error is using a slightly newer version of NumPy (1.13.3) compared to 1.13.1 on the notebook where all is well.

(7) The script will convert smaller images without error, but chokes on a 9000 x 9000 pixel PNG. I realize this isn't small, but still, even this large image works just fine on the notebook.

I did try to search for any information that might suggest that NumPy 1.13.3 was known to break things since it was the only difference I could identify, but I couldn't find anything suggesting such a problem.

Thanks in advance to anyone who can help explain the problem and how to fix it.

9
  • Can you post the entire script that runs? It could be a memory leak of some kind in auxiliary code, leading to insufficient memory for opencv at some later stage. Commented Nov 20, 2017 at 17:04
  • That would be a few hundred lines of code for a generative art program, so I'm not sure that would be practical. Though if that were the problem, I would have thought it would fail on both machines. (This code actually runs 500 times in succession with different generated images on the laptop without issue, yet fails the very first time on the desktop.) Am I missing some way that a coding error could hit just one machine, and the one with lots more memory at that? Commented Nov 20, 2017 at 17:11
  • You've already identified one discrepancy (different Numpy version), so why not eliminate it? Commented Nov 20, 2017 at 17:21
  • @ely Well, I'll be a monkey's uncle. Apparently that is the case. I just stripped everything except the image conversion code out and ran the OpenCV code on a pre-existing image, and it worked. I guess I need to learn what I can about memory leaks, but I'd still be grateful to someone who can explain why the same code works on one machine but not another. That just baffles me. Commented Nov 20, 2017 at 17:21
  • A few hundred lines isn't so bad -- from the sound of it, it seems more likely to be a memory leak issue or some other type of bug. Are you iterating through the files from disk in slightly different ways for each machine? If the scripts are literally exactly the same then your program must parse command line arguments or look for directories with special relative names, and various aspects of this file processing could have a bug. Commented Nov 20, 2017 at 17:22

1 Answer 1

1

It turns out that all of the packages I needed had 64-bit versions available, so I got things working by switching to the 64-bit Python setup.

In case it helps someone else, here's what I learned along the way:

Since I was working in a 32-bit Python environment, the boatloads of RAM on my desktop machine weren't really relevant. In 32-bit mode, I can't possibly access more than 4 GB. In reality, I can't even get that since all sorts of things compete for that 4 GB block, and if I understand correctly, no more than 2 GB will be allotted to any 32-bit Python process anyway.

With Numpy, things get even worse because Numpy arrays require a contiguous block of memory. That makes things tight, apparently tight enough that the 243 MB I needed for the image wasn't available. It wouldn't necessarily require a memory leak for this problem to occur. If things were tight, just the normal (likely somewhat memory-intensive) drawing operations I did with pycairo could have left too little for the subsequent image conversion. (The Surface object had not been released because it would be used in subsequent iterations.)

The surprising part--at least to me--was that the amount of contiguous memory available for an operation within this 2 GB max can vary wildly from machine to machine, and even from day to day, depending on all sort of things that aren't obvious. It appears that my notebook just happens to has some fortuitious circumstances that leave my 243 MB of continuous memory, while my desktop doesn't.

Thanks to those who offered advice.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.