4

I am currently doing coding some NN for huge dataset, for example MNIST dataset (about 700*50000). But when I test it, my code got MemoryError. I have a computer with 12 GB ram, but I think Python or Numpy can't use all of them.

Can I push Python or Numpy to use all remaining available memory in my PC ?

OS : Windows 7 64-bit

Python : Python(x, y) 2.7.60

Thanks

1
  • probably duplicated : Limit python vm memory you can also use python bindings for win32 to set the max memory usage of a process Memory limits Commented Apr 6, 2014 at 13:07

1 Answer 1

6

I believe that the Python(x, y) distribution of Python is still only a 32-bit build (64-bit support is still on its roadmap), so you are limited to 32 bits of address space even though you are using a 64-bit OS. You will need to install a 64-bit build of Python and numpy binaries to get access to more memory.

Sign up to request clarification or add additional context in comments.

6 Comments

owh ok, what is the maximum memory that I can use now ?
Maybe 2 Gb for the entire process. In practice, the maximum size of an array that you can allocate is going to be substantially less because of address space fragmentation. Consider installing the standard 64-bit build of Python and installing 64-bit numpy from Christoph Gohlke.
If I use Anaconda python distribution for 64 bit Windows, can I fully utilized my 64-bit system ? Or the NumPy still have same limitation ?
Ok Thanks a lot, I already tried it and it works. By the way, is that anyway to show what variable that I have and how much it takes the memory ?
Not reliably, no, for reasons that the StackOverflow comment box is much too small to explain. Sorry.
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.