My program runs a simulation which requires huge objects to store the data in. The size of the blob is larger than 2-3GB. Even though I should have anough memory in my MBP, python (Python 2.7.3 on Mac OS X, from ports) cannot seem to use it all, and the system gets totally frozen.
To save the status of the simulation, I use pickle, but it also doesn't work for too large objects, it seems as if pickle would duplicate the objects in the memory before dumping them...
QUESTION: is there a standard library which can handle huge python data structures (dict, set, list) without keeping them in the memory all the time? Alternatively is there a way to force python to run in virtual memory? (I'm not very familiar with numpy, would it help me in this situation?)
Thanks in advance!