I have a large dataset that I perform experiments on. It takes 30 mins to load the dataset from file into memory using a python program. Then I perform variations of an algorithm on the dataset. Each time I have to vary the algorithm, I have to load the dataset into memory again, which eats up 30 minutes.
Is there any way to load the dataset into memory once and for always. And then each time to run a variation of an algorithm, just use that pre loaded dataset?
I know the question is a bit abstract, suggestions to improve the framing of the question are welcome. Thanks.
EDITS:
Its a text file, contains graph data, around 6 GB. If I only load a portion of the dataset, it doesn't make for a very good graph. I do not do computation while loading the dataset.