I'm using C# and I get an out System.OutOfMemoryException error after I read in 50,000 records, what is best practice for handling such large datasets? Will paging help?
5 Answers
Obviously, you can't read all the data in the memory before creating the MDB file, otherwise you wouldn't be getting out of memory exception. :-)
You have two options: - partitioning - read the data in smaller chunks using filtering - virtualizing - split the data in pages and load only the current page
In any case, you have to create the MDB file and transfer the data after that in chunks.
Comments
I would suggest using a generator:
"...instead of building an array containing all the values and returning them all at once, a generator yields the values one at a time, which requires less memory and allows the caller to get started processing the first few values immediately. In short, a generator looks like a function but behaves like an iterator."
The wikipedia article also has few good examples