2

I have large record like more then 1 000 000 and I want to fill this data into DataTable, but at time of filling DataTable it is throwing exception like System.OutOfMemoryException. How I can solve this error? Also I have 4 GB RAM and 64-bit operating system.

8
  • Did you mean that to be 10 million records, or 1 million? (Having two 0s at the end is unusual.) Do you really have to have the whole thing in memory? There are ways to make this more efficient so you can get more into memory at a time, but 10 million records is quite a lot to have in memory in one go... (Even if all 4GB can go towards that, that's only 429 bytes per record, which really isn't a lot.) Commented Mar 25, 2015 at 7:19
  • 1
    What is Iac? Soerry - just ran into a OutOfVocabulary exception Commented Mar 25, 2015 at 7:23
  • 1
    So 1 million? Please bear in mind that "lac" isn't a commonly used term for many readers. I'll edit your question to represent 1 million in the more common way. But even so, do you definitely need all those records in memory at once? Commented Mar 25, 2015 at 7:23
  • 2
    @Serv: See en.wikipedia.org/wiki/Lakh - basically it's 100,000. Commented Mar 25, 2015 at 7:23
  • 1
    Why do you need more than a million rows in memory at once? Commented Mar 25, 2015 at 7:33

1 Answer 1

6

If you -need- all that data in memory, and your app pool is x64, and you are using .NET 4.5, you can use the gcAllowVeryLargeObjects directive... to do this for ASP.NET, you'd need to add your own aspnet.config file for your app pool (see this link for more info) and set this configuration for startup:

<configuration>
  <runtime>
    <gcAllowVeryLargeObjects enabled="true" />
  </runtime>
</configuration>

That said, handling such a large DataTable in memory all at once is more likely not the way to go, specially for web applications (which will generally work for more than one user).

Having that much memory occupied (per user, probably), while you more likely only work on a small subset of that memory at any given time, is just a waste, and you should probably be looking at a different way of handling the problem, rather than bruteforcing .NET to allow for larger objects in memory.

Sign up to request clarification or add additional context in comments.

6 Comments

Thanks Jcl. But I used this ever before, still showing error. Any other then this.
Well, if those 1 million records are large and you have only 4gb of RAM (not for the process, but for the whole system), you might consider that you actually do not have enough memory on your system for it, and if that's the case, the only way would be adding more memory. This is rather unpractical, and you may want to think twice about the handling of that data, instead of loading it all in memory at once.
Even if you load 1M records, you will still need to "iterate" through them in some way. The approach of loading everything is usually wrong, and your real problem is how to perform your processing with reasonable performance and smaller memory footprint on a large data set. Now, for the 10th time, what are you doing with the data and why do you need all the records loaded at once?
Is there any other way to specify the size of temporary memory in web.config file?
Try to compile application for x64 (instead of Any CPU) and see if that solves the problem.
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.