I have a table with around 100,000 rows (and it is going to be getting much larger).
My code now is throwing an OutOfMemoryException when I get up to around the 80,000 record in my loop (even though my system has over 10gb free at the time, it looks like visual studio is limited to around 1.5gb).
The code is intended to loop over all records and just check for certain conditions. I took out my code that actually processes the record and the memory still fills up.
using (var db = new PlaceDBContext())
{
Messages.Output("Total: " + db.Companies.Count());
int count = 0;
foreach (var company in db.Companies)
{
// I am not actually doing anything here,
// I took out my code and the memory still fills up
// CheckMatchConditions(company);
count++;
Console.SetCursorPosition(0, Console.CursorTop);
Console.Write(count.ToString() + " ");
}
}
I thought it might be to do with keeping the context open so I refactored the code to just take 1,000 records at a time and enumerate them all to a list first. This is what I came up with:
int count = 0;
int total = 0;
using (var db = new PlaceDBContext())
{
Messages.Output("Total: " + db.Companies.Count());
total = db.Companies.Count();
}
while (count < total)
{
List<Company> toMatch = new List<Company>();
using (var db = new PlaceDBContext())
{
toMatch = db.Companies.Include(x => x.CompaniesHouseRecords).OrderBy(x => x.ID).Skip(count).Take(1000).ToList();
}
foreach (var company in toMatch)
{
// CheckMatchConditions(company);
count++;
Console.SetCursorPosition(0, Console.CursorTop);
Console.Write(count.ToString() + " ");
}
}
This works a lot slower but still fills up the memory at about the same rate of records looped.
As I commented out my actual method that does anything it must just be these toMatch lists lingering in memory.
I am at a loss here, can someone shed some light on how I should be managing the memory?