I have a JSON file that contains about 20k lines of code that has to be read, sorted and saved into a database. I've written code for it and it works the way it's suppose to but my issue is that it takes about 10 minutes. Therefor I wonder if someone has any ideas what can be done to enhance the performance?
Json:
{
"Number": 123456,
"Area": "NE01"
},
{
"Number": 123457,
"Area": "NE01"
},
and so forth....
C#:
dynamic json = JsonConvert.DeserializeObject(File.ReadAllText(path, Encoding.UTF8));
foreach (var obj in json)
{
if (obj.Area == "NE01")
{
var o = new object
{
Number = obj.Number,
};
db.Entity.Add(obj);
continue;
}
if (obj.Area == "NE02")
{
var o = new object
{
Number = obj.Number,
};
db.Entity.Add(obj);
continue;
}
if (obj.Area == "NE03")
{
var o= new object
{
Number = obj.Number,
};
db.Entity.Add(obj);
continue;
}
if ( obj.Area== "NE04")
{
var o = new object
{
Number = obj.Number
};
db.Entity.Add(obj);
continue;
}
}
db.SaveChanges();
To make it clearer, area has four different values. Depending on the value the number will have a foreign key that points to the area. Unfortunately I'm not allowed to change anything in the underlying database. Let me know if I have to provide further information.
File.ReadAllText,JsonConvert.DeserializeObject, theforeachloop, and finallySaveChanges. Until you know where the time is being spent: you can't start to make it faster.SqlBulkCopymay be your best option. There's interesting ways of doing that, but perhaps the most convenient (if you have aList<T>of aTthat looks like the table) is to useFastMember.ObjectReaderlook forbcphere: github.com/mgravell/fast-member