2

I have a static helper method for bulk inserting Entity Framework objects. It works and my unit tests pass but it looks wrong and I can't help but think there must be a better way to do it.

public static void BulkInsert<T>(Entities db, IList<T> list)
{
    SqlTransaction transaction =  db.Database.CurrentTransaction != null ? db.Database.CurrentTransaction.UnderlyingTransaction as SqlTransaction : null;

    if (db.Database.Connection.State != ConnectionState.Open)
    {
        db.Database.Connection.Open();
    }

    using (var bulkCopy = new SqlBulkCopy((db.Database.Connection) as SqlConnection, SqlBulkCopyOptions.Default, transaction))
    {
        //fill a datatable and write to server, this bit works
    }
}

It could be called inside a DbContextTransaction and/or the DbContext.Database might have done something already so the connection might be open. If you don't explicitly open the connection before creating the bulk copy you get an error:

System.InvalidOperationException: WriteToServer requires an open and available Connection

which is odd, I'd have thought that sqlBulkCopy would have opened the connection if it had to.

So my question is am I going about this the right way?

1 Answer 1

4

I have used your approach successfully myself.
My feeling is that you should close it again if you opened it.

I would additionally let SqlBulkCopy handle the transaction itself. I don't think enlisting SqlBulkCopy into any ambient transaction is a great idea unless your particular scenario absolutely demands it.

public static void BulkInsert<T>(PandaDataContext db, IList<T> list)
{
    var mustOpen = db.Database.Connection.State != ConnectionState.Open;
    try
    {
        if (mustOpen)
            db.Database.Connection.Open();
        using (var bulkCopy = new SqlBulkCopy((db.Database.Connection) as SqlConnection, SqlBulkCopyOptions.Default))
        {
        }
    }
    finally
    {
        if (mustOpen)
            db.Database.Connection.Close();
    }
}
Sign up to request clarification or add additional context in comments.

4 Comments

Hi thanks, the close is a good suggestion. The only thing in the ambient transaction is be removing rows that the bulkcopy is replacing.
I would remove the rows independently (in a different transaction) to the sqlbulkcopy if that will not cause corrupt or invalid data. If you must have both in the same transaction, do a lot of testing with real volumes of data, and make sure it's stable and correct.
I need it in a transaction to make sure that nothing reads the data in-between it being cleared and replaced. It seems performant enough, there's less than a million rows and blocking for 2 seconds doesn't matter. Why do you have doubts about the sqlBulkCopy in the transaction?
I have found that with very large transactions that it can fail. You will need to test. I have typically found that doing tranches of data works better than trying to add millions of rows in a single transaction. You will just need to try it for your data loads and test.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.