26 Jan
2012

SqlBulkCopy for Generic List<T> (useful for Entity Framework & NHibernate)

Category:UncategorizedTag: , , :

A common complaint of the Entity Framework is slow insert times for larger datasets. Last night I was trying to insert a catalog of 15k products and it was taking a very long time (I gave up after 5 minutes). I recalled this post a while back from Mikael Eliasson demonstrating SqlBulkCopy using .NET. I had used BCP in SQL server, but not from .NET. I took Mikael’s example and roughed out a reusable generic version below, which produced 15k inserts in 2.4s or +- 6200 rows per second. I upped it to 4 catalogs, 224392 rows in 39s, for +- 5750 rps (changing between 4 files). These are pretty decent records too, 41 columns and a few of the fields have a meaty char count. Good enough I say.

[gist]http://gist.github.com/1681480[/gist]

This works off just a basic list of items which property names match the table column names. Given that most POCO based ORM’s generally match the schema exactly, it works great with EF code first objects.

To use I just build up a list of objects, pick the connection string off the DbContext and then call BulkInsert to save to the DB. Note that in this case I am just adding items to a List<T>, not the EF DbSet<T>.

[gist]http://gist.github.com/1681888[/gist]

7 thoughts on “SqlBulkCopy for Generic List<T> (useful for Entity Framework & NHibernate)

  1. Just to remember the default behavior for SqlBulkCopy:
    1- Ignore some constraints (like FKs)
    2- Use default column values instead of null

    An example of the last one is
    create table Products ( /*…*/ registerDate datetime default getdate() )
    When you bulkInsert even if you pass null, the value on registerDate will be the default.

    To overcome them you have to use some options:

    using (var bulkCopy = new SqlBulkCopy(ConnectionString, 
       SqlBulkCopyOptions.CheckConstraints | SqlBulkCopyOptions.KeepNulls))

Comments are closed.