A common complaint of the Entity Framework is slow insert times for larger datasets. Last night I was trying to insert a catalog of 15k products and it was taking a very long time (I gave up after 5 minutes). I recalled this post a while back from Mikael Eliasson demonstrating SqlBulkCopy using .NET. I had used BCP in SQL server, but not from .NET. I took Mikael’s example and roughed out a reusable generic version below, which produced 15k inserts in 2.4s or +- 6200 rows per second. I upped it to 4 catalogs, 224392 rows in 39s, for +- 5750 rps (changing between 4 files). These are pretty decent records too, 41 columns and a few of the fields have a meaty char count. Good enough I say.
This works off just a basic list of items which property names match the table column names. Given that most POCO based ORM’s generally match the schema exactly, it works great with EF code first objects.
To use I just build up a list of objects, pick the connection string off the DbContext and then call BulkInsert to save to the DB. Note that in this case I am just adding items to a List<T>, not the EF DbSet<T>.