Thursday, March 22, 2012

best practices on maxinsertcommitsize

Hi,

I wonder if anyone knows what would be the best case scenario for the property 'maxinsertcommitsize' for the sql destination task if I want to load 6m records into a target. Is the best setting 0 (try loading all in one batch) or should I choose a different value for example 1000000 per batch?

Thanks,

Marc

This setting is up to you. How big of a batch do you want to insert? If you encounter an error in the load, do you want to roll back ALL records, or just the batch?

There is no "best practice" because this is user dependent. Everyone's situation is different.|||

Ok, but does it affect performance? For our situation the following applies:

- We don't care how big the batch will be as long as it is optimized for maximum performance
- If we encounter an error in the load it doesn't matter if ALL records roll back or just the batch.

|||I don't think there will be much of a performance difference. The one thing to watch out for in a big batch is the potential for filling up the transaction log.

Try different settings and report your results back to us.|||Actually there may be a performance difference between using batches and not. When SQL Server commits a batch it has to update any affected indexes. 6M records could be a good deal of work. I've seen it take over an hour to commit a batch that size when indexes are involved. By using smaller batches, SQL Server can get started on this work while SSIS is still sending it rows. But that advantage depends on how long your SSIS process takes. If it can generate those 6M in 30 seconds, then giving SQL a head start isn't going to do much good.

No comments:

Post a Comment