SQL Server Performance

Large Inserts/Updates to wide tables

Discussion in 'Non-Transact SQL Developer Performance Tuning' started by Sparky, Oct 28, 2004.

  1. Sparky New Member

    I am going to use a .Net or VB 6 client app to select data from one db, process the data according to business rules and insert into another db. The target is one table that is very wide (120+ columns) with about 2 million rows.

    I think I have serveral options for inserting rows. Which is better?

    1) Disconnected Recordset. Connect to the table with a disconnected recordset, insert the rows at the client and then set the batchupdate to complete the insertion. (I don't know if ADO really treats this like a SQL batch or not.)
    2) insert the rows one row at a time using a sproc.
    3) insert into a text file and use BCP to import the new records
    4) some other way.

    Now, the updates. I will also have updates to do on existing records. I suppose I have similar methods at my disposal for the updates, so I won't list them again.

    Any ideas on which is more efficient?

    I perform this rebuild monthly. Sometimes it is from scratch (mostly inserts) and sometimes it is merely 10% inserts and 20% updates. Perhaps the methods should be different in each case.

  2. Med New Member

    For pure bulk insertions, I don't know better than BULK_INSERT T-SQL command.


  3. Sparky New Member

    Thanks Med. Sounds good for the inserts.

    What about the udpates? Anyone have thoughts on bulk updates.

  4. satya Moderator

Share This Page