Large Inserts/Updates to wide tables | SQL Server Performance Forums

SQL Server Performance Forum – Threads Archive

Large Inserts/Updates to wide tables

I am going to use a .Net or VB 6 client app to select data from one db, process the data according to business rules and insert into another db. The target is one table that is very wide (120+ columns) with about 2 million rows. I think I have serveral options for inserting rows. Which is better? 1) Disconnected Recordset. Connect to the table with a disconnected recordset, insert the rows at the client and then set the batchupdate to complete the insertion. (I don’t know if ADO really treats this like a SQL batch or not.)
2) insert the rows one row at a time using a sproc.
3) insert into a text file and use BCP to import the new records
4) some other way. Now, the updates. I will also have updates to do on existing records. I suppose I have similar methods at my disposal for the updates, so I won’t list them again. Any ideas on which is more efficient? I perform this rebuild monthly. Sometimes it is from scratch (mostly inserts) and sometimes it is merely 10% inserts and 20% updates. Perhaps the methods should be different in each case.
For pure bulk insertions, I don’t know better than BULK_INSERT T-SQL command.

Thanks Med. Sounds good for the inserts. What about the udpates? Anyone have thoughts on bulk updates.
Threadhttp://www.sql-server-performance.com/forum/topic.asp?ARCHIVE=true&TOPIC_ID=1066#4742 for reference. HTH Satya SKJ
Moderator
http://www.SQL-Server-Performance.Com/forum
This posting is provided “AS IS” with no rights for the sake of knowledge sharing.
]]>