Best approach to transfer CSV data to SQL Tables? | SQL Server Performance Forums

SQL Server Performance Forum – Threads Archive

Best approach to transfer CSV data to SQL Tables?

I’m fairly new to DTS and would like to know the best approach to moving CSV file records to SQL 2000 tables, but only if certain conditions are met. For example, I wish to check each record in the CSV file to see if it already exists in the SQL table. If it does exist, check to see if any column value has changed. If YES, either replace the entire record or replace the column that has changed. If NO, don’t transfer the record. I’ve only come up with this approach. – Define the source file as comma delimited
– Define a SQL Connection Property
– Use the Transform Data Task to move ALL CSV records to a staging table
– Use SQL scripts to load the primary tables, performing the necessary edits prior to inserting or updating the records Is there a better approach? Thanks, Dave

I think this is probably the best approach. This way if you need to add/remove/change any restrictions you can do it in the sql script. Chris
I had someone suggest not using DTS and instead BCP the data into a staging table and then use SQL to perform the necessary conditional logic prior to populating the permanent tables. How do you like that approach? The only question I have is how to specifiy a field terminator since my CSV files have fields separated by a comma followed by a single space. I have no control over the format of the CSV files. Thanks, Dave
I read that (in the ‘other place’) and you can. I’ve never had a problem with either, but like the integration of DTS. I think it can be a personal preference. For both you were recommended to bring into a temp/staging area and then process. -t is how you specify the field terminator in BCP and if you put it between quotes, you should be able to use a comma and space. Also, remember if you chose to use bcp or bulk insert that your database setting and maintenance plan must allow for this type of operation. Check "Logged and Minimally Logged Bulk Copy Operations" in books online to see how this fits into your situation. I guess the question no one asked is ‘how much data’? If its huge, mamoth even, then I’d go with the bulk operations. Athough, DTS data pump also gives you the option of doing a fast load. Chris
Also have a look aroundhttp://www.sqldts.com for all kinds of DTS goodies.
On terms of data load BCP is best to use than DTS and you can control them using FORMATFILE option in BCP. As specified by Chris BOL is the best start about BCP and other topics. Satya SKJ
Moderator
http://www.SQL-Server-Performance.Com/forum
This posting is provided “AS IS” with no rights for the sake of knowledge sharing.
]]>