SQL Server Performance Forum – Threads Archive
DTS feeding table problemHi, I have a DTS who takes a fixed-format text file and a DataPump feed a table in my database. The problem is, sometimes, I can get some duplicate records (the same information have been input twice in the text file). Before inserting the record in my table, I’d like to check if it already exists and, if so, skip that specific record OR deleting the record in the table and inserting this one IF a lot number (record’s field) is higher. Is there a way to do it into the DataPump task directly? Actually, I’ve added a SQL Task who do the job after the datapump have finished (get the duplicate records and erasing the one with the lower #lot). It works, but it’s not very clean. Thanks a lot for your insight. Dominic
The way you are doing it probably how I would do too. One other option – again not ideal – is to put a unique constraint on the column. Then when importing the file allow errors. The second entry will then always be skipped. But obviously if you get other errors – say formatting or missing columns etc you won’t know about it. Someone else may have a better idea though… Simon
Instead of deleting rows from destination table you could just insert rows in flat stage table and use select distinct query as a data source for the insert task. it may be efficient if there is a huge number of duplicates.