DTS bulk insert task loads 1194 rows of 504,000 – | SQL Server Performance Forums

SQL Server Performance Forum – Threads Archive

DTS bulk insert task loads 1194 rows of 504,000 –

I have a DTS bulk insert task that is behaving oddly. It runs and indicates that it ran successfully; however, upon examination of the data in query analyzer, it is clear that the data loaded fine BUT only 1194 of the 504,000 rows loaded. No error message or other messages are received. The bulk insert task reads from a flat file and uses an existing (and tested with bcp) format file. This bulk insert task is being upgraded from a bcp script. The same format file is being used. In a separate package, I created a regular data transformation task that loads the data correctly and all 504,000 rows are loaded. I am at a loss to understand why the bulk insert task loads only 1194 records. If there were problems with it, I would expect it to error out right away. I have tried changing the maximumerrors setting to 1000 from 10 with no change in results. I am interested in opinions and comments on this situation.

Can you open the flat file, identify the last line that was imported, and see if it has any funny characters at the end?
Indeed, I checked the file right after this happened and looked at the lines of data around where it stopped. They are fine. Moreover, this file loads fine with the normal data transformation task and with bcp. I appreciate your suggestion. Thanks

What is the delimiter ? Check if all the columns are populated with appropriate data and width. Also check, what is the last record that is loaded in the table. Is it the row 1194 in the flat file ? Probably, the DTS is not able to find a new line charecter or the delimiter. One can also physically map each column data from the flat file in the DTS transformations, based on the column width (if no delimiter is used). You can also use the BULK INSERT statement in your query analyzer and see if there are any errors. Check SQL Books Online for syntax.
]]>