BCP Utility | SQL Server Performance Forums

SQL Server Performance Forum – Threads Archive

BCP Utility

Hi! I have the following problem: I am transfering big tables (about 100 – 400 mb) from a central sql server to several clients. Since there is a 2 mbit line between server and client i use compressed textfiles (bcp native mode) to accomplish this in a acceptable time.
When i´m inserting the data into my tables i´d like to verify my transfer with a stored procedure. The procedures work, if i start them manually. When i start them automatically after the bcp utility has finished the procedure returns NULL.
I think my problem is the server needs some time to rebuild the index of these tables.
So i´m looking for a way to monitor the sql server (which is quite busy after the bcp in) if he´s finished so that my check sp can validate if the transfer was successful. Anyone any idea? greetings
Clemens
You could use a query that checks the rowcount of the tables on the source server and compares it to the rowcount of the tables on the destination servers (using linked servers). Check if the rowcounts match and if they do, the process has finished. That may be too simplistic, but it may be what you’re after. Tom Pullen
DBA, Oxfam GB
Use PROFILER to monitor the activity during this process. Satya SKJ

Hi! Thank you for your hints! -> The rowcount method – i tried this, but i didn´t get any results beacuse the sql server is quite busy when it inserts the data into the destination table (rowcount times out). I think the index rebuld after the insertation of the data is respnsible for this. I also tried to monitor the sysprocesses table to get a hand on the sqlserver processes – it also didn´t work (timeout in the interesting timeframe). -> As for the Profiler: this transfer should take place fully automatic, user triggered. So i will try this to gain any new infos, but i´m looking for a way to automate this so is there a way to do this? (My cliets are MSDE´s with a vb client-tool) I´ve read that it might be usefull to drop the indices of my tables before i insert the data, but i´m a little afraid to do that, because if i insert inconsistent data there might be dublicate records which will prevent a index rebuld whic is not good (this should´nt happen, but you´ll never know…). Another way might be to limit bcp´s batch size to 1000 rows. But this, since every 1000 rows a transaction ends the indexes will also become rebuilt. Since our pk´s are very compley this need´s some time. So i´ve major performance concerns about using this. Thank´s again for your ideas, i´m looking forward to continue this discussion! greeting
Clemens
If you have Primary/Unique key constraint on the table then no duplicate rows will be allowed.
The best is to reindex after the load so that the statistics will be updated against this table. Anyway BCP is limited logged operation and batch of 1000 is not a problem to execute. HTH Satya SKJ

]]>