Log Shipping problem | SQL Server Performance Forums

SQL Server Performance Forum – Threads Archive

Log Shipping problem

I read the log shipping article on this site athttp://www.sql-server-performance.com/sql_server_log_shipping.asp The steps described for the Database Backup Job are: 1 Truncate Log T-SQL Script
2 Backup Database T-SQL Script
3 Copy Backup Operating System Command
4 Restore Database T-SQL Script I have setup jobs for my databases using the above 4 steps. Most of my jobs work fine. However I have found that Step 3 is failing for some of my databases. The command I am using for Step 3 follows the example in the article and looks like this:
xcopy \Sql-1R\MSSQL7BACKUPStopShop.BAK R:MSSQL7BackupLogShipping /c /y It seems to me that this step is failing on some of my database job because the .BAK file is too big. However this is just my guess. I can run this xcopy command from DOS command prompt and it works fine. However it fails when I run it as put of my backup job. As I said the same command for different database backup files works fine. In the example I have given my .BAK file is 395 MB in size. Does anyone know if this xcopy step is failing because of a large file?
If so, does anyone know why this xcopy step would be failing with larger files?
If not, does anyone know why the xcopy is failing? Thank you. Jeff Gavern (Moved from General DBA section)
So the failure is on the operatin system level not on the part of SQL SErver.
Check whether the SQLAgent service account privileges on the primary and secondary servers. In order to run XCOPY the account must be a local administrator and if you run on your own it will run on the SYSADMIN privileges. Satya SKJ
This posting is provided “AS IS” with no rights for the sake of knowledge sharing.
Satya, I don’t believe that is the problem. As I stated I am performing the same xcopy on other database .bak files which reside on the same server and on the same server folder. I am having a problem with 4 out of 12 xcopy. The only difference I can see is that the 4 that are failing are for larger files. There are no messages in the application log. Any help would be appreciated.
COnfirm the configuration and memory settings of primary and secondary servers. Satya SKJ
This posting is provided “AS IS” with no rights for the sake of knowledge sharing.
Satya, Primary server memory is 4 GB
Secondary server memory is 1 GB What configuration settings are you talking about? How do I see them? Also if these are not the settings that you were asking for can you be more specific by explaining what you need and how I go about getting the settings? Thanx. Jeff
I performed another test in my continued analysis of this problem. I created a job on my secondary machine with just xcopy step. This is the exact server and exact xcopy statement that is failing in the multi-step job. This new single step job worked fine. I took almost 8 minutes for the file to be copied but it did work. Is there a reason why the the multi-step xcopy step is not working?
I found a solution to this problem. Instead of using the xcopy Operating System command I am using the xp_cmdshell T-SQL command. Below is an example of the command.
EXEC xp_cmdshell ‘copy \Sql-1R\MSSQL7BACKUPStopShop.BAK R:MSSQL7BackupLogShipping’

Glad you found the solution to the problem, and it was my next suggestion to differentiate the bigger databases to copy between primary and secondary servers. Thanks for sharing the solution. Satya SKJ
This posting is provided “AS IS” with no rights for the sake of knowledge sharing.
Also found another potential problem with the log shipping process documented onhttp://www.sql-server-performance.com/sql_server_log_shipping.asp For the Backup Database step (Step 2) there is an example of code as follows:
Step 2 Script
BACKUP DATABASE database_name TO database_name_backup_device WITH INIT
WAITFOR DELAY ’00:00:05′ On larger databases the WAITFOR is not a long enough delay. What happens is the copy step (step 3) starts while the backup step is still running. This causes step 3 to fail. You need to adjust your delay time accordingly for Step 2.

You might want to table drive that. You could then set it for each database as needed. WAITFORDELAY @dbwaittime works if anyone is interested. MeanOldDBA
[email protected] When life gives you a lemon, fire the DBA.
This may be a bit off-topic, not sure, but in regards to using the WAITFORDELAY… my log shipping implementation is done a bit differently (since my two servers do not have 2-way comms, just a 1-way poke through a firewall). So, mine is built up in a DTS package, split into a series of tasks where I can use the flow control (on completion, on success) in place of manual delays… tanstaafl