Log Shipping is a Disaster Recovery (DR) tool that provides you the ability to have a readily available “warm backup.” Essentially, should your existing system fail, you have a backup SQL Server and database available to take over with relatively up-to-date information. If your database is critical to your business, such as in a call center, log shipping provides an incredible opportunity to make sure that business does not stop for long.
As an example, last year, downtown Houston was flooded so that most of the buildings were inaccessible. Companies that had implemented log shipping to remote DR sites were able to continue using their line of business applications within minutes of losing access to their primary databases. In this situation, a simple clustering solution would have been completely ineffective.
Implementing log shipping is a conceptually simple task.
- Create the database on the remote server(s).
- Set the remote database(s) to read only.
- Create a full database backup on the production server.
- Copy the database backup file to the remote server(s).
- Restore the database on the remote server(s).
- At certain intervals required by your business needs you will then:
- Perform a transaction log backup of your primary database.
- Copy the log backup to the remote server.
- Apply the log to the remote database (this will not work if the database is not read only, or if some work that changes the database has been performed).
You may find that you need to perform a full backup and restore fairly often. Typically, I will schedule the full backup and restore to occur about once per week, during non-peak hours. Your schedules may vary based upon available disk space and the number of full database backups that you wish to retain. Additionally, if you are performing a transaction log backup every 15 minutes, then number of files can get to be pretty large, and may make a manual restore unwieldy.
Writing a VB Log Shipping Application
In this section, I have provided a sample log shipping application which uses an ADO connection to generate the backup files and perform the restores. The application is for demonstration purposes only. So, while you are welcome to use the code, I strongly recommend complete testing for suitability in your environment, and I cannot accept any responsibility for damage that may occur from its use. SQL Server 2000 Enterprise Edition has a Log Shipping Wizard built in. If you are using the Enterprise version, I strongly recommend using the provided wizard. It is much more flexible than the sample application that I have provided, and much more thoroughly tested.
In truth, most of this functionality could have been written using DTS, or VBScript and Windows Scripting Host. However, the environment for which I originally created this tool was using an older version of SQL Server that did not offer DTS, and server policies would not allow for WSH or similar technologies to be installed. Of course, the advantage of using VB is that it does not even need to be run from a SQL Server box. It could easily be scheduled using AT commands from any Windows NT/2000 host.
One of the most difficult decisions you will make when creating your log shipping application is whether you wish to perform all transaction log backups to a single file, or to individual files. I prefer individual files as it provides some immediate visual feedback on the success of your transaction log backups, and can make scripting a manual restore a little more intuitive. However, there are advantages to either choice, so you will need to consider your business needs before you make a decision. The sample code uses the individual file method, but does have some code to account for multiple backups per file.
In the sample code, I first check the registry for information about the production server and the backup servers. In addition, I retrieve the format for the file names that we will create. The format should be a valid date format. Also, changes to the default may cause problems as no other format has been tested, so you will need to test any changes you might make.
An additional option that you may wish to invoke is to perform a DBCC CHECKDB on the production database before performing the backup. If you are running a version prior to SQL Server 7.0, you should consider this mandatory, as problems in the database will be recreated in the target database.
Connecting to the Servers
Connecting to SQL Server using ADO and OLE-DB is very easy. We simply create a string with the appropriate connection parameters, including Provider (SQLOLEDB.1), Data Source (Server Name), User ID and Password. To prevent conflicts with the restore efforts, the initial database is set to the master database.
strConn = “Provider=SQLOLEDB.1;Persist SecurityInfo=True”& _
“;Initial Catalog= Master;”& _
“;Data Source=” & strServer & _
“;Password= “& strPass & _
“;User ID=” & strUser
Once you create the string, assign it to the ConnectionString parameter of the ADODB.Connection object.Next, set the ConnectionTimeOut parameter to 0.This prevents the connection itself from timing out, not the actual SQL executions. Now, just open the connection.
Executing the Backup
Once we have a connection, we will need to create the SQL statement to perform the database or transaction log backup. If the file already exists, then I chose to use the WITH INIT option. This prevents me from overwriting the backups that may already exist in the file. Here is the code to create the SQL statement:
strSQL = “BACKUP DATABASE ” & sDatabase & _
” TO DISK='” & sFilename & “‘ ”
If Not FileExists(sFilename) Then
strSQL = strSQL+ ” WITH INIT”
strSQL = “BACKUP TRANSACTION ” & sDatabase & _
” TO DISK='” & sFilename & “‘ ”
If Not FileExists(GetSetting(App.ProductName, m_sServer, “NetDmpDev”)) Then
strSQL = strSQL+ ” WITH INIT”
If you are using a version of SQL Server prior to 7.0, you will want to change the above SQL to use DUMP DATABASE, and DUMP TRANSACTION, respectively.
Executing the code is fairly straightforward, you simply call the connection object’s Execute method with the query as a parameter. To aid in logging of the data, I trapped some events during the query execution. Capturing ADO events requires using either a class, or a Form object. At the top of the class, I inserted the code Private WithEvents connEvent As ADODB.Connection. With that code in place, new events are placed in your class and are fired by the ADO connection. I used the WillExecute, InfoMessage and ExecuteComplete events to capture the start, executing, and ending messages from ADO.