24/7 DB Options/stratagies ? | SQL Server Performance Forums

SQL Server Performance Forum – Threads Archive

24/7 DB Options/stratagies ?

Hi I am looking for advice/options to make our production DB 24/7. Our production db is a centralised repository of data extracted for mulitple enterprise wide systems. Its is refreshed nightly. This refresh job takes about 30mins to 1 hour depending on the volumes of changes. During the refresh job all websites that hit the production DB are unavailable as the update queries has locks on most tables so any queries get timed-out. At the moment this update occurs when no one is using our DB. However we are soon going to be supporting users 24/7 and we cant bring our websites down for 1hr while updates take place. I am looking for options/advise for different stratagies to help achieve this. Should I be looking into log shipping or replication for a solution? I am thinking I will need a snapshot of the production db to point my IIS servers too when the production DB is being refreshed. However what is the best option for keeping this snapshot up todate? Any pointers to areas or forums to look up greatly appreciated. Cheers
A
You haven’t mentioned about version of SQL Server,operating system and any other applications that are sharing server resource alongwith their service pack information. What do you mean by refreshing data, could you please explain the process followed here.
If you’re using queries then try to restrict the update in batches or using BULK INSERT process and in any case try to identify where it is slowing by using PROFILER. If you want to obtain warm standby server environment then go for LOG SHIPPING, which is easy to maintain and you can use secondary server for queries in between log restore points. Refer to this website for many tips and articles on log shipping. Satya SKJ
Moderator
http://www.SQL-Server-Performance.Com/forum
This posting is provided “AS IS” with no rights for the sake of knowledge sharing.
Hi, Apologies for the missing information, I will now try and explain the complete process. Our production DB is a centralised repository of all our products with auxiliary information pulled from about 8 source systems. This data is refreshed nightly with changes that occurred during that data in the source systems. To achieve this we have 2 Servers running only SQL 2K Standard edition SP3 on windows 2003. One sits behind the firewall and is our staging server and the other 1 sits in a DMZ and is our production DB. On nightly bases our staging DB runs DTS packages that connect to different enterprise wide systems (our source systems) and extracts large volumes of data. This data comes from Oracle DB, Flat files, FTP#%92d text files and SQL DB. The data is extracted, filtered, validated, formatted and inserted in to mirror tables sitting in our staging DB. Next up SQL scripts executed from DTS packages uses the data in the mirror tables to update our staging tables inserting new rows, deleting missing rows and updating existing rows. The staging tables are replicas of our production DB Tables. This all takes place on our staging server as it can only accesses these systems in a secured area and also it#%92s quite a heavy process intensive job that takes about 4 hours to run. The next and final part of the nightly job is to refresh the production DB with changes to the staging data. This is done by running sql scripts on the staging DB that compares the data in its tables with the equivalent tables on Production DB. Inserting rows that exist in staging and not in production, Update rows that exist in both that differ and deleting rows in production not in staging. This final part is the only part that affects production and takes anywhere from 30mins to 1 hour to run. The main reason for this is remote queries execute locally on staging. The Production DB cannot initiate a connection to the staging DB. The staging initiates the connection to production and runs the queries which can cause about 1m rows to be pulled for some tables from production to staging. This takes time and during this refresh the main Production DB tables are locked and timeouts occur if you try to access it from our website. So it#%92s during this job that I need to point our IIS to a snapshot of production DB until the refresh is finished. So is log shipping still an option? Any other strategy that you would recommend? Cheers
Adrian

If the update process is only happening during the nights, then you can take help of log shipping process here by enabling secondary server in STANDBY mode to run any queries from the application. I’m not sure from IIS how you can point out to the secondary server upto the time and then pinback to primary server. Using Log shipping process you can update secondary server using transaction log backup from Primary server that are managed by scheduled jobs. As you’re running SQL standard edition it may not be possible to use Log shipping wizard, but you can achieve log shipping usinghttp://www.sql-server-performance.com/sql_server_log_shipping.asp link andhttp://www.microsoft.com/technet/prodtechnol/sql/2000/maintain/logship1.mspx links. On the update process from the staging server to Production server, you can take help of BCP to Export and Import rows which will be faster as compared to current process. If not you can test the events in development environment if you want to take advantage of it. Satya SKJ
Moderator
http://www.SQL-Server-Performance.Com/forum
This posting is provided “AS IS” with no rights for the sake of knowledge sharing.
]]>