Transferring data from read-only source to destina | SQL Server Performance Forums

SQL Server Performance Forum – Threads Archive

Transferring data from read-only source to destina

Hi, I have read only permission in the source (OLTP) database. The source database is running in SQL Server 2000 and the size is more than 200 GB. I need to pull data from source and load target which is running in SQL server 2005. Following are the objectives I want to achieve. a)Data should be loaded on incremental basis.
b)Whatever changes take place (Update/Delete) in source, that should be replicated to already uploaded data. Here I want to mention that, the source database does not have any identification key or timestamp column like Updated_Date by which I can filter the data which are recently inserted or updated into the source and upload the same. The source does not maintain any history data also. So I do not have any track of deleted record also. I don#%92t have any scope to change the schema in the source. In this scenario can anybody suggest me the best approach to achieve the above mentioned objectives? I think if I can retrieve only the changes and new records form transaction log back up it can solve the problem. But I don#%92t know how to do that. Can log shipping help me? One more question. Say I have a table and I am exporting/importing all the data from/to my target table using SSIS or DTS. In this scenario does using query or using directly the table matters the performance?

I’m a bit confused by subsequent questions whether you need to do this for many tables, or just 1 table within the source database. If you are talking about all of the tables then you need to use SQL Replication to just replicate the changes from 1 server to the other. If you are just talking about a few tables then you could do the following. 1. I think you need a TRIGGER in place on the source table that would send information to the destination server containing what to do. You would need an INSERT, UPDATE and DELETE trigger to be put in place. Those would then insert appropriate messages to some table on your destination server. Your destination server would then have a job running every minute, 5 minutes, 10 minutes or hourly (whatever rate you choose) to process those entries. If it is insert it inserts the record. If it is an Update, it does the update, if it is delete it deletes. (This is if you want the destination table running all of the time and don’t want any downtime for it.) (This wouldn’t be the plan if you have many tables in place.) 2. No way to do read LOG and apply the changes. Log shipping won’t help if you want your destination database to be in use. Log Shipping is only used in a destination server that is just standing by as though you were manually restoring it and saying "ooooh here is another log file to process" "wait here is another log file". Then when you need the destination database to be used you bring it online. That won’t help since I get the impression your destination database is also in use live. 3. Not sure what you mean by your question regarding SSIS or DTS. If you don’t need the destination table in use, then you can use BCP or any number of means to move the data and simply replace the destination data completely. Your table size and system performance would determine how frequently that you could do that.
Hope it helps,
Dalton Blessings aren’t so much a matter of recieving them as they are a matter of recognizing what you have received.
]]>