I've got small databases used to write log data to on multiple servers in different locations around the world. None of the servers can be 100% relied upon due to variable network conditions in each location, but they are self-contained systems logging only their own data. They generate approximately 50,000 records per day between them at the moment, but I'm expecting that to increase as more servers are added over the next year. What I need to do is to collect the log records from the originating servers into a central database used for statistical analysis. The current approach is a nightly job that calls a web service hosted on each logging server. The service returns the days records and deletes them from the log database to keep the size down and to prevent them from being collected more than once. I've been looking at the various database replication methods with a view to setting the main server up as a subscriber and each of the log servers as publishers, but I'm getting confused and I don't think that's going to work. Wouldnt that result in deleting them from the stats server when they get deleted from the logging servers? Everything I've seen seems designed to exactly duplicate a table in two places rather than transfer the data from one to another which is what I need. Any suggestions would be appreciated.