I have an interesting problem related to replication. Our clients have two means of using our software, either via a Web interface that modifies our central database directly, or via a piece of installed desktop software that has its own autonomous database. For the clients that use the installed software, one of the system requirements is that they must be able to use the software without having an internet connection at all times. The way we facilitate that is to give them their own database locally, then we have a custom replication scheme that uses BCP to copy new data from their local database to our central database, and vice versa. The problem I run into is this: When a desktop software client uploads their BCP files, they are inserted into a temporary table which is then joined with the corresponding table in the central database, then the central db table is updated with the new data in this temp table. While this update is going on, there may be millions of records being updated. This results in an exclusive table lock being placed on the table. With this lock in place, our clients which use the Web interface are out of luck for selecting or updating rows via the Web. I'm hoping that someone has run in to this type of problem before and might have some creative solutions for avoiding it. I have no problem switching to a multi-database model, so that perhaps the Web clients use a different database than the desktop software clients. The problem is just that even with a multi-db model, I still have to merge replicate data between those two dbs and if this happens while one of my massive BCP update processes has a table locked, I have the same problem I described earlier. I appreciate any help offered.