SQL Server Performance Forum – Threads Archive
Locking QuestionWe are building an application that will need to "synch" a few tables several times a day. right now we are going to be using DTS packages that simply select data fromt he source tables which will then be inserted into the target/destination tables. The packages will actuallyu be fairly quick but I need to keep users out of the table for the few seconds that they will run. Here are the Tasks:
1) Delete from TheTargetTable
2) Execute transformation
3) Insert into TheTargetTable I am thinking I should use TABLOCKX during this process since the table will be empty for a few seconds and, if I remember correctly, the exclusive lock will return an error unitl the job is completed. Not sure if this is the best way to do this , but would appreciate any help. Also, if it is decided to use the TABLOCKX option hwo would I inject the code into the DTS workflow?
IF the insert operation is processing then anyway users will not have access to those rows, so why to apply seperate table lock in this regard. With the current process if you’re getting any issues then try to solve them instead of making it complicated. Satya SKJ
This posting is provided â€œAS ISâ€ with no rights for the sake of knowledge sharing.
You can also try this: 1. create table tmp_target with the same structure as target table.
2. copy rows to the tmp_target table.
3. drop target table
4. sp_rename ‘tmp_target’, ‘target’ Advantage: All rows will be accessable all the time except for a few ms.
Drowback: Risk of error during a few ms of dropping/renaming.