SQL Server Performance Forum – Threads Archive
update a large table from other tablehi guys, please help me with this case: I have a large table about vehicle, with more than 3 million records.
As the business grows, my boss requires an additional column and in this case field "model" should be put into table.
Soon after I added the new column, I have to insert the value of the new column for each records from other table (temp1), with this sql query: update vehicle
set vehicle.model = temp1.model
from vehicle, temp1
where vehicle.id = temp1.id
and vehicle.model is null the problem is: it took a very long time (37 minutes) until the syntax is finished. Have anyone has experience on how to work faster in this case than i did? i need ur advice guys…thx a lot…thx in advance…
I suppose you added the column using "alter table add yourcolumn datatype NULL"
If you had used "alter table add yourcolumn datatype NOT NULL default ‘dummyvalue’" you would have payed the relocation-price at alter time ! Pagesplits may occur during this operation, making it need more time to execute. If you check the table’s statistics, you’ll see a bunch of relocated rows. Rebuild the table and its indexes if needed. If you use the ‘dummyvalue’ make it as long as your max length of the data you are going to insert into it using your update query. This way the rows don’t need to be relocated at updatetime.
If you are updating this table in a sigle transaction, it is going put exclusive locks on the table…means table can’t be accessed by any one else… If you are not concerned about the blocking and table being offline then you can use the method you tested…
Even if you create a column with default… it take long time… Make sure you have the index on vehicleid and model and also index on temp table on vehicleid… If you want to make this online operation, so that users can access it… use while loop and update 1/10/1000 rows at a time… Mohammed U.