Commit small chunks of Data for Large Data Sets | SQL Server Performance Forums

SQL Server Performance Forum – Threads Archive

Commit small chunks of Data for Large Data Sets

Posts: #30
Currently: Online When inserting large number of rows into SQL Server 2000 database, we need to commit the transactions of small chunks of data instead of updating/Inserting of large dataset. commiting small chunks of data clear any system resource issues such as memory and CPU utilization. Updating/Inserts of large data sets always a problem to memory. I have to insert 85,000,000 records to a table in a transaction and have limited resources. So I want to use commit statement in transaction so that commits trasactions for every 25,000 records. If any one knows T-SQL script for this process please help me. If you need more information please feel free to ask. Thanks, BK Bhushan Kalla
I’ve moved to relevant forum.
Luis Martin
Moderator
SQL-Server-Performance.com Although nature commences with reason and ends in experience it is necessary for us to do the opposite, that is to commence with experience and from this to proceed to investigate the reason.
Leonardo Da Vinci Nunca esperes el reconocimiento de tus hijos, eso ocurrirá luego de tu muerte
All postings are provided “AS IS” with no warranties for accuracy.
As usually it depends on requirments details you haven’t provided. Assuming you have source 85 M table with identity column the fastest solution (afaik) that commits for up to 25,000 rows per iteration (because there may be gaps in identity values) is: declare @minID int
declare @maxID int
declare @batchStart int
declare @batchSize int set @batchSize = 25000 select @minID = min(id), @maxID = max(id)
from sourceTable set @batchStart = @minID while @batchStart <= @maxID begin
insert into DestinationTable(….)
select …
from sourceTable s
where s.id between @batchStart and @batchStart + @batchSize – 1 set @batchStart = @batchStart + @batchSize
end
You may also add an error handling code.
]]>