Reduced Performance until Instance Restart | SQL Server Performance Forums

SQL Server Performance Forum – Threads Archive

Reduced Performance until Instance Restart


Hi! – I am running SQL Server 2000 Personal Edition (with SP3a) on my PC (Windows 2000 Professional, Service Pack 3). PC has 512MB ram and 1.66Ghz P4 CPU. Several GB of free disk space is available. – I have a batch that a) inserts ~40,000 rows into a ‘regular’ table via subquery
(i.e., insert into tab1(c1,c2) select col1,col2 from tab2) followed by a procedure that works on this data. Size of the datafile (.mdf) is only ~ 128M. Log file is ~ 24M. – Though I am on a LAN, I am the lone user and no other user/program is accessing my PC! – My issue: After starting the SQL Server instance on my PC, the first time I run this batch (from a .net 1.1 client) – the task completes in about 5 minutes which is quite ok in terms of expected performance. Every next time when I invoke the same batch (until I restart the instance), it takes 8.5 minutes to complete. If I restart the instance, again the task takes 5 minutes for the first time and 8.5 minutes for every subsequent invocation. To the best of my understanding, exactly identical code should be running all the time, even the parameter inputs and user-data remain unchanged. – My question: What are the specific different possibilities on each of which I should gather more information from my ill-behaving system, in order to get consistent performance from my system? Thanks and sincerely, Subhasis.
Do you allways insert data into the empty table or you actually append data with each execution?
Why would you be using a .Net 1.1 client? Anyway, what about the file growth settings for the data and log files of your DB. You could try to increase the file size of both before the insert, so the file growth doesn’t take up time during the insertion. You could also do a bulk insert – there is a way to do inserts without them getting logged in detail. Check BOL.
quote:Originally posted by mmarovic Do you allways insert data into the empty table or you actually append data with each execution?

I always truncate the table.
Then I insert data into the empty table.
It is a ‘normal’ table.
Thanks, Subhasis.
quote:Originally posted by Adriaan Why would you be using a .Net 1.1 client? Anyway, what about the file growth settings for the data and log files of your DB. You could try to increase the file size of both before the insert, so the file growth doesn’t take up time during the insertion. You could also do a bulk insert – there is a way to do inserts without them getting logged in detail. Check BOL.

Well, ultimately it is a .net application. And the database is rather small – and remains at its default setting of 10% for both. What I am still pondering about is the factor that gives us better (i.e.,acceptable) performance on a fresh instance? Thanks, Subhasis.

It starts as a small database – how big is it after you insert the data?
quote:Originally posted by Adriaan It starts as a small database – how big is it after you insert the data?
Dear Adriaan,
I see that it is 130M after I have inserted the data. No appreciable change in size as I re-execute the task. My program is truncating the data and re-inserting it.
Thanks, Subhasis.
130 MB after the insert – but how much is it before? Also, what are the LOG file sizes, before and after?
quote:Originally posted by Adriaan 130 MB after the insert – but how much is it before?
Also, what are the LOG file sizes, before and after?

Dear Adriaan, – I already increased the size of the user database to 320M (256M for data and 64M for log). Anyway, I also increased the size of the tempDB database to 320M (256M for data and 64M for log anyway). Now I restarted the instance.
– Before processing it shows 235.51M free on user database and 317.59M on tempdB database.
– After processing it shows 222.76M free on user database and 310.95M on tempdB database.
– After observing this, immediately I re-execute the task: observe no appreciable difference in free space! Thanks, Subhasis.
]]>