SQL Server Performance

SQL Server Performance Degradation

Discussion in 'Performance Tuning for DBAs' started by markblansjaar, Mar 10, 2003.

  1. markblansjaar New Member

    Here is some background that may help answer my questions.

    We have recently completed an project utilising SQL Server 2000 as a back-end. The system consists of a central SQL Server (running on a Windows 2000 Server) with upto 10 Client applications connected simultaneously (small potatoes I would have thought).

    Each Client application executes a number of queries (via stored procedures) on a periodic basis (30 seconds). The code has been written so that the Client applications open a number of permanent ODBC connections for the purpose of doing their work.

    In addition to the Client applications there are a number of Web Assistant Jobs running every minute (I have 'phased' these to spread the workload over the minute)

    The problem we have is that over time, the average CPU usage of the Server is creeping up. It starts at about 15% and takes about 5 days to get to 60-70% (Process Queue Length averages 5 at this point). We have told our Customer to stop and restart the SQL Server periodically until we sort the problem out. This however is very inconvenient to them (holds up production - and therefore is usually last resort).

    Using Performance Monitor, I have determined that it is probably the User Connections that are causing the problem because the %User Time counter follows the %Process Time counter and Processor Queue Length counter.
    I also noted that the @@CONNECTIONS variable in SQL Server seems to be increasing at a phenomenal rate.

    Here are my Questions:

    Is it normal for the @@CONNECTIONS variable to increase rapidly. I expected this to be reasonably stable since our Client Applications do not continually open and close their ODBC connections?

    Could our problems be due to the fact that the Windows 2000 Server (on which the SQL Server software is running) is set up as a Domain Controller?

    At one point we suggested that our Customer buy an additional CPU in view of the average Processor Queue Length. Would this guarantee the problem was fixed? We later retracted our suggestion since we were worried that even with an additional CPU that the problem would still be there but just have a longer 'gestation' period.

    Are there any other things that I could check?

    I would be very grateful if someone could help me, or to hear from anyone that has experienced similar problems.

    Thanks and Best Regards,
    Mark Blansjaar
  2. Chappy New Member

    I think the architecture of the system might have a few fundamental flaws and therefore it is not safe to say adding an extra CPU would fix the problem. It might alleviate it, the result being that it may just take longer for it to grind to a halt. This is just a hunch tho, it would need more investigation.

    A better idea from my point of view would be to give the role of domain controller to another machine. A good general rule is that SQL should be as dedicated as possible, since if other processes frequently require sql server to release memory back to the OS, this can be detrimental to performance too. I get the impression this is not really an option for you though!

    Also its generally better not to hold a permanent open connection to the database. I wouldnt have expected the connections to continually increase, but its just not very scalable. Granted, youre only using 10 connections right now, but often rules of scalability are good to follow even for small applications.

    I would check your app code very carefully. Could it be that the vast majority of your code is using the pre-opened connection, but perhaps one routine somewhere is opening another connection and then not closing it? Or perhaps the variable holding the connection is being recreated with a new connection sometimes, and the old connection isnt being disposed of?
    Youre sure the app actually closes the connection gracefully when it exits? Im more familiar with ADO than ODBC, but perhaps orphaned ODBC connections could linger if theyre not disposed of explicitly.
    Just stabs in the dark..

    Finally, a useful tool is in Enterprise Manager: go into Management, and current activity. Here you will see a list of all open connections to the server, and a very vague summary of the activity they have done most recently. Sort the list by Hostname, (or NT login name if possible), to determine which machines are eating the most connections. (see sp_who and sp_who2 stored procs if you prefer the console approach)

    Good luck!
  3. Chappy New Member

    Also run profiler against the sql server, tracing login and logout audits. Then fire up your app and start playing around in it, to see if any more logins are generated after the initial one. You could probably do this initial test against a dev server, and then try it on the live server if dev shows nothing of interest.
  4. trifunk New Member

    I think @@CONNECTIONS returns the number of connections that have been made or that applications have tried to make and not the current number of connections to the server.
    Since your client applications don't continually make these connections while the applications are running could it be that the applications are being started and stopped frequently? As a side note I think it's generally better to only make connections as and when you need them and release them directly afterwards (are you taking advantage of odbc connection pooling, this would take away the need for keeping the connection open if you're worried about performance issues on opening and closing the connection all the time), I don't know the nature of your application so it might be that your design suits your needs the best but it's not a really scaleable architecture, something to think about. Also as there might be different points of exiting the application you might not release all resources at every possible point of exit.

    What do the Web Assistant jobs do? As these are connecting every minute surely this would run up the @@CONNECTIONS variable on the server?

    Sorry I'm not really helping that much, just throwing a few ideas in the air, could you tell us what your applications do against the database (long queries, use of cursors, numerous inserts ect..).

    Cheers
    Shaun

    World Domination Through Superior Software
  5. dtipton New Member

    From BOL:

    @@CONNECTIONS
    Returns the number of connections, or attempted connections, since Microsoft® SQL Server™ was last started.

    Bad idea to have SQL Server running on a Domain Controller.

    What value do you have in the performance counter
    SQL Server:General Statistics:User Connections?

    Does this match the Static Pool of ODBC connections allocated for the application?

    Any thought given to using OLE DB instead of ODBC?


  6. bradmcgehee New Member

    All the suggestions you have gotten so far are good, and you should explore them. My gut feeling is that it is the application causing the problem, not SQL Server or the hardware. You may have to do extensive Profiler Traces and Performance Monitor logs to track down exactly what the problem is. Also, look for potential memory leaks on the server and watch how much memory SQL Server has available. It should remain more or less steady. If it decreases over time, then a memory leak is a potential factor.

    ------------------
    Brad M. McGehee
    Webmaster
    SQL-Server-Performance.Com
  7. markblansjaar New Member

    Thanks to everyone so far !
    Based on your comments/suggestions we have come up with a plan of attack. Updates will follow as we fix/discover problems.

    Regards,
    Mark B.
  8. bytehd New Member

    APP seems to be culprit.
    I designed a system that worked el fine-o until we started doing
    ODBC over the 'net.

    You said:
    "Using Performance Monitor, I have determined that it is probably the User Connections that are causing the problem because the %User Time counter follows the %Process Time counter and Processor Queue Length counter.
    I also noted that the @@CONNECTIONS variable in SQL Server seems to be increasing at a phenomenal rate."

    1) User time vs. Process Time: Windows code or SQL Code.
    Truthfully, under the sheets, its amazing to realize how many MS apps are secretly allowed into the kernel. SQL is one of them.

    2) What is phenomenal rate? 1000,10000,1 million connections?
    When i realized i needed to turn ON shared ODBC connections in VFoxpro, my app gave back some performance.

  9. bytehd New Member

    I also seem to recall that SQL keeps logins/connections open
    after you disconnect because [windows] keeps em open.
    (after all, opening a TCP/IP connection is not an app or service level operation)
  10. markblansjaar New Member

    The problem is fixed ! (99.9% sure)
    We modified the way our application handled ODBC connections as suggested by
    Chappy. Application now opens and closes a connection everytime it needs to
    do some work on the SQL Server.

    Thanks again!

    Regards,
    Mark B.

Share This Page