SQL Server Performance

Slow response of stored procedure first time called

Discussion in 'ALL SQL SERVER QUESTIONS' started by chmbrssh, Aug 21, 2012.

  1. chmbrssh New Member

    Hi,
    I have noticed that a stored proc takes up to 40 seconds when first run, and then less than 1 second on subsequent calls. This as indicated in an excellent article (http://blog.sqlauthority.com/2010/0...-and-reduce-recompilation-performance-tuning/), is because each time a new connection is made to the database an entry in dm_exec_cached_plans has to be created. Once created the databasehas an execution plan and thus the query will be much faster.

    This explains why if you stop and restart sql server this entry in dm_exec_cached_plans will dissapear and thus need to be replaced by running the procedure again.

    We are about to deploy our solution to the client site, and I want to ensure that this application (which is a silverlight app with SQL Server back end) runs as fast as possibloe first time.
    From my undertstanding I would need to run the stored proc before hand so that an entry is made to dm_exec_cached_plans.

    My questions around this are:
    1) If the data in the underlying tables are the same, will different parameter varaiations need to be called with the stored proc, so that new entries in dm_exec_cached_plans can be inserted, or will the same stored proc use the same dm_exec_cached_plans entry regardless of parameter values.
    I ask as the article mentions that:

    "It could be that the data amount in underlying tables changed so much that the previously created plan is not optimized"

    This refers to the underlyting physical table, which is different to the derived table that my stored proc will return, which depending on parameters passed can return very different amounts of data.

    2) In the same vein as above do I need to log in as each user from the silverlight application, in order that an execution plan is stored for each one, or it is sufficient to simply log in as a simple user, which will in turn create an entry in dm_exec_cached_plans, and thus each user will get the page faster when they first log in?
    Thus you can see what I want to do, and can set things up, whereby on a new deployment or restore we carry out a process of running cerstin stored procs or logging into application as different users, so that the user experience is as good as can be.

    Any suggestions, to what I am sure is an existing and hopefully already solved issue would be greatly appreciated:)
  2. FrankKalis Moderator

    Welcome to the forums!
    I'm not sure I fully understand you, but I think you really can't and shouldn't preoptimise things. Of course you can run some dry runs of your application in the client environment before you hand it over to the client, but over time plans that were optimal at one point can become less optimal and would have to be re-evaluated and possibly replaced by "better" ones. Nothing you can do about that. This happens quite naturally. However, SQL Server does quite a good job at working out those details by itself, but it can't hurt to understand some of the internals behind the scenes. One of the best whitepapers of this topic is this one: Batch Compilation, Recompilation, and Plan Caching Issues in SQL Server 2005

    After reading this (and possibly some more information like this: http://www.red-gate.com/community/books/sql-server-statistics) you should be in a better position to decide on your strategy to provide "best" performance for your application.
  3. chmbrssh New Member

    Hya,

    thanks for replying so soon:)
    I have a read a few articles on this and have followed some practices for avoiding pre-compilation (i.e. avoid dml, temp tables etc), but will look at the 2005 article u mention, it seems there is nothing i can do and the appm is slow first time the user runs, but then under 1 second after this. Just means I need to expain to the client as this will seem strange to them.

    I am using 2008 R2 btw if any help

    cheers

Share This Page