SQL Server Performance

Timing Out

Discussion in 'Performance Tuning for DBAs' started by surendrakalekar, Nov 23, 2005.

  1. surendrakalekar New Member

    One of our SP is timing out on the testing server but it is working fine on the main production server. I checked table structurs & indexes related to that SP and it is same on both the servers.

    How I can find out why it is timing out?
    Let me know if you have any checklist to verify?


    Surendra Kalekar

  2. Adriaan New Member

    Could be ...

    ... the number of rows being processed is just X times bigger on the test server than on the production server

    ... there's a big backup operation running on the test server - when it ends, performance should improve again

    ... the server is lower in specs for RAM, diskspace, CPU time, clockspeed, busspeed

    ... the log file has to grow and the file growth setting is not sympathetic

    ... the SP has a loop somewhere and you're not getting out of the loop

    and so on, and so on.
  3. Twan New Member

    Hi ya,<br /><br />you could run SQL Profiler looking for SP<img src='/community/emoticons/emotion-7.gif' alt=':S' />tmtStart and SP<img src='/community/emoticons/emotion-7.gif' alt=':S' />tmtComplete to see where the proc is slower<br /><br />Cheers<br />Twan
  4. surendrakalekar New Member

    Thanks Adriaan and Twan.
    Problem is still there.... As per mine and your checklist nothing is wrong but don't know why that particular sp is timing out. SQL Prfofiler is also not helpful in this case.


    Surendra Kalekar

  5. Madhivanan Moderator

    What does that sp do?
    Post the coding

    Madhivanan

    Failing to plan is Planning to fail
  6. surendrakalekar New Member

    I can not Post the code of this SP but I am posting the Logic of this SP. Hopefully this will help to understand it. It is giving problem when we are inserting rows into #TempTableItems. But this is required for our paging logic and is working on other server.

    Create proc [dbo].ProcedureName
    List of 13 input parameters
    AS
    SET NOCOUNT ON
    DECLAREList of all required variables.

    Required Variable initialization

    Dynamic SQL creation starts for @STR1
    Dynamic SQL creation starts for @STR2
    Add ORDER BY clause in @orderby required for both queries

    CREATE TABLE #TempTableItems (ID int IDENTITY (1, 1), + All the required fields )
    execute ( 'INSERT INTO #TempTableItems (Field List' + @str2 + @orderby )
    Select @numresults1 = @@ROWCOUNT
    execute ( 'INSERT INTO #TempTableItems (Field List)' + @str1 + @orderby )
    Select @numresults = @@ROWCOUNT

    ----------------- Paging Logic ------------------
    Select @numresults2 = (@numresults + @numresults1)
    Select @TotalPages = ( @numresults2 + (@PageSize - 1)) / @PageSize

    if (@CurrentPage is null or @CurrentPage = '' or isnumeric(@CurrentPage) = 0)
    begin
    SELECT @CurrentPage = 1
    end
    if (@CurrentPage < 1)
    begin
    SELECT @CurrentPage = 1
    end
    if (@CurrentPage > @TotalPages)
    begin
    SELECT @CurrentPage = @TotalPages
    end

    SELECT @FirstRec = (@CurrentPage - 1) * @PageSize
    SELECT @LastRec = (@CurrentPage * @PageSize + 1)
    ----------------- Paging Logic ------------------

    select @exportallstr = 'SELECT ID as ItemNo, numresults, TotalPages, CurrentPage, + All other field FROM #TempTableItems WHERE ID > '+ str(@FirstRec) +' AND ID <'+ str(@LastRec)

    execute ( @exportallstr )
    SET NOCOUNT OFF -- Turn NOCOUNT back OFF




    Surendra Kalekar

  7. Madhivanan Moderator

  8. Adriaan New Member

    Add PRIMARY KEY to your ID column definition - you haven't included any kind of index on the column on which you filter, so searches can be slow.
  9. surendrakalekar New Member

    ------------ Section 1 ---------
    CREATE TABLE #TempTableItems (ID int IDENTITY (1, 1) PRIMARY KEY , + All the required fields )
    execute ( 'INSERT INTO #TempTableItems (Field List' + @str2 + @orderby )
    Select @numresults1 = @@ROWCOUNT
    execute ( 'INSERT INTO #TempTableItems (Field List)' + @str1 + @orderby )
    Select @numresults = @@ROWCOUNT
    ------------ Section 1 ---------

    ------------ Section 2 ---------
    select @exportallstr = 'SELECT ID as ItemNo, numresults, TotalPages, CurrentPage, + All other field FROM #TempTableItems WHERE ID > '+ str(@FirstRec) +' AND ID <'+ str(@LastRec)
    execute ( @exportallstr )
    ------------ Section 2 ---------

    When I check the execution plan 'Section 1' is taking 97% time out of total execution time. The 'Section 2' is taking only 3%.
    I tried with the PRIMARY KEY but it is taking bit more time.


    Surendra Kalekar

  10. Adriaan New Member

    At the risk of putting my foot in my mouth again ... (I'd hate to think I could develop a liking for the taste ...)

    You're using mainly dynamic T-SQL, so remember that here an execution PLAN is a very wild guess at best. Basically the query optimizer can predict only the bits that it can recognize for what they are, and the dynamic bits are assumed to carry no weight.

    The only non-dynamic part of this procedure seems to be your CREATE TABLE statement. All the rest is assigning strings to variables, and executing strings. Query optimizer has no idea what to expect there, so it guesses something like 1% for each of those, leaving almost 100% for the CREATE TABLE.

    Adding the PRIMARY KEY does imply extra execution time for the data inserts, so if you aren't inserting too many rows into the temp table then you might drop it. Or perhaps use PRIMARY KEY NONCLUSTERED.

    There's probably more improvement if you start using sp_ExecuteSQL for the dynamic INSERT INTO query statements. (And no, you probably won't see much of a difference in the execution plan: it will still be dynamic T-SQL. But it may well perform better.)

    By the way, if the only 'dynamic' part of a query is the value of a variable, then you should really just use the variable in the WHERE clause, and not concatenate it into a literal statement.

    Anyway, here's how you can feed your local @Low and @High parameters to sp_ExecuteSQL:
    EXEC dbo.sp_ExecuteSQL

    -- Here's the INSERT statement
    N'INSERT INTO #T (col1, col2) SELECT col1, col2
    FROM MyTable
    WHERE col3 BETWEEN @LowValue AND @HighValue',

    -- Here are the input parameters for the INSERT statement
    N'@LowValue INT, @HighValue INT',

    -- Here are the local variables corresponding to the input parameters
    @Low, @High
    Using sp_ExecuteSQL, you're firing the same query, just with different parameters - this is where you have a much better chance of SQL Server finding an existing execution plan in its cache. Also, SQL Server is now better able to find relevant data that is still in cache.
  11. surendrakalekar New Member

    Ok.. Let me try out all the possible options... I will let you know about the same...


    Surendra Kalekar

  12. surendrakalekar New Member

    quote:.....Or perhaps use PRIMARY KEY NONCLUSTERED.

    There's probably more improvement if you start using sp_ExecuteSQL for the dynamic INSERT INTO query statements.
    -- Here's the INSERT statement
    N'INSERT INTO #T (col1, col2) SELECT col1, col2
    FROM MyTable
    WHERE col3 BETWEEN @LowValue AND @HighValue',
    -- Here are the input parameters for the INSERT statement
    N'@LowValue INT, @HighValue INT',
    -- Here are the local variables corresponding to the input parameters
    @Low, @High[/code]Using sp_ExecuteSQL, you're firing the same query, just with different parameters - this is where you have a much better chance of SQL Server finding an existing execution plan in its cache. Also, SQL Server is now better able to find relevant data that is still in cache.

    The performance of PRIMARY KEY NONCLUSTERED is better than PRIMARY KEY and I am using that. But I am not able to convert my dynamic Insert SQL into sp_ExecuteSQL statement.
    It throws me error " Procedure expects parameter '@handle' of type 'int'. "
    Can you help me to convert below code into sp_ExecuteSQL statement.
    --------------------------
    Declare @str2 varchar(4000)
    Declare @companyid int
    Declare @orderby varchar(1000)
    Select @companyid = 4590924
    Select @str2 = 'select top 2 companycontactid as da, firstname as name from companycontact where companyid = ' + str(@companyid)
    Select @orderby = 'order by datepub desc'
    exec ('Insert into #Temp (da,name)' + @str2 + @orderby )
    --------------------------



    Surendra Kalekar

  13. Luis Martin Moderator

    Statistics are updated in both server?


    Luis Martin
    Moderator
    SQL-Server-Performance.com

    One of the symptoms of an approaching nervous breakdown is the belief that one's work is terribly important
    Bertrand Russell


    All postings are provided “AS IS” with no warranties for accuracy.



  14. surendrakalekar New Member

    quote:Originally posted by LuisMartin

    Statistics are updated in both server?
    Luis Martin
    Moderator
    SQL-Server-Performance.com
    Yes, It is updated.


    Surendra Kalekar

  15. Adriaan New Member

    You kind of missed the point that my script was using dbo.sp_ExecuteSQL - this is roughly what your script should be, except it's missing the field list for the target table:

    DECLARE @SQL NVARCHAR(4000)

    SET @SQL =
    'INSERT INTO #TempTableItems ( <fieldlist> )
    select top 2 companycontactid as da, firstname as name
    from companycontact
    where companyid = @companyid
    order by datepub desc'

    EXEC dbo.sp_ExecuteSQL @SQL, N'@companyid INT', @companyid
  16. Adriaan New Member

    Oh, and include the owner prefix for the tables in your query statements, and a table name or alias for each column - you may have seen the other thread about ownership issues, and it is a factor in SQL Server's ability to reuse execution plans:

    ..........
    SET @SQL =
    'INSERT INTO #TempTableItems ( <fieldlist> )
    select top 2 t.companycontactid as da, t.firstname as name
    from dbo.companycontact t
    where t.companyid = @companyid
    order by t.datepub desc'
    ..........
  17. surendrakalekar New Member

    Hi Adriaan<br />I change my INSERT INTO statement accordingly.. and able to save 5-10% time [<img src='/community/emoticons/emotion-1.gif' alt=':)' />]. Thanks for your advice and help. <br />Regarding the timeout for this sp on testing server, 2 days before we did DBCC INDEXDEFRAG and updated statistics on all the tables used by this sp. Now it is not timing out but working slow.<br />Thanks everybody.<br /><br /><br /><br /><br /><h6>Surendra Kalekar</h6>
  18. Adriaan New Member

    Nice to see improvement, but 5-10% isn't much.

    So does the companyid column in your companycontact have an index? If not, is it the first column in the primary key, or the first column in a unique index?
  19. surendrakalekar New Member

    <blockquote id="quote"><font size="1" face="Verdana, Arial, Helvetica" id="quote">quote:<hr height="1" noshade id="quote"><i>Originally posted by Adriaan</i><br /><br />Nice to see improvement, but 5-10% isn't much.<br />So does the companyid column in your companycontact have an index? If not, is it the first column in the primary key, or the first column in a unique index?<br /><hr height="1" noshade id="quote"></font id="quote"></blockquote id="quote"><br />Yes, it is more but need more [<img src='/community/emoticons/emotion-1.gif' alt=':)' />]. CompanyID in compnaycontact table is nonclustered index.<br /><br /><h6>Surendra Kalekar</h6>
  20. mmarovic Active Member

    You can try index on companyID and datePub (in that order) instead of index on companyID or as additional index depending on other important queries.
  21. mmarovic Active Member

    create index idx_companyContact on companyContact(companyID, datePub desc)
  22. surendrakalekar New Member

    mmarovic,
    Actually we have both one on CompanyID and another is CompanyID, DatePub (but this is in asc order) and we need both for the other sp's. But I will try with DatePub Desc order.
    Thanks


    Surendra Kalekar

Share This Page