SQL Server Performance

Performance increase, then decrease, help!

Discussion in 'Performance Tuning for DBAs' started by Alain Tremblay, Sep 8, 2003.

  1. Alain Tremblay New Member

    Hi,

    My company use a 3rd-party Access97 application that runs on a SQL 7 database. I Have access to the sql server, but not to the application code so I can't tune my queries, juste the SQL server.

    My problem is as follow: There is a task most users do that can take them from one hour to 4, depending on the speed of the app that day. Simply put, the form loads 3 sub forms which queries 3 tables that are quite large (1 000 000 records, 8 000 000 records and 760 000 records)

    One day, I started the profiler, caught what looked like the query in question and ran it through the query optimizer. The optimizer suggested a new index, which I created and bam! the screen that usually took 40sec to 1 1/2 min to refresh was below 1 sec!

    But... (there's always a but...) after one week, the problem returned. So, I deleted the index and bingo! 2 days of good performances.

    Since then, I'v been creating end deleting indexes every other day or so and every time the performance returns to normal. I usually touch only the 2 smallest of the 3 tables.

    Any idea what can cause this and what can I do to maintain my performance at peak level?
  2. gaurav_bindlish New Member

    Do you have a scheduled job of updating the indexes for the tables in your database?

    I suspect this is an issue of the statistics of the table not being updated.

    Also if the queries made by users vary a lot in the parameters and the queries are stored as stored procedures, use WITH RECOMPILE option to generate a new execution plan every time the query is executed.

    Gaurav
    Moderator
    Man thrives, oddly enough, only in the presence of a challenging environment- L. Ron Hubbard
  3. Luis Martin Moderator

    Did you run update statistics after any change?

    If not, bengin running Update Statistics before any changes.
    If perfomance is going down, try with profiler again and, after change, run U S again.

    Luis Martin

  4. Twan New Member

    Alain

    Do you have auto update stats and auto create stats options set for the database?
    How much is the data in the tables being updated/inserted/deleted?
    Is the problem code a procedure or dynamic SQL?

    When the problem happens you could try running dbcc freeproccache. If this resolves the issue then your problem is definitely caused by different parameters requiring different plans... The only viable longer term option is then to change the application so that it either recompiles the procedure each time or split the procedure so that the different parameters all give optimum plans...

    Are you able to post the query/procedure?

    Cheers
    Twan
  5. Alain Tremblay New Member

    yes, auto update and autocreate stats is on for the database. The big table is a really frequently updated table. Well, they all are frequently updated. A lot of insert/update and, you guessed it, the query condition are often on the updated fields.

    I think it's dynamic SQL, but how can I be shure?

    I'm sorry but I dont know what dbcc freeproccache is. Can you help me with that or give me a link on how to use it?

    I'll check if I can get the queries (because the form must use at leas 3 queries for there are 3 sub-forms) and post them as soon as I can get them.
  6. Luis Martin Moderator

    Auto update and autocreate stats don't mean full update statistics. As far I know is about 10% like a default maintenace job.

    I suggest to run Update Statistics 100%, once a week for critical tables.

    Luis Martin
  7. Alain Tremblay New Member

    Also, I had a reply to Luis and Gaurav but I screwed up and didn't post it after all! [xx(]

    Ok, here what I can tell you guys:

    Luis, No, I didn't run update stats as I don't know how to do that. I should tell you that my experience with SQL serve is very limited.

    Gaurav: No, I don't have a scheduled job to undate the indexes, but i would like to know how to do that. I started by looking with the "maintenance plan wizzard" but I'm not comfortable enhough to select the parameters for it. Any help would be very much appreciated.

    As for putting parameters in the queries, I should stress that I dont have any access to the application code and queries so my hands are pretty tied up on that subject.

    Thanks a lot for helping, We've benn having this problem for more than a year now and didn't have a clue as to why sometims it was fast and other times it was slow. Now we start to see the light at the end of the tunnel...
  8. Alain Tremblay New Member

    Thanks Luis, I will try that. Can you tell me how to create that job and schedule it?
  9. Twan New Member

    Hi Alain,

    If it is a query that the application submits (as opposed to a query within a procedure) then it is unlikely to be related to my suggestion about dbcc freeproccache...

    The queries as captured in Profiler may be useful. Whilst I'm aware that others have suggested frequent update statistics, I must admit that I've not ever had to run update statistics on my databases... Definitely an option, although I'd prefer to find out a bit more about:
    - what sort of data is in the tables (table and index creation scripts)
    - what is being inserted/updated/deleted (anything peculiar about the data such as incrementing numbers, etc.)
    - what are the queries that you are having problems with

    It would be interesting to see what the difference is in execution plans... Are you able to get a query plan from Query Analyser during bad and good performance?

    Cheers
    Twan
  10. gaurav_bindlish New Member

    See Using SQL Server Tools > User Interface Refernce > Database Maintaenance Plan WIzard help in BOL for help on database maintenance wizard.

    I would recommend capturing the trace for the SQL Server enabling the execution plan and locking events to see the activity in the database.

    Gaurav
    Moderator
    Man thrives, oddly enough, only in the presence of a challenging environment- L. Ron Hubbard
  11. Luis Martin Moderator

    1) Using Entreprise Manager, Management, SQL Server Agent, Jobs, New Job (rigth click)<br />2) Name the job.<br />3) Tab to steps.<br />4) New Step.<br />5) Choose your database.<br />6) In Command write:<br /><br /><br /> UPDATE STATISTICS Table-Name<br /><br />one for each table.<br /><br />7) Go to next Tab Schedules<br /><img src='/community/emoticons/emotion-11.gif' alt='8)' /> New Schedule <br />9) Try to use a non-production time.<br />10) Tab to notification.<br />11) Ok. to save the jobs.<br /><br />12) Good luck (joke).<br /><br />Luis Martin<br /><br />
  12. Alain Tremblay New Member

    Thanks Luis, I created the job each night at 5 AM and I'll see if performance increase. Should I do it on all the database or just the tables in question?
  13. gaurav_bindlish New Member

    It is advisable to run the job for frequently updated tables on a regular basis.

    Gaurav
    Moderator
    Man thrives, oddly enough, only in the presence of a challenging environment- L. Ron Hubbard
  14. Luis Martin Moderator

    I use default (10%) for all database and full only for critical tables like Gaurav sed.

    Luis Martin
  15. Alain Tremblay New Member

    Hi, just wanted to give you all a follow-up. I tried Luis's suggestion and I created a daily job of stats update on the most frequently updated tables but it has not solved my problem. I'll keep looking for a solution and if I find it i'll post it here. Thanks for all your suggestions!
  16. Twan New Member

    Alain,

    Are you able to post the offending query?

    Cheers
    Twan
  17. tikus New Member

    Maybe the indexes are fragmented? If this is the case, you need to rebuild the index. Just like you have to 'delete and recreate' the indexes each time.

    To check index fragmentation:

    dbcc ShowContig ('tableName') with all_indexes

Share This Page