How to Take Advantage of SQL Server 2000 Performance Tuning Tools

Manage Statistics

Without you doing anything, SQL Server automatically creates and maintains internal statistics on the rows of data in all of your tables. These statistics are used by the Query Optimizer to select the optimal execution plan of Transact-SQL code. Most of the time, SQL Server does a fine job of maintaining these statistics, and the Query Optimizer has the necessary information it needs to do its job.

But sometimes, the statistics SQL Server creates and maintains is not optimal, and that’s what the Manage Statistics Tool is for. This tool, available from the Tools menu of Query Analyzer, allows you to modify how SQL Server automatically creates and maintains statistics. You can add, edit, or delete the various statistics maintained by SQL Server. And because this tool is available from the Query Analyzer, you can experiment with different sets of statistics and see how it affects the query optimizer’s execution plans.

Unless you are a very experienced SQL Server DBA or developer, I would not suggest you use this tool. Selecting the proper statistics is a difficult task, and your time would most likely be better spent elsewhere performance tuning SQL Server.

Index Tuning Wizard (for Individual Queries)

A little later in this article you will learn about a tool called the Index Tuning Wizard, which can be used to recommend optimal indexes for an entire database. But for now, you need to know that the Index Tuning Wizard can also be run from the Query Analyzer and be used to recommend indexes for specific queries.

For example, say you are evaluating a particular query for performance, but are not sure if the indexes currently on the tables being hit by the query are effective for this particular query. What you can do is to run the Index Tuning Wizard (available from the Query menu of Query Analyzer), and it will evaluate the query, and if appropriate, recommend that a new index be added in order to optimize this particular query.

While this can be a useful tool, it is also a little dangerous. This is because it is only evaluating indexes for one specific query. It does not take into consideration other queries that might be run against the tables, or the impact adding new indexes might have on INSERTS, UPDATES, or DELETES against these same tables.

In many cases, it better to use the Index Tuning Wizard to evaluate an entire database at a time, instead of a single query. This way, the Index Tuning Wizard can provide more balanced suggestions.

Take Some Time to Master the Query Analyzer

As you can see, the SQL Server 2000 Query Analyzer is a powerful tool with many features, and we have just barely touched the surface of how you can use it to help performance tune Transact-SQL code. You will find it very worthwhile to take whatever time it takes you to learn how to master this important tool.

SQL Server 2000 Profiler

The SQL Server 2000 Profiler is a powerful tool for helping identify SQL Server performance problems, but it is not a tool for the beginner. Essentially, it allows you to capture the communications between your application and SQL Server. While the actual capturing of data is not hard, what can sometimes be difficult for the beginner is interpreting the captured data.

In this section, we will take a look of what the Profiler can do, and also learn a little about how the Profiler can be used to help identify and resolve performance problems.
Profiler Features

The SQL Server 2000 Profiler can capture virtually all communication between a SQL Server and any other application. The various communications you can capture are referred to as events, and are grouped in Event Classes. Each Event Class includes one or more specific events. For example, the Event Class “Performance” has eight events, such as Execution Plan and Show Plan Statistics. Profiler offers you 13 different types of Event Classes to choose from.

Every event includes a variety of data columns associated with them. For example, the NTUserName or the ApplicationName that are just two of the many columns that contain information about the various events that you can capture.

On a production system, thousands of events per second can occur, more than you could ever attempt to analyze. To make the analysis of Profiler data a little easier, the Profiler has the ability to filter only those events you are interested in. For example, you can choose to only capture events between a specific user and SQL Server, or from a specific application and SQL Server, or from and to a specific database residing on SQL Server. You can also select which events you want to capture, and which data columns you want to capture about each event. You don’t have to capture all events or all data columns about an event. This helps to substantially reduce the amount of data captured. A large part of learning how to use the Profiler is deciding what events and data columns you should and should not capture for analysis.

To make your life easier, the Profiler has the ability to create what are called Trace Templates, which are files that let you save the various traces you create so that you can use them over and over. This can be a great time saver if your trace is a complicated one and you want to run it repeatedly.

Once you create and save a Trace Template, you can run the trace at any time. The results of the trace (the events you capture) can be viewed and discarded, saved as a trace file, or saved in a SQL Server table. Saving your trace results is handy, as you can go back to it at any time to review it.

Once a trace has been run, you can view the results (view the captured events) using the Profiler itself. In the Profiler window you can view each event and the data columns you captured, line by line. In many cases, the events will include Transact-SQL code, which you can view directly using the Profiler, or you cut and past into the Query Analyzer if you want, for more detailed analysis.

Creating a trace for the first time can be hard if you are not familiar with all of the various events and data columns. The easiest way to begin learning how to use the Profiler is to use the Profiler’s Create Trace Wizard. This tool includes a variety of basic templates that you can customize for particular needs. For example, the “Find the Worst Performing Queries” template can be used to help you identify those queries that take longer than a predetermined amount of time to run, such as queries that take longer than 1 second. There are a variety of these templates available with the Create Trace Wizard, and you should explore them all.

How to Use the Profiler for Performance Tuning

The Profiler is a powerful tool for identifying performance related problems, and can be used at any time during the development process. One of the areas where I find it extremely useful is when troubleshooting performance of existing applications. As you know, I highly suggest that performance be done early in the design phase of the application to prevent performance problems in the first place, but this is not always possible.

For example, you may have inherited an in-house application, or perhaps your company has purchased an outside application that uses SQL Server as the back-end. In these cases, you may be called in to try and resolve application-related performance problems.

One of the first problems when troubleshooting applications you are not familiar with is not knowing how the application works. While somewhat tedious, you can use the Profiler to “sneak a peek” at how your application communicates with SQL Server. You can configure a trace in Profiler to capture all of the communication between the application and SQL Server. Then, one step at a time, you can perform some task in the application, and then review the communication between the application and SQL Server to find out how the two communicate.

Interpreting the communications usually requires a solid understanding of Transact-SQL, but if you know what you are doing, you can figure out how an application works with SQL Server. While you won’t need to analyze all the communications between the application and SQL Server, you will want to focus on those areas of the application’s functionality that appear to be causing the performance problems, such as a specific report, or some update process.

Generally, once you have created a trace of the offending operation, you can review the Transact-SQL in the trace and identify the problem. For example, I analyzed one particular in-house application and discovered the problem was that the VB code used to access the SQL Server data was creating a cursor, and then moving only one row at a time from SQL Server to the application. The problem was that there were several million rows that were sent, which really slowed performance. Once I identified the cause of the problem, I was able to get the VB code rewritten.

Another feature of the Profiler is that you can create traces of your application’s activity with SQL Server, then use this trace as input to the Index Wizard. The Index wizard then analyzes the activity and can recommend that clustered and non-clustered indexes be added or dropped in order to boost your database’s performance. You will learn more about the Index Wizard in the next section of this article.

As you can see, the Profiler is a very powerful tool, a tool that every SQL Server DBA and developer needs to learn and master.

Index Wizard

The SQL Server 2000 Index Wizard will be a tool you will soon grow to love. Although not perfect, this tool has the ability to evaluate actual production queries running against your SQL Server, and based on the nature of these queries, recommend specific clustered and non-clustered indexes that should be added to your database’s tables. The Index Wizard can be run from the SQL Server 2000 Profiler’s Tools menu.

This tool can be used during testing of your database during the early stages of your application’s development, and it can be used after the database has gone into production. In fact, you should consider running the Index Wizard on your database periodically after your application has gone into production. This is because database usage patterns can vary over time, and the optimal indexes for your database may change along with how the database is actually used.

Although the Index Wizard can be a great tool, you also don’t want to depend in it exclusively for recommending indexes on your databases. While the Index Wizard is very smart, it still can’t make better choices than a DBA experienced in index tuning.
The best feature about the Index Wizard is that it can work with real data from your production SQL Server databases, not fake or simulated data. This means that the indexing recommendations made by it are tailored to how your database is actually used. Here’s how it works.

Before you can use the Index Wizard, you must first create what is called a workload. A workload is a Profiler trace or a Transact-SQL script. In most cases, you will want to use a Profiler trace because it reflects actual database activity.

If you want the Index Wizard to produce useful results, it is important that the workload be created during a time period that is representative of typical day-to-day database usage. This way, the Index Wizard will be able to offer suggestions based on the real world use of your database.

Once the workload has been created, it can be feed to the Index Wizard. What the Index Wizard does is to take a sample of the queries it finds in the workload, and analyzes them using SQL Server 2000’s Query Optimizer.

Once the Index Wizard is done analyzing the workload (if the workload is large, it can take hours to run) it will recommend what it considers to be the best mix of clustered and non-clustered indexes for the tables in your database, based on the workload it analyzed. In addition, if you already have indexes on your database, and the Index Wizard finds out that they are not being used, then it may recommend that they be removed.

When the Index Wizard makes its recommendations, you have the option to allow the Index Wizard to make them now (not recommended on a production database) to schedule their creation for a later time, or to save them as a script. I recommend you save them as a script, which allows you to take some time to carefully review the recommendations. And only once you are happy with the recommendations, you can then run them using the SQL Server 2000 Query Analyzer at an appropriate time. If you disagree with any of the recommendations, you can easily make any changes you find necessary to the script before you run it.


Leave a comment

Your email address will not be published.