Interview with Robin Dewson, Author of Beginning SQL Server 2000 Programming

What alternatives do developers have over using SQL Server-based cursors? In other words, how can developers perform the same function as a cursor without using a cursor?

Perhaps one of the performance gains least utilized by developers starting out in SQL Server are temporary tables. For example, using one or more temporary tables to break down a problem in to several areas could allow blocks of data to be processed in their own individual way, and then at the end of the process, the information within the temporary tables merged and applied to the underlying data.

The main area of your focus should be, is there an alternative way of doing things? Even if I have to break this down into several chunks of work, can I do this work without using cursors, and so result in faster performance.

Another area that you can look at is the use of CASE statements within your query. By using a CASE statement, you can check the value within a column and make decisions and operations based on what you have found. Although you will still be working on a whole set of data, rather than a subset found in a cursor, you can use CASE to leave values, or records as they are, if they do not meet the right criteria. Care should be taken here though, to make sure that by looking at all the data, you will not be creating a large performance impact. Again, look at using a subset of the data by building a temporary table first, and then merging the results in afterwards.

However, don’t get caught out with these recommendations and do any of them in every case. Cursors can be faster if you are dealing with small amounts of data. However, what I have found, to be rule number one, is get as little data in to your cursor as is needed.

If you have no choice but to use a SQL Server-based cursor, what tips do you have in order to optimize them?

Perhaps the best performance gain is when you can create a cursor asynchronously rather than needing the whole population operation to be completed before further processing can continue. Then, by checking specific global variables settings, you can tell when there is no further processing to take place. However, even here, care has to be taken. Asynchronous population should only occur on large recordsets rather than those that only deal with a small number of rows.

Use the smallest set of data possible.

Break out of the cursor loop as soon as you can. If you find that a problem has occurred, or processing has ended before the full cursor has been processed, then exit.

If you are using the same cursor more than once in a batch of work, and this could mean within more than one stored procedure, then define the cursor as a global cursor by using the GLOBAL keyword, and not closing or deallocating the cursor until the whole process is finished. A fair amount of time will be saved, as the cursor and the data contained will already be defined, ready for you to use.

What are your top recommendations about the best way indexes can be applied to SQL Server tables in order to enhance performance?

First of all, try to place a clustered index within each table. However, do take care choosing which fields you use within a clustered index. Avoid, if at all possible, any columns where data updates will take place. In fact, if you have to include a column, which will receive updates to its contents, then on high volumes, this should not be used for a clustered index. 

However, what is better is where you have a table with only a handful of columns within it, and you place all, yes all, of these columns within an index. This is called a covering index, and will save time on finding a record within an index, and then moving to the actual table to get the physical data. However, in my experience, never make a covering index a clustered index as the gain will surely be lost with the re-pagination of the table.

Always keep checking the performance of any indexes, clustered or non-clustered, where there are columns that can have data modifications placed on them. Try to keep indexes on columns that are used for data insertion. Of course, this can never hold true 100% of the time, and you will find that there are times that you just have to do this. If you do have to build an index on a column that is modified, then ensure that an update to the statistics of that table is made at least once a week.

Ensure that when building an index you get the order of the columns correct. In other words, place the high frequency column used for data search first. It is no use having an index using column1 and column2, where you use column2 only within the WHERE statement. You would swap these two columns around in order within the index.

But as I have said before, constantly monitor indexes, constantly check that your statistics are up to date and revisit your design to ensure that yesterday’s indexes are today’s indexes.

Do you use any Microsoft-supplied tools to help you performance tune your SQL Server-based applications. If so, what are they, and how do you use them?

In the main I use the Index Tuning Wizard, which can be started as any other wizard within the SQL Server MMC. By building up a good workload file taken from a day’s or a week’s worth of data, it is possible to see how well the indexes are performing.

I also use Task Manager when I feel a server is not performing as fast as I might expect. From this I can check memory resources and processor utilization. Of course, it may be that your SQL Server based app is performing badly from an incorrectly set up network card or network itself, so therefore checking out the network properties is also an area that I look at and use tools for

Another area to look at within Task Manager is the disk throughput. If you have your data and your transaction logs on the same disk drive, it can lead to performance degradation. Watching disk I/O may highlight this sort of problem, and so require the transaction log to be moved to another drive.

Finally, although not specifically a tool, the use of the DBCC command to check for problems and to rectify them can avoid issues further down the line.

Continues…

Leave a comment

Your email address will not be published.