Interview with Robin Dewson, Author of Beginning SQL Server 2000 Programming

What can a developer do during the logical and physical design of a database in order to help ensure that their database and SQL Server-based application will perform well?

A developer must investigate volumes of data, what types of information will be stored, and how that data will be accessed. If you are dealing with an upgrade to an existing system, analyzing the present data and where existing data volumes occur, how that data is accessed and where the current response bottlenecks are occurring, can help you search for problem areas in the design. A new system would require a thorough investigation of what data will be captured, and looking at volumes of data held in other formats also will aid design. Knowing your data is just as important as knowing the constituents of your data.

Also, constantly revisit your design. As your system is built, check relationships, volumes of data, and indexes to ensure that the physical design is still at its optimum. Always be ready to check your system by using tools like the SQL Server Profiler.

What do you consider are the best reasons to used stored procedures in your application instead of passing Transact-SQL code directly to SQL Server?

First and foremost, a stored procedure is a compiled set of code, where passing T-SQL through languages such as VB, Visual FoxPro, etc., means that the set of code needs to be compiled first. Although T-SQL within VB, etc., can be prepared before running, this is still slower than using a stored procedure. 

Then, of course, there is the security aspect, where, by building a stored procedure, you can place a great deal of security around it. When dealing with sensitive data, you can use an encrypted stored procedure to hide sensitive columns, calculations, and so on.

Finally, by using a stored procedure, I feel that transactional processing becomes a great deal easier and, in fact, using nested transactions become more insular and secure. Having to deal with transactions within code that may have front end code, will slow up a transaction and therefore a lock will be held for longer than necessary.

What are some techniques for writing fast performing stored procedures?

Fast performing stored procedures are like several other areas within T-SQL. Revisiting stored procedures every six months or so, to ensure that they are still running at their optimum performance is essential. However, actual techniques themselves include working with as short a transaction area as possible, as lock contention will certainly impact performance. Recompiling your stored procedures after index additions if you are unable or not wishing to restart SQL Server, will also ensure that a procedure is using the correct index, if that stored procedure is accessing the table which has received the new index.

If you have a T-SQL command that joins several tables, and it takes a long time to return a value, first of all check out the indexes. But what you may find tends to help, is to break down the code and try to determine which join it is that is causing the performance problem. Then analyze this specific join and see why it is a problem.

Always check out a stored procedure’s performance as you build it up by using the SHOWPLAN commands.

Also, try to use EXISTS, rather than a JOIN statement. An EXISTS statement will only join on a table until one record is found, rather than joining all the records .

Also, try to look at using subqueries when you are trying to find a handful of values in the subquery statement, and there is no key on the column you are looking up on.

When should SQL Server-based cursors be used, and not be used?

SQL Server cursors are perfect when you want to work one record at a time, rather than taking all the data from a table as a single bulk. However, they should be used with care as they can affect performance, especially when the volume of data increases. From a beginner’s viewpoint, I really do feel that cursors should be avoided every time because if they are badly written, or deal with too much data, they really will impact a system’s performance. There will be times when it is not possible to avoid cursors, and I doubt if many systems exist without them. If you do find you need to use them, try to reduce the number of records to process by using a temporary table first, and then building the cursor from this. The lower the number of records to process, the faster the cursor will finish. Always try to think “out of the envelope”.

Continues…

Leave a comment

Your email address will not be published.