<![CDATA[ There is not much the report writer can do to speed things up, compared to the DBAs and the people buying the hardware. Nevertheless, I wanted to share some ideas that may or may not work in your situation.
Use indexes in your database
When writing stored procedures, I prefer to keep my query as a file for as long as possible before turning it into a stored procedure.
Then, before I turn the batch file into a sproc, I do a check of the execution plan by pressing CTRL+L in SQL Server Management Studio. If the plan shows any kinds of table scans, I know I may have a non-optimal query.
On a related note, keep in mind that the application is optimized for the things it does most often. If you are querying something that the application does all day long, you will find the indexes designed heavily in your favor, but if you are writing anything else you are on your own.
Generally use table variables rather than temporary tables
The most common problem with temporary tables is that they can cause your code to recompile at execution time every time your stored procedure is run, and you will not know about it.
Whenever you have code that creates tables interspersed with T-SQL code, you run the risk of a recompile.
There are exceptions, however. Temporary tables have the ability to have any number of indexes (just like a permanent table), and have statistics maintained on them. Also the cost of a recompile may be small compared to the cost of your query.
You can cause an expensive recompile by creating temporary tables using select ..into #MyTable within your other data manipulation code. If you must use temporary tables, try to create them all at once.
Better still, make sure that all your temporary tables are all defined with create table statements, so the optimizer knows about them.
Even better still, use table variables instead of temporary tables whenever possible as temporary tables have a habit of being written to tempdb instead of residing in memory and SQL Server has more opportunities to optimize operations on table variables, in addition to the reasons above.
For more detailed information on what can cause a recompile, see,
Surprisingly, this little directive actually saves significant time,
Contrary to what some people think, it does not actually affect functionality with respect to @@rowcount.
set nocount on
select top 99 * into #t2 from cv3client
print @@rowcount –works even with nocount on
This still works, no matter what state nocount is in, but, with nocount off, it is as if your stored procedure has a bunch of print statements sprinkled throughout which takes enough time to make it worthwhile to insert this one command.
Use the nolock hint, but cautiously
The real reason I am writing this article is because of the generous use of the (nolock) hint that we make around here.
At first it concerned me, but I have become more comfortable with it over time, so I wanted to share my reasons why I am now comfortable with this now so it can either become part of the common prevailing wisdom and be used by everyone, or so someone can show me a reason why we should not.
In our database, configuration information is never updated at the same time as reports are run (only at downtime) and data is never deleted (instedead the older row is inactivated)
The reason I think the (nolock) hint should be safe in our database is because you will never be able to read a row of data in a harmful transient state; the worst you may be able to do is read a recently inactivated record.
But reading that recently inactivated record is no more harmful than running the report a few seconds too soon, which would give the same result.
I should add that the vendor of the application we use also uses (nolock) in their reports.
Other times when it is safe to use (nolock) include:
*When you are joining tables that do not change while your report is running, say a table of zip codes and cities.
*when it’s just you developing code in your test database.
*when you can include a disclaimer on your report.
Also note that another way to achieve the benefits of (nolock) is to lock the file containing the read-only table. In our case we cannot do this.
Here is an article that discusses (nolock) in greater detail:
SQL Server supports a large number of other query hints that are applicable in various situations. Here is a reference on the query hints supported by SQL Server 2008:
The default isolation level for a select statement is READ COMMITTED. This prevents dirty reads but the data can be changed before the end of the transaction, possibly resulting in nonrepeatable reads or phantom data. See this link:
This article demonstrates a 14% performance improvement with (nolock) even when tables are only being read, not updated.
Use cursors and table variables with care
Cursors are almost always a last resort, but writing reports often requires a very detailed manipulation of data that can only be performed using cursors.
For example, I am working on a report that contains 23 columns of data, all from slightly different places in the database. It is not realistic to get this all in one select statement.
Of course, you should never use a cursor against the database directly. Instead, grab the data you need and put it into a table variable and manipulate it from there.
When you do use a cursor, use the LOCAL and FAST_FORWARD options.
This specifies a FORWARD_ONLY, READ_ONLY cursor with performance optimizations enabled. This is the fastest type of cursor.