Temporary Tables vs. Table Variables and Their Effect on SQL Server Performance

There are three major theoretical differences between temporary tables:

create table #T (…)

And table variables:

declare @T table (…)

The first difference is that transaction logs are not recorded for the table variables. Hence, they are out of scope of the transaction mechanism, as is clearly visible from this example:

create table #T (s varchar(128))
declare @T table (s varchar(128))
insert into #T select ‘old value #’
insert into @T select ‘old value @’
begin transaction
     update #T set s=’new value #’
     update @T set s=’new value @’
rollback transaction
select * from #T
select * from @T

s
—————
old value #

s
—————
new value @

After declaring our temporary table #T and our table-variable @T, we assign each one with the same “old value” string. Then, we begin a transaction that updates their contents. At this point, both will now contain the same “new value” string. But when we rollback the transaction, as you can see, the table-variable @T retained its value instead of reverting back to the “old value” string. This happened because, even though the table-variable was updated within the transaction, it is not a part of the transaction itself.

The second major difference is that any procedure with a temporary table cannot be pre-compiled, while an execution plan of procedures with table variables can be statically compiled in advance. Pre-compiling a script gives a major advantage to its speed of execution. This advantage can be dramatic for long procedures, where recompilation can be too pricy.

Finally, table variables exist only in the same scope as variables. Contrary to the temporary tables, they are not visible in inner stored procedures and in exec(string) statements. Also, they cannot be used in an insert/exec statement.

But let’s compare both in terms of performance.

At first, we prepare a test table with 1 million records:

create table NUM (n int primary key, s varchar(128))
GO
set nocount on
declare @n int
set @n=1000000
while @n>0 begin
     insert into NUM
          select @n,’Value: ‘+convert(varchar,@n)
     set @n=@n-1
     end
GO

Now we prepare our test procedure T1:

create procedure T1
     @total int
as
     create table #T (n int, s varchar(128))
     insert into #T select n,s from NUM
          where n%100>0 and n<=@total
     declare @res varchar(128)
     select @res=max(s) from NUM
          where n<=@total and
               not exists(select * from #T
               where #T.n=NUM.n)
GO

Called with a parameter, which we will vary from 10, 100, 1,000, 10,000, 100,000 up to 1,000,000, it copies the given number of records into a temporary table (with some exceptions, as it skips records where n is divisible by 100), and then finds a max(s) of such missing records. Of course, the more records we give, the longer the execution is.

To measure the execution time precisely, I use the code:

declare @t1 datetime, @n int

set @t1=getdate()
set @n=100 – (**)
while @n>0 begin
     exec T1 1000 – (*)
     set @n=@n-1 end
select datediff(ms,@t1,getdate())
GO

(*) The parameter to our procedure is varied from 10 to 1,000,000.

(**) If an execution time is too short, I repeat the same loop 10 or 100 times.

I run the code several times to get a result of a “warm” execution.

The results can be found in Table 1 below.

Now let’s try to improve our stored procedure by adding a primary key to the temporary table:

create procedure T2

     @total int
as
     create table #T (n int primary key, s varchar(128))
     insert into #T select n,s from NUM
          where n%100>0 and n<=@total
     declare @res varchar(128)
     select @res=max(s) from NUM
          where n<=@total and
               not exists(select * from #T
               where #T.n=NUM.n)
GO

Then, let’s create a third one. With a clustered index, it works much better. But let’s create the index AFTER we insert data into the temporary table—usually, it is better:

create procedure T3
     @total int
as
     create table #T (n int, s varchar(128))
     insert into #T select n,s from NUM
          where n%100>0 and n<=@total
     create clustered index Tind on #T (n)
     declare @res varchar(128)
     select @res=max(s) from NUM
          where n<=@total and
               not exists(select * from #T
               where #T.n=NUM.n)
GO

Surprise! Large amounts of data take longer; merely adding 10 records takes an additional 13 milliseconds. The problem is that “create index” statements force SQL server to recompile stored procedures, and slows down the execution significantly.

Now let’s try the same using table variables:

create procedure V1
     @total int
as
     declare @V table (n int, s varchar(128))
     insert into @V select n,s from NUM
          where n%100>0 and n<=@total
     declare @res varchar(128)
     select @res=max(s) from NUM
          where n<=@total and
               not exists(select * from @V V
               where V.n=NUM.n)
GO

To our surprise, this version is not significantly faster than the version with the temporary table. This is a result of a special optimization SQL server has for the create table #T statements in the very beginning of a stored procedure. For the whole range of values, V1 works better or the same as T1.

Now let’s try the same with a primary key:

create procedure V2
     @total int
as
     declare @V table (n int primary key, s varchar(128))
     insert into @V select n,s from NUM
          where n%100>0 and n<=@total
     declare @res varchar(128)
     select @res=max(s) from NUM
          where n<=@total and
               not exists(select * from @V V
               where V.n=NUM.n)
GO

The result is much better, but T2 outruns this version.

Records

T1

T2

T3

V1

V2

10

0.7

1

13.5

0.6

0.8

100

1.2

1.7

14.2

1.2

1.3

1000

7.1

5.5

27

7

5.3

10000

72

57

82

71

48

100000

883

480

580

840

510

1000000

45056

6090

15220

20240

12010

Table 1: Using SQL Server 2000, time in ms.

But the real shock is when you try the same on SQL Server 2005:

N

T1

T2

T3

V1

V2

10

0.5

0.5

5.3

0.2

0.2

100

2

1.2

6.4

61.8

2.5

1000

9.3

8.5

13.5

168

140

10000

67.4

79.2

71.3

17133

13910

100000

700

794

659

Too long!

Too long!

1000000

10556

8673

6440

Too long!

Too long!

Table 2: Using SQL Server 2005 (time in ms).

In some cases, SQL 2005 was much faster then SQL 2000 (marked with green). But in many cases, especially with huge amounts of data, procedures that used table variables took much longer (highlighted with red). In four cases, I even gave up waiting.



Conclusion

  1. There is no universal rule for when and where to use temporary tables or table variables. Try them both and experiment.
  2. In your tests, verify both sides of the spectrum—small numbers of records and huge data sets.
  3. Be careful with migrating to SQL 2005 when you use complicated logic in your stored procedures. The same code can run 10-100 times slower on SQL server 2005!

For other SQL articles, please visit http://www.lakesidesql.com/.




Related Articles :

  • No Related Articles Found

3 Responses to “Temporary Tables vs. Table Variables and Their Effect on SQL Server Performance”

  1. Hi Dmitry Tsuranoff,

    I am trying to find a better solution for my requirement out of Table variables and Temp tables.I tried aboe mentioned T1 and V1 in SQL server 2008. Surprisingly V1 for 100000 recs is taking more than 5mins. Please let know if I miss anything, below is the code that I executed

    create procedure V1
    @total int
    as
    declare @V table (n int, s varchar(128))
    declare @t1 datetime
    set @t1=getdate()
    insert into @V select n,s from NUM
    where n%100>0 and n<=@total
    declare @res varchar(128)
    select @res=max(s) from NUM
    where n<=@total and
    not exists(select * from @V V
    where V.n=NUM.n)

    select datediff(ms,@t1,getdate())
    GO

  2. I have to disagree with you.

    Temporary tables survive the end of the queries, but table variables doesn’t.

    Sometimes, you want to store some data in query #1 and you want to access them in query #2. In that cases you can’t use table variables.

    About the performance:
    In the first case, I inserted 10 000 rows to a simple table. Then, in the second case I inserted also 10 000 rows to the same table, but inserted their ids to a tmp table too. (with the “OUTPUT INSERTED.id INTO #tmpTable” clause in the insert query)

    Sadly, the second solution was six times slower.

  3. – First Clean Cache
    DBCC FREEPROCCACHE
    GO
    IF EXISTS (SELECT * FROM sys.sysobjects WHERE TYPE = ‘P’ AND NAME = ‘up_SampleSP’)
    BEGIN
    DROP PROCEDURE up_SampleSP
    END
    GO
    – Create the Stored Procedure
    CREATE PROCEDURE up_SampleSP
    AS
    SELECT * into #mytemp
    FROM dept

    select * from #mytemp
    GO
    – Check the Query Plan for SQL Batch
    – [ Result -- You will find that there is no ObjectName with the name of up_SampleSP ]
    SELECT cp.objtype AS PlanType,
    OBJECT_NAME(st.objectid,st.dbid) AS ObjectName,
    cp.refcounts AS ReferenceCounts,
    cp.usecounts AS UseCounts,
    st.TEXT AS SQLBatch,
    qp.query_plan AS QueryPlan
    FROM sys.dm_exec_cached_plans AS cp
    CROSS APPLY sys.dm_exec_query_plan(cp.plan_handle) AS qp
    CROSS APPLY sys.dm_exec_sql_text(cp.plan_handle) AS st;
    GO
    – Execute Stored Procedure
    EXEC up_SampleSP
    GO
    – Check the Query Plan for SQL Batch
    – [ Result -- You will find that there is one entry with name ObjectName with name up_SampleSP ]
    SELECT cp.objtype AS PlanType,
    OBJECT_NAME(st.objectid,st.dbid) AS ObjectName,
    cp.refcounts AS ReferenceCounts,
    cp.usecounts AS UseCounts,
    st.TEXT AS SQLBatch,
    qp.query_plan AS QueryPlan
    FROM sys.dm_exec_cached_plans AS cp
    CROSS APPLY sys.dm_exec_query_plan(cp.plan_handle) AS qp
    CROSS APPLY sys.dm_exec_sql_text(cp.plan_handle) AS st;
    GO

Software Reviews | Book Reviews | FAQs | Tips | Articles | Performance Tuning | Audit | BI | Clustering | Developer | Reporting | DBA | ASP.NET Ado | Views tips | | Developer FAQs | Replication Tips | OS Tips | Misc Tips | Index Tuning Tips | Hints Tips | High Availability Tips | Hardware Tips | ETL Tips | Components Tips | Configuration Tips | App Dev Tips | OLAP Tips | Admin Tips | Software Reviews | Error | Clustering FAQs | Performance Tuning FAQs | DBA FAQs |