# tables | SQL Server Performance Forums

SQL Server Performance Forum – Threads Archive

# tables

Hi, I have couple of stored procs for report which use tables that have millions of records. would use of # table be justified or derived tables is more suitable. I understand #table increases IO activity but using derived tables would been running a query on millions of record.
Your question is not clear. It depends on what you are trying. If you have to use derived table which is derived using many tables and want to use that in many places in SP, then make use of #table Madhivanan Failing to plan is Planning to fail
quote:Originally posted by nirajshah7 Hi, I have couple of stored procs for report which use tables that have millions of records. would use of # table be justified or derived tables is more suitable. I understand #table increases IO activity but using derived tables would been running a query on millions of record.
Sql server creates temp tables internally you use derived tables…
You can test performace using derived table and temp tables…
Run the query/procedure with SET STATISTICS I/O ON to see the I/O activity..
MohammedU.
Moderator
SQL-Server-Performance.com All postings are provided “AS IS” with no warranties for accuracy.

Would you mind posting your code so we don’t have to shoot in the dark? [<img src=’/community/emoticons/emotion-1.gif’ alt=’:)‘ />]<br /><br />–<br />Frank Kalis<br />Microsoft SQL Server MVP<br />Contributing Editor, Writer & Forum Moderator<a target="_blank" href=http://www.sql-server-performance.com>http://www.sql-server-performance.com</a><br />Webmaster:<a target="_blank" href=http://www.insidesql.de>http://www.insidesql.de</a>
I understand #table increases IO activity but using derived tables would been running a query on millions of record.
True, also refer to the execution plans and use PERFMON to capture server activity in thsi case. Satya SKJ
Microsoft SQL Server MVP
Writer, Contributing Editor & Moderator
http://www.SQL-Server-Performance.Com
This posting is provided AS IS with no rights for the sake of knowledge sharing. Knowledge is of two kinds. We know a subject ourselves or we know where we can find information on it.
Is you report listing millions of rows? If not, then you must be filtering.[<img src=’/community/emoticons/emotion-5.gif’ alt=’;)‘ />]<br /><br />So the main question must be whether the tables have the proper indexes for your filter criteria.<br /><br />Also, what kind of processing does your SP do?
Hello,
I have facing similar kind of issue. I have one base table and about 10 Millions rows. We are fetching data from this base table depends upon some condition (Date) into #temp table. Then we are summrize those data and taking into some variables (@ procedure variable) and then again updates some othere base table through this variables.
This job is running on SQL Server every after 1/2 hrs which is make my tempdb in problem. I also facing I/O issues.
I have try to optimize query by reading Query plan and put some more Index on Base table as well as temp table but still performance and tempdb is in issue.
So can you guys suggest me better way to optimize my Script. Do i use table variable ? Thanks in advance.

Without seeing your code, it hard to suggest anything…
If you can post the code…some one will help you…
MohammedU.
Microsoft SQL Server MVP
Moderator
SQL-Server-Performance.com All postings are provided “AS IS” with no warranties for accuracy.

…and please start a new thread for your question instead of hijacking another thread. You will improve your chances to get good answers. [<img src=’/community/emoticons/emotion-1.gif’ alt=’:)‘ />]<br /><br />–<br />Frank Kalis<br />Microsoft SQL Server MVP<br />Contributing Editor, Writer & Forum Moderator<a target="_blank" href=http://www.sql-server-performance.com>http://www.sql-server-performance.com</a><br />Webmaster:<a target="_blank" href=http://www.insidesql.de>http://www.insidesql.de</a>
]]>