Temp Tables or a huge outer join? | SQL Server Performance Forums

SQL Server Performance Forum – Threads Archive

Temp Tables or a huge outer join?

I have 2 tables. Joined on an indexed field, one table has over 2 million records, the other over 1 million. The problem Im having is that the users who query on these tables are permitted to run queries by date range (the date field is also indexed) and it sometimes will take forever to return the data, group it, do a few sums and counts etc etc. My question is, should I be pulling the data based on date range into a temp table, then do all of my sums, counts grouping etc… THEN output the data? Or is there a good way to optimize my query to speed it up? The sql server is a monster of a machine (big ole brand new poweredge server) and it runs great. I think the problem lies in the sql query that ive written. Additionally, this query gets hit quite often. So if I DO use temp tables, is it bad to create temp tables for a query that could possibly get used every 60 seconds or so by users? Thanks for any assistance guys!
Indexing a date column (assuming the date column contains time) is not always worth it. When you query the date column you generally don’t know the time so the query is usually a > , < or BETWEEN, so its likely to use a table scan anyway. If the time portion is always 00:00:00.00 thenyou can use an = and the index may be of value. Rather than a temp table consider using the table variable which should perform better. Test query with/with out table variable to see which performs better.
]]>