Complex Query Tuning | SQL Server Performance Forums

SQL Server Performance Forum – Threads Archive

Complex Query Tuning

I’m running into a problem with the following query but before I overwhelm you with the actual query, I need to explain some things:
  • The problem I am actually running into with this example in particular is "[Microsoft][ODBC SQL Server Driver][SQL Server]There is insufficient system memory to run this query."
  • The actual environment generating and executing the query is ColdFusion and before anyone says CF sucks, changing the environment IS NOT AN OPTION (unless, of course, you are willing to port close to a million lines of code to ASP or another environment and debug it for free!)
  • The entire where clause is dynamic so I’m not sure how much benefit would be gained by creating a stored procedure to execute the select
  • The query normally works with narrower WHERE criteria (like fewer MID or ArchiveID values)
  • I’ve elimitated similar "insufficient memory" queries by creating temporary tables to hold the IN values but I’m not certain this is the best approach
  • Is what would be the "best" way to eliminate the immediate problem of "insufficient memory"
  • Is there a better solution for the case clauses
  • Lastly, assume that all the required indexes exist and are optimized (we’ve done quite a bit of performance tuning on the indexes)

Here is the big ugly offending query:
select
ArchivesID, Id, t.Mid, Invoice, CardAbb, CardNumber, CardExpDate, TerminalID, AuthOnly,
case
when CurrencyCode is null or t.CurrencyCode=” then
‘840’
else
CurrencyCode
end as CurrencyCode,
PrimaryAmount, SecondaryAmount, CustInfo, Clerk, BusDate, TranDate, TranTime, TotalAuth, Tolerance,
Auth3Resp, Auth3Code, PrimaryAmount + SecondaryAmount as TotalAmount, MerchantName,
case
when CardAbb = ‘??’
or (PrimaryAmount + SecondaryAmount = 0)
or (PrimaryAmount + SecondaryAmount > TotalAuth * Tolerance
and PrimaryAmount + SecondaryAmount – TotalAuth >= 1.00)
or (Abs(PrimaryAmount + SecondaryAmount) > 99999.99)
or (PrimaryAmount + SecondaryAmount = 0) then
‘Y’
else
‘N’
end as Problem
from
[InBank]..[ArchTrans] t (NOLOCK)
left outer join [InBank_0000001495]..[MERCHANT] m (NOLOCK) on (m.MID=t.MID)
where
t.MID in (
10611,10256,10249,10736,85191,10330,10553,10348,50740,50765,50732,50757,85241,10637,10355,
10371,10363,103218,10389,10397,85209,10405,10561,10579,10587,10595,85217,10603,40584,85225,
11791,85233,85258,85266,100511,10645,97394,97352,45716,10314,10280,10306,11817,10660,10520,
85274,10504,10462,10496,10470,10447,10686,10454,10488,40576,10652,10546,11809,10702,10710,
10413,97360,97378,97386,85282,10751,10744,85118,10538,10298,85183
)
and AuthOnly in (‘N’)
and ArchivesId in (
43552,43482,43481,43494,43493,43549,43512,43511,43470,43520,43469,43468,43505,43504,43484,
43483,43465,43464,43488,43487,43486,43485,43525,43547,43546,43539,43534,43533,43490,43489,
43514,43513,43480,43479,43492,43491,43503,43502,43475,43474,43473,43472,43471,43496,43495,
43498,43497,43527,43532,43531,43529,43528,43526,43499,43541,43540,43516,43515,43501,43500,
43530,43510,43548,43538,43537,43517,43536,43519,43518,43545,43544,43542,43553,43554,43522,
43521,43556,43555,43523,43467,43466,43477,43476,43509,43508,43543,43507,43506,43551,43550,
43478,43535,43524
)
and m.MID = t.MID
order by
MerchantName, Invoice,ID"
Steve Sommers
www.shift4.com
quote:I’ve elimitated similar "insufficient memory" queries by creating temporary tables to hold the IN values but I’m not certain this is the best approach
I agree with this aproach except I would use table variable instead. Other then that I suggest to replace "AuthOnly in (‘N’)" with "AuthOnly = ‘N’" and to remove "and m.MID = t.MID" from where clause. From which table are columns without table alias in front?
quote:Originally posted by mmarovic I agree with this aproach except I would use table variable instead. Other then that I suggest to replace "AuthOnly in (‘N’)" with "AuthOnly = ‘N’" and to remove "and m.MID = t.MID" from where clause. From which table are columns without table alias in front?
I’ll have to research table variables reference; I’ve never used them. I will definitely remove the "and m.MID = t.MID", it is leftover from a prior version that did not use the "left outer join" to join the tables — good catch, thanks. I’ll see about optimizing the "AuthOnly in (‘N’)" clause. The items in the list are actually dynamic. In this case only ‘N’ is selected, but the user can add other values. I’ve never found any reference the "xyz=x" is any faster then "xyz in (x)"; does it make a difference? Steve Sommers
www.shift4.com
quote:I’ve never found any reference the "xyz=x" is any faster then "xyz in (x)"; does it make a difference?
Maybe not. Just was not sure if query optimizer is able to figure out that there is just one value in list so it is equivalent to xyz = <only value from the list>. In that case it can use index on xyz or any combination of SARGs including one on xyz.
quote:I’ll have to research table variables reference; I’ve never used them.
It looks like they are better choice for small number of rows.
The couple times I’ve run into "insufficient memory" errors were when the result set was huge (as in many wide rows). Do you have a way to run this query returning count (*) instead of the select list? This might help pinpoint whether or not the problem is with the execution or the results. My guess is that the error occurs either with the initial result set, or the sort operation that occurs once all the records are found.
-Hartmut5
There is a good chance that the result set is huge. I’m hoping to find a way to get around the result set size limitation and if there is multiple ways, which I suspect there are, determine the best. I think the reason for this error is triggered is dependant on more factors than simply result set size. The reason I think this is because I can remove the WHERE clause and get a result set without issue every time, and this result set is bigger than the failed query would have returned. Thanks. Steve Sommers
www.shift4.com
Any success with table variables?
quote:Originally posted by mmarovic Any success with table variables?
Have not tried it yet. Last week was a hectic week. Hopefully tomorrow (Monday’s are always hectic) [8] Steve Sommers
www.shift4.com
May also check for valid ODBC driver between CF & SQL Server.
How about memory settings on SQL?
HOpe the latest service pack is applied to SQL to address few issues with CF compatibility. Satya SKJ
Moderator
http://www.SQL-Server-Performance.Com/forum
This posting is provided “AS IS” with no rights for the sake of knowledge sharing.
Okay, is this voodoo or is there a reason for this? In my example query that I started this thread, I changed the WHERE clause to the following and POOF!, no more out of memory error:
where
t.MID in ( select MID from [InBank_0000001495]..[MERCHANT] where MID in (
10611,10256,10249,10736,85191,10330,10553,10348,50740,50765,50732,
50757,85241,10637,10355,10371,10363,103218,10389,10397,85209,10405,
10561,10579,10587,10595,85217,10603,40584,85225,11791,85233,85258,
85266,100511,10645,97394,97352,45716,10314,10280,10306,11817,10660,
10520,85274,10504,10462,10496,10470,10447,10686,10454,10488,40576,
10652,10546,11809,10702,10710,10413,97360,97378,97386,85282,10751,
10744,85118,10538,10298,85183 )
)
and AuthOnly in (‘N’)
and ArchivesId in ( select ID from [InBank_0000001495]..[ARCHIVES] where ID in (
43552,43482,43481,43494,43493,43549,43512,43511,43470,43520,43469,
43468,43505,43504,43484,43483,43465,43464,43488,43487,43486,43485,
43525,43547,43546,43539,43534,43533,43490,43489,43514,43513,43480,
43479,43492,43491,43503,43502,43475,43474,43473,43472,43471,43496,
43495,43498,43497,43527,43532,43531,43529,43528,43526,43499,43541,
43540,43516,43515,43501,43500,43530,43510,43548,43538,43537,43517,
43536,43519,43518,43545,43544,43542,43553,43554,43522,43521,43556,
43555,43523,43467,43466,43477,43476,43509,43508,43543,43507,43506,
43551,43550,43478,43535,43524 )
)
Basically, I changed the explicit value list to sub-queries using the same value list and my problem went away. In my mind, the sub-queries would seem more memory intensive, but I guess my mind is not in sync with the SQL gurus that wrote MSSQL. Anyone have any comments, especially if there are side-effects associated to the above that I have not thought out?… Steve Sommers
www.shift4.com
Have you checked execution plan. My guess is that you have here the same effect you would have with table variable. t.MID in (select MID from …) forced query optimizer to build worktable first with IDs (the same as for ArchivesId in ( select ID …)). Then probably join logic is applied instead of table scaning through entire table. I don’t get it why you use outer instead of inner join when you require MID to be from Merchant table anyway. Is it again due to generating dynamic sql that may include this condition but doesn’t have to?
quote:Originally posted by mmarovic Have you checked execution plan…I don’t get it why you use outer instead of inner join when you require MID to be from Merchant table anyway…
I’ll definitely compare the two execution plans; maybe this will get me in more of the mindset of the SQL gurus. As to outer vs. inner join, in this particular case I could have used an inner and the reason has more to do with habit than dynamic. In most of my queries, outer is the preferred method because the data being brought in from the secondary tables (again, in most cases) is optional. This is another area I’ll look at optimizing. Thanks. Steve Sommers
www.shift4.com
This is not particularly good habit having performance in mind. This way you force sql server to select rows from ArchTrans table first which might not be better execution plan.
Look at the execution plan. See if it was creating a temporary table and doing a hash join on the old query plan. Would be fun to see both the execution plans if you can post them. MeanOldDBA
[email protected] When life gives you a lemon, fire the DBA.
]]>