SQL Server Performance

Select Based on List of Random Values

Discussion in 'Performance-Related Article Discussions' started by shanetasker, May 5, 2006.

  1. shanetasker New Member

  2. Madhivanan Moderator

    Well. Other approach would be

    Declare @t table(names varchar(100))
    Insert into @t
    Select 'Davolio' union all select 'Nancy'union all
    select 'Fuller' union all select 'Peacock' union all
    select 'Buchanan' union all select 'Suyama' union all
    select 'King'

    Declare @names varchar(8000)
    set @names='Fuller,Peacock,King'

    Select names from @t where ','+@names+',' like '%,'+names+',%'


    Madhivanan

    Failing to plan is Planning to fail
  3. FrankKalis Moderator

  4. mmarovic Active Member

    There are many more possible implementations that are not mentioned in the article. The most complete discussion I found in famous Erland Sommarskog article "Arrays and Lists in SQL Server". However the last implementation from my article is not mentioned there and I found it the most efficient in case of lists of values from the small enough domain.

    Madhivanan, your implementation is interesting. I guess your intention was to use where clause against orderStatuses table and then to join it with orders table. If so, that is clever idea. I don't think I'll have time to test it today and compare with solutions from the article. However, testing script is attached, you can download it and modify to include your solution and test it yourself.
  5. mmarovic Active Member

    Hi Frank, I know about that article. The last implementation from my article was not there last time I checked. If Madhivanan's implemention is there I managed to miss it somehow. I believe it is very comparable with my solution, possibly faster, but without overhead of maintaining additional list members table.
  6. FrankKalis Moderator

    As I said, I just discovered your article today and the first thing that came to my mind was Erland's article. I can't say right now, that one article is "better" than the other, but it's always good to have several different people writing about one topic, because nobody's perfect. And different people tackle problems in different ways, so that's usually an excellent source to sucking knowledge from someone else. [<img src='/community/emoticons/emotion-1.gif' alt=':)' />]<br /><br />And thanks for providing the test script.<br /><br />--<br />Frank Kalis<br />Microsoft SQL Server MVP<br /<a target="_blank" href=http://www.insidesql.de>http://www.insidesql.de</a><br />Heute schon gebloggt?<a target="_blank" href=http://www.insidesql.de/blogs>http://www.insidesql.de/blogs</a><br />Ich unterstuetze PASS Deutschland e.V. <a target="_blank" href=http://www.sqlpass.de>http://www.sqlpass.de</a>)
  7. mmarovic Active Member

    Erland's article is great, no doubt about that. However, when I tested solution from there, I've found udf based solution not efficient enough for my needs. I also wanted to avoid dynamic sql because we didn't want to give read rights to application login. I did that research preparing solution for former employer. I applied the last implementation mentioned in the article.<br /><br />About test script, that was your suggestion from "Running totals" article discussion and I found it to be a very good idea. [<img src='/community/emoticons/emotion-1.gif' alt=':)' />]
  8. Madhivanan Moderator

    The logic in my script will avoid splitting the data or usage of Dynamic SQL [<img src='/community/emoticons/emotion-1.gif' alt=':)' />]<br /><br />Madhivanan<br /><br />Failing to plan is Planning to fail
  9. mmarovic Active Member

    Madhivanan, I couldn't resist, so I already tested your solution. As I expected the performance was similar to the last implementation from the article, even better, for most test cases it was slightly faster. Difference was small, but having in mind your solution doesn't require hassle with list members table, from now on I will recommend your solution as the best for the problem when relatively small lookup table is available. Here is the code I applied for testing:


    if exists(select * from sysobjects where type = 'U' and name = 'testResults')
    drop procedure orders_sel_by_userID_orderStatusList
    go
    create procedure orders_sel_by_userID_orderStatusList
    @userId int,
    @orderStatusListvarchar(100)
    as begin
    set noCount on
    Select count(*)
    from orders o
    join dbo.orderStatuses l on o.orderStatusID = l.OrderStatusID
    where o.userID = @userID
    and ',' + @orderStatusList + ',' like ',' + cast(l.OrderStatusID as varchar(10)) + ','
    end

  10. Madhivanan Moderator

    I think you need to use %

    and ',' + @orderStatusList + ',' like '%,' + cast(l.OrderStatusID as varchar(10)) + ',%'

    Madhivanan

    Failing to plan is Planning to fail
  11. mmarovic Active Member

    That's right, thanks for the correction.
  12. mmarovic Active Member

    I've just finished testing with fixed sp version. Now, its speed is comparable with one of dynamic sql solution. However, I haven't checked if optimal execution plan is used. I plan to force it using the code bellow and test it again:


    if exists(select * from sysobjects where type = 'U' and name = 'testResults')
    drop procedure orders_sel_by_userID_orderStatusList
    go
    create procedure orders_sel_by_userID_orderStatusList
    @userId int,
    @orderStatusListvarchar(100)
    as begin
    set noCount on
    Select count(*)
    from dbo.orderStatuses l
    inner loop join orders o on o.orderStatusID = l.OrderStatusID
    where o.userID = @userID
    and ',' + @orderStatusList + ',' like '%,' + cast(l.OrderStatusID as varchar(10)) + ',%'
    option (force order)
    end

  13. mmarovic Active Member

    Forcing execution plan didn't affect teting results significantly. So the conclusion is that implementation proposed by Madhivanan offers similar performance to dynamic sql, but it has advantage that no read rights are necessary to run it. It is slower then permanent list members table when lookup table is small. However, if lookup table contains 100s of rows I guess this solution might be better. I was not aware of this implementation when I wrote the article. If I was I would certainly include it there.
  14. Madhivanan Moderator

    Thanks Mirko for the test [<img src='/community/emoticons/emotion-1.gif' alt=':)' />]<br /><br />Madhivanan<br /><br />Failing to plan is Planning to fail
  15. mmarovic Active Member

    You are wellcome. [<img src='/community/emoticons/emotion-1.gif' alt=':)' />]

Share This Page