Detecting row 51 – 100 | SQL Server Performance Forums

SQL Server Performance Forum – Threads Archive

Detecting row 51 – 100

Hi, I have a solution found for getting row 51 – 100 (I use this statement for navigation through data in HTML pages)… Now I wanna ask you if this solution could be optimized … Here is my code : SELECT * FROM (
SELECT TOP 50 * FROM (
SELECT TOP 100 * FROM Table WHERE Col2=’something’ ORDER BY Col1)
D1 ORDER BY Col1 DESC) D2
) ORDER BY Col1 Is there any possibility to make this more perfomant?? But please keep in mind that the WHERE and ORDER BY clauses are created dynamically…
Dimi

Did you analyze execution plan?
Luis Martin
Moderator
SQL-Server-Performance.com
If you have an a unique field you can try something like set rowcount x — where x is the first row you want to have, in this case 51
select @unique_field = unique_field
from my_table
where …
order by coloumn set rowcount y — where y is the number or rows you wish to return
select *
from my_table
where …
and unique_field >= @unique_field
order by column make sure you have indexes on both the order by column and the unique column
How about using a cursor to scroll through the rows until you get a variable counter = 51 then start the select * from the data and stop when counter reaches your max value? A unique field would help here. BCP also has a first and last row but that#%92s not what you want here. Patrick


you could also try set rowcount 100
select stuff into a temp table in the right order, and make sure the temp table has an indentity column select stuff out of the temp table where identity between 50 and 100 Cheers
Twan
I suppose asigning a column x times to the same variable seems useless, but in this case it might be a fast way. At the end of the query the variable will hold the last row’s value which would be our starting point.
This way you avoid creating a temporary table with all it’s implications (time space recompile… )
Testes it on on 940,000 with set rowcount 900,000. Asigning to a variable took 5 seconds and a temp table 2 minutes and 5 seconds.

cursors are really slow.. Looks like Dimi’s idea is the way to go This is what I tried: TABLE: tblPerson INDEX: NON-Clustered on Person_id (PK) Number of Records in table: 238,000 T-SQL:
SELECT * FROM (
SELECT TOP 50 * FROM (SELECT TOP 100 * FROM tblPerson ORDER BY Person_id)d1 ORDER BY Person_id DESC
)d2
ORDER BY Person_id Execution plan:
tblperson.PK_tblperson Index Scan Cost:16%
Bookmark Lookup Cost 24%
Top Cost 0%
Sort/Top Sprt Cost: 30%
Sort cost: 29%
Select Cost: 0%
Results from SQl Profiler:
Reads: 435
Duration: 16 ms
Observation:
"Index scan" did not actually use the index?
However, the query itself taking only 16ms was good enough![^]
So, what do the experts think?
quote:Originally posted by DTDimi Hi, I have a solution found for getting row 51 – 100 (I use this statement for navigation through data in HTML pages)… Now I wanna ask you if this solution could be optimized … Here is my code : SELECT * FROM (
SELECT TOP 50 * FROM (
SELECT TOP 100 * FROM Table WHERE Col2=’something’ ORDER BY Col1)
D1 ORDER BY Col1 DESC) D2
) ORDER BY Col1 Is there any possibility to make this more perfomant?? But please keep in mind that the WHERE and ORDER BY clauses are created dynamically…
Dimi

What about something like this??? declare @highval in
set rowcount 50 select @highval = max(col1) from FROM Table WHERE Col2=’something’ select * FROM Table WHERE Col2=’something’ and col1 > @highval ORDER BY Col1

SELECT TOP 50 * FROM Table where TableRowId not in (
SELECT TOP 50 TableRowId FROM Table WHERE Col2=’something’ ORDER BY Col1)
and Col2=’something’ ORDER BY Col1 Is this faster ?
Of course requesting the 2nd page like this is fast. But execute this on a table with 500.000 records or more and select the last page : SELECT TOP 50 * FROM
(
SELECT TOP 451000 * FROM TABLENAME ORDER BY INDEXEDID DESC
) as a
ORDER BY INDEXEDID ASC Took 6 seconds on a dual xeon with 2Gb of ram But the beauty is in the simplicity here. And i wouldn’t know of a better (faster) way to do this.

Based on this solution, i found a faster way : <br /><br />DECLARE @MIN int<br />DECLARE @MAX int<br /><br />– find the id range of the page<br />SELECT @MIN = min(INDEXEDID), @MAX = max(INDEXEDID) FROM<br />(<br />SELECT TOP 50 INDEXEDID FROM <br />(<br />SELECT TOP 449000 PC_ID FROM TABLENAME ORDER BY INDEXEDID DESC<br />) as a<br />) as b<br /><br />– select using min and max id<br />SELECT * FROM TABLENAME WHERE INDEXEDID BETWEEN @MIN AND @MAX<br /><br />Result : Subsecond response <img src=’/community/emoticons/emotion-5.gif’ alt=’;)‘ /><br />
Wow this is the only way to do it in MS SQL Server? [xx(]<br /><br />Why not just SELECT * FROM TABLE LIMIT 50,100 like mySQL [V]<br /><br />German <img src=’/community/emoticons/emotion-1.gif’ alt=’:)‘ />
Limit is not a recognised command in SQL Server and the way defined above is ideal way.
I will not debate on advantages/features between MySQL & SQL Server as its a never ending topic. Satya SKJ
Moderator
http://www.SQL-Server-Performance.Com/forum
This posting is provided “AS IS” with no rights for the sake of knowledge sharing.
This is how you would do it in Oracle: select order_id, line_item_id
from (select rownum num, order_id, line_item_id
from order_items
order by order_id, line_item_id)
where num between 51 and 100 Note that rownum is a pseudocolumn available for use in every query. It’s little things like this that make Oracle so much easier to work with than SQL Server. -Ken

unfortunately at &gt; 10 times the cost of purchase, hardware, development and support… <img src=’/community/emoticons/emotion-5.gif’ alt=’;-)’ /><br /><br />seriously though, all databases have their pros and cons. language extensions are usually a differentiating factor, and vendors will always leapfrog each other. SQL2005 will again leapfrog the other databases in terms of functionality and language extensions, and then before long another vendor will leapfrog them. This forum is not really about discussing the pros and cons of database vendors. There are plenty of chat rooms for that sort of purpose. This forum is about knowledge sharing about a tool that we use in our every day working life<br /><br />Cheers<br />Twan
]]>