Best Way Browse Results N per page

Posted on 2011-03-11
Medium Priority
Last Modified: 2012-05-11
Which is the best solution to select and browse results N per page?

Let us suppose I have a database with 100,000 records and I want to browse them 10 per page; should I select for each page all 100,000 records and then show them using startrow and maxrows?

Is it possible to select, for each page, only 10 of them using something like the MySQL command "LIMIT" ?

I use Coldfusion + SQL Server
Question by:Giambattista
  • 4
  • 3
  • 2
  • +2
LVL 39

Accepted Solution

BrandonGalderisi earned 1500 total points
ID: 35114229
The problem you will run into is that in order to limit your results to, let's say, page 2 of 10 containing 10 records per page, the higher the page number, the less efficient it will be.

SQL 2000
opt 1. SQL Server will need to sort the 100 records, return records 1-20, the application will need to throw out 1-10.
ex. select top 20* from TableA order by Id
negative: the higher the page, the more records that are returned for the app to throw out.

opt 2. SQL Server will need to sort the 100 records, then throw out 1-10, and return 11-20.
ex. select * from (select top 10 * from (select top 20* from TableA order by Id)a order by Id Desc)b order by Id
negative: the higher the page, the more records that SQL Server must do, and a triple sort must occur.

SQL 2005+
opt 1. Use a derived table the row_number() function to sort the records, assign a row number to 1-100, and only select from the derived table
ex. select * from (select *,row_number() over(order by id) OrdF from TableA) where OrdF between 11 and 20
Downside: SQL Server must still sort the entire record set, and for each page number throw skip.  the row_number() function will be evaluated over all records.

No matter which option you use above, sort the records responsive to the search ONE TIME.  Then, store those search results in a temporary table (note, not a #temp table, but a temporary cache table).  The table should be ordered by the correct value with an identity value as the clustered primary key.  Then subsequent pages can retrieved from the static cache.

Any questions, please ask.
LVL 39

Expert Comment

ID: 35114450
@Brandon, great explanation of options!  I'm curious what you mean by temporary cache table?  

One other codlfusion/HTML option I have used is to fetch all the IDs (primary keys) of the table and keep them in a form field or session variable, in the correct sort order.   Then when going to page 30, you just grab the 30th from the list and then next 10 IDs and fetch all the detail you need using those IDs.  

I don't think this is a great idea for 100,000 records but works well for a thousand.   I agree that using a database would be best,  I'm just curious how you manage the session for that..

Author Comment

ID: 35115905
Thank you for your answers. I actually was trying to use this solution.

Let us suppose that the 100,000 records are NEWS.

Let us suppose also that I want to order them by data.

Page 1) select top 10 news_title from news order by data desc
set last_data = datetime of the 10th news

Page 2) select top 10 news_title from news where data > last_data order by data desc
set last_data = datetime of the 20th news

Page 3) select top 10 news_title from news where data > last_data order by data desc
set last_data = datetime of the 30th news


Page N) select top 10 news_title from news where data > last_data order by data desc
set last_data = datetime of the (Nx10)th news

Will it work?
What Kind of Coding Program is Right for You?

There are many ways to learn to code these days. From coding bootcamps like Flatiron School to online courses to totally free beginner resources. The best way to learn to code depends on many factors, but the most important one is you. See what course is best for you.

LVL 39

Expert Comment

ID: 35116770
Yes.  There are a couple of caveats with that solution.  One, the date column must be indexed to be efficient for searching and the news_title must be included in the index or it likely won't be used.  Also, if as you are paging, new results come in, they won't appear in your search.  Not that the latter isn't an issue with the cached search method, but it is worth pointing out.  My method is more for paging through search results than just paging through an entire table.  Instead of using a datetime column, assuming that you can always expect records to be inserted in chronological order, you can just sort based on the clustered identity column if there is one.
LVL 39

Expert Comment

ID: 35116902

No, I don't really get the example.

It seems you are setting the last_data timestamp to those news titles already viewed.

However, on a user-by-user basis, some users may never have seen those stories, while others have.   Also, what if the person wants to go back a page to see the new articles of the previous page, then you have lost track of which ones those are.

If you want to do it something like that, with the data timestamps, you could use a seperate table that stores the session ID and a counter 1, 2, 3... for the page number the news article was displayed on.   So the first 10 news displayed would get a 1 and then the next page displayed would get a 2.   That way you can page back, when paging forward, you just grab the next top 10 by date.    But it seems to me that you have to do it by session_id so that everyone can see the top 10 news stories of the day

LVL 23

Expert Comment

by:Saqib Khan
ID: 35117473
10K records is NOT too much... i have done row_number() paging technique on way over then 10K records and performance was excellant. here is a article which explains in detail.

LVL 39

Expert Comment

ID: 35117507

 it's 100k

LVL 39

Expert Comment

ID: 35117510

.. but row_number() could be viable with that size as well... give it a try
LVL 53

Expert Comment

ID: 35118699
curious what you mean by temporary cache table?  

I'm curious too. I'm not familiar with it, but it sounds intriguing.
LVL 23

Expert Comment

by:Saqib Khan
ID: 35125607
yes 100k should be fine as well... give it a shot, depends how much data are you returning etc...

Author Comment

ID: 35152867
I'm sorry I still did not assign points, but I have to study a little to understand all you wrote ... :-) ... Thanks anyway to all.

Author Closing Comment

ID: 35401597
I0m not so good to understand completely the solution.

Featured Post

Keep up with what's happening at Experts Exchange!

Sign up to receive Decoded, a new monthly digest with product updates, feature release info, continuing education opportunities, and more.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

An alternative to the "For XML" way of pivoting and concatenating result sets into strings, and an easy introduction to "common table expressions" (CTEs). Being someone who is always looking for alternatives to "work your data", I came across this …
When trying to connect from SSMS v17.x to a SQL Server Integration Services 2016 instance or previous version, you get the error “Connecting to the Integration Services service on the computer failed with the following error: 'The specified service …
Via a live example, show how to extract information from SQL Server on Database, Connection and Server properties
Via a live example, show how to backup a database, simulate a failure backup the tail of the database transaction log and perform the restore.

590 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question