Best Way Browse Results N per page

Posted on 2011-03-11
Medium Priority
Last Modified: 2012-05-11
Which is the best solution to select and browse results N per page?

Let us suppose I have a database with 100,000 records and I want to browse them 10 per page; should I select for each page all 100,000 records and then show them using startrow and maxrows?

Is it possible to select, for each page, only 10 of them using something like the MySQL command "LIMIT" ?

I use Coldfusion + SQL Server
Question by:Giambattista
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 4
  • 3
  • 2
  • +2
LVL 39

Accepted Solution

BrandonGalderisi earned 1500 total points
ID: 35114229
The problem you will run into is that in order to limit your results to, let's say, page 2 of 10 containing 10 records per page, the higher the page number, the less efficient it will be.

SQL 2000
opt 1. SQL Server will need to sort the 100 records, return records 1-20, the application will need to throw out 1-10.
ex. select top 20* from TableA order by Id
negative: the higher the page, the more records that are returned for the app to throw out.

opt 2. SQL Server will need to sort the 100 records, then throw out 1-10, and return 11-20.
ex. select * from (select top 10 * from (select top 20* from TableA order by Id)a order by Id Desc)b order by Id
negative: the higher the page, the more records that SQL Server must do, and a triple sort must occur.

SQL 2005+
opt 1. Use a derived table the row_number() function to sort the records, assign a row number to 1-100, and only select from the derived table
ex. select * from (select *,row_number() over(order by id) OrdF from TableA) where OrdF between 11 and 20
Downside: SQL Server must still sort the entire record set, and for each page number throw skip.  the row_number() function will be evaluated over all records.

No matter which option you use above, sort the records responsive to the search ONE TIME.  Then, store those search results in a temporary table (note, not a #temp table, but a temporary cache table).  The table should be ordered by the correct value with an identity value as the clustered primary key.  Then subsequent pages can retrieved from the static cache.

Any questions, please ask.
LVL 39

Expert Comment

ID: 35114450
@Brandon, great explanation of options!  I'm curious what you mean by temporary cache table?  

One other codlfusion/HTML option I have used is to fetch all the IDs (primary keys) of the table and keep them in a form field or session variable, in the correct sort order.   Then when going to page 30, you just grab the 30th from the list and then next 10 IDs and fetch all the detail you need using those IDs.  

I don't think this is a great idea for 100,000 records but works well for a thousand.   I agree that using a database would be best,  I'm just curious how you manage the session for that..

Author Comment

ID: 35115905
Thank you for your answers. I actually was trying to use this solution.

Let us suppose that the 100,000 records are NEWS.

Let us suppose also that I want to order them by data.

Page 1) select top 10 news_title from news order by data desc
set last_data = datetime of the 10th news

Page 2) select top 10 news_title from news where data > last_data order by data desc
set last_data = datetime of the 20th news

Page 3) select top 10 news_title from news where data > last_data order by data desc
set last_data = datetime of the 30th news


Page N) select top 10 news_title from news where data > last_data order by data desc
set last_data = datetime of the (Nx10)th news

Will it work?
Get 15 Days FREE Full-Featured Trial

Benefit from a mission critical IT monitoring with Monitis Premium or get it FREE for your entry level monitoring needs.
-Over 200,000 users
-More than 300,000 websites monitored
-Used in 197 countries
-Recommended by 98% of users

LVL 39

Expert Comment

ID: 35116770
Yes.  There are a couple of caveats with that solution.  One, the date column must be indexed to be efficient for searching and the news_title must be included in the index or it likely won't be used.  Also, if as you are paging, new results come in, they won't appear in your search.  Not that the latter isn't an issue with the cached search method, but it is worth pointing out.  My method is more for paging through search results than just paging through an entire table.  Instead of using a datetime column, assuming that you can always expect records to be inserted in chronological order, you can just sort based on the clustered identity column if there is one.
LVL 39

Expert Comment

ID: 35116902

No, I don't really get the example.

It seems you are setting the last_data timestamp to those news titles already viewed.

However, on a user-by-user basis, some users may never have seen those stories, while others have.   Also, what if the person wants to go back a page to see the new articles of the previous page, then you have lost track of which ones those are.

If you want to do it something like that, with the data timestamps, you could use a seperate table that stores the session ID and a counter 1, 2, 3... for the page number the news article was displayed on.   So the first 10 news displayed would get a 1 and then the next page displayed would get a 2.   That way you can page back, when paging forward, you just grab the next top 10 by date.    But it seems to me that you have to do it by session_id so that everyone can see the top 10 news stories of the day

LVL 23

Expert Comment

by:Saqib Khan
ID: 35117473
10K records is NOT too much... i have done row_number() paging technique on way over then 10K records and performance was excellant. here is a article which explains in detail.

LVL 39

Expert Comment

ID: 35117507

 it's 100k

LVL 39

Expert Comment

ID: 35117510

.. but row_number() could be viable with that size as well... give it a try
LVL 52

Expert Comment

ID: 35118699
curious what you mean by temporary cache table?  

I'm curious too. I'm not familiar with it, but it sounds intriguing.
LVL 23

Expert Comment

by:Saqib Khan
ID: 35125607
yes 100k should be fine as well... give it a shot, depends how much data are you returning etc...

Author Comment

ID: 35152867
I'm sorry I still did not assign points, but I have to study a little to understand all you wrote ... :-) ... Thanks anyway to all.

Author Closing Comment

ID: 35401597
I0m not so good to understand completely the solution.

Featured Post

Get real performance insights from real users

Key features:
- Total Pages Views and Load times
- Top Pages Viewed and Load Times
- Real Time Site Page Build Performance
- Users’ Browser and Platform Performance
- Geographic User Breakdown
- And more

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

International Data Corporation (IDC) prognosticates that before the current the year gets over disbursing on IT framework products to be sent in cloud environs will be $37.1B.
Ever wondered why sometimes your SQL Server is slow or unresponsive with connections spiking up but by the time you go in, all is well? The following article will show you how to install and configure a SQL job that will send you email alerts includ…
Via a live example combined with referencing Books Online, show some of the information that can be extracted from the Catalog Views in SQL Server.
Using examples as well as descriptions, and references to Books Online, show the different Recovery Models available in SQL Server and explain, as well as show how full, differential and transaction log backups are performed
Suggested Courses

743 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question