?
Solved

Pagination problem

Posted on 2006-05-18
13
Medium Priority
?
426 Views
Last Modified: 2007-12-19
I have a pagination issue due to a very large database. The following code has been working fine for me up until now;

function pageturn($sqlcall,$pagelimit) {

global $query, $num_rows, $prev_page, $next_page, $num_pages, $page;

if (!$_GET[page]) {  
   $_GET[page] = 1;  
}  
$prev_page = $_GET[page] - 1;  
$next_page = $_GET[page] + 1;  
$query = mysql_query($sqlcall);
if (!$query) print mysql_error()." in table \n";

// Set up specified page  
$page_start = ($pagelimit * $_GET[page]) - $pagelimit;  
$num_rows = mysql_num_rows($query);

if ($num_rows <= $pagelimit) {  
   $num_pages = 1;  
} else if (($num_rows % $pagelimit) == 0) {  
   $num_pages = ($num_rows / $pagelimit);  
} else {  
   $num_pages = ($num_rows / $pagelimit) + 1;
}  
 $num_pages = (int) $num_pages;  

if (($_GET[page] > $num_pages) || ($_GET[page] < 0)) {  
print "You have specified an invalid page number";  
}  

// performs the actual displaying...  
$sqlcall = $sqlcall . " LIMIT $page_start, $pagelimit";  
$query = mysql_query($sqlcall);  
}

function prevnext ($prev,$num,$next,$pg) {

global $PHP_SELF;

if ($prev)  {
   print "<a href=\"$PHP_SELF?page=$prev\"><img src=\"pics/button.previous.gif\" width=\"16\" height=\"16\" border=\"0\" align=\"absmiddle\"></a>";  
}

// Page # direct links  
// If you don't want direct links to each page, you should be able to
// safely remove this chunk.
for ($i = 1; $i <= $num; $i++) {  
   if ($i != $pg) {  
      print " <a href=\"$PHP_SELF?page=$i\">$i</a>";  
   } else {  
      print " <b>$i</b> ";  
   }  
}  

// Next  
if ($pg != $num) {  
   print "&nbsp;<a href=\"$PHP_SELF?page=$next\"><img src=\"pics/button.next.gif\" width=\"16\" height=\"16\" border=\"0\" align=\"absmiddle\"></a>" ;
      }
}

Like I said, the pagination is working fine but I would like it to work like Google's pagination. How do I go about accomplishing this?

0
Comment
Question by:frankenstrat
  • 5
  • 4
  • 2
  • +1
13 Comments
 
LVL 5

Expert Comment

by:aminerd
ID: 16710353
I'm not 100% sure what the problem with the current pagination is. Too many page numbers showing at once, cluttering up the UI? The SELECT ... LIMIT taking too long?
0
 

Author Comment

by:frankenstrat
ID: 16710829
There are far too many page numbers coming up that are cluttering the UI.
0
 

Author Comment

by:frankenstrat
ID: 16710896
Actually the select limit process is fast as long as I don't try to order the sql call.
0
Concerto Cloud for Software Providers & ISVs

Can Concerto Cloud Services help you focus on evolving your application offerings, while delivering the best cloud experience to your customers? From DevOps to revenue models and customer support, the answer is yes!

Learn how Concerto can help you.

 
LVL 11

Expert Comment

by:star_trek
ID: 16711762
googles pagination is the estimation, it puts the number of pages roughly, what you can do is the following
1) say you want to display 20 records per page run the initial query with limit 50, display 20 records and save some in cache
2) have the pagination set to 5, if query returns 50 records
3) when user clicks the next page based on the return results, increase the pagination and soon.
4) If there are no more record to display say the results are same hence omitted.
0
 
LVL 16

Expert Comment

by:dr_dedo
ID: 16711934
>>>pagination issue due to a very large database<<<

how large is it, are we talking thousands or millions ???

if you are taling millions, then, it would take about 5 sec on my PC to get records count, i guess it is too much load on the server and too much to ask the server to do, in case of milions of records, you can use a rough number as star mentioned, i'd do select count (id) from mytable and store that value for a reference

also you can limit the number of pages shown, say, just 10 pages, not all, and when a user click on the 10th page, you show the next 10, and so on

notice that google is actually using a rough number of recoords, it happens all teh time, that by reaching the end of pagination, it  tell u no more to show, although there were too many pages shown !!
0
 
LVL 11

Expert Comment

by:star_trek
ID: 16711973
dr_dedo, you just gave the same explanation as mine. Don't you agree.
0
 
LVL 16

Expert Comment

by:dr_dedo
ID: 16712023
i don't see i copied and pasted your comment, even if i did, i guess frankenstrat would notice that, don't u think??
i wanted to know how large is the db, if it is a matter of thousands, then your comment would apply, and i didn't discuss that, i discussed if the records are millions, have u ever did a select * from a db with +5,000,000 records ??? or just retrieve there count?? that's what i was talking about, and correct me if i'm wrong, i don't see u talked about that. besides, it's -for sure- expected to have some similarity in answers, we are answering the same question!!!
0
 
LVL 5

Accepted Solution

by:
aminerd earned 500 total points
ID: 16712065
// number of page links to display at once
$display_pages = 10;

// center around or current page (or try to)
$start = ($pg - floor($display_pages / 2));
if ($start < 1) $start = 1;

// figure out where to stop
$end = ($start + $display_pages);

if ($end > $num)
{

      $end = $num;
      
      // readjust our start point to display enough pages
      $start = ($end - $display_pages);
      if ($start < 1) $start = 1;

}

// display page links
for ($i = $start; $i <= $end; $i++)
{
      
      if ($i != $pg) {  
            print " <a href=\"$PHP_SELF?page=$i\">$i</a>";
      } else {
            print " <b>$i</b> ";
      }
      
}
0
 
LVL 5

Expert Comment

by:aminerd
ID: 16712108
"have you ever did a select * from a db with +5,000,000 records"

Without a WHERE clause, most of this query's execution time will be sending the data across the network. That's why the LIMIT's there. The LIMIT will have a negligable effect on performance in this case.

"or just retrieve there count??"

Again, without a WHERE clause (i.e., running SELECT COUNT(*) FROM table;) this should take no time at all on a MyISAM table, since the MyISAM storage engine keeps a count of the number of records the table holds (as opposed to InnoDB, which does not).

The downside to using a LIMIT comes with the WHERE clause. That's because MySQL performs the ENTIRE query BEFORE it limits it's results. In other words, if your query would have done a full table scan to return 5,000,000 records and you LIMIT it to 500, MySQL will still perform the full table scan, but only send 500 of the records back to the client.
0
 

Author Comment

by:frankenstrat
ID: 16712586
We are talking millions of records in a MyISAM table. There were over 6 million but after weeding out useless data, it is down to half of that. The speed of the query isn't mission critical at this point but may be up for consideration in the future. That being said all the queries are remarkably quick. It is the one type of query that I am using that produces thousands of results.

Here are examples of the queries

if ($_POST[number]=="Yes") {
$citysql = "SELECT * FROM cc WHERE id='$Find' ";
} else if ($filter == "cc") {
$citysql = "SELECT * FROM cc WHERE code='$Find' ORDER BY name";
} else if ($_POST[featurecode]=="ALL") {
$citysql = "SELECT * FROM cc WHERE name='$Find' ";
} else if ($_GET[search]) {
$citysql = "SELECT * FROM cc WHERE name='$Find' ";
} else {
$citysql = "SELECT * FROM cc WHERE code='$_POST[code]' AND name = '$Find' "; //produces thousands of results
}
pageturn($citysql,"20");
0
 
LVL 5

Expert Comment

by:aminerd
ID: 16714358
It doesn't seem that you consider your query execution times to be a problem. The code I listed above will allow you to show only x ($display_pages) number of pages at a time.
0
 

Author Comment

by:frankenstrat
ID: 16717083
If there is a method that would improve my query execution times, I am all ears.
0
 
LVL 5

Expert Comment

by:aminerd
ID: 16718545
As long as you're using indexes, you're probably OK. Rather than running the query twice, however -- once to get the number of rows, and then once to get the data -- I'd suggest you use SQL_CALC_FOUND_ROWS, like:

SELECT SQL_CALC_FOUND_ROWS * FROM cc WHERE name='test' LIMIT 0, 100;

This will return the first 100 records as always. You can then run:

SELECT FOUND_ROWS() as total_records;

Which will return a single column, single row result: the total number of records the previous query would have returned without the LIMIT. This way, you don't have to run the same query twice.
0

Featured Post

Vote for the Most Valuable Expert

It’s time to recognize experts that go above and beyond with helpful solutions and engagement on site. Choose from the top experts in the Hall of Fame or on the right rail of your favorite topic page. Look for the blue “Nominate” button on their profile to vote.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

This article discusses how to implement server side field validation and display customized error messages to the client.
Originally, this post was published on Monitis Blog, you can check it here . In business circles, we sometimes hear that today is the “age of the customer.” And so it is. Thanks to the enormous advances over the past few years in consumer techno…
The viewer will learn how to look for a specific file type in a local or remote server directory using PHP.
This tutorial will teach you the core code needed to finalize the addition of a watermark to your image. The viewer will use a small PHP class to learn and create a watermark.
Suggested Courses
Course of the Month16 days, 5 hours left to enroll

850 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question