read files from directory

hi,
i want to read files from directory .
The files would be more than 10k+ even 1L + too.
is there any way to do paination logic  to read only 1000 mail at a time.
And later pressing next page read next files

readdir() reads whole dir which i dont want
LVL 15
InsoftserviceAsked:
Who is Participating?
 
Ray PaseurConnect With a Mentor Commented:
do we have any facility to just pick 1000 first -- Yes, you can do that with readdir() and a counter.
<?php // RAY_temp_insoftservice.php
error_reporting(E_ALL);
echo "<pre>"; // MAKE IT EASY TO READ



// SKIP SOME DIRECTORY ENTRIES, THEN READ SOME DIRECTORY ENTRIES



// SET DEFAULT VALUES - THESE COULD COME FROM A FORM INPUT
$skip_count = 27;
$read_count = 4;

// FOLLOW THE DIRECTIONS ON PHP.NET
if (!$handle = opendir('.')) die('FAILED TO OPEN');

// SKIP THE RIGHT NUMBER OF FILES
while (FALSE !== ($file = readdir($handle)))
{
    if (!$skip_count) break;
    $skip_count--;
}

// RETREIVE THE RIGHT NUMBER OF FILES
$files = array();
while (FALSE !== ($file = readdir($handle)))
{
    if (!$read_count) break;
    $files[] = $file;
    $read_count--;
}

// SHOW THE WORK PRODUCT
print_r($files);

Open in new window

0
 
Beverley PortlockConnect With a Mentor Commented:
There is no easy way to do it. I would read the filenames with either readdir or scandir and place them in an array which I would store in the session. I would then use array_slice to pick out 'blocks' according to where I am in the pagination.

http://www.php.net/readdir
http://www.php.net/scandir (my preference)
http://www.php.net/array_slice

For storing the array in the session use serialize / unserialize or implode / explode and convert to strings as I have found this less troublesome than storing arrays in a session variable. Since the session is held on the server and read directly via the filesystem it does not matter much if the session file gets big as reading via the filesystem is fast.
 

http://www.php.net/implode
http://www.php.net/explode
http://www.php.net/serialize
http://www.php.net/unserialize
0
 
shdwmageCommented:
I wrote a long post on this, but I have to agree with bport.

Set a variable for how many records you want to show. (x)
Set a variable for the first record you want to show. (default 0) (y)
Set a variable for the last record you want to show by adding the number of records to show plus the start record. (z = x+y)

Slice the array using x and z as the start and end points.
Display the list like normal.
0
Hire Technology Freelancers with Gigs

Work with freelancers specializing in everything from database administration to programming, who have proven themselves as experts in their field. Hire the best, collaborate easily, pay securely, and get projects done right.

 
DerokorianCommented:
You can use glob() to get the contents of the directory, then using pagination limit which keys to use (say 1-1000, 1001-2000, etc). The following snippet is untested but should give you a basic idea:

// Directory Contents
$files = glob('./somedirectory/*');
$filecount = sizeof($files);

// Pagination set up:
$pagenum = isset($_GET['page']) ? (int)$_GET['page'] : 1;
$pagesize = 100;
$pages = ceil($filecount/$pagesize);

$first = ($pagenum-1)*$pagesize;
$last = $pagenum*$pagesize;

for( $i=$first; $i<$pagesize; $i++ ) {
   echo $files[$i] . '<br />';
}

$prev = $pagenum - 1;
$prev = ($prev < 1) ? 1 : $prev;

$next = $pagenum + 1;
$next = ($next > $pages) ? $pages : $next;

echo '<a href="'.$_SERVER['PHP-SELF'].'"?page=1">First</a> ';
echo '<a href="'.$_SERVER['PHP-SELF'].'"?page='.$prev.'">Previous</a> ';
for( $i=1; $i<=$pages; $i++ ) {
   echo '<a href="'.$_SERVER['PHP-SELF'].'"?page='.$i.'">'.$i.'</a> ';
}
echo '<a href="'.$_SERVER['PHP-SELF'].'"?page='.$next.'">Next</a> ';
echo '<a href="'.$_SERVER['PHP-SELF'].'"?page='.$pages.'">Last</a> ';

Open in new window

0
 
InsoftserviceAuthor Commented:
i hope u did not got my point.

i want method so that php only reads X amount of file at a time beside reading whole folders file
@Derokorian: do u think that it will read only X amount of files and than read it.
i am asking it as i have not tried or scanned the code. completely
0
 
shdwmageConnect With a Mentor Commented:
I would probably do something similar to this:

#set the directory
$dir = './tmp';
#set the number of files to show
$file_count = 50;
#get the start number, or else start at 0
if isset($_GET['s']) {
  $start = clean_input($_GET['s']);
} else {
  $start = 0;
}
#Figure out the last record
$end = $start + $file_count;

#input how you want to get your files from your directory here.
$list = scandir($dir);

#select the portion of the array you want to output
$section = array_slice($list,$start,$file_count,true);

#format your output and display

Open in new window


I would also add some counter to see the total number of files in a directory, but I didn't add it to the above code.
0
 
shdwmageCommented:
I'd also like to note the above is untest, I just wrote it on the fly.  I also didn't include how to add the pagination list.
0
 
Beverley PortlockCommented:
"i want method so that php only reads X amount of file at a time beside reading whole folders file "

Perhaps you could clarify this further? I am not clear what you mean
0
 
shdwmageCommented:
Apparently looking back I missed the point *-*

I would use PHP to generate the list once upon request and then store that list to a file on the server.  Then from that file generate the normal page list.

I honestly can't think of a different way to do it.
0
 
shdwmageCommented:
Thats assuming I understand the premise of your question.

You are saying that the file has more than 10,000 files in it.  (ouch)  I've done an LS on a directory like that before and I regretted it.

But I don't know of any way to not get the whole directory.  I just know how to parse it so it doesn't all display at once.  There are ways to do this from the file system, but I cannot think of any in PHP hence the suggestion to put it into a file once and then access the file so the server doesn't have to take a resource hit for each page.
0
 
Ray PaseurCommented:
This function only reads one file name at a time. http://php.net/manual/en/function.readdir.php

So you set up some looping logic.  For example if you wanted to start at file number 1000 and present until file number 1200, you would make 1,000 calls to readdir() discarding the results, then you would make another 200 calls to readdir() presenting the results.

Does that make sense to you?  If not, I might be able to create a small example.
0
 
InsoftserviceAuthor Commented:
hi,
thx for all ur comments.
but do we have any facility to just pick 1000 first because array_slice and other process might not be feasible. Even i had thought for keeping this in session or in buffer but even tht not possible as within fraction of minutes the file no increases .
 
0
 
Beverley PortlockCommented:
"i had thought for keeping this in session or in buffer but even tht not possible as within fraction of minutes the file no increases ."

In that case you are shooting at a moving target. If the number of files can change in between updates then 'paging' them is going to be rather tricky.
0
 
Ray PaseurCommented:
...rather tricky. - in a word.  Or maybe rather wacky.  The problem you face is related to the nature of client-server systems.  The client makes a request and the server makes a response that is based on the contents of the underlying data at the exact moment-in-time.  If the server data is changing rapidly, then almost any response you can create will (obviously) be imprecise and as the client sits there reading the screen, it will become more and more outdated.  There is no magic bullet to attack this problem; the best thing to do is make the client aware of the changing nature of the data.  In life things change and time marches on.  So it is in some applications, as well.
0
 
InsoftserviceAuthor Commented:
thx for the comments.
Ya the question was wacky, and to solve it i had no solution so, i had used ray code and other EE hints took just 1000 files and kept rest files unread ,which would be read after clicking next link or prev link. hope my this steps works.
0
All Courses

From novice to tech pro — start learning today.