Want to protect your cyber security and still get fast solutions? Ask a secure question today.Go Premium

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 298
  • Last Modified:

Read file in sections

Im trying to open a file in my PHP script, however keep getting the message 'Out of Memory'. I only want to access the file in sections of 25Kb, so is it possible to read a file 25Kb at a time in PHP?

So something like:-

$fh = fopen('c:\myFile.txt', 'r') or die("Can't open file");
while readFile.eof=false
          {
          $fs = fread($fh,25000)
          //Do my work on the 25 or less stream.
          }
fclose($fh);

Open in new window

Thanks in advance
0
tonelm54
Asked:
tonelm54
  • 2
  • 2
2 Solutions
 
Chris HarteThaumaturgeCommented:
You will probably be better off using file_get_contents. This reads a file into a string more efficiently than fread and also allows offsets so you can read the file in chunks.


http://uk3.php.net/manual/en/function.file-get-contents.php
0
 
tonelm54Author Commented:
Good afternoon,
Ok, so I wrote a simple code, to basically read a file in chunks:-
	$strFile = "c:\\myFile.doc";
	
	$bufferSize = 25000;
	
	$rndFileID = rndID();
	
	$newFileSize = filesize($strFile);
	
	if ($newFileSize % $bufferSize > 0)
		{
		$newFileSegments = $newFileSize /$bufferSize
		}
	else
		{
		$newFileSegments = ($newFileSize /$bufferSize) + 1
		}
	
	for (x=1;x<$newFileSegments;x++)
		{
		$data = file_get_contents($strFile, ,null ,x*$bufferSize, $bufferSize);
		$file_put_contents ('c:\\copiedFile.doc', $data, FILE_APPEND);
		}
	

Open in new window


But the file is corrupt when
0
 
Ray PaseurCommented:
Uhh, couple of thoughts.  File_Get_Contents() will read the entire file into memory, so this may not be what you want if you are getting "out of memory" conditions.

The function rndid() on line 5 does not exist so we do not know what it might be doing.  It appears to create a variable that is never used.  Might want to leave that out.

Conceptually it is much easier to count the lines in a file, versus counting the bytes.  So you might want to try something like this:

fopen()
$num = 0
while ($num < 5000)
   while (!feof())
       $num++
        fread()
        PROCESSING, WHATEVER
fwrite() each 5000 lines
Loop
0
 
Chris HarteThaumaturgeCommented:
There is a lot wrong with your code, the main thing being filenames. You cannot use direct addressing ie

 "c:\\myFile.doc"

You can only use relative addressing. The files you want to use must be on the root path of your web server.

In linux this would be

/var/www/html/

and would be addressed with out using any redirection. This code reads a file called text.txt in 2500 bit sized chunks and outputs it to a file called output.txt.
$strFile = "test.txt";

$size = filesize($strFile);

$bufferSize = 2500;

$no_of_chunks = ceil($size / $bufferSize);

$x = 0;
while ($x < $no_of_chunks)
{
    $offset = $x * $bufferSize;
    $chunk = file_get_contents($strFile, NULL, NULL, $offset, $bufferSize);
    $x++;
    file_put_contents('output.txt', $chunk, FILE_APPEND);
//    echo $chunk;
}

Open in new window

0
 
Ray PaseurCommented:
@Munterman: I think file_get_contents() and the variants can operate on a URL.
http://php.net/manual/en/function.file-get-contents.php

Even though the site says that file_put_contents() may operate on a URL, I have found through experience that a path based on getcwd() seems to work better.
http://php.net/manual/en/function.file-put-contents.php
0

Featured Post

Concerto Cloud for Software Providers & ISVs

Can Concerto Cloud Services help you focus on evolving your application offerings, while delivering the best cloud experience to your customers? From DevOps to revenue models and customer support, the answer is yes!

Learn how Concerto can help you.

  • 2
  • 2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now