• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 194
  • Last Modified:

spitting file into segments, buffer too small

Im trying to read a file by segments, so Ive written the following:-
        while ($buffer = fgets($handleR, $this->chunkSize)) {

            echo strlen($buffer);

Open in new window


$this->chunkSize = 1000000, however when I echo out the size of the buffer it is 151 instead 1000000.

Is there a better way to read a file via segments?

What Im trying to do it split a file into segments and process them, as the file is several Gb's in size.

Thanks in advance for any support.
0
tonelm54
Asked:
tonelm54
1 Solution
 
Ray PaseurCommented:
Please see: http://php.net/manual/en/function.fgets.php where it says,
Reading ends when length - 1 bytes have been read, or a newline (which is included in the return value), or an EOF (whichever comes first). If no length is specified, it will keep reading from the stream until it reaches the end of the line.
So basically it looks like the script is reading the first line of the dataset.  You may get better results with file_get_contents() since it provides both a starting offset and a maximum length.
0
 
Slick812Commented:
You may do well to use the PHP file read function fread (  ), you will not need to keep track of the file position, as the fread( ) will auto advance the file position on each read see manual at -
    http://php.net/manual/en/function.fread.php

$count = 1;
while (!feof($handleR)) {
    $buffer = fread($handle, $this->chunkSize);
    echo $count++.'seg length= '.strlen($buffer);
}
0

Featured Post

Upgrade your Question Security!

Your question, your audience. Choose who sees your identity—and your question—with question security.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now