spitting file into segments, buffer too small

Im trying to read a file by segments, so Ive written the following:-
        while ($buffer = fgets($handleR, $this->chunkSize)) {

            echo strlen($buffer);

Open in new window


$this->chunkSize = 1000000, however when I echo out the size of the buffer it is 151 instead 1000000.

Is there a better way to read a file via segments?

What Im trying to do it split a file into segments and process them, as the file is several Gb's in size.

Thanks in advance for any support.
tonelm54Asked:
Who is Participating?
 
Slick812Commented:
You may do well to use the PHP file read function fread (  ), you will not need to keep track of the file position, as the fread( ) will auto advance the file position on each read see manual at -
    http://php.net/manual/en/function.fread.php

$count = 1;
while (!feof($handleR)) {
    $buffer = fread($handle, $this->chunkSize);
    echo $count++.'seg length= '.strlen($buffer);
}
0
 
Ray PaseurCommented:
Please see: http://php.net/manual/en/function.fgets.php where it says,
Reading ends when length - 1 bytes have been read, or a newline (which is included in the return value), or an EOF (whichever comes first). If no length is specified, it will keep reading from the stream until it reaches the end of the line.
So basically it looks like the script is reading the first line of the dataset.  You may get better results with file_get_contents() since it provides both a starting offset and a maximum length.
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.