Need solution to download large 100+ MB files using PHP.

Need solution to download large 100+ MB files using PHP.

Currently I have in mind chopping the file into parts and then rejoining them durnig the stream.. but don't know if it will work.  I found some code to chop the file up, but am having trouble puttingit back together (good thing I am not a doctor)

Everything seems fine until the download.  The download window is launched but the file is 0bytes.

Thanks in Advance.

Code:

<?php
$filename = $file;
$path = '/var/www/virtual/ftp/' . $filename;
error_reporting(E_ALL);
set_time_limit( 0 );
$bytes = 512000;           //size of the chunks in bytes (500k in this case)

$pieces = file_chop($path, $bytes);
echo "$path was chopped into " . count($pieces) . " $bytes-byte chunks stored in $pieces";
//the last piece may be smaller than $bytes bytes
exit;
foreach ($pieces as $bit){
   //insert $bit into your database
header('Content-Type: application/force-download');
header ("Content-Length: " . filesize($file));
header ("Content-Disposition: attachment; filename=".basename($file));

  echo fread($bit, 4096);
       ob_flush();

}

function file_chop($file_path, $chunk_size){
    $handle = fopen($file_path, 'rb');         //read the file in binary mode
    $size = filesize($file_path);
    $contents = fread($handle, $size);
    fclose($handle);
   
    //find number of full $chunk_size byte portions
    $num_chunks = floor($size/$chunk_size);

    $chunks = array();

    $start = 0;
    for ($kk=0; $kk < $num_chunks; $kk++){
      $chunks[] = substr($contents, $start, $chunk_size); //get $chunk_size bytes at a time
      $start += $chunk_size;
    }
   
    if ($start < $size){
       $chunks[] = substr($contents, $start);  //get any leftover
    }
    return $chunks;
}  




function file_chop_big($file_path, $chunk_size){
  $size = filesize($file_path);

  //find number of full $chunk_size byte portions
  $num_chunks = floor($size/$chunk_size);

  $file_handle = fopen($file_path, 'rb');         //read the file in binary mode

  $chunks = array();

  for ($kk=0; $kk < $num_chunks; $kk++){
    $chunks[$kk] = basename($file_path).'.chunk'.($kk+1);
    $chunk_handle = fopen($chunks[$kk], 'w');   //open the chunk file for writing

    //write the data to the chunk file 1k at a time
    while((ftell($chunk_handle) + 1024) <= $chunk_size){
      fwrite($chunk_handle, fread($file_handle, 1024));
    }

    if(($leftover = $chunk_size-ftell($chunk_handle)) > 0 ){
      fwrite($chunk_handle, fread($file_handle, $leftover));
    }
    fclose($chunk_handle);
  }

  if (($leftover = $size - ftell($file_handle)) > 0){
    $chunks[$num_chunks] = basename($file_path).'.chunk'.($num_chunks + 1);
    $chunk_handle = fopen($chunks[$num_chunks], 'w');
    while(!feof($file_handle)){
      fwrite($chunk_handle, fread($file_handle, 1024));
    }
    fclose($chunk_handle);
  }

  fclose($file_handle);
  return $chunks;
}
?>
bionicblakeyAsked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

bionicblakeyAuthor Commented:
i had teh exit; in there for testing...    it is not the reason the download failed.. pls comment it out..
thanks.
0
Zack SoderquistCommented:
Are you trying to download or upload a file???? i.e. are you trying to download a file to your computer from your site or are you trying to upload a file from your computer to your website?
0
Zack SoderquistCommented:
If you are trying to upload a file to your website, it's probably that your hosting server is limiting the upload limit of files. If it is hosted on a "unix" flavored platform running Apache, you may be able to override the limits by creating a .htaccess file in the root of your site.

Put the following lines in your .htaccess file setting the filesize to what you want .. I set it to 200M (200 MB)

php_value upload_max_filesize 200M
php_value post_max_size 200M
php_value memory_limit 200M
php_value max_execution_time 3600
0
Cloud Class® Course: CompTIA Healthcare IT Tech

This course will help prep you to earn the CompTIA Healthcare IT Technician certification showing that you have the knowledge and skills needed to succeed in installing, managing, and troubleshooting IT systems in medical and clinical settings.

bionicblakeyAuthor Commented:
"Need solution to download large 100+ MB files using PHP."

I am trying to DOWNLOAD.

Thanks.
0
Zack SoderquistCommented:
I read the download.. I was confirming

I'm confused as to why you are chopping up the file .. why can't you just download the file as a whole?
0
Zack SoderquistCommented:
As I'm understanding what you're doing ..

PHP is a server side scripting language .. you will be chopping it up and putting it back together before the file is sent to the client .. which is kind of like driving to the post office and back to pick up your mail out of your mailbox in front of your house.

Without knowing what you're trying to accomplish, my initial reaction is that you may be approaching it the wrong way
0
bionicblakeyAuthor Commented:
Hi

I have alot of users downloading large files.. and i am trying to cut down on server load
and also cut down on failed downloads.

when the files get 100+ MB , it seems that a high % of downloads fail.

do you have another idea?
0
Zack SoderquistCommented:
Yes, change protocols

Provide them an FTP link to download the files instead of using HTTP.
If you want, keep the smaller files passed via HTTP, but I would highly recommend that instead of downloading larger files via HTTP, provide the using with an FTP link to download the file. FTP is a more efficient protocol and faster protocol for downloading large files
0
Zack SoderquistCommented:
You are probably getting alot of failed downloads due to HTTP timeouts .. FTP should eliminate that ..

If you get alot of failed downloads, then your users will probably try a several times to download the file, which will dramatically increase your traffic. FTP again should resolve this because they would get the file the first time.
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
hernst42Commented:
If you have no security checks if a user is allowed to download a file put the files into a directory which is accessable via static (nonphp) HTTP-Request which can handle the webserver by using kernel functions and not by executing a php-script. This also allows a HTTP-resume (if your webserver supports it) which might also cut down the load a lot.
0
slyongCommented:
You could use the fread function in PHP to make the download resumable as in the comment section of php fread (http://www.php.net/fread).
0
ygouthamCommented:
i am not commenting on the "chopping" and "putting together".  but the 0 byte problem occurs if apache is not the owner of the file.  in which case it would always download the file as a 0 byte stuff.

change the owner to the httpd user.  use a small file like a 1 mb or something.  try chopping it to pieces and see if it comes wholesome.
0
Zack SoderquistCommented:
has this issue been resolved?
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
PHP

From novice to tech pro — start learning today.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.