Solved

Need solution to download large 100+ MB files using PHP.

Posted on 2007-04-10
13
497 Views
Last Modified: 2012-05-05
Need solution to download large 100+ MB files using PHP.

Currently I have in mind chopping the file into parts and then rejoining them durnig the stream.. but don't know if it will work.  I found some code to chop the file up, but am having trouble puttingit back together (good thing I am not a doctor)

Everything seems fine until the download.  The download window is launched but the file is 0bytes.

Thanks in Advance.

Code:

<?php
$filename = $file;
$path = '/var/www/virtual/ftp/' . $filename;
error_reporting(E_ALL);
set_time_limit( 0 );
$bytes = 512000;           //size of the chunks in bytes (500k in this case)

$pieces = file_chop($path, $bytes);
echo "$path was chopped into " . count($pieces) . " $bytes-byte chunks stored in $pieces";
//the last piece may be smaller than $bytes bytes
exit;
foreach ($pieces as $bit){
   //insert $bit into your database
header('Content-Type: application/force-download');
header ("Content-Length: " . filesize($file));
header ("Content-Disposition: attachment; filename=".basename($file));

  echo fread($bit, 4096);
       ob_flush();

}

function file_chop($file_path, $chunk_size){
    $handle = fopen($file_path, 'rb');         //read the file in binary mode
    $size = filesize($file_path);
    $contents = fread($handle, $size);
    fclose($handle);
   
    //find number of full $chunk_size byte portions
    $num_chunks = floor($size/$chunk_size);

    $chunks = array();

    $start = 0;
    for ($kk=0; $kk < $num_chunks; $kk++){
      $chunks[] = substr($contents, $start, $chunk_size); //get $chunk_size bytes at a time
      $start += $chunk_size;
    }
   
    if ($start < $size){
       $chunks[] = substr($contents, $start);  //get any leftover
    }
    return $chunks;
}  




function file_chop_big($file_path, $chunk_size){
  $size = filesize($file_path);

  //find number of full $chunk_size byte portions
  $num_chunks = floor($size/$chunk_size);

  $file_handle = fopen($file_path, 'rb');         //read the file in binary mode

  $chunks = array();

  for ($kk=0; $kk < $num_chunks; $kk++){
    $chunks[$kk] = basename($file_path).'.chunk'.($kk+1);
    $chunk_handle = fopen($chunks[$kk], 'w');   //open the chunk file for writing

    //write the data to the chunk file 1k at a time
    while((ftell($chunk_handle) + 1024) <= $chunk_size){
      fwrite($chunk_handle, fread($file_handle, 1024));
    }

    if(($leftover = $chunk_size-ftell($chunk_handle)) > 0 ){
      fwrite($chunk_handle, fread($file_handle, $leftover));
    }
    fclose($chunk_handle);
  }

  if (($leftover = $size - ftell($file_handle)) > 0){
    $chunks[$num_chunks] = basename($file_path).'.chunk'.($num_chunks + 1);
    $chunk_handle = fopen($chunks[$num_chunks], 'w');
    while(!feof($file_handle)){
      fwrite($chunk_handle, fread($file_handle, 1024));
    }
    fclose($chunk_handle);
  }

  fclose($file_handle);
  return $chunks;
}
?>
0
Comment
Question by:bionicblakey
13 Comments
 

Author Comment

by:bionicblakey
ID: 18884840
i had teh exit; in there for testing...    it is not the reason the download failed.. pls comment it out..
thanks.
0
 
LVL 7

Expert Comment

by:Zack Soderquist
ID: 18884974
Are you trying to download or upload a file???? i.e. are you trying to download a file to your computer from your site or are you trying to upload a file from your computer to your website?
0
 
LVL 7

Expert Comment

by:Zack Soderquist
ID: 18884991
If you are trying to upload a file to your website, it's probably that your hosting server is limiting the upload limit of files. If it is hosted on a "unix" flavored platform running Apache, you may be able to override the limits by creating a .htaccess file in the root of your site.

Put the following lines in your .htaccess file setting the filesize to what you want .. I set it to 200M (200 MB)

php_value upload_max_filesize 200M
php_value post_max_size 200M
php_value memory_limit 200M
php_value max_execution_time 3600
0
Free Tool: Port Scanner

Check which ports are open to the outside world. Helps make sure that your firewall rules are working as intended.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

 

Author Comment

by:bionicblakey
ID: 18885002
"Need solution to download large 100+ MB files using PHP."

I am trying to DOWNLOAD.

Thanks.
0
 
LVL 7

Expert Comment

by:Zack Soderquist
ID: 18885012
I read the download.. I was confirming

I'm confused as to why you are chopping up the file .. why can't you just download the file as a whole?
0
 
LVL 7

Expert Comment

by:Zack Soderquist
ID: 18885030
As I'm understanding what you're doing ..

PHP is a server side scripting language .. you will be chopping it up and putting it back together before the file is sent to the client .. which is kind of like driving to the post office and back to pick up your mail out of your mailbox in front of your house.

Without knowing what you're trying to accomplish, my initial reaction is that you may be approaching it the wrong way
0
 

Author Comment

by:bionicblakey
ID: 18885112
Hi

I have alot of users downloading large files.. and i am trying to cut down on server load
and also cut down on failed downloads.

when the files get 100+ MB , it seems that a high % of downloads fail.

do you have another idea?
0
 
LVL 7

Expert Comment

by:Zack Soderquist
ID: 18885278
Yes, change protocols

Provide them an FTP link to download the files instead of using HTTP.
If you want, keep the smaller files passed via HTTP, but I would highly recommend that instead of downloading larger files via HTTP, provide the using with an FTP link to download the file. FTP is a more efficient protocol and faster protocol for downloading large files
0
 
LVL 7

Accepted Solution

by:
Zack Soderquist earned 500 total points
ID: 18885289
You are probably getting alot of failed downloads due to HTTP timeouts .. FTP should eliminate that ..

If you get alot of failed downloads, then your users will probably try a several times to download the file, which will dramatically increase your traffic. FTP again should resolve this because they would get the file the first time.
0
 
LVL 48

Expert Comment

by:hernst42
ID: 18885938
If you have no security checks if a user is allowed to download a file put the files into a directory which is accessable via static (nonphp) HTTP-Request which can handle the webserver by using kernel functions and not by executing a php-script. This also allows a HTTP-resume (if your webserver supports it) which might also cut down the load a lot.
0
 
LVL 24

Expert Comment

by:slyong
ID: 18886529
You could use the fread function in PHP to make the download resumable as in the comment section of php fread (http://www.php.net/fread).
0
 
LVL 14

Expert Comment

by:ygoutham
ID: 18887949
i am not commenting on the "chopping" and "putting together".  but the 0 byte problem occurs if apache is not the owner of the file.  in which case it would always download the file as a 0 byte stuff.

change the owner to the httpd user.  use a small file like a 1 mb or something.  try chopping it to pieces and see if it comes wholesome.
0
 
LVL 7

Expert Comment

by:Zack Soderquist
ID: 18969533
has this issue been resolved?
0

Featured Post

Free Tool: SSL Checker

Scans your site and returns information about your SSL implementation and certificate. Helpful for debugging and validating your SSL configuration.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Title # Comments Views Activity
website maintenance mode 1 25
Help in good tutorials for PHP, HTML and CSS 6 40
Test if unique ID is in log file 5 21
Delete  php pages of a part of our site 8 31
Password hashing is better than message digests or encryption, and you should be using it instead of message digests or encryption.  Find out why and how in this article, which supplements the original article on PHP Client Registration, Login, Logo…
3 proven steps to speed up Magento powered sites. The article focus is on optimizing time to first byte (TTFB), full page caching and configuring server for optimal performance.
Explain concepts important to validation of email addresses with regular expressions. Applies to most languages/tools that uses regular expressions. Consider email address RFCs: Look at HTML5 form input element (with type=email) regex pattern: T…
The viewer will learn how to look for a specific file type in a local or remote server directory using PHP.

856 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question