Go Premium for a chance to win a PS4. Enter to Win

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 673
  • Last Modified:

Download multiple files with Perl

Im in a bind with downloading files.  I want the website visitor to click a series of boxes depending on what files they want to download and send the request to a Perl script to initiate the download.  I have a perl script that will download one file -- it works.   Is there a way to download an entire folder containing several files?  I've also tried using Archive::Zip to place all the requested files into one zip file to download, but that may be an unnecessary step.  Another problem with Archive::Zip:  the files, once downloaded, reveal the folder structure of my server -- probably not a good idea.

Can anyone suggest a secure and reliable module, if one exists, that will initiate the download of several requested files at the same time?

Thanks.


0
marcparillo
Asked:
marcparillo
  • 6
  • 4
  • 2
  • +2
3 Solutions
 
jhurstCommented:
This is not really a perl question as much as an html protocol question.  

Sadly the answer is sort of NO.

The hrml protocol returns a response.  Not the "a" response.

There are kluges, such as mime, etc but realistically your zip approach is the best choice since none of the kludges is reliable and cross platform
0
 
mjcoyneCommented:
Will Archive::Zip still reveal the directory structure if you chdir() into the appropriate directory before calling addFile for the file?
0
 
marcparilloAuthor Commented:
mjcoyne -- your suggestion worked.  Hadn't thought of chdir() into the appropriate directory.
Is it possible to chdir() a couple of different places in the same script?

The files to download are in separate folders.  Is it possible to chdir() around the server and collect files to download?
When I try this -- I get an Internal Server Error.
(I suppose I could also move all the folders I want into the same folder before building the zip file and then download the files, but it would be nice to move around the server)


#head on over to FOLDER1 add a bunch of files stored in an array
chdir ('/var/www/vhosts/mydomain.com/httpdocs/FOLDER1/');
foreach $i (@file){
  $zip->addFile("$i");
}
#now go to a different directory and add one more file from FOLDER2
chdir ('/var/ww/vhosts/mydomain.com/httpdocs/FOLDER2/');
$zip->addFile("$anotherfile.txt");
            
die 'Cannot create $zip_file_name: $!\n' if $zip->writeToFileNamed("$zip_file_name") != AZ_OK;
0
Concerto's Cloud Advisory Services

Want to avoid the missteps to gaining all the benefits of the cloud? Learn more about the different assessment options from our Cloud Advisory team.

 
jhurstCommented:
you can change directory as many times as you need in the script.

Sounds like you have a solution here.
0
 
marcparilloAuthor Commented:
Here's the script that works -- with only chdir one time -- in case anyone's interested:

#!/usr/bin/perl

use CGI;
use Archive::Zip qw( :ERROR_CODES :CONSTANTS ) || die "Can't find Module";
my $zip = Archive::Zip->new();
my $q = new CGI;

my $i;

#read in variables, including an array of files
my @file = $q->param('file');
my $file2 = $q->param('file2');
my $zip_file_name="MyFiles.zip";


#chdir into appropriate directory
chdir ('/var/www/vhosts/mydomain.com/httpdocs/FOLDER1/');

foreach $i (@file){
  $zip->addFile("$i");
}

#add a different file from a different folder
$zip->addFile("FOLDER2/$script.txt");
            
die 'Cannot create $zip_file_name: $!\n' if $zip->writeToFileNamed("$zip_file_name") != AZ_OK;

#start downloading
$path = '/var/www/vhosts/mydomain.com/httpdocs/FOLDER1/';
$filepath = "$path$zip_file_name";
my $size= -s $filepath;
my $buff;

print "Content-type: application/forced-download\r\n";
print "Content-Length: $size\r\n";
print "Content-disposition: attachment;filename=$zip_file_name\r\n\r\n";
open(FILE, "<$filepath") || die;
binmode FILE;
binmode STDOUT;
while (read FILE, $buff, 1024) {
 print $buff;
}
close(FILE);

0
 
marcparilloAuthor Commented:
jhurst -- do you know why I would get an Internal Server Error when I try to chdir a couple of times in the same script?
0
 
jhurstCommented:
no, it makes no sense that you get that internal server error.

I wonder if you are trying to change to some directory that is not permitted though.  

I suspect that you may be running on a Microsoft server and "enough said" if that is the case.
0
 
marcparilloAuthor Commented:
I get a Premature End of Script Headers when I try to

chdir('/path/to/one/folder/');

and then

chdir('/path/to/a/different/folder/');

0
 
marcparilloAuthor Commented:
It's a unix server -- I'll check the permissions again -- but I'm in the httpdocs directory so all should be okay.

0
 
jhurstCommented:
is it apache ?

If so this is VERY strange.  I would just expect that an attept to change to a directory that is not permitted would just fail in perl.  The one thing that I do see that is different in what you do and when I do when I user chdir, is that I do not have the trailing /.  This does not mean that you are wrong, or even that I am right, it is just something that I note is different.
0
 
mjcoyneCommented:
If you're getting premature end of script headers, it's likely a permissions problem.  To see if it's due to it being the second chdir call or due to the particular directory, swap their order (i.e. try going into FOLDER2 first, then FOLDER1).  If FOLDER2 fails, it's the permissions (or something else about the directory), if FOLDER2 works, and FOLDER1 fails when it worked before when first, then it's the multiple chdir calls.

If it seems like it's the latter (multiple chdir calls), and the permissions of the folder (not just the files within) are correct for the webserver user (whatever user your webserver runs as), check the path you're giving to chdir carefully.  If you're using absolute paths, try relative paths (e.g. if FOLDER1 and FOLDER2 are both subfolders of /var/www/vhosts/mydomain.com/httpdocs/, you can go from FOLDER1 to FOLDER2 by saying chdir ('../FOLDER2);.

If you're already using relative paths, try absolute paths.

There is no reason why you shouldn't be able to chdir as many times as you like...
0
 
TintinCommented:
Add

use CGI::Carp qw(fatalsToBrowser);

and error checking to chdir, eg:

chdir '/var/ww/vhosts/mydomain.com/httpdocs/FOLDER2/' or die "Can not chdir to FOLDER2 $!\n";
0
 
kblack05Commented:
When a CGI script fails to execute properly, the server generally returns an 'Internal Server Error' or '500 Server Error message'. In these cases, the reason for the failure will often be printed to Perl's STDERR filehandle. This tutorial describes how to redirect this message to an error.log.

To obtain useful error messages, add the following snippet to your script, just beneath the shebang line (the first line of the script; usually !#/usr/local/bin/perl or !#/usr/bin/perl):

BEGIN {
open (STDERR, ">/path/to/somewhere/error.txt");
}

Now run your script from your browser. After it gives you its error message, call your error.txt from the browser (for example, http://www.your_site.com/somewhere/error.txt; you might make a bookmark for it because, if you're like me, you'll be using it a lot!). The reason for your script's failure should be written there.
Note: If you call the error.txt file more than once, you may have to hit your browser's "reload" button.

If nothing's written in the error.txt log, it's probably one of two things:

Your shebang line is incorrect (i.e., the script can't find the Perl interpreter). Solution: ask your Systems Administrator where Perl is! See Perl/Base Paths.
The path to your somewhere/ directory is incorrect (i.e., the script couldn't find your error.txt to write its error message to). Solutions:
Run the envhtml.pl program, then be sure you copy the DOCUMENT_ROOT correctly.
Add the following snippet (suggested by Brian Foy and Spider) to your script, just beneath the shebang line:
BEGIN {
      ##  flush STDOUT immediately
      $| = 1;
      print "Content-type: text/html\n\n";
}

open STDERR, ">&STDOUT";

Now run your script again. This time the error message should appear right in your browser. The only problem with this second method is that, if your script has a Location command, it will cause a conflict (the line Location: http://www.address.com/ will be printed onto the screen rather than executing). So, once you've corrected your errors, you'll have to comment out the debugging line.
0
 
TintinCommented:
kblack05.

perldoc CGI::Carp

0
 
kblack05Commented:
Thanks Tintin, I am aware, however this method will also redirect console and syslog errors and give a logged record.

Regards
0
 
marcparilloAuthor Commented:
Thanks everyone.
I can now chdir freely and I learned a few things about error tracing.

I'll split up the points.
0

Featured Post

VIDEO: THE CONCERTO CLOUD FOR HEALTHCARE

Modern healthcare requires a modern cloud. View this brief video to understand how the Concerto Cloud for Healthcare can help your organization.

  • 6
  • 4
  • 2
  • +2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now