Link to home
Start Free TrialLog in
Avatar of kaleid
kaleid

asked on

another recursive question.

Is it possible to take all the files of the same name but with different extensions (i.e. test.txt, test.xls, test.doc) and put them in 1 .zip file, and do this for every directory in a tree?
Avatar of Mindo
Mindo

Yes, it's possible, why not? You just list the files you want to add to the zip file and and tell the program to put them in a *.zip file. It doesn't matter what it's name. To make it recursive, just go through the directories and repeat this action on each of them.
assuming that you are operating under unix or xenix or something decent like that:

@listOfFiles=`find directory -name \\nameToLookFor* -print`;
will generate an list with all files that start with nameToLookFor and are in any directory rooted at directory.
Avatar of ozo
use File::Find;
find(\&wanted, 'directory');
sub wanted {
  /^nameToLookFor/ && push @listOfFiles,$File::Find::name;
}
Avatar of kaleid

ASKER

Im using winnt, I can locate the files, but im not sure exactly how to zip them up automatically.
Avatar of kaleid

ASKER

Adjusted points to 20
Avatar of kaleid

ASKER

Still cant get the script to compress the files...is it possible to automate this process so it goes through every folder and zips up all files with the same name?
I tried push but doesnt seem to work.
Please help...
Thanks.

 
Avatar of kaleid

ASKER

Adjusted points to 50
ozo: what's the advantage to using File:Find over @files = <"test.*">;
??
kaleid: post your code that's not working... is it not reaching all desired files or is the zip call failing?
There is a CPAN library for compressing files, Compress::Zlib. I am not sure if it will create a multi file archive though.

Here is an example of how it compress binary files in a directory each to a seperate zip file.

use Compress::Zlib ;

foreach $file (@listOfFiles) {
  $gz = gzopen($file, "rb")
  or die "Cannot open $file: $gzerrno\n" ;

  print $buffer
     while $gz->gzread($buffer) > 0 ;
  die "Error reading from $file: $gzerrno\n"
  if $gzerrno != Z_STREAM_END ;  
     $gz->gzclose() ;
}


The advantage to using File:Find over @files = <"test.*">; is that File:Find can recursively visit every sub-directory in a tree

Do you want each of the filed compressed separately?
Or all the files compressed in one zip?
Or one zip per each directory?
Avatar of kaleid

ASKER

ozo,
I am trying to compress the files into one zip.  That is, all the files with the same name which will in most cases give me many zip files per directory.  I used the code that you had posted but i dont think it created the zip files.  Do I have to manipulate what you had written (other than change the directory name) to make it work?

#!usr/bin/perl

use File::Find;
                   find(\&wanted, 'directory');
                   sub wanted {
                     /^nameToLookFor/ && push @listOfFiles,$File::Find::name;
                   }

Avatar of kaleid

ASKER

Adjusted points to 70
Ozo's code will built up your list of files. You still need to run extra code across this array to compress your files. The Compresss::Zlib should get you started. Alternatively, get something like the 'tar' utiltiy for Windows. This will make one big file out of a directory tree
Avatar of kaleid

ASKER

teraplane: will zlib work with winnt?
Avatar of kaleid

ASKER

Adjusted points to 100
Yes, I have tried it out and it generates zip files without any problems
Avatar of kaleid

ASKER

teraplane:  did it work recursively?  I am unfamiliar with zlib.  Do I unzip it to the /bin directory after I have downloaded it?
ozo's code does the recursive search and builds up the list of files. My code then compresses all these files. I suggest you download the zlib package from CPAN, check the README instructions, install it and then try doing some simple examples.

The best way for you to learn is by trial and error, rather than me giving a complete solution to one specific problem. This sample code should point you in the right direction
Avatar of kaleid

ASKER

Adjusted points to 125
Avatar of kaleid

ASKER

Adjusted points to 150
Avatar of kaleid

ASKER

Adjusted points to 200
Avatar of kaleid

ASKER

is there a way to compress these with the dos version of winzip?
please help.
ASKER CERTIFIED SOLUTION
Avatar of omere
omere

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of kaleid

ASKER

Thank you, Ive given it a try and
got this message when trying to run:

"In string, @listfile now must be written as \@listfile at D:\temp\test\perl\d2test2.pl line 11, near 'pkzip -a -ex zipname @listfile'"

running this script...

#!usr/bin/perl

use File::Find;

find(\&wanted, 'D:\temp\test\perl');  # or whatever, substitute C:\ for the directory

open(OUT, ">listfile.pkz");
print OUT join("\n", @listOfFiles);
close(OUT);

system("pkzip -a -ex zipname @listfile.pkz");

sub wanted {
  /^filename/ && push @listOfFiles,$File::Find::name; # substitute filename
}
Add a \ before @, or use:
system("pkzip -a -ex zipname " . '@listfile.pkz'
Add a \ before @, or use:
system("pkzip -a -ex zipname " . '@listfile.pkz');
Avatar of kaleid

ASKER

Ok,
(thank you by the way, this is the most progress ive made to date...)
it worked but it zipped everything into one zip file.  What I need it to do is zip files with only similar filenames for example:

in directory d:\temp\test\perl\

test1.txt
test1.xls
test1.ppt

test2.txt
test2.xls
test2.ppt

after running the script i would like to achieve two (or however many sets in a directory) zip files.  one "test1.zip" and one "test2.zip"


thanks.
Ah, that's something else then.

This would require some more work, as you would need to recurse through all of the subdirectories, and in each sub directory create a hash with the list of all filenames, add those with similar names to it and then zip it - then moving on.

If you don't know how to do it, lemme know and I'll post some code up.
Avatar of kaleid

ASKER

EXACTLY!!!...
problem is...i dont know how to do it, and ive needed to do it for weeks.
any help you can offer is very much appreciated.