Link to home
Start Free TrialLog in
Avatar of shragi
shragiFlag for India

asked on

perl copy the files by avoiding duplicates

Hi,

I am writing the below script to copy the files from one location to another location, but how can avoid copying duplicates.
I know what copy in perl does - if it see any duplicate files it will replace it, but I don't think this is a good idea becoz I generally have atleast 1000 files in a source folder and each day I will get some 20 files max...so when i run my perl program it just needs to copy those 20 files instead of all 1000 .... copying all 1000 files again is a waste of time and resource of server.

so how can i check before hand whether the file already exists in the destination or not.
Also what is the best solution in terms of memory and speed replacing duplicates files or checking for duplicates files.
from what i know checking for duplicates is effective way because of file size that it is going to replace.




my perl script:
my script:
#!/usr/bin/perl

use strict;
use warnings;

my $source = "C:\\test";
my $destination = "C:\\test1";

opendir(DIR,"$source") or die "Cannot open $source\n";
my @files = readdir(DIR);
closedir(DIR);

foreach my $file (@files) {
  next if ($file !~ /\.txt$/i);
  system("copy \"$source\\$file\" \"$destination\"");
 
}
SOLUTION
Avatar of wilcoxon
wilcoxon
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of Tintin
Tintin

Why not just use rsync?
Avatar of shragi

ASKER

What is rsync ?
ASKER CERTIFIED SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial