Moving files from a watched folder/directory

mce-man-it
mce-man-it used Ask the Experts™
on
I need a perl script to move files from a watched folder/directory to a holding directory on another machine. My requirements are:

the watched folder will have its contents processed every hour (i intend to use cron to do this)

the script processes each file one at at time but only if the file is complete (files may still be being copied to the directory over a wireless network, maybe use a date stamp, files older than 120sec or EOF?) and moves each file to a remote holding directory on another computer

The code i've attached is a script used by a program called TWiST as a custom move. The    #Transfer pdf command       i need to use BUT with tweeks as it triggers database actions on the receiving server. The files can be anything from pdfs to eps images.

Thanks in advance
#!/usr/bin/perl
 
use File::Basename;
 
# Input of variables from TWiST
 
# Source file
$src = $ARGV[0];
 
# Host        can we replace $ARGV[1] with the ip address of or destination server
$host = $ARGV[1];
 
# Destination file
$dest = $ARGV[2];
 
# User name
$user = "xxxxxxx"; #will fill in with user
 
# User password
$passwd = "xxxxxx"; #will fill in with users password
 
# Calc dirs
$src_dir = dirname($src);
$src_file = basename($src);
 
$dest_dir = dirname($dest);
 
# Set Variables      how to put a destination directory here eg  /RAID/holding/sort
$dest_dir =~ s/ /%20/g;
 
# Transfer pdf command
$send_pdf = "cd \"$src_dir\"; /usr/bin/curl " . "-F \"overwrite=1\" " . "-F \"filedata=\@$src_file\" " . "-u $user:$passwd " . " -s \"http://$host/webnative/upload?$dest_dir\" ";
# any need to use   \"cd \"$src_dir\";
 
# Send the pdf to the server
system (" $send_pdf > /tmp/send_pdf ");

Open in new window

Comment
Watch Question

Do more with

Expert Office
EXPERT OFFICE® is a registered trademark of EXPERTS EXCHANGE®

Commented:
You can try a toll for file replication under linux: rsync
http://www.samba.org/ftp/rsync/rsync.html
It is free and take all the things that have to be done to replicate files on remote PC.

Look at this simple tutorial:
http://troy.jdmz.net/rsync/index.html

Author

Commented:
rsync wouldn't be any use in this instance as the move needs to use the curl part of the script i posted as the upload feature triggers database actions as i already stated.
thanks anyway
Top Expert 2009

Commented:
So, does the script need to search a directory for any new files, or you have another script that does that, and that other script will be calling this script to do the upload?

>>moves each file to a remote holding directory
So once the files are uploaded, they should removed?
Expert Spotlight: Joe Anderson (DatabaseMX)

We’ve posted a new Expert Spotlight!  Joe Anderson (DatabaseMX) has been on Experts Exchange since 2006. Learn more about this database architect, guitar aficionado, and Microsoft MVP.

Author

Commented:
yes it will only need to search/watch one directory, i need to use parts of the posted script

/usr/bin/curl " . "-F \"overwrite=1\" " . "-F \"filedata=\@$src_file\" " . "-u $user:$passwd " . " -s \"http://$host/webnative/upload?$dest_dir\" ";

as the way to move each file, the /webnative/upload is a trigger for a database refresh. once the files have uploaded they can bem removed
Top Expert 2009
Commented:

#!/usr/bin/perl
use strict;
use warnings;
use File::Find;
 
my $username = "xxxxxxx"; #will fill in with user
my $password = "xxxxxxx"; #will fill in with users password
my $SearchDirectory = '/path/to/search';
my $DestDirectory = '/path/to/dest';
 
my $StartTime = time;
find(\&found, $SearchDirectory);
 
sub found {
    my $mtime = (stat($File::Find::name))[9];
    return unless $StartTime - $mtime > 120;
    my $cmd = qq{/usr/bin/curl -F "overwrite=1" -F "filedata=\@$_" -u $user:$passwd  -s "http://$host/webnative/upload?$DestDir"};
    system("$cmd > /tmp/send_file");
    #Do you want to check to see if it was successful?
    if($?) {
        warn "Error with $File::Find::name\n";
        return;
    }
    unlink($File::Find::name) or warn "Could not remove '$File::Find::name': $!\n";
}

Open in new window

Author

Commented:
great work, just doing some quick testing, just running it from terminal at the moment to check the move (mac osx).
hit two snags its moving more than just files but hidden folders Network Trash Folder, can you add anything that just sees files that have and extension .pdf, .eps
also the script seems to loop can it be ended once no more files are found, then be run again after an hour etc.
other than that its works even the database trigger

fantastic
Top Expert 2009
Commented:
>>the script seems to loop
It'll check every directory/subdirectory until everything in the $SearchDirectory has been checked.  It'll only check each once though.

If you want to skip directories or only process "normal" files, or only files that have .pdf or .eps extensions, see code below.

sub found {
    my $mtime = (stat($File::Find::name))[9];
    return unless -f _;            #Skip non normal files (eg: directories/pipes/links...)
    return if -d _;                #Skip directories
    return unless /\.(pdf|eps)$/;  #Skip anything without .pdf or .eps extension
    return unless $StartTime - $mtime > 120;
    my $cmd = qq{/usr/bin/curl -F "overwrite=1" -F "filedata=\@$_" -u $user:$passwd  -s "http://$host/webnative/upload?$DestDir"};
    system("$cmd > /tmp/send_file");
    #Do you want to check to see if it was successful?
    if($?) {
        warn "Error with $File::Find::name\n";
        return;
    }
    unlink($File::Find::name) or warn "Could not remove '$File::Find::name': $!\n";
}

Open in new window

Author

Commented:
you are the man, going to do some more testing monday but from what you have given me the solution works like a dream

thanks again

Do more with

Expert Office
Submit tech questions to Ask the Experts™ at any time to receive solutions, advice, and new ideas from leading industry professionals.

Start 7-Day Free Trial