FTP script to not send same file multiple times.

Hi,

I have following FTP script that that send files for everyday,
HOST=199.168.0.105
USER=user
PASS=pass
ftp -inv $HOST <<EOF
user $USER $PASS
bin
cd /destination/
put  "*.gz"
quit
EOF

Open in new window


what I want now is that it should not send same files multiple times, it should check if the files already send it should not send it again. on the same day

im thinking of a file to create our of file send and once it will send another file it will check name of file send and abort if send that? don't know how to do this if any good idea please share.

regards
LVL 1
hi4pplAsked:
Who is Participating?

Improve company productivity with a Business Account.Sign Up

x
 
omarfaridConnect With a Mentor Commented:
you can do the following:

1- transfer files first time
2- touch local file:
touch /tmp/myref
3- next time you see newer files
find . -newer /tmp/myref -t f -exec ls {} \; > /tmp/files
4- run ftp

HOST=199.168.0.105
USER=user
PASS=pass
files=`cat /tmp/files`
ftp -inv $HOST > /tmp/myremfiles <<EOF
user $USER $PASS
bin
cd /destination/
mput $files
quit
EOF
0
 
omarfaridCommented:
You can always have a reference file for last transfer.

e.g. you could touch a file at the end of each transfer and next time you run the script it should find files newer than that reference file and transfer them only:

another way would be to do ls (from ftp connection) on remote folder first, and store locally then exclude these files from transfer
0
 
hi4pplAuthor Commented:
Hi,

any working example would be highly appreciate it?
0
Free Tool: Site Down Detector

Helpful to verify reports of your own downtime, or to double check a downed website you are trying to access.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

 
omarfaridCommented:
you could run something like:

HOST=199.168.0.105
USER=user
PASS=pass
ftp -inv $HOST > /tmp/myremfiles <<EOF
user $USER $PASS
bin
cd /destination/
ls
quit
EOF
ls > /tmp/myfiles
grep -v -f /tmp/myremfile /tmp/myfiles > /tmp/files
files=`cat /tmp/files`
ftp -inv $HOST > /tmp/myremfiles <<EOF
user $USER $PASS
bin
cd /destination/
mput $files
quit
EOF
0
 
hi4pplAuthor Commented:
but in this one where it will check so it will not send duplicate files? I don't see any condition and also how does it know, to change directory based on system date of machine as for everyday there is new directory 20141216.

regards
0
 
omarfaridCommented:
What the script is supposed to do is:

1- connect via ftp to remote system and get list of files on the remote system and store in /tmp/myremfile

2- list local files and store in /tmp/myfiles

3- exclude remote files from local files and store in /tmp/files

4- push the files in /tmp/files to remote system
0
 
hi4pplAuthor Commented:
hi, thanks for explaination but the only reason i'm asking not to send duplicate files is that the file that I send will be process from destination and they will not the their when I send the new file so I can't compare them like that... I have to do it locally that if I send one file it should not be send the second time.

thanks
0
 
carlrjrCommented:
When I had a similar problem I would move the file(s) into a subfolder immediately after the put into a folder named 'archive' .
0
 
hi4pplAuthor Commented:
sorry I forget to mentioned that where I pull the files are SFTP and where I push them is ftp
0
 
omarfaridCommented:
please elaborate on requirement as I am now not clear what is required
0
 
hi4pplAuthor Commented:
Hi,

what I mean is the location where I pick up files is SFTP and where I send the file destination is ftp
0
 
omarfaridCommented:
when I sow your script I assumed that files are available on your local server. You can always pull those files with sftp then push with ftp.
0
 
hi4pplAuthor Commented:
Hi sorry for confusing but this is my requirement,

- I have files stored locally in a system that is collected from SFTP ==Done
- I have to send these files to another FTP which i'm doing now with rsynch
       - but I need to put in place a control here that it should remember the files send and don't send those file back to FTP as where I send the files they get them and process them and removing them if I send one file twice they will process it twice.

regards
0
 
omarfaridCommented:
What you do with the file you send on the local server? can you move it to a different folder?
0
 
hi4pplAuthor Commented:
I only collect locally because the to move it to another FTP, and if I move them it will rsynch it back from SFTP... so yeah I can have a script that move them to some temp folder ... but I all I want is not to send same file to destination the second time
0
 
omarfaridCommented:
what are the chances that the file is fetched again from sftp or its time stamp is updated?
0
 
hi4pplAuthor Commented:
the local is like ghost of SFTP if I remove anything it will bring it back from SFTP, so that is why I need to keep some log and send files that have not been send before some kind of IF statement
0
 
hi4pplAuthor Commented:
i'm actually using this and it does not work the way I wanted like to send only file once and keep back up of file name send and next time it run check what files send and send the new one

function sendfile {
   ftp
}

touch ~/tmp/newsync
if [ -f ~/tmp/lastsync ]; then
   NEWERFLAG="-newer ~/tmp/lastsync"
else
   NEWERFLAG=""
fi

find ~/user/data/ ${NEWERFLAG} ! -newer ~/tmp/newsync | while read file; do
      sendfile ${file}
done
touch -r ~/tmp/newsync ~/tmp/lastsync

Open in new window


don't know what i'm doing wrong
0
 
omarfaridCommented:
why you want files not newer !
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.