Solved

FTP script to not send same file multiple times.

Posted on 2014-12-21
19
160 Views
Last Modified: 2015-01-04
Hi,

I have following FTP script that that send files for everyday,
HOST=199.168.0.105
USER=user
PASS=pass
ftp -inv $HOST <<EOF
user $USER $PASS
bin
cd /destination/
put  "*.gz"
quit
EOF

Open in new window


what I want now is that it should not send same files multiple times, it should check if the files already send it should not send it again. on the same day

im thinking of a file to create our of file send and once it will send another file it will check name of file send and abort if send that? don't know how to do this if any good idea please share.

regards
0
Comment
Question by:hi4ppl
  • 9
  • 9
19 Comments
 
LVL 40

Expert Comment

by:omarfarid
ID: 40511462
You can always have a reference file for last transfer.

e.g. you could touch a file at the end of each transfer and next time you run the script it should find files newer than that reference file and transfer them only:

another way would be to do ls (from ftp connection) on remote folder first, and store locally then exclude these files from transfer
0
 
LVL 1

Author Comment

by:hi4ppl
ID: 40511469
Hi,

any working example would be highly appreciate it?
0
 
LVL 40

Expert Comment

by:omarfarid
ID: 40511481
you could run something like:

HOST=199.168.0.105
USER=user
PASS=pass
ftp -inv $HOST > /tmp/myremfiles <<EOF
user $USER $PASS
bin
cd /destination/
ls
quit
EOF
ls > /tmp/myfiles
grep -v -f /tmp/myremfile /tmp/myfiles > /tmp/files
files=`cat /tmp/files`
ftp -inv $HOST > /tmp/myremfiles <<EOF
user $USER $PASS
bin
cd /destination/
mput $files
quit
EOF
0
 
LVL 1

Author Comment

by:hi4ppl
ID: 40511505
but in this one where it will check so it will not send duplicate files? I don't see any condition and also how does it know, to change directory based on system date of machine as for everyday there is new directory 20141216.

regards
0
 
LVL 40

Expert Comment

by:omarfarid
ID: 40511507
What the script is supposed to do is:

1- connect via ftp to remote system and get list of files on the remote system and store in /tmp/myremfile

2- list local files and store in /tmp/myfiles

3- exclude remote files from local files and store in /tmp/files

4- push the files in /tmp/files to remote system
0
 
LVL 1

Author Comment

by:hi4ppl
ID: 40511514
hi, thanks for explaination but the only reason i'm asking not to send duplicate files is that the file that I send will be process from destination and they will not the their when I send the new file so I can't compare them like that... I have to do it locally that if I send one file it should not be send the second time.

thanks
0
 
LVL 3

Expert Comment

by:carlrjr
ID: 40511605
When I had a similar problem I would move the file(s) into a subfolder immediately after the put into a folder named 'archive' .
0
 
LVL 40

Accepted Solution

by:
omarfarid earned 500 total points
ID: 40511614
you can do the following:

1- transfer files first time
2- touch local file:
touch /tmp/myref
3- next time you see newer files
find . -newer /tmp/myref -t f -exec ls {} \; > /tmp/files
4- run ftp

HOST=199.168.0.105
USER=user
PASS=pass
files=`cat /tmp/files`
ftp -inv $HOST > /tmp/myremfiles <<EOF
user $USER $PASS
bin
cd /destination/
mput $files
quit
EOF
0
 
LVL 1

Author Comment

by:hi4ppl
ID: 40511679
sorry I forget to mentioned that where I pull the files are SFTP and where I push them is ftp
0
Better Security Awareness With Threat Intelligence

See how one of the leading financial services organizations uses Recorded Future as part of a holistic threat intelligence program to promote security awareness and proactively and efficiently identify threats.

 
LVL 40

Expert Comment

by:omarfarid
ID: 40511687
please elaborate on requirement as I am now not clear what is required
0
 
LVL 1

Author Comment

by:hi4ppl
ID: 40512560
Hi,

what I mean is the location where I pick up files is SFTP and where I send the file destination is ftp
0
 
LVL 40

Expert Comment

by:omarfarid
ID: 40512573
when I sow your script I assumed that files are available on your local server. You can always pull those files with sftp then push with ftp.
0
 
LVL 1

Author Comment

by:hi4ppl
ID: 40525141
Hi sorry for confusing but this is my requirement,

- I have files stored locally in a system that is collected from SFTP ==Done
- I have to send these files to another FTP which i'm doing now with rsynch
       - but I need to put in place a control here that it should remember the files send and don't send those file back to FTP as where I send the files they get them and process them and removing them if I send one file twice they will process it twice.

regards
0
 
LVL 40

Expert Comment

by:omarfarid
ID: 40525148
What you do with the file you send on the local server? can you move it to a different folder?
0
 
LVL 1

Author Comment

by:hi4ppl
ID: 40525169
I only collect locally because the to move it to another FTP, and if I move them it will rsynch it back from SFTP... so yeah I can have a script that move them to some temp folder ... but I all I want is not to send same file to destination the second time
0
 
LVL 40

Expert Comment

by:omarfarid
ID: 40525215
what are the chances that the file is fetched again from sftp or its time stamp is updated?
0
 
LVL 1

Author Comment

by:hi4ppl
ID: 40525238
the local is like ghost of SFTP if I remove anything it will bring it back from SFTP, so that is why I need to keep some log and send files that have not been send before some kind of IF statement
0
 
LVL 1

Author Comment

by:hi4ppl
ID: 40525317
i'm actually using this and it does not work the way I wanted like to send only file once and keep back up of file name send and next time it run check what files send and send the new one

function sendfile {
   ftp
}

touch ~/tmp/newsync
if [ -f ~/tmp/lastsync ]; then
   NEWERFLAG="-newer ~/tmp/lastsync"
else
   NEWERFLAG=""
fi

find ~/user/data/ ${NEWERFLAG} ! -newer ~/tmp/newsync | while read file; do
      sendfile ${file}
done
touch -r ~/tmp/newsync ~/tmp/lastsync

Open in new window


don't know what i'm doing wrong
0
 
LVL 40

Expert Comment

by:omarfarid
ID: 40529257
why you want files not newer !
0

Featured Post

Highfive + Dolby Voice = No More Audio Complaints!

Poor audio quality is one of the top reasons people don’t use video conferencing. Get the crispest, clearest audio powered by Dolby Voice in every meeting. Highfive and Dolby Voice deliver the best video conferencing and audio experience for every meeting and every room.

Join & Write a Comment

Suggested Solutions

The following is a collection of cases for strange behaviour when using advanced techniques in DOS batch files. You should have some basic experience in batch "programming", as I'm assuming some knowledge and not further explain the basics. For some…
Active Directory replication delay is the cause to many problems.  Here is a super easy script to force Active Directory replication to all sites with by using an elevated PowerShell command prompt, and a tool to verify your changes.
Learn several ways to interact with files and get file information from the bash shell. ls lists the contents of a directory: Using the -a flag displays hidden files: Using the -l flag formats the output in a long list: The file command gives us mor…
Sending a Secure fax is easy with eFax Corporate (http://www.enterprise.efax.com). First, Just open a new email message.  In the To field, type your recipient's fax number @efaxsend.com. You can even send a secure international fax — just include t…

757 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

20 Experts available now in Live!

Get 1:1 Help Now