cure_22
asked on
Backup over ssh only new or edited files recursive
Hi,
i've been trying to find a script that backs up(mirrors) huge load of data so that it sends only changed files or new files to remote server securily over ssh(scp(prefer) or rsync) weekly, but i havent found a script that does that. cronjob i can do for it but this recursive and only updated or new files is out of my shell scripting knowledge. so if you have this kind of script or you know where is please let me know. there is no option to make a tar or send all the data thru net.
i've been trying to find a script that backs up(mirrors) huge load of data so that it sends only changed files or new files to remote server securily over ssh(scp(prefer) or rsync) weekly, but i havent found a script that does that. cronjob i can do for it but this recursive and only updated or new files is out of my shell scripting knowledge. so if you have this kind of script or you know where is please let me know. there is no option to make a tar or send all the data thru net.
Here's a script to transfer a whole directory from host-a to host-b:
#!/bin/sh
source=/var/tmp ; target=/var/backup
cd $source ; tar cf - . | ssh -l username host-b "cd $target ; tar xf -"
if you dislike tar you can use "dump" (or "ufsdump") or "cpio" instead.
The only problem is that ssh will ask four username's password. To
get around this you either will have to install expect or (the better
choice) setup ssh keys on host-a and host-b.
#!/bin/sh
source=/var/tmp ; target=/var/backup
cd $source ; tar cf - . | ssh -l username host-b "cd $target ; tar xf -"
if you dislike tar you can use "dump" (or "ufsdump") or "cpio" instead.
The only problem is that ssh will ask four username's password. To
get around this you either will have to install expect or (the better
choice) setup ssh keys on host-a and host-b.
BTW: The script is to be run on host-a
For cpio (transfer only files modified in the last two days):
#!/bin/sh
source=/var/tmp ; target=/var/backup
cd $source ; find . -mtime -2 -type f -print | cpio -ocB | ssh -l username host-b "cd $target ; cpio -icBduml"
#!/bin/sh
source=/var/tmp ; target=/var/backup
cd $source ; find . -mtime -2 -type f -print | cpio -ocB | ssh -l username host-b "cd $target ; cpio -icBduml"
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
thanks guys for a lot answers, many ways and only one can get the points.
The rsync idea is good -- but I for my part always try NOT to install any additional software
not essentially required. Today, ssh is "a must" on most systems and you can go with the
cpio approach I've mentioned.
I must admit the rsync thing is transferring less data if stuff is already on the target side,
but this is not the case when running
source=/var/tmp ; target=/var/backup
cd $source ; find . -mtime -1 -type f -print | cpio -ocB | ssh -l username host-b "cd $target ; cpio -icBduml"
daily.
not essentially required. Today, ssh is "a must" on most systems and you can go with the
cpio approach I've mentioned.
I must admit the rsync thing is transferring less data if stuff is already on the target side,
but this is not the case when running
source=/var/tmp ; target=/var/backup
cd $source ; find . -mtime -1 -type f -print | cpio -ocB | ssh -l username host-b "cd $target ; cpio -icBduml"
daily.
You will have your data in a cvsroot directory, then setup your sercurity so that only your 'remote server' can connect to the cvs server.
CVS implements ssh as one of its transfer protocols.
If your remote server connects and issues the commands (easily scriptable) to your data holding server it will only transfer any updated and new files that have been put in the cvs root directory.
Checkout http://www.cvshome.org/ for more information.