Want to protect your cyber security and still get fast solutions? Ask a secure question today.Go Premium

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 953
  • Last Modified:

Linux Script - Synchronize and keep incremental backup of directories/files

Hi,
I have a site with very active development carried on day-to-day basis. I need to see what files my developer have worked on and need back them up for recovery.
The build of system runs in GB so I don't want to backup all files. Storage is seriously expensive.

I am looking for a linux script to
1. First backup changed files every hour and
2. Then all rsync files from master directory to secondary folder on same or remote server every hour.

Many Thanks
0
crazywolf2010
Asked:
crazywolf2010
2 Solutions
 
Seth SimmonsSr. Systems AdministratorCommented:
find . -mmin -60 | xargs -I% rsync -tgcop % /bar/%

Open in new window


generally speaking, you would use something like this
you find the files (replace . if you have a specific path and not using current folder)
the mmin -60 is the time specified; searching for anything modified in the last 60 minutes

then filter the results through xargs which will do an rsync to the destination

you can use whatever rsync options you want instead of -tgcop as i used in the example

you can also have multiple rsync processes by adding -P x (something less than 8 for performance reasons) right after xargs though you could leave it out if you have very large files

replace /bar/ with the actual local destination

as far as a remote server is concerned, to automate you would need to setup ssh keys to use without prompting for a password

save it as backup.sh and add cron entry

0 * * * * /path/backup.sh

Open in new window

0
 
skullnobrainsCommented:
rsync can handle the above by itself by using --compare-dest

-------

rsync has an option to do incremental backups by creating hardlinks of files that did not change using --link-dest. it needs a little scripting to be actually usable in cron. here is an example script to handle incremental backups. the script assumes the destination dir to be local but you can adapt differently. it is just simpler to pull rather than push in this context

# set these up
src=/local/dir
dst=/other/dir

test -L $dst.latest && lnk="--link-dest=$dst.latest"
dst_=$dst.`date +%Y%m%d%H%M%S`
if rsync -az $lnk $src $dst_.part
then
  mv $dst_.part $dst_
  ln -sfn $dst_ $dst.latest
else
  echo RSYNC FAILURE, removing synchronised files >&2
  rm -rv $dst_.part
  exit 1
fi

# cleanup previously failed/incomplete synchros
rm -frv $dst*.part | sed 's/^/RM:	/'

Open in new window


-----------

you can also use a combination of --backup and --suffix in order to store everything in a single directory. use the current date as suffix for example.
0

Featured Post

Independent Software Vendors: We Want Your Opinion

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

Tackle projects and never again get stuck behind a technical roadblock.
Join Now