Linux Script - Synchronize and keep incremental backup of directories/files

I have a site with very active development carried on day-to-day basis. I need to see what files my developer have worked on and need back them up for recovery.
The build of system runs in GB so I don't want to backup all files. Storage is seriously expensive.

I am looking for a linux script to
1. First backup changed files every hour and
2. Then all rsync files from master directory to secondary folder on same or remote server every hour.

Many Thanks
Who is Participating?
Seth SimmonsConnect With a Mentor Sr. Systems AdministratorCommented:
find . -mmin -60 | xargs -I% rsync -tgcop % /bar/%

Open in new window

generally speaking, you would use something like this
you find the files (replace . if you have a specific path and not using current folder)
the mmin -60 is the time specified; searching for anything modified in the last 60 minutes

then filter the results through xargs which will do an rsync to the destination

you can use whatever rsync options you want instead of -tgcop as i used in the example

you can also have multiple rsync processes by adding -P x (something less than 8 for performance reasons) right after xargs though you could leave it out if you have very large files

replace /bar/ with the actual local destination

as far as a remote server is concerned, to automate you would need to setup ssh keys to use without prompting for a password

save it as and add cron entry

0 * * * * /path/

Open in new window

skullnobrainsConnect With a Mentor Commented:
rsync can handle the above by itself by using --compare-dest


rsync has an option to do incremental backups by creating hardlinks of files that did not change using --link-dest. it needs a little scripting to be actually usable in cron. here is an example script to handle incremental backups. the script assumes the destination dir to be local but you can adapt differently. it is just simpler to pull rather than push in this context

# set these up

test -L $dst.latest && lnk="--link-dest=$dst.latest"
dst_=$dst.`date +%Y%m%d%H%M%S`
if rsync -az $lnk $src $dst_.part
  mv $dst_.part $dst_
  ln -sfn $dst_ $dst.latest
  echo RSYNC FAILURE, removing synchronised files >&2
  rm -rv $dst_.part
  exit 1

# cleanup previously failed/incomplete synchros
rm -frv $dst*.part | sed 's/^/RM:	/'

Open in new window


you can also use a combination of --backup and --suffix in order to store everything in a single directory. use the current date as suffix for example.
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.