Linux Script - Synchronize and keep incremental backup of directories/files

Hi,
I have a site with very active development carried on day-to-day basis. I need to see what files my developer have worked on and need back them up for recovery.
The build of system runs in GB so I don't want to backup all files. Storage is seriously expensive.

I am looking for a linux script to
1. First backup changed files every hour and
2. Then all rsync files from master directory to secondary folder on same or remote server every hour.

Many Thanks
crazywolf2010Asked:
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

x
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Seth SimmonsSr. Systems AdministratorCommented:
find . -mmin -60 | xargs -I% rsync -tgcop % /bar/%

Open in new window


generally speaking, you would use something like this
you find the files (replace . if you have a specific path and not using current folder)
the mmin -60 is the time specified; searching for anything modified in the last 60 minutes

then filter the results through xargs which will do an rsync to the destination

you can use whatever rsync options you want instead of -tgcop as i used in the example

you can also have multiple rsync processes by adding -P x (something less than 8 for performance reasons) right after xargs though you could leave it out if you have very large files

replace /bar/ with the actual local destination

as far as a remote server is concerned, to automate you would need to setup ssh keys to use without prompting for a password

save it as backup.sh and add cron entry

0 * * * * /path/backup.sh

Open in new window

0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
skullnobrainsCommented:
rsync can handle the above by itself by using --compare-dest

-------

rsync has an option to do incremental backups by creating hardlinks of files that did not change using --link-dest. it needs a little scripting to be actually usable in cron. here is an example script to handle incremental backups. the script assumes the destination dir to be local but you can adapt differently. it is just simpler to pull rather than push in this context

# set these up
src=/local/dir
dst=/other/dir

test -L $dst.latest && lnk="--link-dest=$dst.latest"
dst_=$dst.`date +%Y%m%d%H%M%S`
if rsync -az $lnk $src $dst_.part
then
  mv $dst_.part $dst_
  ln -sfn $dst_ $dst.latest
else
  echo RSYNC FAILURE, removing synchronised files >&2
  rm -rv $dst_.part
  exit 1
fi

# cleanup previously failed/incomplete synchros
rm -frv $dst*.part | sed 's/^/RM:	/'

Open in new window


-----------

you can also use a combination of --backup and --suffix in order to store everything in a single directory. use the current date as suffix for example.
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Linux

From novice to tech pro — start learning today.