Solved

Daily Incremental Backups using rsync

Posted on 2009-04-10
18
1,511 Views
Last Modified: 2013-12-01
I run a daily rsync backup of one ReadyNAS NV+ device to two others connected to our network. We're going to start moving one off-site each week for disaster recovery and wanted to also rsync the daily changes to another directory for daily off-site rotation.

Currently, I have a script (attached, dirsync.sh) that gets passed the directory name and then rsyncs the contents from the main NAS to each of the secondary devices, logs the progress to a text file, and emails the file to me when completed successfully. This is called by another script (attached, daily.sh) that cycles through the 7 different shares that get backed up (passed as the first argument).

The ultimate end goal is to get everything changed since the spare was taken off-site (Friday) synced to a specific place. That means for each of the shares at /mount/nas4 passed from daily.sh, the changes since the last Friday will be synced to /mount/nas6/daily (i.e., on Wednesday it will copy everything modified at /mount/nas4/IT/ in the last 5 days to /mount/nas6/daily/IT/). On Friday it should be backing up everything in the last 1 day and deleting everything else.

What I think I need to do is use the backup feature of rsync, but I can't figure out the syntax. Is there an easy way to add it to my existing script? I'm okay running more scripts, if necessary, to accomplish this. But I am in a little over my head getting this running.
#!/bin/sh

 

# NAS Share Backup Script, v3.0

# /usr/local/scripts/rsyncscripts/dirsync.sh

# Created By:    

# Last Modified: 20090324 1947

 

# Declare the script name

script=$1;

 

# Get date and time for log name

today=`date +%Y%m%0e-%H%M%S`;

 

# Get date for email notification

dateFormat=`date +%Y%m%0e`;

 

# Backup NAS4 to NAS3

date > /var/log/rsync/$today-$script.log

rsync --del --progress --stats -v -r -t /mount/nas4/$script/ /mount/nas3/$script/ >> /var/log/rsync/$today-$script.log

 

# Insert Stage1 completion date/time into log

date >> /var/log/rsync/$today-$script.log

 

# Backup NAS4 to NAS5

rsync --del --progress --stats -v -r -t /mount/nas4/$script/ /mount/nas5/$script/ >> /var/log/rsync/$today-$script.log

 

# Insert final completion date/time into log

date >> /var/log/rsync/$today-$script.log

 

# Build completion time for email

complete=`date +%Y%m%0e-%H%M%S`;

 

# Build email body

emailbody="A successful backup of $script has been completed.\nThe backup began at $today and ended at $complete.\nA log file is attached.";

rm -rdf /tmp/emailbody-$script.txt

echo $emailbody > /tmp/emailbody-$script.txt

 

# Send email and cleanup

mutt -s "Successful Backup: $script - $dateFormat" -a /var/log/rsync/$today-$script.log -c xxx@xxx.com yyy@yyy.com zzz@zzz.com qqq@qqq.com < /tmp/emailbody-$script.txt

rm -rdf /tmp/emailbody-$script.txt

Open in new window

0
Comment
Question by:drbill1
  • 9
  • 8
18 Comments
 

Author Comment

by:drbill1
ID: 24116695
Here is the daily.sh script that calls the above dirsync.sh script:
#!/bin/bash

 

# NAS Share Daily Script, v1.5

# /usr/local/scripts/rsyncscripts/daily.sh

# Created By:    

# Last Modified: 20090326 2124

 

# Declare share names

SHARES=( IT MCI XYZ 123 galadriel gandalf );

 

# Get count of the shares for loop

SHARECOUNT=${#SHARES[@]}

 

# Run shares through backup script

for (( i=0;i<$SHARECOUNT;i++)); do

    /usr/local/scripts/rsyncscripts/dirsync.sh ${SHARES[${i}]}

done

Open in new window

0
 
LVL 16

Expert Comment

by:ai_ja_nai
ID: 24121364
I don't understand what your problem is about: what do you mean by "the backup feature of rsync"? what should rsync do that doesn't do right now?
0
 

Author Comment

by:drbill1
ID: 24122852
Rsync has a --backup option that should allow me to do incremental backups aside from my regular full-run rsync to the other device. Basically, I want rsync to continue syncing the full ReadyNAS device (which I've already scripted) and to start syncing all changed files from the last week to an external USB drive.
0
 
LVL 16

Expert Comment

by:ai_ja_nai
ID: 24123540
it alreasy does it, you don't need to add anything else

by the way, instead of --progress -v -r -t, use -avzP

z flag compresses transmission to save bandwidth;
a stands for rlptgoD (preserves everything and all the hierarchy)
P is partial and progress, so that you have resume capabilities
0
 

Author Comment

by:drbill1
ID: 24124597
No, you're missing the point; I know that rsync already does only incremental between two places. But I'm moving the destination off-site each week and need to grab the changes during the week to a USB hard drive. The NAS unit has 4x1.5TB drives and weighs about 35 pounds and is a pain to move every day. Since we have only about 30GB of weekly changes, I want to do incremental backups to an external USB hard drive (small, portable, easy to move) of everything since the device was moved.

So, on Monday, I want to rsync all of the changes over the last 3 days, Tuesday the last 4 days, etc. On Friday, I want to reset and sync only the last 1 day (and wipe out everything else). I was almost there working with the --backup flag on rsync, but I haven't been able to work it out completely by myself.

I've changed my flags on that script per your directions, and I'll run it tonight to see how it works. Thanks for the tip!
0
 
LVL 16

Expert Comment

by:ai_ja_nai
ID: 24127007
I still don't understand the problem.
If you have to choose a different destination (the usb drive) because the main storage is moved off the site, just do it. What has this to do with rsync's backup flag?
0
 

Author Comment

by:drbill1
ID: 24128006
I only want to rsync files changed since the previous Friday, not rsync all files from all time. Focus on that and ignore everything else. From what I've read on the man pages of rsync, I can do this with the backup flag, but I don't know how to do it. Either using the backup flag or some other method I haven't thought of, how can I sync all of the files that have changed since the previous Friday to a directory while maintaining the folder hierarchy.
0
 
LVL 16

Expert Comment

by:ai_ja_nai
ID: 24128774
the backup flag is just --backup-dir + --suffix options :) you don't need it

So,if it's saturday, you want to tranfer only changes occurred friday, disregarding those happened on thursday or wednesday? Is this that you want?
(the most difficult part of consulting is understanding what the client wants :))
0
 

Author Comment

by:drbill1
ID: 24130775
Yes, and on Friday I want it to reset and start again.
0
IT, Stop Being Called Into Every Meeting

Highfive is so simple that setting up every meeting room takes just minutes and every employee will be able to start or join a call from any room with ease. Never be called into a meeting just to get it started again. This is how video conferencing should work!

 
LVL 16

Accepted Solution

by:
ai_ja_nai earned 500 total points
ID: 24131687
It's not a direct-built-in function.

But you can build a list of files that have to be updated, "filelist.txt", and then invoke rsync with parameter --files-from=filelist.txt  ; those files and only those will be synched (if you don't include another source). To build that list of file use this:

find ~ -type f -mtime -1 >filelist.txt

this builds a file with all the changed files in the last 24 hours.
0
 

Author Comment

by:drbill1
ID: 24142632
How do I specify where to search in that find command?
0
 
LVL 16

Expert Comment

by:ai_ja_nai
ID: 24146790
in place of the ~ (which means own home)
0
 

Author Comment

by:drbill1
ID: 24149668
If I pass the --files-from=filelist.txt flag to rsync, will it preserve the path on the destination? How do I wipe it clean after 7 days to start over?
0
 
LVL 16

Expert Comment

by:ai_ja_nai
ID: 24149873
it should, if you include -a
to clean the archive every 7 days, run a cron script with the 'rm -fr /path/to/folder' command
0
 

Author Comment

by:drbill1
ID: 24150255
I'm going to run it now and see how it goes, then. There are over 2.2 million files on the shares on the NAS unit, so it will take a while to run the find command and see what happens. I'll report back if it works.
0
 

Author Comment

by:drbill1
ID: 24153108
The find portion did not take long at all, but how do I format the rsync command? I tried the below, but it errored. Do I pass the source directory as being the same place I ran my find command against? The filelist.txt shows the full path to the files from root (i.e. "/mount/nas4/MCI/jjohnson/nvplus.png" is one entry). How do I denote the source for rsync?
rsync --stats -avzP --files-from=/tmp/filelist.txt /mount/nas6/daily/

Open in new window

0
 
LVL 16

Expert Comment

by:ai_ja_nai
ID: 24153459
you have to include at least a source directory (empty, so that you don't sync anything useless), before the destination

Anyway, to check if file hierarchy is preserved, try with a small subset. Or, if it doesn't work you will sync 2M of files uselessly..
0
 
LVL 14

Expert Comment

by:small_student
ID: 24217771
Hi DrBill1

Check this solution

On the first day do
 rsync -avP src dst-usb-drive             This is a 0 (Full) Backup , you do it the first time

On the seond day do
on the usb drive
cp -al back0 back-$(date +%d%m%Y)

This will make a hard link dir to your original , no added size
Then
 rsync -avP --delete src dst-usb-drive/back-$(date +%d%m%Y)

this will only copy new files and thus you have a snap shot of each day, the --delete flag removes files that are deleted in the original src dir

Repeat step 2 every day (ofcourse you will implement it in a script)

Another solution is to use tar incremental backup, it takes only incrementals of each day not syncing
0

Featured Post

Find Ransomware Secrets With All-Source Analysis

Ransomware has become a major concern for organizations; its prevalence has grown due to past successes achieved by threat actors. While each ransomware variant is different, we’ve seen some common tactics and trends used among the authors of the malware.

Join & Write a Comment

Suggested Solutions

The purpose of this article is to fix the unknown display problem in Linux Mint operating system. After installing the OS if you see Display monitor is not recognized then we can install "MESA" utilities to fix this problem or we can install additio…
A Bare Metal Image backup allows for the restore of an entire system to a similar or dissimilar hardware. They are highly useful for migrations and disaster recovery. Bare Metal Image backups support Full and Incremental backups. Differential backup…
This tutorial will walk an individual through locating and launching the BEUtility application to properly change the service account username and\or password in situation where it may be necessary or where the password has been inadvertently change…
This tutorial will walk an individual through the process of configuring basic necessities in order to use the 2010 version of Data Protection Manager. These include storage, agents, and protection jobs. Launch Data Protection Manager from the deskt…

757 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

21 Experts available now in Live!

Get 1:1 Help Now