syscrash
asked on
Backup script using tar/rsync or something
I have been attempting to create a backup script heres what I would like to have it do..
Full weekly backup every Sunday.
Incrementals Mon-Fri
Name each backup username-date-.tar.gz(or something similar)
Remove anything older than 7 days
Heres our setup...
We have user home directories stored in /Volumes/Data/homedir all user home directories are seperated via class year IE: 2013, 2012 etc etc.
I have been messing around with scripting for the last couple of days heres what I have so far ( be easy on me I am just learning...)
#! /bin/bash
freshmen="/Volumes/Data/ho medir/2013 /*"
sophmore="/Volumes/Data/ho medir/2012 /*"
date=`date +%Y%m%d`
dow=`/bin/date +%a`
dom=`/bin/date +%d`
sophfulltarget="/Volumes/B ackups/wee kly/2012/"
for u in ${sophmore} ;do
if [ "${u} in ${sophmore} and ${dow} =Thu" ]
then
tar czf $u-${date}.tar $u
mv $u-${date}.tar ${sophfulltarget}
fi
done
Full weekly backup every Sunday.
Incrementals Mon-Fri
Name each backup username-date-.tar.gz(or something similar)
Remove anything older than 7 days
Heres our setup...
We have user home directories stored in /Volumes/Data/homedir all user home directories are seperated via class year IE: 2013, 2012 etc etc.
I have been messing around with scripting for the last couple of days heres what I have so far ( be easy on me I am just learning...)
#! /bin/bash
freshmen="/Volumes/Data/ho
sophmore="/Volumes/Data/ho
date=`date +%Y%m%d`
dow=`/bin/date +%a`
dom=`/bin/date +%d`
sophfulltarget="/Volumes/B
for u in ${sophmore} ;do
if [ "${u} in ${sophmore} and ${dow} =Thu" ]
then
tar czf $u-${date}.tar $u
mv $u-${date}.tar ${sophfulltarget}
fi
done
ASKER
Yea, I have looked at ribs... not what I want really.
My point wasn't so much to say "You should use ribs" as it was to provide a script that already has code snippets and structure that should allow you to pick and choose the functions you want and design your own script based on it. Ribs has built in most of the functionality you want with a lot of other stuff you dont, so take the individual functions out of it and paste them into a new script that accomplishes what you're really looking for.
ASKER
Just going to use carbon copy cloner for our macs
Hi would just use rsnapshot which is a wrapper to rsync and have already a daily and weekly incremantal ready to use..
If you need config examples let me know.
http://rsnapshot.org/
If you need config examples let me know.
http://rsnapshot.org/
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
If you are using tar to backup, your incremental backup will be a problem, unless you create a file of the data you allready backed up. I don't have a ready solution for you, but you can create this file using the find command and md5sum. Something like find / -type f -exec md5sum \{\} \; > backedupfiles. You can parse this file any number of times and filter out the files you don´t want to backup. Now when you want to create a incremental backup you run this find command again, and take only the difference to create a new backup. Then when done you could overwrite the backedupfiles with this new file. You never need to read your tarred and zipped backupfiles this way.
ASKER
I decided to use backuppc, this suggestion came after we had been using it for a month.
Rather than reinvent the wheel, I would suggest to take a look at something that's already been done and tweak it to suit your needs. Check this out:
http://www.rustyparts.com/ribs.php
This uses rsync to and file compares the backups to the live file store and backups up the changes WHILE keping the history backups intact at the same time. It's also got some other nifty features in it, such as emailing you a daily backup report, etc. The only downside is that the backups wont be compressed, but the disk space difference shouldn't be back-breaking and the flexibility you gain might be more than worth the trade off.