Bobby
asked on
need to create script to back up directories
Linux. I need to set up a script that I can run via cron in the middle of every night to copy / back up specific folders into another directory. I want all folders, files and permissions and attributes copied. The goal is to have a 24 hour old copy that I can restore in case a user screws up their files. Example:
/usr/home/thisuser/web_mas ter would be copied / backed up to /usr/home/mainuser
/usr/home/thisotheruser/we b_master would be copied / backed up to /usr/home/mainuser
/usr/home/andanotheruser/w eb_master would be copied / backed up to /usr/home/mainuser
Each night, the copy / back up would overwrite the one from 24 hours ago.
How do I set this script up, and what type of file extension should the script be?
/usr/home/thisuser/web_mas
/usr/home/thisotheruser/we
/usr/home/andanotheruser/w
Each night, the copy / back up would overwrite the one from 24 hours ago.
How do I set this script up, and what type of file extension should the script be?
ASKER
Thanks. Why tar?
Tar will preserve owners & protection mask in the most easy way.
If you also use ACL's you actually may need star.
Just copying files won't get you the deletions that would be needed as well.
rsync could be another tool to use. (tar is mostly installed anyway, rsync maybe not).
Also you didn't specify directories where data could be placed, to using an archive file cirumvents that.
(mingling of data of various users seems counter productieve if file overwrite each other.
If you also use ACL's you actually may need star.
Just copying files won't get you the deletions that would be needed as well.
rsync could be another tool to use. (tar is mostly installed anyway, rsync maybe not).
Also you didn't specify directories where data could be placed, to using an archive file cirumvents that.
(mingling of data of various users seems counter productieve if file overwrite each other.
ASKER
While I was waiting I did this in a .sh file and ran it, it worked (except it skipped copying symlinks,.... can that be changed?)... would this be good enough? And I assume it will just overwrite each folder each day? That's what I want.
rsync -r /usr/www/users/userone/eimg/ /usr/home/mainuser/backups/eimg/
rsync -r /usr/www/users/mainuser/beta.me.dev.mysite.com/ /usr/home/mainuser/backups/beta.me.dev.mysite.com/
No you need rsync -a for that (archival mode).
or: rsync -rlptgoD
which is the same...
See man rsync for details on the flags.
https://linux.die.net/man/1/rsync
or: rsync -rlptgoD
which is the same...
See man rsync for details on the flags.
https://linux.die.net/man/1/rsync
ASKER
Ok, can you please tell me how to write these two rsync commands so that they will copy everything in each folder, including symlinks, and then overwrite the previous copy at the destination. Thanks much.
rsync -r /usr/www/users/userone/eimg/ /usr/home/mainuser/backups/eimg/
rsync -r /usr/www/users/mainuser/beta.me.dev.mysite.com/ /usr/home/mainuser/backups/beta.me.dev.mysite.com/
ASKER
or is it just a simple matter of changing the -r to -a?
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
The cron job executed the script last night, I had just the -a, not -av, and everything looks intact. What's the difference between -a and -av? Verbose? Where does that info go?
ASKER
If I add the v after the -a will I be sent an email with the info? The server does send me an email if the backup fails, I tested that by renaming one of the folders the script backs up.
Based on what you've asked for, noci's suggestion is best.
Either rsync -a or -av (-v just means verbose [read the man page] which produces a list of file changes) for super fast cloning of files between directories.
If you do something like this...
You'll get a list of problems, add -v to get an entire list of files changed.
Tip: If you're doing this to attempt reaching a consistent file state in your target-dir(s) to then do backups, use tar + zstd for least resource usage + maximum file compression.
Tip: Also, you can go one step further + do what I do for my backups. I do an rsync where my target-dir is on another machine. Then after all all machines (many) finish their rsync to the backup server, I create full backups (1x/week) or incremental backups (6x/week) on the backup machine. This way the tar + zstd has no effect on runtime performance of production sites.
Either rsync -a or -av (-v just means verbose [read the man page] which produces a list of file changes) for super fast cloning of files between directories.
If you do something like this...
rsync -a source-dir target-dir 2>&1 | your-mail-sending-voodoo-here
You'll get a list of problems, add -v to get an entire list of files changed.
Tip: If you're doing this to attempt reaching a consistent file state in your target-dir(s) to then do backups, use tar + zstd for least resource usage + maximum file compression.
Tip: Also, you can go one step further + do what I do for my backups. I do an rsync where my target-dir is on another machine. Then after all all machines (many) finish their rsync to the backup server, I create full backups (1x/week) or incremental backups (6x/week) on the backup machine. This way the tar + zstd has no effect on runtime performance of production sites.
ASKER
Thank you both very much.
lets put the list of usernames in /usr/local/etc/webmasters_
Open in new window
lets call the script: /usr/local/bin/backup_webmOpen in new window
chmod 755 /usr/local/bin/backup_webmAnd add a cron entry for root:
Open in new window
To restore execute:
tar xvpvf /usr/home/mainuser/<someus