Link to home
Start Free TrialLog in
Avatar of Bobby
BobbyFlag for United States of America

asked on

need to create script to back up directories

Linux. I need to set up a script that I can run via cron in the middle of every night to copy / back up specific folders into another directory. I want all folders, files and permissions and attributes copied. The goal is to have a 24 hour old copy that I can restore in case a user screws up their files. Example:

/usr/home/thisuser/web_master would be copied / backed up to /usr/home/mainuser
/usr/home/thisotheruser/web_master would be copied / backed up to /usr/home/mainuser
/usr/home/andanotheruser/web_master would be copied / backed up to /usr/home/mainuser

Each night, the copy / back up would overwrite the one from 24 hours ago.

How do I set this script up, and what type of file extension should the script be?
Avatar of noci
noci

most easy is using tar  to create a backup set.

lets put the list of usernames in /usr/local/etc/webmasters_to_backup
thisuser
thisotheruser
andanotheruser

Open in new window

lets call the script:   /usr/local/bin/backup_webmaster
#!/bin/bash

for u in $( cat /usr/local/etc/webmasters_to_backup ) 
do
    tar czf /usr/home/mainuser/${u}.tgz  -C /usr/home/${u}/web_master .
done

Open in new window

chmod 755 /usr/local/bin/backup_webmaster

And add a cron entry for root:
0 0 * * * /usr/local/etc/backup_webmaster

Open in new window


To restore execute:

tar xvpvf /usr/home/mainuser/<someuser>.tgz -C /usr/home/<someuser>/web_master the/file/to-restore
Avatar of Bobby

ASKER

Thanks. Why tar?
Tar will preserve owners & protection mask in the most easy way.
If you also use ACL's you actually may need  star.

Just copying files won't get you the deletions that would be needed as well.
rsync could be another tool to use.  (tar is mostly installed anyway, rsync maybe not).
Also you didn't specify directories where data could be placed, to using an archive file cirumvents that.
(mingling of data of various users seems counter productieve if file overwrite each other.
Avatar of Bobby

ASKER

While I was waiting I did this in a .sh file and ran it, it worked (except it skipped copying symlinks,.... can that be changed?)... would this be good enough? And I assume it will just overwrite each folder each day? That's what I want.

rsync -r /usr/www/users/userone/eimg/ /usr/home/mainuser/backups/eimg/
rsync -r /usr/www/users/mainuser/beta.me.dev.mysite.com/ /usr/home/mainuser/backups/beta.me.dev.mysite.com/

Open in new window

No you need rsync -a for that (archival mode).
or: rsync -rlptgoD
which is the same...
See man rsync for details on the flags.

https://linux.die.net/man/1/rsync
Avatar of Bobby

ASKER

Ok, can you please tell me how to write these two rsync commands so that they will copy everything in each folder, including symlinks, and then overwrite the previous copy at the destination. Thanks much.
rsync -r /usr/www/users/userone/eimg/ /usr/home/mainuser/backups/eimg/
rsync -r /usr/www/users/mainuser/beta.me.dev.mysite.com/ /usr/home/mainuser/backups/beta.me.dev.mysite.com/

Open in new window

Avatar of Bobby

ASKER

or is it just a simple matter of changing the -r to -a?
ASKER CERTIFIED SOLUTION
Avatar of noci
noci

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of Bobby

ASKER

The cron job executed the script last night, I had just the -a, not -av, and everything looks intact. What's the difference between -a and -av? Verbose? Where does that info go?
Avatar of Bobby

ASKER

If I add the v after the -a will I be sent an email with the info? The server does send me an email if the backup fails, I tested that by renaming one of the folders the script backs up.
Based on what you've asked for, noci's suggestion is best.

Either rsync -a or -av (-v just means verbose [read the man page] which produces a list of file changes) for super fast cloning of files between directories.

If you do something like this...

rsync -a source-dir target-dir 2>&1 | your-mail-sending-voodoo-here

Open in new window


You'll get a list of problems, add -v to get an entire list of files changed.

Tip: If you're doing this to attempt reaching a consistent file state in your target-dir(s) to then do backups, use tar + zstd for least resource usage + maximum file compression.

Tip: Also, you can go one step further + do what I do for my backups. I do an rsync where my target-dir is on another machine. Then after all all machines (many) finish their rsync to the backup server, I create full backups (1x/week) or incremental backups (6x/week) on the backup machine. This way the tar + zstd has no effect on runtime performance of production sites.
Avatar of Bobby

ASKER

Thank you both very much.