Robocopy backup with retention options

Hello, I want to use robocopy to backup a windows 7 C: drive folder to a server share.

I want to avoid a single copy of the backup in case the source gets deleted and it backs up an empty folder. Therefore I need some kind of retention to be able to go back in time. Can Robocopy do this?

Any other ideas for this scenario? 50 windows 7 machines to deploy solution to so needs to be scalable and remote managed.
LVL 1
PeteAsked:
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

x
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

NVITEnd-user supportCommented:
Just don't use the /mir, or /purge options
To avoid a single copy, you need to use a different destination each time. e.g.

robocopy /e /r:0 /w:0 /dcopy:t "C:\source" "C:\target1"

Open in new window

robocopy /e /r:0 /w:0 /dcopy:t "C:\source" "C:\target2"

Open in new window

Simon SmithCommented:
As an additional precaution on getting good and historical backups of those files using robocopy make sure to enable previous versions on the server and allocate it lots of disk space.

when I want a similar solution, I typically preface the robocopy command with a command or two manage the multiple folders.  For example extend the following 3 day script to a 7 or 10 or 14 day script by adding the appropriate number for REName commands and updating the RD (remove directory) command.

RD /S/Q C:\target3
REN C:\target2 target3
REN C:\target1 target2
robocopy /e /r:0 /w:0 /dcopy:t "C:\source" "C:\target1"


I do a variation on that to collect backups of normal.dotm (for Microsoft Word) from dozens of users computers.
Simon SmithCommented:
I assume you are wanting to use the /MIR switch to delete files from that backup that have been legitimately deleted, but you don't want the entire folder cleared if the robocopy command somehow checks an incorrect folder that replaced the folder you are interested in. I've tried doing the same thing with very mixed results using a conditional script to test for the existence and contents of the source folder.  In the end there was always an unexpected situation that stopped the backup from working when it was needed most. The rotating folder structure was the only method I had any long term success with. The downside to is the potential to use up a lot of disk space. Especially if a user drops in a iTunes playlist you weren't expecting.

The conditional command I used (not recommended by the way) would have been something close to this:

IF EXIST C:\Source\*.*  robocopy /mir /r:0 /w:0 /dcopy:t "C:\source" "C:\target1"

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
serialbandCommented:
robocopy is not the ideal solution to do a time machine type deduplication.  You end up with multiple copies that wastes the advantages of robocopy to only copy changes, unless you just rotate the backup through the multiple copies.  I wouldn't rename each copy and start over, that just wastes time and disk space.  You might do 7 different targets, 1 for each day of the week and schedule a task for each of the 7 copies on the different days.  You still have to do 7 time consuming, full copies at the beginning.  However, subsequent copies after the 7 days will be much quicker.

You should try Hard Link Backup for easier deduplication: https://www.lupinho.net/hardlinkbackup/  The free version allows a single save target per backup and you have to manually start the backup.  The paid version allows you to schedule your backup.  You can have numerous backups to a single target.  The enterprise version allows for 2 simultaneous backup targets, if you want to back up to 2 separate backup media at the same time.  You can run it from your server to pull the remote files from the workstation.

NTFS has had hardlinks with the mklink command.  You can make 1024 hard links per file, good for 1024 days of backups if done daily, or 512 days of backups, if done twice a day.  HardLink Backup makes use of those built-in hardlinks to deduplicate the file, saving disk space and a lot of time on all subsequent backups.  This is easier than installing and running the rsnapshot port from the linux world, but works similarly.  It is also less error prone than scripting robocopy to do a similar thing.

If you want even easier, you should just use mozy or box, however, those cost more.  Based on the fact that you want to use robocopy, I suspect that you don't want to pay too much which is why I suggested HardLink Backup.  It's not too expensive at 39 euros.
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Windows Server 2016

From novice to tech pro — start learning today.