• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 905
  • Last Modified:

How do I run Robocopy with Shadowcopy script

I have multiple locations, connected with vpn. I am interested in backing up all my files from one location to the other location every night. I currently have backupexec on my main location and have tried backing up to the remote location. My problem was that backupexec failed when backing up using the above method. (that's both backups, local and remote.) Downside using backupexec, Being that I have the constraints of a vpn (speed), I don't want to backup files which were backed up already. Hence, differential, would just build up over time and kill me, till I have to do another full backup. My other option would be incremental, that means I would also have to do another full backup every now and then, plus I would have the headache of restoring all backups.

I have therefore thought robocopy would be a great tool. I have 3 problems with this. 1) Open files, 2) Logs. 3) I need the logs, failed or success to be emailed in simple language.

I have read that robocopy could be run in conjunction with shadow copy. Question, could I have only those files that failed be copied via shadow copy, or being that shadow copy is a new file, would that mean that all files will be copied again?
1 Solution
I don't know about robocopy with shadow copy, but I will tell you hat I do and maybe it will work for you too.

I use DFS-R available in Windows 2003 R2 and above to replicate the files from all of my remote servers to my central file server. I then just backup my central server. I have 11 servers replicating hundreds of GB over T1 connection. DFS-R is very bandwidth efficient, and it only copies over changed bytes, which is as efficient as it gets. This only works for file servers, but it works very well.
For a simple backup/replication solution, robocopy is definitely that. Take a look at this post of mine. You xan certainlly use Robocopy in the scenario you mentioned.

If you add the switch /b, it runs in backup mode.

If you add the switch /log:lathToLog.log, it will create a log for you.

The issue now, is hat yo want a daily log that is unique. Unfortunately it will overwrite or append the same name. Not good.

DFS fnctionality is great but this my solve your issues  

My fix for this is to get a free app called namedate.exe. With namedate, you can now append the current date to your robolog file.

I set this all up in a batch script.
You need to use an Rsync method for bakup.

Try this:

This guy post about using VSS like you mentioned: http://blogs.msdn.com/b/adioltean/archive/2005/01/20/357836.aspx. The VSS copy can be mounted as a drive letter or within an empty folder. It's uses vshadow.exe which is from my research is part of the VSS SDK.  As for robocopy it will only copy the files that have changed. Plus it has lots of options including the /IPG for fine tuning its performance over a WAN.

I'd recommend the following command if you want to mirror the content. Note, this will delete files that sre deleted on the source.

robocopy <source> <destination> *.* /mir /copyall /z /r:0 /w:0 /np /log:<log path>

I'd recommend the following command if you don't want to erase older files.

robocopy <source> <destination> *.* /e /copyall /z /r:0 /w:0 /np /log:<log path>

Featured Post

Free Tool: Path Explorer

An intuitive utility to help find the CSS path to UI elements on a webpage. These paths are used frequently in a variety of front-end development and QA automation tasks.

One of a set of tools we're offering as a way of saying thank you for being a part of the community.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now