Solved

BackupExec - Should I backup DFS replicated folders, or only the original folders?

Posted on 2008-06-11
6
1,906 Views
Last Modified: 2013-12-01
We syncronize two file servers across our MPLS using DFS.  I noticed today in BackupExec that along with the original "DATA\sub-folders" on the primary server, we are also backing up the same sub-folders and files as they appear under another tree on that same server that begins with
"Shadow Copy Components\User Data\Distributed File System Replication\DfsrReplicatedFolders\sub-folders".

Do I need to backup both folder structures?  Instinct tells me that we should backup one or the other, but I am unclear on the ramifications of ignoring one over the other.  

Should I continue to backup both sets of folders?  And if not, which should be backed up?
0
Comment
Question by:cbns
  • 3
  • 2
6 Comments
 
LVL 17

Accepted Solution

by:
John Gates earned 500 total points
ID: 21792702
What time is your backup running?  Is it possible files are getting written to any of the DFS partners at that time?  If not there is no reason to backup anything but the main share as it is all replicated anyway.


-D-
0
 
LVL 17

Assisted Solution

by:John Gates
John Gates earned 500 total points
ID: 21792712
Also another approach which I use in practice is to have Volume Shadow Copy running on the volumes hosting DFS shares that way I do have state in time backups of the different shares.  That VSS information is then backed up.

-D-
0
 

Author Comment

by:cbns
ID: 21794288
Thank you so much for getting back to me.  I think you've almost got me on track, but I need just a bit more clarification.  I realize I'm being a bit dense here.  My apologies.  My confusion stems I think, because I thought that only metadata was stored in the primary server's DFS system, so was confused to see actual files saved under the "shadow copy components\..\..\..\DfsrReplicatedFolders" folder of that same server.

In my case, users would only rarely be writing files during my backup window, so I can follow your suggestion of backing up only the main share.  Please bear with me, but when you reference the main share, do you mean the actual folders that existed on my primary file server prior to setting up DFS, or do you mean the folders that appear under the Shadow Copy Components folder on that same server?  And if you mean the "original" folders, do I take from this that there are not other control files under "shadow copy components" that I must backup?

Also, shoud I take from your second post that even though Backup Exec is making use of Shadow Copy, "Volume Shadow copy" must be implemented as a separate service?  I have been thinking of implementing it but have not brought myself up to speed on using it yet.

Thanks again for your response and your patience. Mike
0
Windows Server 2016: All you need to know

Learn about Hyper-V features that increase functionality and usability of Microsoft Windows Server 2016. Also, throughout this eBook, you’ll find some basic PowerShell examples that will help you leverage the scripts in your environments!

 

Author Closing Comment

by:cbns
ID: 31466225
Thanks again.  You've motivated me to implement VSC as well.
0
 

Author Comment

by:cbns
ID: 21815310
Please note that my intent was to give all points originally offered.  If I left any out of the total I just distributed please award the additional points to dimante.
0
 
LVL 14

Expert Comment

by:Wonko_the_Sane
ID: 33468092
I realize this is really old, but I think it needs clarification since this is an important topic.

If you have folders replicated using DFSR, then you MUST include the folders under Shadow Copy Components or the replicated data will not get backed up in Backup Exec! This is especially dangerous since your job will complete without errors if you do not check it but the data will not be included in your backup. The only indicator would be that the amount of data is too low, but if you don't check that manually it is very easy to miss.

Even if the data is replicated somewhere else you may want it in your local backup for faster restores etc., and you will have to include it in at least one copy.

0

Featured Post

U.S. Department of Agriculture and Acronis Access

With the new era of mobile computing, smartphones and tablets, wireless communications and cloud services, the USDA sought to take advantage of a mobilized workforce and the blurring lines between personal and corporate computing resources.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Meet the world's only “Transparent Cloud™” from Superb Internet Corporation. Now, you can experience firsthand a cloud platform that consistently outperforms Amazon Web Services (AWS), IBM’s Softlayer, and Microsoft’s Azure when it comes to CPU and …
Join Greg Farro and Ethan Banks from Packet Pushers (http://packetpushers.net/podcast/podcasts/pq-show-93-smart-network-monitoring-paessler-sponsored/) and Greg Ross from Paessler (https://www.paessler.com/prtg) for a discussion about smart network …
This tutorial will walk an individual through the process of configuring basic necessities in order to use the 2010 version of Data Protection Manager. These include storage, agents, and protection jobs. Launch Data Protection Manager from the deskt…
After creating this article (http://www.experts-exchange.com/articles/23699/Setup-Mikrotik-routers-with-OSPF.html), I decided to make a video (no audio) to show you how to configure the routers and run some trace routes and pings between the 7 sites…

867 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

16 Experts available now in Live!

Get 1:1 Help Now