Solved

Backup DFSR with Backup Exec 2012 on Windows 2008 R2 Cluster

Posted on 2014-02-13
6
2,475 Views
Last Modified: 2014-03-17
Hi,
I'm looking for the best way to backup our DFS replicated folders with Backup Exec 2012

Our file and DFS services are running on Windows 2008 R2 Cluster that consists of 2 nodes.

From Symantec Forums I found out that I need to backup both Folders and Shadow Copy Components. Otherewise the restore will not work properly (http://www.symantec.com/business/support/index?page=content&id=TECH85354)

I noticed that when I run File Cluster backup, DFSR Shadow Copy Components aren't there. I find them only on the physical nodes, which means, I have to setup 2 backups in order to backup these folders.

In this scenario, do both nodes' shadow copy compnents need to be backed up or just the active one?

Anyway, is proper automated backup even possible in this case, or is there a better way of doing it?
0
Comment
Question by:Alumicor
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 4
6 Comments
 
LVL 20

Expert Comment

by:compdigit44
ID: 39859905
Are you using a domain or stand alone DFS setup?
0
 

Author Comment

by:Alumicor
ID: 39859917
Stand Alone, but it's in the domain environment.
0
 
LVL 27

Accepted Solution

by:
Steve earned 400 total points
ID: 39923084
DFS is difficult to backup.
the official way is to backup each DFS server and its system volume information folder.
This means that restoring backups restores all DFS servers at the same time in an attempt to maintain consistency.

In practice it's a pain and may not be doable.

In the past I have generally backed up one ore more DFS servers. In the event of a problem, kill the replication group, restore the files and setup the replication again.
This takes time and can cause excessive traffic but is workable.
0
The Eight Noble Truths of Backup and Recovery

How can IT departments tackle the challenges of a Big Data world? This white paper provides a roadmap to success and helps companies ensure that all their data is safe and secure, no matter if it resides on-premise with physical or virtual machines or in the cloud.

 

Assisted Solution

by:Alumicor
Alumicor earned 0 total points
ID: 39923661
I already found a workaround, by simply backing up on file share level. One of my former colleagues suggested that it's the easiest way, since it avoids all that DFS pain. It's been working well so far.
0
 

Author Comment

by:Alumicor
ID: 39923664
Just as you do, I backup only 1 of the servers. We have a 100 meg pipe between 2 locations, so replication is quite fast.
0
 

Author Closing Comment

by:Alumicor
ID: 39933726
In practice it was the easiest to implement.
0

Featured Post

Backup Solution for AWS

Read about how CloudBerry Backup fully integrates your backups with Amazon S3 and Amazon Glacier to provide military-grade encryption and dramatically cut storage costs on any platform.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

A safe way to clean winsxs folder from your windows server 2008 R2 editions
In 2017, ransomware will become so virulent and widespread that if you aren’t a victim yourself, you will know someone who is.
This tutorial will show how to configure a new Backup Exec 2012 server and move an existing database to that server with the use of the BEUtility. Install Backup Exec 2012 on the new server and apply all of the latest hotfixes and service packs. The…
This tutorial will show how to configure a single USB drive with a separate folder for each day of the week. This will allow each of the backups to be kept separate preventing the previous day’s backup from being overwritten. The USB drive must be s…

726 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question