Solved

Server 2008R2 DFS Replication of Multiple Servers

Posted on 2013-01-21
3
465 Views
Last Modified: 2013-09-17
Hello Experts,

I need help restructuring (or recreating) our current DFS structure.  

Current scenario: I have two Corporate Engineering groups: one located at Main-HQ and another at ENGR-HQ.  I have 4 other remote locations, each with their own engineering group.  Each corporate group needs "local" access to everyones files.  Each of the other groups only need "local" access to their own files.  Right now, Main-HQ has everyones files and they are being replicated to ENGR-HQ via DFS-R.  Everyone accesses the files via a single namespace (\\domain\engineering\) but everyone outside of the two HQ offices are accessing the files remotely.

What I need to do is split the data up by location (5 sites) and use targets to each locations server share so that each location is working "locally" but accessing the share through the same UNC path and have each location replicate their share to the corporate office so the corporate guys have "local" access to all the files.  Then I want to replicate everyones files to the same location they are currently being sent to.


Here are my questions and concerns:

1. The current namespace is in use and Im not sure how its setup so I want to avoid interupting service to this namespace. Should I create a new namespace?

2. All the current data is on the existing namespace server.  Once the data can be put back on the appropriate server, should the replicated data go on another server?  Basically, should the namespace server be just a namespace server?

3. Once all the replication links are established, should the new "all-inclusive" folder be replicated to another site or should I setup two replication links; one to each server that needs all the data?


I hope this is clear.  If not, I can try to explain in more detail.
0
Comment
Question by:Justin Perry
3 Comments
 

Author Comment

by:Justin Perry
ID: 38803585
Let me explain it a little differently:

The goal of this project was to get the engineering folder structure standardized across all locations.  

Someone brought up using DFS to have them all accessing the same data. That lead to a single namespace of \\domain\engineering\.  Under the root was the different components of the engineering department: Mechanical, Drafting, Thermal, etc., then separated by Year, then by job number.  

The tree looks like this:
\\domain\engineering\drafting\2012\123456\filename.dwg
\\domain\engineering\mechanical\2013\654321\filename.txt

Because of our small WAN pipes to a few locations, the decision was made to pull everyones data into the DFSRoot folder (Engineering) on the DFS namespace server and replicate that entire folder back to everyone.  The first replication was setup between both corporate engineering locations.  That is working fine. Then I get handed the project after the co-worker who set it up had left.

I tried to setup the full folder replication to the only location that was not already using the new DFS structure and it was a nightmare on bandwidth and replication. (initially replicating 125GB of data, then trying to keep up with replicating a 2-3% daily change rate.

NEW THOUGHTS AND PLAN:

Each location only needs to have a local copy of their own data.  

The 2 corporate engineering groups need to have a local copy of everyones data.  

The non-corporate engineers do not necessarily need to have access to anyone elses data, as the need for them to read or write to another locations files is rare.

I think the best approach to this is to have non-corporate engineers use their local server and replicate those folders down to the main corporate engineering location.  The main corporate guys can utilize the DFS UNC (or drive mapping) to see everyone elses files, separated at the top level by location-named folders.  Since each folder is being replicated to a server local to them, they can access everyones files "locally" and the changes get replicated back to the intended location; not to everyone.

The other corporate location can have one of the following replication setups:
1. Replicate the consolidated folder from the main corporate site
2. Have each non-corporate server replicate to both the main corporate site and the other corporate site.

Hope this helps.
0
 
LVL 77

Expert Comment

by:arnold
ID: 38805062
basically, I think you are looking at achieving
\\domain\engineering\siteA
\\domain\engineering\siteB
such that each of these folders will replicate up to Main and Engineering HQ while the each site will only access their own.

server at siteB will have the \\domain\engineering\siteB share data while everything else will have to go remotely back to main/engr.

Since you have the main DFS as a single structure, adding folders to the name space might be a way to achieve what you are looking for.

siteA server1 d:\sitea will be setup with replication to Main and Engr HQs f:\sitea
You could limit the bandwidth you dedicate to the replication as well as schedule depending on the timeframe that you want the data to be available on the other two locations.
You would then control the access based on the site.

You would repeat the same for siteB.

The replication can be from each site to only one while the replication between ENGR and MAIN will be full.
This will depend on what bandwidth and how you want to use it.
i.e. sitea to main
 siteB to engr
main <=>  ENGR for sitea and siteb replication connections.
0
 
LVL 37

Accepted Solution

by:
ArneLovius earned 500 total points
ID: 38807787
I would add locations to the root of the folder structure and then use multiple replication groups.

\engineering\corp\folders
            \siteA\folders
            \siteB\folders

Open in new window


you could also use

\engineering\drafting\folders
                     \mechanical\folders
                     \siteA\folders
                     \siteB\folders

Open in new window


The key is multiple replication groups, you would need at least three, one for each site and one for the two other locations.

You could continue with the current structure, and just have multiple folders in the replication groups, but you run the risk of having name collisions if for example site A and site B both added a folder with the same name. If you split out the root folders for each of the remote sites, this cannot happen...
0

Featured Post

PRTG Network Monitor: Intuitive Network Monitoring

Network Monitoring is essential to ensure that computer systems and network devices are running. Use PRTG to monitor LANs, servers, websites, applications and devices, bandwidth, virtual environments, remote systems, IoT, and many more. PRTG is easy to set up & use.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Scenario:  You do full backups to a internal hard drive in either product (SBS or Server 2008).  All goes well for a very long time.  One day, backups begin to fail with a message that the disk is full.  Your disk contains many, many more backups th…
I had a question today where the user wanted to know how to delete an SSL Certificate, so I thought that I would quickly add this How to! Article for your reference. WHY WOULD YOU WANT TO DELETE A CERTIFICATE? 1. If an incorrect certificate was …
This tutorial will walk an individual through the steps necessary to enable the VMware\Hyper-V licensed feature of Backup Exec 2012. In addition, how to add a VMware server and configure a backup job. The first step is to acquire the necessary licen…
This Micro Tutorial hows how you can integrate  Mac OSX to a Windows Active Directory Domain. Apple has made it easy to allow users to bind their macs to a windows domain with relative ease. The following video show how to bind OSX Mavericks to …

770 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question