Link to home
Start Free TrialLog in
Avatar of 1shop
1shop

asked on

2003, IIS and DFS

Anyone famillar with DFS, I'd like your input on using it for this purpose.

We are looking for a way to syncronize a folder accross multiple web servers.  The content is images for IIS.

Proposed config:   Replace current local folder of images (approx 100 thousand files) with a DFS share of these images.  Needs quick replication and redundancy.

Current config:  8 web servers with approx 100 sites each (identical) pulling images from local folders that are syncronized via robocopy.exe.

Note - the image files change regularly.

Ideal - access the images from the DFS folder locally on each web server.  However I've read that accessing content directly rather then via the DFS tree can be problematic.

Thanks in advance for input and suggestions.
Avatar of tigermatt
tigermatt
Flag of United Kingdom of Great Britain and Northern Ireland image


If all you are looking for is replication, then there would be no need to put the shares into a namespace. DFS Replication works at the file system level, so it would simply be a case of setting up, in DFS, the replication feature. You would specify the servers to replicate between and the paths on each server's hard drive which should be replicated.

In this case, you can safely access and write to the share locally, and this would be replicated between the servers. It's only when you start using namespaces that accessing shares directly on particular servers becomes a problem.

-tigermatt
So lets talk a little bit about the sense of this functionality:

DFS should provide a common source for clients, so you can provide one Share to a couple of directories, which resides on different locations or on different servers. These folders can be replicated to other server or directories to increase redundancy or just to provide faster accessw to the resources.

If you want to use DFS to replicate contents to different web-site, it may work of course, but as you said, you have 8 web sites, you may produce a lot of traffic. You may setup a test system whith about two servers just to see, what traffic you will produce and how your servers can handle that.

Reading them locally may not the problem for the web-servers, I think writing is more an issue if you directly access the files instead of using the share.

I#m not quite sure, why you are not referencing the same file structure from all of the web sites instead of copying them around?
As long as you don't modify the same file from multiple servers the same time, you can try DFS. The only catch is that when creating the DFS Replication, you would double check on the size for staging and make sure they all identical with enough space as the initial replication. You may want to upgrade your win2k3 to R2 if you haven't as the replication will work better. But test it out first as 100 thousands files(not sure the size), and depending on how frequent the changes will be made, you could create a lot of traffics during normal business hours. I suggest you schedule the replication interval as needed.

Honestly, if you are using robocopy to do the synchnization, you may want to take a look at the SecureCopy http://www.scriptlogic.com/products/securecopy/Comparesecurecopy.asp which will provide you with a bit more friendly features. But again, for the number of server and data size, you may eventually need a more robust product that allow you with better control of wan bandwidth and with a more reliable replication such as the XOSOFT from CA: http://www.ca.com/us/products/product.aspx?ID=5879
Avatar of 1shop
1shop

ASKER

Current Web servers are configured to be identical and behind a Load balancer...  the challenge is that each web server must have the exact same images available.  The images files are subject to change as our clients upload images etc.  The changes really are minimal perhaps a couple hundred files a day.  As we grow, synchronizing the "images" directory is now taking significant time and resources.

Due to access speed and some challenges using unc path references for portions of our application,  it is ideal if these files can remain local on the servers and take advantage of the multimaster sync used in DFS.  (Currently we are using a script and robocopy.exe).

Directory size is 5.72 gig, 199,336 files, 66,738 folders.

Ideal - sync the file to all web servers and have IIS continue to access them locally
less ideal is to move to a full DFS File share access method and access via unc path.

Follow up what tigermatt said, but nevertheless I would recommend, to check this within a lab.
ASKER CERTIFIED SOLUTION
Avatar of tigermatt
tigermatt
Flag of United Kingdom of Great Britain and Northern Ireland image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of 1shop

ASKER

Thanks very much - I'll VM this up and try it out.