Solved

HA File Server on Server 2008

Posted on 2013-06-11
6
250 Views
Last Modified: 2013-06-23
Looking at ways to implement a HA solution on Server 2008.

I had a look at DFS Replication but the problem is that it only allows a single namespace server and selecting sharing folders with multiple target locations.

Failure of the namespace server would result in not being able to access folders using the namespace. Additionally from what I've read, there is no mechanism to lock files when opened by one user, so there'll be issues of incorrectly opening/saving data here.

Alternatively, I can setup a failover cluster as the following guide:
http://blogs.technet.com/b/filecab/archive/2009/06/29/deploying-dfs-replication-on-a-windows-failover-cluster-part-i.aspx
http://blogs.technet.com/b/filecab/archive/2009/06/29/deploying-dfs-replication-on-a-windows-failover-cluster-part-ii.aspx
http://blogs.technet.com/b/filecab/archive/2009/06/29/deploying-dfs-replication-on-a-windows-failover-cluster-part-iii.aspx

However, this arrangement does require shared clustered storage and I'm no looking to invest on any expensive SAN solution. I thinking possibly using some other means of storage virtualisation.

Basically, I like to setup a HA file server with just two servers and in the cost effective way. If there any suggestions on achieving the mentioned, please let me know. Thank you.
0
Comment
Question by:dave558
  • 4
6 Comments
 

Author Comment

by:dave558
ID: 39239983
Regarding the DFS namespace and replication, it does seem to place locks on files. Anyone able to clarify and how is it achieving.

I noticed Word documents are able to lock the files from being edited from another user by the Word application. However, it doesn't prevent Notepad from accessing and modify the content which appears.

So it seems certain applications like Word does prevent editing/renaming/deleting of locked files and where as others like Notepad does not care whether a file is locked nor does it place a lock itself and can modify/delete that files anytime.

Just reading on byte-range locks for file locking but would like to know more about this mechanism. Are the locks managed by the OS with some sort of databases tracking information about opened file and portion of the files that are locked?
0
 

Author Comment

by:dave558
ID: 39240002
Actually, there does appear to be an option to add multiple namespace servers.

However, after disconnecting one of the file servers, it doesn't seem to use the second namespace server for file access; I can't access the network drive.
0
 
LVL 76

Expert Comment

by:arnold
ID: 39241268
You can not use links but use targets where the shares exist.
Then double check your DFS name space ditributioin.
What do you mean disconnecting?  unplugging from the network, or removing as a target.

THe disconnect from network recovers faster than the propagation of the removal of a target from a possible list.

Not sure what locks you mean.  client applications that access a file and lock it will prevent the replication of the file until the client application exits.
i.e. office products create a ~filename.doc for each file they open.
open files are not replicated.  When you save/close the file it will be replicated.
0
Zoho SalesIQ

Hassle-free live chat software re-imagined for business growth. 2 users, always free.

 
LVL 56

Accepted Solution

by:
Cliff Galiher earned 500 total points
ID: 39241753
DFS-R really isn't meant to be used as for HA and yes, file locks will be an issue. It isn't that the files can't be locked, it is that only the referenced file is locked, not all replicants. Here is a scenario where that can be a problem:

HA, by definition, means you probably don't want to have to manually fail over or fail back. Otherwise it wouldn't technically be "highly" available. It'd more of a DR setup. Since, with HA, you'd want that to be automatic, there are plenty of network conditions that could cause one workstation (we'll call this workstation A) to get a reference to Server A from the namespace server, and workstation B to later get a reference to Server B from the nameserver.

Now, when workstation A *opens* a file, it'd place a lock on the file. But that lock would *only* be on the file on Server A. DFS-R would *not* lock the copy on Server B. So Workstation B could open the copy that exists on Server B without ever "seeing" the lock. And then you are in a situation where a version conflict would exist.

That is where DFS-R can really bite you, and it isn't even that rare. It isn't an edge case. When sysadmins try to use DFS-R for HA, it almost always happens within a week or two, and the user experience creates a bad perception, even if the actual technical issue may seem minor.

As far as HA goes, you do need to use shared storage. There is no truly "cheap" way to do this, but for just a two server setup, shared SAS would be quite a bit less expensive than a SAN. It can get you into a truly HA setup *relatively* inexpensively. If shared SAS still breaks the budget, I'd honestly have to argue that the company probably doesn't really need HA. The point of HA is that *any* downtime costs enough money that the cost of shared storage is less that the revenue that'd be lost, therefore is justifiable. All a matter of calculating the ROI.

-Cliff
0
 

Author Comment

by:dave558
ID: 39243084
In terms of ROI, it's not worthwhile to invest in SAN/shared SAS.

Thought I'd get to see if there's ways of setting up something similar, possibly virtualisation of normal SATA hard drives as shared storage. Performance isn't a criteria here as there not too many people accessing the file server at the same time.

Thanks for your comments, greatly appreciated.
0
 

Author Closing Comment

by:dave558
ID: 39270128
Thanks, very insightful.
0

Featured Post

Find Ransomware Secrets With All-Source Analysis

Ransomware has become a major concern for organizations; its prevalence has grown due to past successes achieved by threat actors. While each ransomware variant is different, we’ve seen some common tactics and trends used among the authors of the malware.

Join & Write a Comment

Scenario:  You do full backups to a internal hard drive in either product (SBS or Server 2008).  All goes well for a very long time.  One day, backups begin to fail with a message that the disk is full.  Your disk contains many, many more backups th…
A safe way to clean winsxs folder from your windows server 2008 R2 editions
This tutorial will walk an individual through locating and launching the BEUtility application and how to execute it on the appropriate database. Log onto the server running the Backup Exec database. In a larger environment, this would generally be …
This tutorial will show how to configure a new Backup Exec 2012 server and move an existing database to that server with the use of the BEUtility. Install Backup Exec 2012 on the new server and apply all of the latest hotfixes and service packs. The…

706 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

19 Experts available now in Live!

Get 1:1 Help Now