Solved

HA File Server on Server 2008

Posted on 2013-06-11
6
254 Views
Last Modified: 2013-06-23
Looking at ways to implement a HA solution on Server 2008.

I had a look at DFS Replication but the problem is that it only allows a single namespace server and selecting sharing folders with multiple target locations.

Failure of the namespace server would result in not being able to access folders using the namespace. Additionally from what I've read, there is no mechanism to lock files when opened by one user, so there'll be issues of incorrectly opening/saving data here.

Alternatively, I can setup a failover cluster as the following guide:
http://blogs.technet.com/b/filecab/archive/2009/06/29/deploying-dfs-replication-on-a-windows-failover-cluster-part-i.aspx
http://blogs.technet.com/b/filecab/archive/2009/06/29/deploying-dfs-replication-on-a-windows-failover-cluster-part-ii.aspx
http://blogs.technet.com/b/filecab/archive/2009/06/29/deploying-dfs-replication-on-a-windows-failover-cluster-part-iii.aspx

However, this arrangement does require shared clustered storage and I'm no looking to invest on any expensive SAN solution. I thinking possibly using some other means of storage virtualisation.

Basically, I like to setup a HA file server with just two servers and in the cost effective way. If there any suggestions on achieving the mentioned, please let me know. Thank you.
0
Comment
Question by:dave558
  • 4
6 Comments
 

Author Comment

by:dave558
ID: 39239983
Regarding the DFS namespace and replication, it does seem to place locks on files. Anyone able to clarify and how is it achieving.

I noticed Word documents are able to lock the files from being edited from another user by the Word application. However, it doesn't prevent Notepad from accessing and modify the content which appears.

So it seems certain applications like Word does prevent editing/renaming/deleting of locked files and where as others like Notepad does not care whether a file is locked nor does it place a lock itself and can modify/delete that files anytime.

Just reading on byte-range locks for file locking but would like to know more about this mechanism. Are the locks managed by the OS with some sort of databases tracking information about opened file and portion of the files that are locked?
0
 

Author Comment

by:dave558
ID: 39240002
Actually, there does appear to be an option to add multiple namespace servers.

However, after disconnecting one of the file servers, it doesn't seem to use the second namespace server for file access; I can't access the network drive.
0
 
LVL 77

Expert Comment

by:arnold
ID: 39241268
You can not use links but use targets where the shares exist.
Then double check your DFS name space ditributioin.
What do you mean disconnecting?  unplugging from the network, or removing as a target.

THe disconnect from network recovers faster than the propagation of the removal of a target from a possible list.

Not sure what locks you mean.  client applications that access a file and lock it will prevent the replication of the file until the client application exits.
i.e. office products create a ~filename.doc for each file they open.
open files are not replicated.  When you save/close the file it will be replicated.
0
Best Practices: Disaster Recovery Testing

Besides backup, any IT division should have a disaster recovery plan. You will find a few tips below relating to the development of such a plan and to what issues one should pay special attention in the course of backup planning.

 
LVL 57

Accepted Solution

by:
Cliff Galiher earned 500 total points
ID: 39241753
DFS-R really isn't meant to be used as for HA and yes, file locks will be an issue. It isn't that the files can't be locked, it is that only the referenced file is locked, not all replicants. Here is a scenario where that can be a problem:

HA, by definition, means you probably don't want to have to manually fail over or fail back. Otherwise it wouldn't technically be "highly" available. It'd more of a DR setup. Since, with HA, you'd want that to be automatic, there are plenty of network conditions that could cause one workstation (we'll call this workstation A) to get a reference to Server A from the namespace server, and workstation B to later get a reference to Server B from the nameserver.

Now, when workstation A *opens* a file, it'd place a lock on the file. But that lock would *only* be on the file on Server A. DFS-R would *not* lock the copy on Server B. So Workstation B could open the copy that exists on Server B without ever "seeing" the lock. And then you are in a situation where a version conflict would exist.

That is where DFS-R can really bite you, and it isn't even that rare. It isn't an edge case. When sysadmins try to use DFS-R for HA, it almost always happens within a week or two, and the user experience creates a bad perception, even if the actual technical issue may seem minor.

As far as HA goes, you do need to use shared storage. There is no truly "cheap" way to do this, but for just a two server setup, shared SAS would be quite a bit less expensive than a SAN. It can get you into a truly HA setup *relatively* inexpensively. If shared SAS still breaks the budget, I'd honestly have to argue that the company probably doesn't really need HA. The point of HA is that *any* downtime costs enough money that the cost of shared storage is less that the revenue that'd be lost, therefore is justifiable. All a matter of calculating the ROI.

-Cliff
0
 

Author Comment

by:dave558
ID: 39243084
In terms of ROI, it's not worthwhile to invest in SAN/shared SAS.

Thought I'd get to see if there's ways of setting up something similar, possibly virtualisation of normal SATA hard drives as shared storage. Performance isn't a criteria here as there not too many people accessing the file server at the same time.

Thanks for your comments, greatly appreciated.
0
 

Author Closing Comment

by:dave558
ID: 39270128
Thanks, very insightful.
0

Featured Post

PRTG Network Monitor: Intuitive Network Monitoring

Network Monitoring is essential to ensure that computer systems and network devices are running. Use PRTG to monitor LANs, servers, websites, applications and devices, bandwidth, virtual environments, remote systems, IoT, and many more. PRTG is easy to set up & use.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Scenario:  You do full backups to a internal hard drive in either product (SBS or Server 2008).  All goes well for a very long time.  One day, backups begin to fail with a message that the disk is full.  Your disk contains many, many more backups th…
Remote Apps is a feature in server 2008 which allows users to run applications off Remote Desktop Servers without having to log into them to run the applications.  The user can either have a desktop shortcut installed or go through the web portal to…
This tutorial will walk an individual through the steps necessary to configure their installation of BackupExec 2012 to use network shared disk space. Verify that the path to the shared storage is valid and that data can be written to that location:…
This tutorial will give a short introduction and overview of Backup Exec 2012 and how to navigate and perform basic functions. Click on the Backup Exec button in the upper left corner. From here, are global settings for the application such as conne…

803 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question