?
Solved

DFS not working for large files

Posted on 2010-01-11
5
Medium Priority
?
762 Views
Last Modified: 2012-05-08
Hello I have DFS installed and in use on 3 servers in my environment.  It was working perfectly for a long time, except it won't replicate my large 6.2GB backup files.  The files in question are .bak, and each is 6.2GB.  I have run the DFS health check, and saw that it replicated its test file, and it shows that the connection is established and working.  I increased the staging quota from 4GB to 20GB, thinking that it needed to be at least the size of what's being transferred, but I now know what the staging quota really means.

BTW all 3 servers are server 2008
0
Comment
Question by:inferno521
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
  • 2
5 Comments
 
LVL 26

Accepted Solution

by:
lnkevin earned 2000 total points
ID: 26292247
The default size of each staging folder is 4,096 MB. This is not a hard limit, however. It is only a quota that is used to govern cleanup and excessive usage based on high and low watermarks (90 percent and 60 percent of staging folder size, respectively). For example, when the staging folder reaches 90 percent of the configured quota the oldest staged files are purged until the staging folder reaches 60 percent of the configured quota. It is possible to replicate a file that is larger than the configured quota of a staging folder.

If a staging folder quota is configured to be too small, DFS Replication might consume additional CPU and disk resources to regenerate the staged files. Replication might also slow down because the lack of staging space can effectively limit the number of concurrent transfers with partners.

However, DFS  File Replication System may not work with .bak or .tmp based on MS.
http://support.microsoft.com/?kbid=229928

You can try to work around by changing the extension from .bak to .test then reverse back after the file is replicated.

K

0
 
LVL 1

Author Closing Comment

by:inferno521
ID: 31675804
I should have thought of that(.bak, .tmp being excluded).  Its like that for most backup programs, so they don't backup backups.  I guess I dismissed it because there doesn't seem to be any user defined exemptions
0
 
LVL 26

Expert Comment

by:lnkevin
ID: 26304142
Other work around is use robocopy or xcopy batch file to schedule file .bak transfer.

K
0
 
LVL 1

Author Comment

by:inferno521
ID: 26305856
Thanks for the advice Inkevin, but I'm trying to automate this as much as possible.  If I were to write a batch, wouldn't I have to specify the filename specifically(or use wildcards)?  

The .bak files are generated through sequel and follow a naming convention(date), I wanted to use DFS because of the ability to choose how much bandwidth to allocate to this.  The files are transferred from our co-lo to our home office, and our site-to-site connection speed is about 500KB.  so the ability to throttle down the transfer speed during business hours and ramp it up is key to me.  Also DFS is easier to understand for my manager, when it works, because of the GUI.
0
 
LVL 26

Expert Comment

by:lnkevin
ID: 26307274
You can either use wild card or copy entire folder with an option new files only.... You can use copy switch to replace files or backup mode (recommended). Open command line and type:
robocopy /?

it will give you a list of the options you can use.

K
0

Featured Post

Get real performance insights from real users

Key features:
- Total Pages Views and Load times
- Top Pages Viewed and Load Times
- Real Time Site Page Build Performance
- Users’ Browser and Platform Performance
- Geographic User Breakdown
- And more

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Concerto Cloud Services, a provider of fully managed private, public and hybrid cloud solutions, announced today it was named to the 20 Coolest Cloud Infrastructure Vendors Of The 2017 Cloud  (http://www.concertocloud.com/about/in-the-news/2017/02/0…
A look at what happened in the Verizon cloud breach.
This tutorial will walk an individual through configuring a drive on a Windows Server 2008 to perform shadow copies in order to quickly recover deleted files and folders. Click on Start and then select Computer to view the available drives on the se…
This tutorial will walk an individual through setting the global and backup job media overwrite and protection periods in Backup Exec 2012. Log onto the Backup Exec Central Administration Server. Examine the services. If all or most of them are stop…
Suggested Courses

771 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question