Still celebrating National IT Professionals Day with 3 months of free Premium Membership. Use Code ITDAY17

x
?
Solved

DFS-R Backlog file list

Posted on 2013-12-17
5
Medium Priority
?
2,915 Views
Last Modified: 2013-12-23
I have a problem with DFS Replication. I can see that our users have dumped a whole heap of files (approx. 12,000) into a couple of replicated folders and the system is backlogging these files. The backlog is not growing very quickly which suggests that DFS is still working to a degree. However when I run a DFS Backlog report it only gives me the first 100 files. Ideally I want get a list of *all* the backlogged files as I want to move them to a non-DFS replicated folder and then re-introduce them to replication slowly.

Anybody know how to get a complete list of backlogged files?
0
Comment
Question by:BluecubeTechnology
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
  • 2
5 Comments
 
LVL 8

Assisted Solution

by:MarkieS
MarkieS earned 2000 total points
ID: 39725953
Hi

Take a look at DFSRMon.exe.  
http://certcollection.org/forum/topic/106034-dfsrmon-v110/

It's kinda clunky and slow on the GUI but it is gathering alot of info in the background so dont expect it to be fast.

Also use TRACE32.EXE to monitor the live log files in %WINDIR%\debug\DFSRxxxxx.log.gz and DFSRxxxxx.log

DFSRxxxxx.log is the current file being used and TRACE32.EXE will show you live replication activity as it happens.  For the full instructions how to view this look at http://support.microsoft.com/kb/958893

You may find that staging areas are your problem.   A temporary increase in staging area size to get you through your "time of adversity" may result in faster replication.  You can reduce the staging area size afterwards.

I would have reservations in now removing the files for replication only to add them in again slowly.  The act of removing the files has to be replicated itself.  This effectively triples the amount of replication going on.

kind regards
Mark S.
0
 

Author Comment

by:BluecubeTechnology
ID: 39726346
Thanks for the reply, the TRACE32.exe has been helpful in getting at the logs.

I've increased the staging area size to 10GB for the affected folders but this doesn't appear to have got things moving. I've also checked the bandwidth between the office and datacentre and I'm not getting anything like the amount of traffic that would suggest the line is saturated (it's a 40MB EFM circuit).

This is an extract from the live log:

+	present                         1
+	nameConflict                    0
+	attributes                      0x20
+	ghostedHeader                   0
+	data                            0
+	gvsn                            {2909EAD2-DA4D-4F28-9CB2-7028EB48070D}-v2014664
+	uid                             {2909EAD2-DA4D-4F28-9CB2-7028EB48070D}-v2014664
+	parent                          {2909EAD2-DA4D-4F28-9CB2-7028EB48070D}-v2003432
+	fence                           Default (3)
+	clockDecrementedInDirtyShutdown 0
+	clock                           20131121 16:02:37.218 GMT (0x1cee6d318ff22d9)
+	createTime                      20131121 11:06:57.230 GMT
+	csId                            {4D219D40-1196-4B26-9CD6-41DFC0B14D54}
+	hash                            00000000-00000000-00000000-00000000
+	similarity                      00000000-00000000-00000000-00000000
+	name                            3109001 - <File_Name>.xlsx
+	 Error:
+	[Error:9027(0x2343) RpcFinalizeContext downstreamtransport.cpp:1117 25016 C A failure was reported by the remote partner]
+	[Error:9027(0x2343) DownstreamTransport::RdcGet downstreamtransport.cpp:5269 25016 C A failure was reported by the remote partner]
+	[Error:9024(0x2340) DownstreamTransport::RdcGet downstreamtransport.cpp:5269 25016 C The file meta data is not synchronized with the file system]
20131218 11:28:39.602 25016 INCO  6582 InConnection::LogTransferActivity Failed to receive RAWGET uid:{2909EAD2-DA4D-4F28-9CB2-7028EB48070D}-v2014664 gvsn:{2909EAD2-DA4D-4F28-9CB2-7028EB48070D}-v2014664 fileName:3109001 - Coca-cola - Supreme Control.xlsx connId:{B8966922-E91B-4B56-807F-CC46EA43CA52} csId:{4D219D40-1196-4B26-9CD6-41DFC0B14D54} stagedSize:0 Error:
+	[Error:9027(0x2343) DownstreamTransport::RdcGet downstreamtransport.cpp:5346 25016 C A failure was reported by the remote partner]
+	[Error:9027(0x2343) RpcFinalizeContext downstreamtransport.cpp:1117 25016 C A failure was reported by the remote partner]
+	[Error:9027(0x2343) DownstreamTransport::RdcGet downstreamtransport.cpp:5269 25016 C A failure was reported by the remote partner]
+	[Error:9024(0x2340) DownstreamTransport::RdcGet downstreamtransport.cpp:5269 25016 C The file meta data is not synchronized with the file system]
20131218 11:28:39.602 25016 INCO  2831 InConnection::ProcessErrorStatus (Ignored) Remote error connId:{B8966922-E91B-4B56-807F-CC46EA43CA52} state:CONNECTED Error:
+	[Error:9027(0x2343) DownstreamTransport::RdcGet downstreamtransport.cpp:5346 25016 C A failure was reported by the remote partner]
+	[Error:9027(0x2343) RpcFinalizeContext downstreamtransport.cpp:1117 25016 C A failure was reported by the remote partner]
+	[Error:9027(0x2343) DownstreamTransport::RdcGet downstreamtransport.cpp:5269 25016 C A failure was reported by the remote partner]
+	[Error:9024(0x2340) DownstreamTransport::RdcGet downstreamtransport.cpp:5269 25016 C The file meta data is not synchronized with the file system]

Open in new window



Unfortunately I'm not sure how to interpret these errors. Some searching has suggested that this due to temp files not being replicated but the file attributes for this file (0x20) and others I have looked at do not indicate the temporary attribute is set, so I think that line of investigation is a dead end.
0
 

Accepted Solution

by:
BluecubeTechnology earned 0 total points
ID: 39726725
I have a solution:

The problem was with the TCP Chimney-Offload feature. Once this was disabled (as per below) the cork was removed and DFS-R began to flow once again. Thank you MarkieS, your suggestion of using TRACE32.exe found me the errors that lead to me finding the solution.

1. Click Start, click Run, type Regedit, and then click OK. 
 
2. Locate the following registry subkey: 
 
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters 
 
*If any of the below keys are not present, create them.
 
3. Double-click the EnableTCPChimney registry entry. 
 
4. In the Edit DWORD Value dialog box, type 0 in the Value data box, and then click OK. 
 
5. Double-click the EnableRSS registry entry. 
 
6. In the Edit DWORD Value dialog box, type 0 in the Value data box, and then click OK. 
 
7. Double-click the EnableTCPA registry entry. 
 
8. In the Edit DWORD Value dialog box, type 0 in the Value data box, and then click OK. 
 
9. Restart the server.

Open in new window

0
 
LVL 8

Expert Comment

by:MarkieS
ID: 39726749
Good to hear you're running again
0
 

Author Closing Comment

by:BluecubeTechnology
ID: 39735677
Because connections are buffered and processed on the TOE (TCP\IP Offload Engine) chip, resource limitations happen more often then they would if processed by the ample CPU and memory resources that are available to the operating system.  This limitation of resources on the TOE chip can cause communication issues.
0

Featured Post

Office 365 Training for IT Pros

Learn how to provision tenants, synchronize on-premise Active Directory, implement Single Sign-On, customize Office deployment, and protect your organization with eDiscovery and DLP policies.  Only from Platform Scholar.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Know what services you can and cannot, should and should not combine on your server.
Resolving an irritating Remote Desktop connection that stops your saved credentials from being used.
This tutorial will walk an individual through the steps necessary to configure their installation of BackupExec 2012 to use network shared disk space. Verify that the path to the shared storage is valid and that data can be written to that location:…
This tutorial will walk an individual through configuring a drive on a Windows Server 2008 to perform shadow copies in order to quickly recover deleted files and folders. Click on Start and then select Computer to view the available drives on the se…

715 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question