Go Premium for a chance to win a PS4. Enter to Win

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 282
  • Last Modified:

What is a good way to retrieve files from large archives overseas and cache them to be used locally?

Hello,

I would like some recommendations a solution I am trying to implement for a company that has two offices: one in Holland and one in the US. The office in the US is new and it still needs to access a network share that has a large archive with many individual files that is located in Holland. The problem is that accessing these files (they're using some program that opens architectural drawings) and making any changes can be slow because of the distance barrier. To solve this I thought about doing a DFS share and creating some shared virtual folders that would sync overnight from the US share to the Holland share but that idea lost steam after realizing the archives are big and the users don't have a preference for a smaller set of data. So what I am looking for is to find out if there are any solutions out there that do this sort of file caching? Kind of like a "fetch as you use" type of deal.
0
Samir Saber
Asked:
Samir Saber
  • 6
  • 4
1 Solution
 
Mitchell MilliganInformation Technology Network AdministratorCommented:
You may not want to completely abandon the DFS share idea just yet.  You may want to seed the DFS first, which will reduce that synchronization time and bandwidth utilization.  You may want to have the data copied to an external hard drive and sent to the other location, then you can set up the DFS and the image will be pre-seeded.

DFS Replication supports copying files to a replication group member before the initial replication. This "prestaging" can dramatically reduce the amount of data replicated during the initial replication.

The initial replication does not need to replicate contents when files differ only by real attributes or time stamps. A real attribute is an attribute that can be set by the Win32 function SetFileAttributes. For more information, see SetFileAttributes Function in the MSDN library (http://go.microsoft.com/fwlink/?LinkId=182269). If two files differ by other attributes, such as compression, then the contents of the file are replicated.

To prestage a replication group member, copy the files to the appropriate folder on the destination server(s), create the replication group, and then choose a primary member. Choose the member that has the most up-to-date files that you want to replicate because the primary member's content is considered "authoritative." This means that during initial replication, the primary member's files will always overwrite other versions of the files on other members of the replication group.

For information about pre-seeding and cloning the DFSR database, see DFS Replication Initial Sync in Windows Server 2012 R2: Attack of the Clones.

For more information about the initial replication, see Create a Replication Group.

Source: http://technet.microsoft.com/en-us/library/cc773238%28v=ws.10%29.aspx#BKMK_079
0
 
Samir SaberAuthor Commented:
Thanks for the great suggestion Mitchell and you are right - I haven't crossed the DFS idea out yet. It will come down to what is the best option financially because I am not sure yet how big the archive is.
0
 
Mitchell MilliganInformation Technology Network AdministratorCommented:
If you can find out, you may be able to get an external hard drive, copy the data to it, then send it in the mail to your other site.
0
Automating Your MSP Business

The road to profitability.
Delivering superior services is key to ensuring customer satisfaction and the consequent long-term relationships that enable MSPs to lock in predictable, recurring revenue. What's the best way to deliver superior service? One word: automation.

 
Samir SaberAuthor Commented:
Will do. I believe the archive is at least 100 TBs.
0
 
Mitchell MilliganInformation Technology Network AdministratorCommented:
Ok, so we are talking between 10 and 100 external hard drives?
0
 
Mitchell MilliganInformation Technology Network AdministratorCommented:
That's a lot more than i had thought you would say.  Wow, that's a lot of data.
0
 
Samir SaberAuthor Commented:
Just checked with their IT guys and they said it's 3.8TB instead (I was also surprised it was that big.. that's what happens when common users tell me what they think of the size). I will most likely go the DFS route.
0
 
Mitchell MilliganInformation Technology Network AdministratorCommented:
Sounds great, I would still look into seeding it if possible before starting the DFS.  That will save significant time and bandwidth I think.  I am not sure how long or how much it would cost to ship an external hard drive between your sites, but I can only imaging how long it would take to synchronize 3.8 TB.  Good luck, and if this has answered your question, please be sure to close your question.
0
 
Samir SaberAuthor Commented:
Thanks again Mitchell.
0
 
Mitchell MilliganInformation Technology Network AdministratorCommented:
No problem, good luck!
0

Featured Post

Free Tool: Site Down Detector

Helpful to verify reports of your own downtime, or to double check a downed website you are trying to access.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

  • 6
  • 4
Tackle projects and never again get stuck behind a technical roadblock.
Join Now