[Webinar] Streamline your web hosting managementRegister Today

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 718
  • Last Modified:

Upload to multiple locations concurrently?

I need a way to have clients upload large (up to 30GB) files to a single address via FTP and have that upload actually delivered to three separate locations simultanously.

Currently, we wait the two or three hours for the upload to complete and then we resend to the other two locations - which takes an additional two or three hours to complete.

We're not insistent on FTP, but we have many clients so that end needs to be either a standard technology or a web interface.

It sounds like a DFS thing, but can DFS start replicating before a file has fully been received?
0
frozenJim
Asked:
frozenJim
2 Solutions
 
Lee W, MVPTechnology and Business Process AdvisorCommented:
If you're using FTP, you could just run 3 different ftp's at the same time - it would take 3 times as long and no one would get it "first" (maybe a few minutes different).
0
 
AlexPaceCommented:
I agree with leew, it depends on the location of the throughput bottleneck.

Currently you send 15 GB per hour (2 hours to send 30 GB)  

If this is the max your computer, network, and internet connection can manage then running concurrent uploads won't help... you'll just have 3 jobs running concurrently for 6 hours instead of 3 sequential jobs each running 2 hours for a total of 6 hours.

On the other hand if your computer, network, and internet connection are capable of doing a lot more than 15 GB per hour then you absolutely want to have three different instances of your ftp client running so you can transfer to all three remote sites concurrently.
0
 
QlemoDeveloperCommented:
This question has been classified as abandoned and is being closed as part of the Cleanup Program. See my comment at the end of the question for more details.
0
Making Bulk Changes to Active Directory

Watch this video to see how easy it is to make mass changes to Active Directory from an external text file without using complicated scripts.

 
frozenJimAuthor Commented:
I think that I should explain the problem more clearly:

A variety of uploaders - over whom I have no control - spread throughout the world, need to upload very large files to my FTP server.  They will send one file, one time using their own FTP client software.  I have no control over them.

I need these files to arrive in three locations throughout the world simultaneously: India, England, and Canada.

What I envision is an FTP server smart enough to "pass- through" the bits as they arrive at the main server.  But I am unaware of where I can find this technology.

If anyone has a solution, I would be grateful.
0
 
Lee W, MVPTechnology and Business Process AdvisorCommented:
You're looking for a broadcast... FTP is a protocol -  a set of rules defining how something (in this case, file transfer) gets done.  There is no facility I've ever heard of to allow FTP to do this.

You might want to look at services that don't use FTP - DropBox or similar file sharing sites.
0
 
frozenJimAuthor Commented:
Yeah, thanks Leew, that is what I was thinking.

Due to security considerations, I cannot use a PUBLIC CLOUD service.  I need to own my own DropBox (or something similar).

Rumour has it that Watchguard may be building this device now... you receive the file in one location and AS IT ARRIVES it is replicated to other sites on a block basis.  THIS is what I need.

Anyone else know of this kind of solution?
0

Featured Post

Never miss a deadline with monday.com

The revolutionary project management tool is here!   Plan visually with a single glance and make sure your projects get done.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now