Want to protect your cyber security and still get fast solutions? Ask a secure question today.Go Premium


what is the best way to upload large amounts of data?

Posted on 2009-04-04
Medium Priority
Last Modified: 2012-05-06
Hi, I currently backup my customers data to a local server using Ahsay data backup software. However although I am sure that the server I use is good for the purpose I am considering using a dedicated server so my bandwidth is not used up and so that customers data is safer etc. My only dilema is that some of these accounts might be 100's of GB's and I have asked many of the dedicated server companies if I can ship the data to them using a USB hdd for example and they have rules that state I cannot do this. I am considering going with BE unlimited broadband due to the fact they have an upload speed of 2.5mb.  is there any other bband providers that have a higher upload speed for this purpose or is this ample to say FTP the data to the dedicated  server?  I am with virgin so that doesnt really help in the department of dynamic IP, though I have got round that with dyndns.org. I definately want to go down the route of a dedicated servere and have looked at ovh.co.uk has anybody had any dealings with this company they seem to be pretty reasonable. or is there another way of being able to send large amounts of data to a dedicated server without purchasing say a lease line?

Thanks in advance.
Question by:wireless24
  • 2

Expert Comment

ID: 24067996
for linux solution you can try use netcat.

For transport large file you mast run next comand:

On the receive end do:
# netcat -vvn -l -p 7000 -q 2 | tar x

And on the sending end do:
# tar cpf - <filename> | netcat <ip_receiver> 7000 -vvn

File will be archived, transport and extract on receiver.
Because of data archiving, upload time mast be smaller.

Author Comment

ID: 24068093
Hi, I have not really used linux and all of this is windows based?

is there a windows based solution? I really have no idea what any of that means Espeto.

Thanks again.

LVL 23

Accepted Solution

Mysidia earned 2000 total points
ID: 24073703
One possibility would be to switch to Avamar or some solution that provides you source dedup to reduce the size of your backups <G>

Find a dedicated server provider that will allow it,  that will allow you physical access to your server  (some will, although there may be a cost for access),  or that will allow you to collocate your own server.

By using a collocation provider or leasing rack space with power and network, you own the server, and can make arrangements to gain physical access to it, take it down, or reinstall it, when you need to.
Most importantly, you can custom-build it, and load it up with a hard drive in a hot-swap bay or internal bay containing the data, so you pre-seed the synchronization process.

The downside is, if anything physically goes wrong with your server, it's your problem, and getting remote hands could be expensive  (so make sure you can get out of band management,  to remotely power-on/off  re-install OS, monitor RAID health, etc)

There are some upload tools that may help you move files.
Robocopy    http://technet.microsoft.com/en-us/magazine/2006.11.utilityspotlight.aspx
Deltacopy   http://www.aboutmyip.com/AboutMyXApp/DeltaCopy.jsp
ViceVersa http://www.tgrmn.com/

But if you're using the Ahsay software, you don't necessarily need them to be setting up a remote replica.   The Ahsay software is advertised as capable of replicating to another server, and this is most likely the best way to sync files from your local machine to the remote server.

A  2 mbit connection might not actually work  if you keep your backups going to the local machine, and try to keep re-synchronizing.

100 gigs of data * 1024 MBs/Gig *8  bits/byte   IS  819200  megabits of data.

At the upload rate of  2 megabits per second,  that would be 409600 seconds,
or   a total of  113.77 hours     for 100gb of data


For a backup plan, this is not very good.  Typically people utilize nightly backups, and they don't want to under any circumstances lose more than 24 hours of data.

So if you generate 100 new gigabytes of data per day,  you really should have 10 megabits  of upstream bandwidth.

However, it would be prudent to run the initial synchronization first and then determine _after_ that initial burst, how much additional data you are needing to transfer.

Depending no your synchronization tool you may have compression available that can help a great deal.

If you already have additional server equipment, you might want to test this on your LAN.

Sync up a second server with all the backup data, and use tools to carefully monitor the total traffic being uploaded to it  after a few nights you will have the approximate average usage per day.

Having this number will also help tell you if it's feasible to get a dedicated or collocated server and  use file sync tools to keep it in sync with yours.

Or if you need to point your backup clients at the remote server, and use your local DOWNSTREAM bandwidth to pull data from remote  to your local one, instead of your UPSTREAM bandwidth to push.

(I.E. Remote copy becomes the 'primary')


Author Closing Comment

ID: 31566606
Thanks for your help, worked great!!

Featured Post

Efficient way to get backups off site to Azure

This user guide provides instructions on how to deploy and configure both a StoneFly Scale Out NAS Enterprise Cloud Drive virtual machine and Veeam Cloud Connect in the Microsoft Azure Cloud.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Data center, now-a-days, is referred as the home of all the advanced technologies. In-fact, most of the businesses are now establishing their entire organizational structure around the IT capabilities.
Among the most obnoxious of Exchange errors is error 1216 – Attached Database Mismatch error of the Jet Database Engine. When faced with this error, users may have to suffer from mailbox inaccessibility and in worst situations, permanent data loss.
This tutorial will walk an individual through locating and launching the BEUtility application to properly change the service account username and\or password in situation where it may be necessary or where the password has been inadvertently change…
This tutorial will show how to configure a single USB drive with a separate folder for each day of the week. This will allow each of the backups to be kept separate preventing the previous day’s backup from being overwritten. The USB drive must be s…
Suggested Courses

580 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question