what is the best way to upload large amounts of data?

Hi, I currently backup my customers data to a local server using Ahsay data backup software. However although I am sure that the server I use is good for the purpose I am considering using a dedicated server so my bandwidth is not used up and so that customers data is safer etc. My only dilema is that some of these accounts might be 100's of GB's and I have asked many of the dedicated server companies if I can ship the data to them using a USB hdd for example and they have rules that state I cannot do this. I am considering going with BE unlimited broadband due to the fact they have an upload speed of 2.5mb.  is there any other bband providers that have a higher upload speed for this purpose or is this ample to say FTP the data to the dedicated  server?  I am with virgin so that doesnt really help in the department of dynamic IP, though I have got round that with dyndns.org. I definately want to go down the route of a dedicated servere and have looked at ovh.co.uk has anybody had any dealings with this company they seem to be pretty reasonable. or is there another way of being able to send large amounts of data to a dedicated server without purchasing say a lease line?

Thanks in advance.
Who is Participating?
MysidiaConnect With a Mentor Commented:
One possibility would be to switch to Avamar or some solution that provides you source dedup to reduce the size of your backups <G>

Find a dedicated server provider that will allow it,  that will allow you physical access to your server  (some will, although there may be a cost for access),  or that will allow you to collocate your own server.

By using a collocation provider or leasing rack space with power and network, you own the server, and can make arrangements to gain physical access to it, take it down, or reinstall it, when you need to.
Most importantly, you can custom-build it, and load it up with a hard drive in a hot-swap bay or internal bay containing the data, so you pre-seed the synchronization process.

The downside is, if anything physically goes wrong with your server, it's your problem, and getting remote hands could be expensive  (so make sure you can get out of band management,  to remotely power-on/off  re-install OS, monitor RAID health, etc)

There are some upload tools that may help you move files.
Robocopy    http://technet.microsoft.com/en-us/magazine/2006.11.utilityspotlight.aspx
Deltacopy   http://www.aboutmyip.com/AboutMyXApp/DeltaCopy.jsp
ViceVersa http://www.tgrmn.com/

But if you're using the Ahsay software, you don't necessarily need them to be setting up a remote replica.   The Ahsay software is advertised as capable of replicating to another server, and this is most likely the best way to sync files from your local machine to the remote server.

A  2 mbit connection might not actually work  if you keep your backups going to the local machine, and try to keep re-synchronizing.

100 gigs of data * 1024 MBs/Gig *8  bits/byte   IS  819200  megabits of data.

At the upload rate of  2 megabits per second,  that would be 409600 seconds,
or   a total of  113.77 hours     for 100gb of data


For a backup plan, this is not very good.  Typically people utilize nightly backups, and they don't want to under any circumstances lose more than 24 hours of data.

So if you generate 100 new gigabytes of data per day,  you really should have 10 megabits  of upstream bandwidth.

However, it would be prudent to run the initial synchronization first and then determine _after_ that initial burst, how much additional data you are needing to transfer.

Depending no your synchronization tool you may have compression available that can help a great deal.

If you already have additional server equipment, you might want to test this on your LAN.

Sync up a second server with all the backup data, and use tools to carefully monitor the total traffic being uploaded to it  after a few nights you will have the approximate average usage per day.

Having this number will also help tell you if it's feasible to get a dedicated or collocated server and  use file sync tools to keep it in sync with yours.

Or if you need to point your backup clients at the remote server, and use your local DOWNSTREAM bandwidth to pull data from remote  to your local one, instead of your UPSTREAM bandwidth to push.

(I.E. Remote copy becomes the 'primary')

for linux solution you can try use netcat.

For transport large file you mast run next comand:

On the receive end do:
# netcat -vvn -l -p 7000 -q 2 | tar x

And on the sending end do:
# tar cpf - <filename> | netcat <ip_receiver> 7000 -vvn

File will be archived, transport and extract on receiver.
Because of data archiving, upload time mast be smaller.
wireless24Author Commented:
Hi, I have not really used linux and all of this is windows based?

is there a windows based solution? I really have no idea what any of that means Espeto.

Thanks again.

wireless24Author Commented:
Thanks for your help, worked great!!
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.