Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people, just like you, are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions

what is the best way to upload large amounts of data?

Posted on 2009-04-04
Last Modified: 2012-05-06
Hi, I currently backup my customers data to a local server using Ahsay data backup software. However although I am sure that the server I use is good for the purpose I am considering using a dedicated server so my bandwidth is not used up and so that customers data is safer etc. My only dilema is that some of these accounts might be 100's of GB's and I have asked many of the dedicated server companies if I can ship the data to them using a USB hdd for example and they have rules that state I cannot do this. I am considering going with BE unlimited broadband due to the fact they have an upload speed of 2.5mb.  is there any other bband providers that have a higher upload speed for this purpose or is this ample to say FTP the data to the dedicated  server?  I am with virgin so that doesnt really help in the department of dynamic IP, though I have got round that with dyndns.org. I definately want to go down the route of a dedicated servere and have looked at ovh.co.uk has anybody had any dealings with this company they seem to be pretty reasonable. or is there another way of being able to send large amounts of data to a dedicated server without purchasing say a lease line?

Thanks in advance.
Question by:wireless24
  • 2

Expert Comment

ID: 24067996
for linux solution you can try use netcat.

For transport large file you mast run next comand:

On the receive end do:
# netcat -vvn -l -p 7000 -q 2 | tar x

And on the sending end do:
# tar cpf - <filename> | netcat <ip_receiver> 7000 -vvn

File will be archived, transport and extract on receiver.
Because of data archiving, upload time mast be smaller.

Author Comment

ID: 24068093
Hi, I have not really used linux and all of this is windows based?

is there a windows based solution? I really have no idea what any of that means Espeto.

Thanks again.

LVL 23

Accepted Solution

Mysidia earned 500 total points
ID: 24073703
One possibility would be to switch to Avamar or some solution that provides you source dedup to reduce the size of your backups <G>

Find a dedicated server provider that will allow it,  that will allow you physical access to your server  (some will, although there may be a cost for access),  or that will allow you to collocate your own server.

By using a collocation provider or leasing rack space with power and network, you own the server, and can make arrangements to gain physical access to it, take it down, or reinstall it, when you need to.
Most importantly, you can custom-build it, and load it up with a hard drive in a hot-swap bay or internal bay containing the data, so you pre-seed the synchronization process.

The downside is, if anything physically goes wrong with your server, it's your problem, and getting remote hands could be expensive  (so make sure you can get out of band management,  to remotely power-on/off  re-install OS, monitor RAID health, etc)

There are some upload tools that may help you move files.
Robocopy    http://technet.microsoft.com/en-us/magazine/2006.11.utilityspotlight.aspx
Deltacopy   http://www.aboutmyip.com/AboutMyXApp/DeltaCopy.jsp
ViceVersa http://www.tgrmn.com/

But if you're using the Ahsay software, you don't necessarily need them to be setting up a remote replica.   The Ahsay software is advertised as capable of replicating to another server, and this is most likely the best way to sync files from your local machine to the remote server.

A  2 mbit connection might not actually work  if you keep your backups going to the local machine, and try to keep re-synchronizing.

100 gigs of data * 1024 MBs/Gig *8  bits/byte   IS  819200  megabits of data.

At the upload rate of  2 megabits per second,  that would be 409600 seconds,
or   a total of  113.77 hours     for 100gb of data


For a backup plan, this is not very good.  Typically people utilize nightly backups, and they don't want to under any circumstances lose more than 24 hours of data.

So if you generate 100 new gigabytes of data per day,  you really should have 10 megabits  of upstream bandwidth.

However, it would be prudent to run the initial synchronization first and then determine _after_ that initial burst, how much additional data you are needing to transfer.

Depending no your synchronization tool you may have compression available that can help a great deal.

If you already have additional server equipment, you might want to test this on your LAN.

Sync up a second server with all the backup data, and use tools to carefully monitor the total traffic being uploaded to it  after a few nights you will have the approximate average usage per day.

Having this number will also help tell you if it's feasible to get a dedicated or collocated server and  use file sync tools to keep it in sync with yours.

Or if you need to point your backup clients at the remote server, and use your local DOWNSTREAM bandwidth to pull data from remote  to your local one, instead of your UPSTREAM bandwidth to push.

(I.E. Remote copy becomes the 'primary')


Author Closing Comment

ID: 31566606
Thanks for your help, worked great!!

Featured Post

Free Tool: Postgres Monitoring System

A PHP and Perl based system to collect and display usage statistics from PostgreSQL databases.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

How to fix error ""Failed to validate the vCentre certificate. Either install or verify the certificate by using the vSphere Data Protection Configuration utility" when you are trying to connect to VDP instance from Vcenter.
VM backups can be lost due to a number of reasons: accidental backup deletion, backup file corruption, disk failure, lost or stolen hardware, malicious attack, or due to some other undesired and unpredicted event. Thus, having more than one copy of …
This tutorial will walk an individual through locating and launching the BEUtility application to properly change the service account username and\or password in situation where it may be necessary or where the password has been inadvertently change…
This tutorial will walk an individual through the steps necessary to enable the VMware\Hyper-V licensed feature of Backup Exec 2012. In addition, how to add a VMware server and configure a backup job. The first step is to acquire the necessary licen…

790 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question