Solved

what is the best way to upload large amounts of data?

Posted on 2009-04-04
4
845 Views
Last Modified: 2012-05-06
Hi, I currently backup my customers data to a local server using Ahsay data backup software. However although I am sure that the server I use is good for the purpose I am considering using a dedicated server so my bandwidth is not used up and so that customers data is safer etc. My only dilema is that some of these accounts might be 100's of GB's and I have asked many of the dedicated server companies if I can ship the data to them using a USB hdd for example and they have rules that state I cannot do this. I am considering going with BE unlimited broadband due to the fact they have an upload speed of 2.5mb.  is there any other bband providers that have a higher upload speed for this purpose or is this ample to say FTP the data to the dedicated  server?  I am with virgin so that doesnt really help in the department of dynamic IP, though I have got round that with dyndns.org. I definately want to go down the route of a dedicated servere and have looked at ovh.co.uk has anybody had any dealings with this company they seem to be pretty reasonable. or is there another way of being able to send large amounts of data to a dedicated server without purchasing say a lease line?

Thanks in advance.
Chris
0
Comment
Question by:wireless24
  • 2
4 Comments
 
LVL 1

Expert Comment

by:Espeto
ID: 24067996
for linux solution you can try use netcat.

For transport large file you mast run next comand:

On the receive end do:
# netcat -vvn -l -p 7000 -q 2 | tar x

And on the sending end do:
# tar cpf - <filename> | netcat <ip_receiver> 7000 -vvn

File will be archived, transport and extract on receiver.
Because of data archiving, upload time mast be smaller.
0
 

Author Comment

by:wireless24
ID: 24068093
Hi, I have not really used linux and all of this is windows based?

is there a windows based solution? I really have no idea what any of that means Espeto.

Thanks again.

Chris
0
 
LVL 23

Accepted Solution

by:
Mysidia earned 500 total points
ID: 24073703
One possibility would be to switch to Avamar or some solution that provides you source dedup to reduce the size of your backups <G>

Find a dedicated server provider that will allow it,  that will allow you physical access to your server  (some will, although there may be a cost for access),  or that will allow you to collocate your own server.

By using a collocation provider or leasing rack space with power and network, you own the server, and can make arrangements to gain physical access to it, take it down, or reinstall it, when you need to.
Most importantly, you can custom-build it, and load it up with a hard drive in a hot-swap bay or internal bay containing the data, so you pre-seed the synchronization process.

The downside is, if anything physically goes wrong with your server, it's your problem, and getting remote hands could be expensive  (so make sure you can get out of band management,  to remotely power-on/off  re-install OS, monitor RAID health, etc)

There are some upload tools that may help you move files.
Robocopy    http://technet.microsoft.com/en-us/magazine/2006.11.utilityspotlight.aspx
Deltacopy   http://www.aboutmyip.com/AboutMyXApp/DeltaCopy.jsp
ViceVersa http://www.tgrmn.com/

But if you're using the Ahsay software, you don't necessarily need them to be setting up a remote replica.   The Ahsay software is advertised as capable of replicating to another server, and this is most likely the best way to sync files from your local machine to the remote server.


A  2 mbit connection might not actually work  if you keep your backups going to the local machine, and try to keep re-synchronizing.

100 gigs of data * 1024 MBs/Gig *8  bits/byte   IS  819200  megabits of data.

At the upload rate of  2 megabits per second,  that would be 409600 seconds,
or   a total of  113.77 hours     for 100gb of data

(uncompressed).

For a backup plan, this is not very good.  Typically people utilize nightly backups, and they don't want to under any circumstances lose more than 24 hours of data.

So if you generate 100 new gigabytes of data per day,  you really should have 10 megabits  of upstream bandwidth.

However, it would be prudent to run the initial synchronization first and then determine _after_ that initial burst, how much additional data you are needing to transfer.


Depending no your synchronization tool you may have compression available that can help a great deal.


If you already have additional server equipment, you might want to test this on your LAN.

Sync up a second server with all the backup data, and use tools to carefully monitor the total traffic being uploaded to it  after a few nights you will have the approximate average usage per day.

Having this number will also help tell you if it's feasible to get a dedicated or collocated server and  use file sync tools to keep it in sync with yours.


Or if you need to point your backup clients at the remote server, and use your local DOWNSTREAM bandwidth to pull data from remote  to your local one, instead of your UPSTREAM bandwidth to push.

(I.E. Remote copy becomes the 'primary')

0
 

Author Closing Comment

by:wireless24
ID: 31566606
Thanks for your help, worked great!!
0

Featured Post

Do You Know the 4 Main Threat Actor Types?

Do you know the main threat actor types? Most attackers fall into one of four categories, each with their own favored tactics, techniques, and procedures.

Join & Write a Comment

By default, Carbonite Server Backup manages your encryption key for you using Advanced Encryption Standard (AES) 128-bit encryption. If you choose to manage your private encryption key, your backups will be encrypted using AES 256-bit encryption.
Data center, now-a-days, is referred as the home of all the advanced technologies. In-fact, most of the businesses are now establishing their entire organizational structure around the IT capabilities.
This tutorial will walk an individual through locating and launching the BEUtility application to properly change the service account username and\or password in situation where it may be necessary or where the password has been inadvertently change…
This tutorial will walk an individual through configuring a drive on a Windows Server 2008 to perform shadow copies in order to quickly recover deleted files and folders. Click on Start and then select Computer to view the available drives on the se…

708 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

13 Experts available now in Live!

Get 1:1 Help Now