Still celebrating National IT Professionals Day with 3 months of free Premium Membership. Use Code ITDAY17

x
?
Solved

what is the best way to upload large amounts of data?

Posted on 2009-04-04
4
Medium Priority
?
859 Views
Last Modified: 2012-05-06
Hi, I currently backup my customers data to a local server using Ahsay data backup software. However although I am sure that the server I use is good for the purpose I am considering using a dedicated server so my bandwidth is not used up and so that customers data is safer etc. My only dilema is that some of these accounts might be 100's of GB's and I have asked many of the dedicated server companies if I can ship the data to them using a USB hdd for example and they have rules that state I cannot do this. I am considering going with BE unlimited broadband due to the fact they have an upload speed of 2.5mb.  is there any other bband providers that have a higher upload speed for this purpose or is this ample to say FTP the data to the dedicated  server?  I am with virgin so that doesnt really help in the department of dynamic IP, though I have got round that with dyndns.org. I definately want to go down the route of a dedicated servere and have looked at ovh.co.uk has anybody had any dealings with this company they seem to be pretty reasonable. or is there another way of being able to send large amounts of data to a dedicated server without purchasing say a lease line?

Thanks in advance.
Chris
0
Comment
Question by:wireless24
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
4 Comments
 
LVL 1

Expert Comment

by:Espeto
ID: 24067996
for linux solution you can try use netcat.

For transport large file you mast run next comand:

On the receive end do:
# netcat -vvn -l -p 7000 -q 2 | tar x

And on the sending end do:
# tar cpf - <filename> | netcat <ip_receiver> 7000 -vvn

File will be archived, transport and extract on receiver.
Because of data archiving, upload time mast be smaller.
0
 

Author Comment

by:wireless24
ID: 24068093
Hi, I have not really used linux and all of this is windows based?

is there a windows based solution? I really have no idea what any of that means Espeto.

Thanks again.

Chris
0
 
LVL 23

Accepted Solution

by:
Mysidia earned 2000 total points
ID: 24073703
One possibility would be to switch to Avamar or some solution that provides you source dedup to reduce the size of your backups <G>

Find a dedicated server provider that will allow it,  that will allow you physical access to your server  (some will, although there may be a cost for access),  or that will allow you to collocate your own server.

By using a collocation provider or leasing rack space with power and network, you own the server, and can make arrangements to gain physical access to it, take it down, or reinstall it, when you need to.
Most importantly, you can custom-build it, and load it up with a hard drive in a hot-swap bay or internal bay containing the data, so you pre-seed the synchronization process.

The downside is, if anything physically goes wrong with your server, it's your problem, and getting remote hands could be expensive  (so make sure you can get out of band management,  to remotely power-on/off  re-install OS, monitor RAID health, etc)

There are some upload tools that may help you move files.
Robocopy    http://technet.microsoft.com/en-us/magazine/2006.11.utilityspotlight.aspx
Deltacopy   http://www.aboutmyip.com/AboutMyXApp/DeltaCopy.jsp
ViceVersa http://www.tgrmn.com/

But if you're using the Ahsay software, you don't necessarily need them to be setting up a remote replica.   The Ahsay software is advertised as capable of replicating to another server, and this is most likely the best way to sync files from your local machine to the remote server.


A  2 mbit connection might not actually work  if you keep your backups going to the local machine, and try to keep re-synchronizing.

100 gigs of data * 1024 MBs/Gig *8  bits/byte   IS  819200  megabits of data.

At the upload rate of  2 megabits per second,  that would be 409600 seconds,
or   a total of  113.77 hours     for 100gb of data

(uncompressed).

For a backup plan, this is not very good.  Typically people utilize nightly backups, and they don't want to under any circumstances lose more than 24 hours of data.

So if you generate 100 new gigabytes of data per day,  you really should have 10 megabits  of upstream bandwidth.

However, it would be prudent to run the initial synchronization first and then determine _after_ that initial burst, how much additional data you are needing to transfer.


Depending no your synchronization tool you may have compression available that can help a great deal.


If you already have additional server equipment, you might want to test this on your LAN.

Sync up a second server with all the backup data, and use tools to carefully monitor the total traffic being uploaded to it  after a few nights you will have the approximate average usage per day.

Having this number will also help tell you if it's feasible to get a dedicated or collocated server and  use file sync tools to keep it in sync with yours.


Or if you need to point your backup clients at the remote server, and use your local DOWNSTREAM bandwidth to pull data from remote  to your local one, instead of your UPSTREAM bandwidth to push.

(I.E. Remote copy becomes the 'primary')

0
 

Author Closing Comment

by:wireless24
ID: 31566606
Thanks for your help, worked great!!
0

Featured Post

Manage your data center from practically anywhere

The KN8164V features HD resolution of 1920 x 1200, FIPS 140-2 with level 1 security standards and virtual media transmissions at twice the speed. Built for reliability, the KN series provides local console and remote over IP access, ensuring 24/7 availability to all servers.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

#Citrix #Citrix Netscaler #HTTP Compression #Load Balance
Arrow Electronics was searching for a KVM  (Keyboard/Video/Mouse) switch that could display on one single monitor the current status of all units being tested on the rack.
This tutorial will walk an individual through the steps necessary to enable the VMware\Hyper-V licensed feature of Backup Exec 2012. In addition, how to add a VMware server and configure a backup job. The first step is to acquire the necessary licen…
This tutorial will walk an individual through the process of configuring basic necessities in order to use the 2010 version of Data Protection Manager. These include storage, agents, and protection jobs. Launch Data Protection Manager from the deskt…

715 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question