Want to protect your cyber security and still get fast solutions? Ask a secure question today.Go Premium

x
?
Solved

Backing up remote server 4mb/min max...  Backukp Exec 10.1d

Posted on 2006-11-15
4
Medium Priority
?
287 Views
Last Modified: 2010-04-03
I have a remote location connected via a 4mb/min connection and need a backup solution.  I currently use backup exec 10.1d and run backups once a week.  They typically take about 4 - 5 days to complete - as I am backup up about 23 gb.

This is a Dell 2850 w/ a raid 0/5 setup and has plenty of disc space for local backups.  The backup exec client is installed on the remote server.  I created backup to disc folders locally and submitted the jobs, but it seems to be transmitting the data to and from now, cutting my througput down to 2mb/min.  Any suggestions to make this work?

Another idea I had was to setup the remote server as a secondary backup exec server.  I could then backup locally (at full speed) and then transfer to tape once a month or so.

Lastly, we did purchase a replication manager add-on.  The intent was to replicate the data across the WAN and then backup from the replicated server.  Has anyone tried this approach?

Thanks in advance...
0
Comment
Question by:furrbish
2 Comments
 
LVL 56

Accepted Solution

by:
Handy Holder earned 500 total points
ID: 17950044
Low bandwidth WANs are ideal for synthetic backup that you get with the advanced backup to disk option. If the backup server has disk and tape and a remote client is over the WAN link you can take a full backup once every few months and incremental or differential backups daily, then a synthetic full backup can be made from an old full backup plus the incrementals.

If you look at the options under incremental backup you'll see there's one that collects extra data for synthetic backup (it records what files have been deleted). When it creates a synthetic backup it uses the old disk based full backup and copies it to tape but checks the incrementals or differential to see what files were deleted and doesn't copy them to tape and only copies the most recent version of any changed file to tape.

Getting only half the bandwidth (I don't know whether you have megabits/minute or megabytes per minutes because you've used lower case and I'm sure you're not talking about millibits) is not surprising if you have a long distance high latency link, it uses TCP rather than UDP so periodic acknowledgements are required, you can sometimes get more throughput by increasing the TCP window size.

Replicating is an ideal solution, if you've got something like doubletake you can clone the current OS to the remote server and use doubletake or similar to replicate all the files and update them every time a file is altered, as they work at the raw disk level only changed disk blocks are replicated so if you edit a 1GB file with 100MB of new information there's only 100MB of data to replicate, an incremental backup would make a new copy of the entire 1GB file.
0
 
LVL 88

Assisted Solution

by:rindi
rindi earned 500 total points
ID: 17950296
I would use a tape drive at the remote location and backup to that. Remote backups of the size you are doing just take too long if you don't have a faster connection. You can allways control the backup using Terminal Services.

Another option would be to use rsync, which only copies the differences of what you already have at your location, so the initial backup will take time too. This also won't be much faster if your data you are backing up changes too much. Although rsync is a 'nix product, you should also be able to use it on m$ stuff, you'll find info about that within the link.

http://samba.anu.edu.au/rsync/
0

Featured Post

Free Tool: Path Explorer

An intuitive utility to help find the CSS path to UI elements on a webpage. These paths are used frequently in a variety of front-end development and QA automation tasks.

One of a set of tools we're offering as a way of saying thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

This article is an update and follow-up of my previous article:   Storage 101: common concepts in the IT enterprise storage This time, I expand on more frequently used storage concepts.
Windows Server 2003 introduced persistent Volume Shadow Copies and made 2003 a must-do upgrade.  Since then, it's been a must-implement feature for all servers doing any kind of file sharing.
This Micro Tutorial will teach you how to reformat your flash drive. Sometimes your flash drive may have issues carrying files so this will completely restore it to manufacturing settings. Make sure to backup all files before reformatting. This w…
Despite its rising prevalence in the business world, "the cloud" is still misunderstood. Some companies still believe common misconceptions about lack of security in cloud solutions and many misuses of cloud storage options still occur every day. …
Suggested Courses
Course of the Month10 days, 18 hours left to enroll

572 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question