We help IT Professionals succeed at work.

SBS Backup - how fast?

On a SBS 2003 server I use its built-in backup utility to backup everything, including Exchange.

Last time a 400 GB backup took 12 hours, plus more for verification - total was 18 hours.
Machine is a 3.0 GHz, 4 GB RAM. Backup device is an external hard drive USB 2.0 connected directly to the server.

Is this a normal backup time? If not, what am I doing wrong? I used SBS install settings and backup settings that are pretty standard, or so I think I did it...
Comment
Watch Question

Technology and Business Process Advisor
Most Valuable Expert 2013
Commented:
Backups don't generally do much with CPU and RAM - the key is the performance of the drive, the type of data being backed up, the fragmentation of the disk, and other items.

If you stop and look at how this backup ran, you averaged 33.3 GB per hour or about 500 MB per minute - which by most standards, is pretty good.

Personally, I don't like the SBS backup since it ALWAYS does FULL backups.  In my opinion, it should be doing periodic fulls and daily differentials.  (You can't easily/realistically backup 400 GB every night, yet that's what the backup would have you do).

You can use third party backup utilities to do differentials backups OR you can use NT backup outside of the wizards to do them.  I have a script I just modified that , in theory, should allow you to create scheduled, automated backups on a regular basis.

Author

Commented:
thank you!
> I have a script I just modified that , in theory, should allow you to create scheduled, automated backups on a regular basis.

Could you share the script?
Lee W, MVPTechnology and Business Process Advisor
Most Valuable Expert 2013

Commented:
It is shared on my web site (link available in my profile) - but more directly, www.lwcomputing.com/mysoftware.asp
O. PierruSystem admin
Commented:
Hi,

Or you could still use the SBS backup tool everyday, but exclude what is taking a lot of space on the *data* drives if that's possible (exclude only regular data like shared folders etc. but not databases or system related data).
Then you can create a script yourself in NTBackup to run full/differential backups containing only the excluded data, every night after the SBS backup is done. Not really elegant but that's work.

What is taking so much space by the way?
Knowing that would help in determining a backup strategy.

And what kind of medium are you using to save the backups? (tapes? HDD?)

Author

Commented:
leew: thank you for the script.

Oliver: Most of the space is taken by pictures and presentations. The backup device is an external hard drive (1 TB) - or rather two of those - USB 2.0 connected directly to the server.

Now that I've learned more things (thanks), I'm thinking of doing this:
1. Run a full SBS backup every Friday night, then take this backup drive (A) to a safe place. Connect another backup drive (B) and delete all files on it - B would stay put all week.
2. Every other night, on B, run SBS backup to save the system, Exchange and the database (all this max 30-40 GB) by excluding the rest. This backup would replace existing file.
3. Also every other night, on B, after (2) completes, run NTBackup in incremental mode (takes less time than the differential mode) for data excluded at (2).
4. Next Friday night, still on B, full SBS backup. Take B to a safe place, replace with A, delete all files, and it's ready for (2) and (3) every other night.

Am I right? Did I miss anything? I'd appreciate any comments.
Lee W, MVPTechnology and Business Process Advisor
Most Valuable Expert 2013

Commented:
How often you backup depends entirely on the value of your data.  It's a little dated, but you might want to article on backup in general.  http://www.lwcomputing.com/tips/static/backup.asp
O. PierruSystem admin

Commented:
Good article leew.

--

Your strategy would be a good start.

I was suggesting differential backups because as you may already know, if you run a full once a week, it should be fast enough to run the differential jobs in between and in case of disaster, you don't have to restore all the incrementals since the last full backup (and in case an incremental bkf is corrupt, you'll lose data).

About step 3, you need a full backup say Friday and incrementals (or differentials) the rest of the week. Next Friday a full job is run and so on. Or you could do 2 full bkf by months...