Recommended size of Backup (.BKF) files

Hi,

I want to backup my client's server to USB hard drive.  But they're currently using about 100 gig total and is sure to grow.  Knowing that SBS Backup or NTbackup uses no compression, a full backup would create a huge single file.  Are single files of that size 100+Gig.bkf stable?  I know BackupExec lets you set to break the files into smaller chunks (1 gig or your choice)  is that because it's safer?

I like the one file because if you need to restore from an old backup (and to a different machine) you just need to catalog the one file rather than dozens (if they were 2 gig chunks).

What're your expert opinions / experience on this.  With data amounts growing, I'm sure this will become even more relevant issue.

Thanks,
Charles
CHRubeAsked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

rindiCommented:
It is easier to backup smaller files to other media, like CDR's. Also, if one of the bkf files get corrupt you should still be able to extract most data from the rest. Another reason for creating smaller chunks is because some destination filesystems don't support larger filesizes (fat 32 supports a file size of max 4 GB). I've also seen problems when trying to copy a large file from a dvd or CD media.
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
gurutcCommented:
I'm with rindi.  smaller chunks are always better.  On 2 Terabyte enterprise backups that I do I always chunk up the job.  Good backup utilities will let you catalog and restore data from the other chunks even if one is corrupt.  And you never know when you'll be trying to restore data and you won't have NTFS support.  If you keep the chunk sizes below 4 gigs and put them on a FAT32 USB drive, you'll be able to read or write them from almost any OS.  If you go bigger, requiring NTFS, then you're hosed if you want to safely access the data, even just to copy it, while running Linux, OSX, and DOS/Win98 (yes, I still need and use it every day).

No points for me, do what rindi says.  He's pretty clever.  See his name to the left over there in the margin?

<<<---

- Travis
0
CHRubeAuthor Commented:
Thanks guys.

Okay, so far consensus is that smaller is definitely better.  On small clients I was considering the SBS backup which doesn't really give you much choice and does create just one backup file (which as of last check was like 35 gig).  I'll have to use though the ability to create smaller slices as an important feature to look for in a backup app.  I've kinda become hesitant with FAT32, but that's an interesting point about flexibility and access.  I'm going to wait a bit more to see if anyone else has other real life experience with this issue particularly regarding SBS.

It's a bit annoying that backup software costs as much or more than the whole Small Business Server suite.

Charles
0
Ultimate Tool Kit for Technology Solution Provider

Broken down into practical pointers and step-by-step instructions, the IT Service Excellence Tool Kit delivers expert advice for technology solution providers. Get your free copy now.

gurutcCommented:
SBS backup is fine.  Use it.  But configure separate backup jobs for the parts of your data and OS set.  Do one backup of OS and system state, one of apps, one of data etc etc.  Then repeat to a second and separate backup device.  A big internal spare drive and an external USB drive could be the two backup destinations.  

And also use a separate imaging package, like rindi's fave acronis true image, or my standby ghost running from an ultimate boot cd (www.ubcd4win.com) to image the server so you can get it back up quickly and then restore your data if it really blows up (which, thankfully, all computers do, which feeds my family)

Luck,
Travis
0
CHRubeAuthor Commented:
Doesn't look like much more comment, I'll imminently reward/split points.  But did want to comment on last point about SBS backup.  It's very limited.  The backup built into the SBS interface is either on or off, and it makes one 'large' file per backup.  While you can exclude folders, you can't break up into multiple backup jobs.  Note that it basically is just a front end to NTbackup which one could use directly instead of the 'SBS Backup' to create multiple different jobs (including differentials and incrementals).  Which is basically what I've done for one client because a full every night that SBS makes fills up the external hard way too quickly.

Thanks for your insights,
Charles
0
gurutcCommented:
I agree with you on the SBS backup.  My brain was locking up.  Use NTBackup native and not the SBS front end.  You seem pretty slick and knowledgeable so you should be able to implement a good backup schema.

Travis
0
gurutcCommented:
cool with me - gurutc
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Storage

From novice to tech pro — start learning today.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.