Solved

Backup Software recomendation and stratagy

Posted on 2000-03-08
15
216 Views
Last Modified: 2013-12-15
I am planning on deploying some production samba servers. I would appreciate some guidence in regards to tape backup sofware. It would be primaraly to backup up the smb shares. We are interested in open file backups the way St. Bernard does. These are all low end department servers so price is important. We have used HP 4mm dat that work ok but the externals are a little expensive and the internals clog often.
So I would like guidence on software and hardware. We seldom have more than 1 to 4 gb of data.
0
Comment
Question by:davidpm
  • 6
  • 3
  • 3
  • +2
15 Comments
 
LVL 1

Expert Comment

by:hbrady
Comment Utility
I have been using BRU for some time now and it is still the best I can find.  I have had very good luck with my HP 4/8GB DAT DDS2 Drive (Internal SCSI).  I would install BRU (http://www.estinc.com/) and one tape drive on one machine. Then mount all of the SMB shares via smbmount on that one server.  You can then backup all of the SMB shares on one tape.  You could also use NFS to backup those shares across the network.  Good Luck

0
 

Author Comment

by:davidpm
Comment Utility
I really appreciate the recomendation but what about open file managment? Does bru do handle that?
0
 
LVL 1

Expert Comment

by:hbrady
Comment Utility
David:

I don't have the "Official Answer"; however, I backup databases that are in use every night as well as a whole pile of documents, e-mail etc that I leave open. I run it on 3 Servers and it's great. I have never had any errors based on files being "open".  I actually have a spare copy of the commercial version that I was about to post on EBAY.  Let me know if you want to give it shot and I'll give it to you for about 1/3 retail cost.  Let me know if you are interested.

You can check the FAQ's at:
http://www.estinc.com/FAQ-2.html

Feel free to reject again if you want more feedback, I just get tired of loosing points when someone says the same this as my "Comment" in an answer :-)


0
 

Author Comment

by:davidpm
Comment Utility
I don't begrudge you the points h and appreciate your input. What I was hoping for however was a little more history. IOW used tar did this well but needed this feature tried xx worked like blah-blah am currently using y because of blah but am looking at z that looks interesting.
BRU seems to be the obvious answer, most marketed etc so I am just making sure I have covered all the bases.
Send me the poop by private mail on your for sale copy. Always looking for a bargan.
0
 
LVL 1

Expert Comment

by:hbrady
Comment Utility
Davidpm:

Send me an e-mail at hbrady77@earthlink.net so I will have you e-maill address.  Once I have your address I will send you the scoop on the BRU I have.
0
 
LVL 2

Expert Comment

by:mapc
Comment Utility
Try
http://www.amanda.org/

This should answer your question.
0
 
LVL 40

Expert Comment

by:jlevie
Comment Utility
Amanda has the "open file" problem because it uses the native dump facilities which has the "open file" problem as far as I know. There was, at one point a modified version of BSD dump around that could deal with the "open file" problem by detecting that a file changed in size during the dump. I've not been able to find a document that explicitly states that linux dump intelligently handles this case. Unfortunately, the time that you find out that you backup system isn't doing the right thing is when disaster has struck and you really, really need those backups to be good.

Enterprise backup systems, like Legato Networker, have special facilities to handle things like databases that are always open, and it will tell you if an ordinary  file changed size during backup. Networker does expect you to keep track of save sets that are suspect, but at least you do have the information available to make an informed decision.

 
0
Highfive + Dolby Voice = No More Audio Complaints!

Poor audio quality is one of the top reasons people don’t use video conferencing. Get the crispest, clearest audio powered by Dolby Voice in every meeting. Highfive and Dolby Voice deliver the best video conferencing and audio experience for every meeting and every room.

 

Author Comment

by:davidpm
Comment Utility
Jlevie; do you use legato networker for your production machines exclusively? Are other products or procedures involved.
0
 
LVL 40

Expert Comment

by:jlevie
Comment Utility
I've got four differnet networks that all use Networker as the backup engine and most of the networks have a mix of Unix and windows clients. The good things about Networker are; that it's designed to be a multi-client enterprise backup system and has the performance and features to do so, that it can support a wide range of client and servers, that it supports just about every type of backup media from single tape drives all the way through terabyte tape libraries, and that it's extremely reliable and has an excellent support system. For commercial databases and other special applications there are special modules available that can intelligently backup those items.

What Networker doesn't do well is to restore an OS without having a minimal OS available. If you think about it, this isn't overly surprising as you need quite a bit of functionality to run something like Networker. On my Unix boxes I solve this by taking a native filesystem dump of just the OS when appropriate, like after install, or when some major upgrade has been done. I can restore the OS from that dump and then use Networker to recover the dynamic data (like password files, automount data, etc) and bring the system up to date.

Networker can be a bit intimidating on first glance. There are a lot of things it can do and a lot of ways to set it up. But when it's been set up properly for what you want it to do it pretty much runs itself (especially if you have tape libraries). We don't find that we need anything other than Networker and the native system tools to do everything we need done.
0
 

Author Comment

by:davidpm
Comment Utility
I'm still so used to NT with the magic winnt folder.

Does this mean that you do not backup the boot partition? You just reload boot from original media? I would think there would have been patches etc you want to save.

0
 
LVL 40

Accepted Solution

by:
jlevie earned 50 total points
Comment Utility
Well yes, If you had a /boot partition on a Linux system you'd want to back it up also. My example doesn't show one because Solaris doesn't use one.

Since we are on the subject of backups, and thus disaster recovery I'm going to diverge a bit from that subject into some implied areas. Obviously a good backup system is a critical part of disaster recovery. What's not so obvious is planning your system layouts to minimize the time required for recovery. System layout and partitioning schemes tend to be a bit of a religious issue. I know what I like and I think I've got pretty good reasons to support those choices.

Whenever possible (and always for servers) I like to have two disk drives. One for the OS and another for data. On the OS disk I make partitions for /, swap, /var, and /opt. Depending on what else the server does I may also make /var/spool partitions. I don't bother with separate / & /usr partitions. That used to a good idea before the days of shared libraries as you'd have just enough in / to boot single user and repair a system. Now just about everything is dynamically linked and you have to have /usr/lib available to do most anything.

The uses of / & /var should be obvious. What's not so obvious is that a lot of the nifty things that you add to Linux to provide nifty services are going to be scribbling on the OS disk. Samba, Apache, and others scatter their config files and data areas across the system. This makes recovery of a failed system more time consuming as more pieces have to be updated from backups after the base system is recovered. But most of those things don't have to be mixed in with the rest of the system. You have to build your own, but you then can target them to the /opt filesystem.

I use /opt for things that I add to the system that aren't necessarily OS version dependant, like StarOffice, Acrobat, Netscape, Apache, Samba, etc. I realize that some of these exist as rpms that can be added to the system, but they are available separately and some are frequently needed on other workstations making them likely candidates for exporting via NFS. The point is that a system upgrade, or OS re-installation doesn't necessarily require that these types of packages be re-installed each time.

In fact, the binaries for each of the shared packages are located within /opt/export on my servers (which is an NFS export and which frequently is a separate disk if there's lots of shared apps) and I make logical links to their normal place in /opt (e.g. /opt/Acrobat is a logical link to /opt/export/Acrobat). This lets me mount the shared data on a workstation on /opt/export and make the same links. On first glace this might seem odd, but with a uniform environment the user's shell init files can look the same or nearly so. Also I still have the freedom to install a local package on a workstation in its /opt. Oh yeah, rebuilding a workstation that uses shared apps is faster than rebuilding one that has everything local.

So now I have all of the OS stuff on one drive (and possibly the shared apps on another) and the user or other dynamic data is on a different drive. Obviously it's going to be easier to recover from a single drive failure and it's going to be easier to upgrade the system, because I've separated the actual OS as much as possible from everything else.

The point of all of this is that you want to think through the way you set up servers with considerable thought given to how you'd put one back together if you lose the OS. The more you can separate server specific and dynamic data from what has to come off the OS cd and onto the root filesystems the easier it will be.
0
 
LVL 2

Expert Comment

by:linuxwrangler
Comment Utility
A big second to jlevie's basic comments and I'll add mine, too.

First, if price is a serious consideration than I doubt you'll get the budget for Legato no matter how fine it is.

Second, no matter what you pick be sure to test a real backup and a real restore (not just clicking a verify files button). In one famous case a few years ago a but was discovered in which the software backed every thing up and verified files and told the operator that everything was ok. About the only thing it didn't do is write any data to the tape - a fact that was not discovered until the tape was needed. Retest from time to time and after any OS or backup software update.

Third, consider tape portability. Some software can't restore files backed up from a unix box onto an NT box, etc. even though each of the boxes had agents appropriate for that OS and everything was run on to a central server. (In my last job the network had Linux, Unix, WinNT, Win2k, Netware 4.0, Netware 4.1, Netware 5.0, Mac, Win95, Win98, and DOS machines housing a total of about 1 terrabyte so backups were loads of fun)

Finally, consider (hack, cough), NT backup. If you have an NT machine on the net it will backup samba shares just fine (though you are on your own for disaster recovery of the Linux box in this case)
0
 
LVL 2

Expert Comment

by:linuxwrangler
Comment Utility
A big second to jlevie's basic comments and I'll add mine, too.

First, if price is a serious consideration than I doubt you'll get the budget for Legato no matter how fine it is.

Second, no matter what you pick be sure to test a real backup and a real restore (not just clicking a verify files button). In one famous case a few years ago a but was discovered in which the software backed every thing up and verified files and told the operator that everything was ok. About the only thing it didn't do is write any data to the tape - a fact that was not discovered until the tape was needed. Retest from time to time and after any OS or backup software update.

Third, consider tape portability. Some software can't restore files backed up from a unix box onto an NT box, etc. even though each of the boxes had agents appropriate for that OS and everything was run on to a central server. (In my last job the network had Linux, Unix, WinNT, Win2k, Netware 4.0, Netware 4.1, Netware 5.0, Mac, Win95, Win98, and DOS machines housing a total of about 1 terrabyte so backups were loads of fun)

Finally, consider (hack, cough), NT backup. If you have an NT machine on the net it will backup samba shares just fine (though you are on your own for disaster recovery of the Linux box in this case)
0
 

Author Comment

by:davidpm
Comment Utility
Comment accepted as answer
0
 

Author Comment

by:davidpm
Comment Utility
I found this old question laying around everyone had forgot about.
Got a new one under linux networking. Check it out. I'm sure you have done exactly what I thinking of already.
0

Featured Post

How your wiki can always stay up-to-date

Quip doubles as a “living” wiki and a project management tool that evolves with your organization. As you finish projects in Quip, the work remains, easily accessible to all team members, new and old.
- Increase transparency
- Onboard new hires faster
- Access from mobile/offline

Join & Write a Comment

If you have a server on collocation with the super-fast CPU, that doesn't mean that you get it running at full power. Here is a preamble. When doing inventory of Linux servers, that I'm administering, I've found that some of them are running on l…
Linux users are sometimes dumbfounded by the severe lack of documentation on a topic. Sometimes, the documentation is copious, but other times, you end up with some obscure "it varies depending on your distribution" over and over when searching for …
Learn several ways to interact with files and get file information from the bash shell. ls lists the contents of a directory: Using the -a flag displays hidden files: Using the -l flag formats the output in a long list: The file command gives us mor…
Learn how to find files with the shell using the find and locate commands. Use locate to find a needle in a haystack.: With locate, check if the file still exists.: Use find to get the actual location of the file.:

771 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

13 Experts available now in Live!

Get 1:1 Help Now