• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 693
  • Last Modified:

Pervasive DB backup plan

Does anyone have best practices for ensuring a pervasive database can be restored at any moment of time, ie a fire, tornado, etc.  How to and which medium to bsckup the log, DB etc?  I am trying to crate a disaster recovery plan.  
1 Solution
Bill BachPresidentCommented:
Backup up a Pervasive database is easy, and there are multiple solutions:

1) Get all users out and back up the files.  The easiest and quickest solution, this is usually limited to larger databases with available downtime windows.  Any old backup package or file copy solution will work just fine.  Consider backing up to another disk drive FIRST, and then to tape for archival purposes. Downside: down time is required, and it can take a while for large databases.

2) Use Continuous Operations Mode.  The same file copy solution, you first prepare the database by coalescing the files using BUTIL or Backup Agent, and you can copy the files while they are in use.  You can even do this several times per day -- some people have done this as frequently as every 2 hours, especially for smaller databases.  Get more information on how to do this from www.goldstarsoftware.com/press.asp in the white paper on Pervasive Database Backups.  Downside: Performance impact while in ContOps mode; limited number of copies per day; best for smaller databases.

3) Use DataExchange.  Pervasive has a replication solution which works well or poorly, depending on your database engine version, server configuration, frequency of data file changes, expertise, and time you have to devote to it.  While a cheap solution that is fairly easy to install, it can be difficult to keep running in a dynamic environment.  If your app is stable, though, with only infrequent data structure changes, then DX can provide a one-way replication from a primary server to a backup server that works VERY well, and at the database level, so it provides some good protection from viruses, bad "DEL *.*" commands, and the like.  Works very well with large databases.  Downside: Learning curve to setup; hard to maintain if the data file structures change frequently; only available and functional with SOME PSQL versions.

4) Use volume-based replication.  Several solutions abound, including DoubleTake (my own personal favorite) and CA XOSoft.  The trick is that the replication has to be done in "disk write order" to ensure that the disk writes take place on the secondary server in the same order as the primary server.  This is a good solution for also backing up flat files (EXE's, config files, txt files, etc.) along with the database and can work to an off-site server as well.  A bit more expensive, and it can be tricky to set up to ensure "pure" data stability, but this is VERY easy to maintain on a long-term basis (unlike DX) and can be used for very large data sets, as well.  Downside: DEL *.* commands or file corruption are replicated almost instantly, so this is not a stand-alone solution; can be more expensive.

5) Use SAN Snapshot solutions.  The NetAPP series of SANs has been also vetted to allow for proper snapshotting of the database.  You can easily put the database into ContOps mode, take a snapshot, and allow the SAN to replicate the snapshot copy to a secondary SAN.  I have some companies doing this every 15 minutes, and wanting to go even more frequently, which allows for great recovery potential, even from an off-site server.  Downside: Cost -- figure $300K as a starting budget, not including training and configuration.
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

A proven path to a career in data science

At Springboard, we know how to get you a job in data science. With Springboard’s Data Science Career Track, you’ll master data science  with a curriculum built by industry experts. You’ll work on real projects, and get 1-on-1 mentorship from a data scientist.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now