• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 590
  • Last Modified:

Script to do own verify integrity on backup of FileMaker database

Running a backup from the server with "verify backup integrity" has slowed us down too much, so I need to verify the integrity of a backup db in a different way. Is there anything I should do besides opening it up and going to a bunch of layouts?

We have a rather large database (3GB).

I do a backup in the middle of the day so that if the database gets damaged we haven't lost an entire day. However, the backup has been slowing people down for progressively longer periods (db locked up for all intensive purposes .... started at 10min then 30min then 1 hr).

Last night I added an eSATA card (I back up to an external drive) to see if it helped. My tests didn't show a difference. I then tried removing the "Verify backup Integrity" checkbox and saw a dramatic difference. The backup now takes 3 minutes.

Here's my deal, I still want to know that it's a good backup, so I'm thinking of running a script that opens the database and does a few things. Anything you think I should check?

Thanks!
0
challengeday
Asked:
challengeday
  • 3
  • 3
  • 2
  • +1
2 Solutions
 
North2AlaskaCommented:
I've been doing backups for many years.  I find the best method is the 3-2-1 method: 3 copies, 2 different media (i.e. don't just use DVDs or a single hard drive), 1 off site.

If it only takes three minutes to do the backup, back it up 3 times.  One has to be good...  :-)

Oh, the other rule of thumb is, if its not automatic its not a backup...
0
 
challengedayAuthor Commented:
Wow, that's a lot of backing up!

I do 5 copies (1 at noon, 1 at 11pm, one at midnight to a daily folder) , those 3 to an external disk then copy one of those to my local PC and use that to upload to offsite.

I can't see doing DVDs since that's not very automatic. What 2 different media do you use?

So you don't do "verify backup integrity"?
0
 
North2AlaskaCommented:
First, I think I need to clarify.  I don't use FM Server, so take that into consideration.

The two media I'm am using are local hard drive and cloud storage ( I use CrashPlan).   Maybe media, as in a hard drive vs a DVD isn't really valid any more.  We all have too much data to put on a DVD and hard drives are really cheap.  So, maybe I should say two different devices.  For example I have a set of external hard drives I back up to as well as a Time Capsule.  I also backup across the net to a computer in my Dad's home in Alaska.  (CrashPlan is great for that and free)
0
Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

 
Will LovingPresidentCommented:
If your main FileMaker db file is 3GB, I would guess that you are storing large amounts of data in container fields. I would suggest/recommend doing one of the following: 1) using "Store File as Reference" and keeping the stored data outside the database; 2) create a separate FileMaker file that contains related records with just the Container fields and moving your Container data there so that the file containing non-container data is much smaller and more quickly and easily backed up; 3) Consider something like SuperContainer which I've not used but which allows you to use Store As Reference to other locations besides a local file directory.

With separate files you back them up at different and more convenient times.
0
 
challengedayAuthor Commented:
Thanks North2Alaska, good clarification.

Thanks willmcn, I do use "Store Fille as Reference" (I changed that early on for exactly this reason). Still the filesize steadily creeps up.

I have never figured out how to tell what exactly is taking up the space. It'd be nice to have a tool that showed the composition of the file so as to try to target specific areas.
0
 
Will LovingPresidentCommented:
A simple way to figure out where the bulk is being stored is this: Make a local copy of your database, open it on your desktop and then start deleting tables, close the file and check the size. Once you find the culprit table or tables, you can delete fields go further by deleting fields if you wish. The other thing to do is to make a clone of the file and check to see if it's something that parts of the file itself, such as massive graphics stored on layouts.
0
 
challengedayAuthor Commented:
Great idea! Although with 70 tables it's going to take me a while :)

The only downside of the large file for us so far is the long verify. Other than that do you know of any reason to try to keep file size down?
0
 
Will LovingPresidentCommented:
YOu can probably delete tables in batches but I would start with any one that has container fields.

From an integrity standpoint, there should in theory be no issue with using large files. However, in my experience, large size is almost always equated with increased chances of corruption, especially if you are editing the live file (making structural changes to fields, tables and layouts) across a network or WAN. I've seen many large files get corrupted because users were logging into them across the internet, making structural edits and then having the file get corrupted during the saving changes process. The larger the file and the slower or flakier the connection, the more likely this is to happen.
0
 
jvaldesCommented:
Rather than have the database check it's integrity, unload the database in a way you can reconstruct it.

Save a clone (Only when there is a version change)
Export all the table data through a script

Rebuild the database by reading all the tables into the clone also done through a script.

If the database is complex this will take less time. It will require that your system be taken offline for maintenance.

The great news is you are guaranteed integrity and you can always recompose the data
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

The 14th Annual Expert Award Winners

The results are in! Meet the top members of our 2017 Expert Awards. Congratulations to all who qualified!

  • 3
  • 3
  • 2
  • +1
Tackle projects and never again get stuck behind a technical roadblock.
Join Now