Solved

Script to do own verify integrity on backup of FileMaker database

Posted on 2011-03-16
9
571 Views
Last Modified: 2012-05-11
Running a backup from the server with "verify backup integrity" has slowed us down too much, so I need to verify the integrity of a backup db in a different way. Is there anything I should do besides opening it up and going to a bunch of layouts?

We have a rather large database (3GB).

I do a backup in the middle of the day so that if the database gets damaged we haven't lost an entire day. However, the backup has been slowing people down for progressively longer periods (db locked up for all intensive purposes .... started at 10min then 30min then 1 hr).

Last night I added an eSATA card (I back up to an external drive) to see if it helped. My tests didn't show a difference. I then tried removing the "Verify backup Integrity" checkbox and saw a dramatic difference. The backup now takes 3 minutes.

Here's my deal, I still want to know that it's a good backup, so I'm thinking of running a script that opens the database and does a few things. Anything you think I should check?

Thanks!
0
Comment
Question by:challengeday
  • 3
  • 3
  • 2
  • +1
9 Comments
 
LVL 12

Accepted Solution

by:
North2Alaska earned 250 total points
ID: 35151331
I've been doing backups for many years.  I find the best method is the 3-2-1 method: 3 copies, 2 different media (i.e. don't just use DVDs or a single hard drive), 1 off site.

If it only takes three minutes to do the backup, back it up 3 times.  One has to be good...  :-)

Oh, the other rule of thumb is, if its not automatic its not a backup...
0
 
LVL 4

Author Comment

by:challengeday
ID: 35151494
Wow, that's a lot of backing up!

I do 5 copies (1 at noon, 1 at 11pm, one at midnight to a daily folder) , those 3 to an external disk then copy one of those to my local PC and use that to upload to offsite.

I can't see doing DVDs since that's not very automatic. What 2 different media do you use?

So you don't do "verify backup integrity"?
0
 
LVL 12

Expert Comment

by:North2Alaska
ID: 35151594
First, I think I need to clarify.  I don't use FM Server, so take that into consideration.

The two media I'm am using are local hard drive and cloud storage ( I use CrashPlan).   Maybe media, as in a hard drive vs a DVD isn't really valid any more.  We all have too much data to put on a DVD and hard drives are really cheap.  So, maybe I should say two different devices.  For example I have a set of external hard drives I back up to as well as a Time Capsule.  I also backup across the net to a computer in my Dad's home in Alaska.  (CrashPlan is great for that and free)
0
 
LVL 24

Expert Comment

by:Will Loving
ID: 35151613
If your main FileMaker db file is 3GB, I would guess that you are storing large amounts of data in container fields. I would suggest/recommend doing one of the following: 1) using "Store File as Reference" and keeping the stored data outside the database; 2) create a separate FileMaker file that contains related records with just the Container fields and moving your Container data there so that the file containing non-container data is much smaller and more quickly and easily backed up; 3) Consider something like SuperContainer which I've not used but which allows you to use Store As Reference to other locations besides a local file directory.

With separate files you back them up at different and more convenient times.
0
What Should I Do With This Threat Intelligence?

Are you wondering if you actually need threat intelligence? The answer is yes. We explain the basics for creating useful threat intelligence.

 
LVL 4

Author Comment

by:challengeday
ID: 35151707
Thanks North2Alaska, good clarification.

Thanks willmcn, I do use "Store Fille as Reference" (I changed that early on for exactly this reason). Still the filesize steadily creeps up.

I have never figured out how to tell what exactly is taking up the space. It'd be nice to have a tool that showed the composition of the file so as to try to target specific areas.
0
 
LVL 24

Assisted Solution

by:Will Loving
Will Loving earned 250 total points
ID: 35151903
A simple way to figure out where the bulk is being stored is this: Make a local copy of your database, open it on your desktop and then start deleting tables, close the file and check the size. Once you find the culprit table or tables, you can delete fields go further by deleting fields if you wish. The other thing to do is to make a clone of the file and check to see if it's something that parts of the file itself, such as massive graphics stored on layouts.
0
 
LVL 4

Author Comment

by:challengeday
ID: 35152025
Great idea! Although with 70 tables it's going to take me a while :)

The only downside of the large file for us so far is the long verify. Other than that do you know of any reason to try to keep file size down?
0
 
LVL 24

Expert Comment

by:Will Loving
ID: 35152198
YOu can probably delete tables in batches but I would start with any one that has container fields.

From an integrity standpoint, there should in theory be no issue with using large files. However, in my experience, large size is almost always equated with increased chances of corruption, especially if you are editing the live file (making structural changes to fields, tables and layouts) across a network or WAN. I've seen many large files get corrupted because users were logging into them across the internet, making structural edits and then having the file get corrupted during the saving changes process. The larger the file and the slower or flakier the connection, the more likely this is to happen.
0
 
LVL 9

Expert Comment

by:jvaldes
ID: 35156706
Rather than have the database check it's integrity, unload the database in a way you can reconstruct it.

Save a clone (Only when there is a version change)
Export all the table data through a script

Rebuild the database by reading all the tables into the clone also done through a script.

If the database is complex this will take less time. It will require that your system be taken offline for maintenance.

The great news is you are guaranteed integrity and you can always recompose the data
0

Featured Post

Highfive Gives IT Their Time Back

Highfive is so simple that setting up every meeting room takes just minutes and every employee will be able to start or join a call from any room with ease. Never be called into a meeting just to get it started again. This is how video conferencing should work!

Join & Write a Comment

Conversion Steps for merging and consolidating separate Filemaker files The following is a step-by-step guide for the process of consolidating two or more FileMaker files (version 7 and later) into a single file with multiple tables. Sometimes th…
Having just upgraded from Filemaker 11 to Filemaker 12 over the weekend, we thought we would add some tips for others making the same move.  In general, our installation went without incident. Please note that this is not a replacement for Chapter 5…
In this seventh video of the Xpdf series, we discuss and demonstrate the PDFfonts utility, which lists all the fonts used in a PDF file. It does this via a command line interface, making it suitable for use in programs, scripts, batch files — any pl…
When you create an app prototype with Adobe XD, you can insert system screens -- sharing or Control Center, for example -- with just a few clicks. This video shows you how. You can take the full course on Experts Exchange at http://bit.ly/XDcourse.

762 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

23 Experts available now in Live!

Get 1:1 Help Now