Solved

Script to do own verify integrity on backup of FileMaker database

Posted on 2011-03-16
9
580 Views
Last Modified: 2012-05-11
Running a backup from the server with "verify backup integrity" has slowed us down too much, so I need to verify the integrity of a backup db in a different way. Is there anything I should do besides opening it up and going to a bunch of layouts?

We have a rather large database (3GB).

I do a backup in the middle of the day so that if the database gets damaged we haven't lost an entire day. However, the backup has been slowing people down for progressively longer periods (db locked up for all intensive purposes .... started at 10min then 30min then 1 hr).

Last night I added an eSATA card (I back up to an external drive) to see if it helped. My tests didn't show a difference. I then tried removing the "Verify backup Integrity" checkbox and saw a dramatic difference. The backup now takes 3 minutes.

Here's my deal, I still want to know that it's a good backup, so I'm thinking of running a script that opens the database and does a few things. Anything you think I should check?

Thanks!
0
Comment
Question by:challengeday
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
  • 3
  • 2
  • +1
9 Comments
 
LVL 12

Accepted Solution

by:
North2Alaska earned 250 total points
ID: 35151331
I've been doing backups for many years.  I find the best method is the 3-2-1 method: 3 copies, 2 different media (i.e. don't just use DVDs or a single hard drive), 1 off site.

If it only takes three minutes to do the backup, back it up 3 times.  One has to be good...  :-)

Oh, the other rule of thumb is, if its not automatic its not a backup...
0
 
LVL 4

Author Comment

by:challengeday
ID: 35151494
Wow, that's a lot of backing up!

I do 5 copies (1 at noon, 1 at 11pm, one at midnight to a daily folder) , those 3 to an external disk then copy one of those to my local PC and use that to upload to offsite.

I can't see doing DVDs since that's not very automatic. What 2 different media do you use?

So you don't do "verify backup integrity"?
0
 
LVL 12

Expert Comment

by:North2Alaska
ID: 35151594
First, I think I need to clarify.  I don't use FM Server, so take that into consideration.

The two media I'm am using are local hard drive and cloud storage ( I use CrashPlan).   Maybe media, as in a hard drive vs a DVD isn't really valid any more.  We all have too much data to put on a DVD and hard drives are really cheap.  So, maybe I should say two different devices.  For example I have a set of external hard drives I back up to as well as a Time Capsule.  I also backup across the net to a computer in my Dad's home in Alaska.  (CrashPlan is great for that and free)
0
Technology Partners: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

 
LVL 25

Expert Comment

by:Will Loving
ID: 35151613
If your main FileMaker db file is 3GB, I would guess that you are storing large amounts of data in container fields. I would suggest/recommend doing one of the following: 1) using "Store File as Reference" and keeping the stored data outside the database; 2) create a separate FileMaker file that contains related records with just the Container fields and moving your Container data there so that the file containing non-container data is much smaller and more quickly and easily backed up; 3) Consider something like SuperContainer which I've not used but which allows you to use Store As Reference to other locations besides a local file directory.

With separate files you back them up at different and more convenient times.
0
 
LVL 4

Author Comment

by:challengeday
ID: 35151707
Thanks North2Alaska, good clarification.

Thanks willmcn, I do use "Store Fille as Reference" (I changed that early on for exactly this reason). Still the filesize steadily creeps up.

I have never figured out how to tell what exactly is taking up the space. It'd be nice to have a tool that showed the composition of the file so as to try to target specific areas.
0
 
LVL 25

Assisted Solution

by:Will Loving
Will Loving earned 250 total points
ID: 35151903
A simple way to figure out where the bulk is being stored is this: Make a local copy of your database, open it on your desktop and then start deleting tables, close the file and check the size. Once you find the culprit table or tables, you can delete fields go further by deleting fields if you wish. The other thing to do is to make a clone of the file and check to see if it's something that parts of the file itself, such as massive graphics stored on layouts.
0
 
LVL 4

Author Comment

by:challengeday
ID: 35152025
Great idea! Although with 70 tables it's going to take me a while :)

The only downside of the large file for us so far is the long verify. Other than that do you know of any reason to try to keep file size down?
0
 
LVL 25

Expert Comment

by:Will Loving
ID: 35152198
YOu can probably delete tables in batches but I would start with any one that has container fields.

From an integrity standpoint, there should in theory be no issue with using large files. However, in my experience, large size is almost always equated with increased chances of corruption, especially if you are editing the live file (making structural changes to fields, tables and layouts) across a network or WAN. I've seen many large files get corrupted because users were logging into them across the internet, making structural edits and then having the file get corrupted during the saving changes process. The larger the file and the slower or flakier the connection, the more likely this is to happen.
0
 
LVL 9

Expert Comment

by:jvaldes
ID: 35156706
Rather than have the database check it's integrity, unload the database in a way you can reconstruct it.

Save a clone (Only when there is a version change)
Export all the table data through a script

Rebuild the database by reading all the tables into the clone also done through a script.

If the database is complex this will take less time. It will require that your system be taken offline for maintenance.

The great news is you are guaranteed integrity and you can always recompose the data
0

Featured Post

Secure Your Active Directory - April 20, 2017

Active Directory plays a critical role in your company’s IT infrastructure and keeping it secure in today’s hacker-infested world is a must.
Microsoft published 300+ pages of guidance, but who has the time, money, and resources to implement? Register now to find an easier way.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Title # Comments Views Activity
Filemaker xsl file 8 139
Filemaker Go to FMS - Low cellular signal 2 88
filemaker server 14 default database directory 1 78
change filemaker server web direct port 1 193
Pop up windows can be a useful feature of any Filemaker database.  Though best used sparingly, they can be employed in a multitude of different ways, for example;  as a splash screen at login, during scripted processes to control user input, as pick…
Conversion Steps for merging and consolidating separate Filemaker files The following is a step-by-step guide for the process of consolidating two or more FileMaker files (version 7 and later) into a single file with multiple tables. Sometimes th…
Although Jacob Bernoulli (1654-1705) has been credited as the creator of "Binomial Distribution Table", Gottfried Leibniz (1646-1716) did his dissertation on the subject in 1666; Leibniz you may recall is the co-inventor of "Calculus" and beat Isaac…
Exchange organizations may use the Journaling Agent of the Transport Service to archive messages going through Exchange. However, if the Transport Service is integrated with some email content management application (such as an antispam), the admini…

730 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question