Link to home
Start Free TrialLog in
Avatar of markp99

asked on

MSAccess: Unreplicate a collection of tables?

I have database that uses replication.  Periodically I need to unreplicate the database due to recurring corruption issues (that's a separate discussion - the unreplicated DB never gets corrupted).

I have automated the unreplication process somewhat.  In an unreplicated DB (with necessary references & modules included):

1.  LINK to all the tables in the replicated DB (~40 tables)
2.  Run a series of MakeTable queries via a collection of SQL statements to replace the contents of the tables with current data (I exclude the replication fields)
3.  Remove all the LINKED tables
4.  Delete all forms and queries from unreplicated DB
5.  Import all forms and queries from the replicated DB into the unreplicated DB
6.  Replicate the unreplicated DB

This process works well enough, but there is a shortcoming:

1. If Tables have been added/removed or the structure(s) changed, my existing MakeTable SQLs will not produce the expected results.  I need to keep an inventory of all add/deletes/updates to the tables and their structures.  I am in maintenance/development mode, so changes can be numerous between the unexpected/unplanned "unreplications".


Can anyone suggest alternative approaches to Unreplication??  Is there a fully automated approach that would anticipate table structure changes?

Avatar of oleggold
Flag of United States of America image

I see Your problem and it seems to be the usual one with big number of Access dbs.I know we use Databuilder product to generate and replicate the multiple Access dbs.In the same fashion as You pointed the tool uses master db to model replications on and run sqls to create and delete multiple tables.I think that as far as Your master db contains SQLs to create any new tables ,there's no need to keep additional records as it can get really cumbersome and hard to mantain.
Hope helps
Avatar of Scott McDaniel (EE MVE )
Are you trying to synchronize copies of a database (i.e. the structure, not the data)? If so, there are some 3rd part products which can do this. Otherwise, you'll need to either (a) manually keep up with the changes you need to make, and then change your MakeTable queries to reflect those changes or (b) write code to loop through your tables in the "master" db and build the necessary tables in your "replica" databases ...
Avatar of markp99



I had thought to "loop thru" the tables as you suggest.

I can see the collection of tables in db.TableDefs.  Is it possible to also see the fields in each table??  If yes, I suppose I could create the MakeTable queries dynamically.
From the Trigeminal software site
Michael Kaplan
Trigeminal Software, Inc.

 TSI Access 2000 Un-Replicator
Have you ever wanted to unreplicate a database that has been replicated? Seems like a one-way trip, doesn't it? Well, if you are using an Access 97 database, Microsoft has a tool that does a fair job which you can get at Q153526 (ACC: How to Make a Replicated Database a Regular Database). But if are using Jet 4.0, you can try build 5018 of ourTSI Access 2000 Un-Replicator, a ComAddIn written in VB6 that you can stick on your machine, register, and then see right off your Tools|Replication menu.

Works fine in Access 2k3
It will remove the system fields and leave you an unreplicated database.

Another way to limit the number of times you have to "Unreplicate" your set is to keep a spare replica in the set.  
I have my design master in a directory,  Replicas in Antigua and NYC, and an additional copy on my laptop which I use for testing.  
If my design master becomes corrupt (more often it is the replica in Antigua) I simply delete the corrupt replica, try to replicate with it, when that fails, I recreate the replica.
You can also copy an existing replica over the corrupt one...  Access will accept the replacement as the original as long as it is in the originals "Path/Address"

Avatar of markp99


It seems for me the Design Master gets corrupted as I work thru new functionality that runs into errors or simply hangs.

Yes. I really SHOULD work on new/risky functionality AWAY from my design master, but integrating new external results back into my design master seems a bit of a chore (less than unreplicating??).  Though, I NEVER see issues of corruption in my experiments in unreplicated databases.  It seems as if replication adds a level of intolerance for certain types errors.
The worst thing that happens is when the design master becomes corrupt and it looses the Replicated property (while maintaining all the baggage)

Instead of unreplicating the set...  try creating a new replica (to replace the design master) from an existing replica then using the menu option to "Recover the design master" on the new replica.
(again this is done by first renaming the existing DM and trying to replicate with it, which will drop it from the replica list, then create the new replica in its place)

Alternately, you can simply copy an existing replica OVER your design master and again Recover Design Master

If you have objects in your design master that are not replicated, simply import them into the new design master
Avatar of markp99



I guess I had presumed the original DM to hold the keys to the replica set.  I have "recovered design master" from a replica (or a copy of the design master), but then created new/replacement replicas from there, vs just re-replicating.  Filesize increased with each increment.  Replica upon replica, baggage upon baggage, was the thought...and probably the source of the frequent corruptions.

I thought the un-replication, then fresh replication seems cleanest, and produces seemingly more compact filesizes, improving load time across our networks.

I'll play more with the approach you mention.
Avatar of Computer101
Flag of United States of America image

Link to home
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial