Link to home
Start Free TrialLog in
Avatar of WyattRidesAgain
WyattRidesAgain

asked on

Memo Fields- Unreliable? Too Big?

I'm currently using Access 2000 and might move to 2002 in the future.  In one of my Access table's memo field I'm storing large strings which I manipulate using VBA code.  A given string length can be anywhere from 1,000 to maybe 500,000 characters.  I've seen passing comments in various sources that memo fields can cause problems and/or crashes sometimes, but no explanations were given on what those problems might be.

1) Are memo fields in fact problematic?  If so, what specifically are the problems and how does one avoid or cope with them?

2) Related to that, I believe Access has a data limit of 2GB.  I'm assuming my memo fields consume their share of that limit, which over time will grow to be a substantial amount.  If my assumption is correct, should I consider moving the table with the memo fields to a separate back-end mdb file;  or plan to have to move to SQL Server someday;   or use external text files with one file per string (a pain?);  or ...?

WRA
ASKER CERTIFIED SOLUTION
Avatar of jadedata
jadedata
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of WyattRidesAgain
WyattRidesAgain

ASKER

I fear wading into these waters but here I go...

To explain what I'm doing in some detail, this memo question is about how I'm handling fast reconstruction of engineering bills of material (BOM's) and their inverse, Where-Used's (WU's).  I have three tables applicable to this discussion:
1)  Items:   contains info on each part and assembly.  
2)  Binary:  (forget why I named it that) has three fields: Parent_Item_ID, Child_Item_ID, Qty.  First two fields are foreign keys from the Items table.  This is a self-join table containing all Parent-Child relationships necessary to define all BOM's in our database.  Theoretically, any two items can be in a Parent-Child-Qty relationship, though I have business rules in place to avoid unnatural pairing (sounds like a genetics experiment).
3)  Compiled:   has two fields:  a foreign key for an item having a BOM under it, and a memo field containing the "compiled" BOM.

I'll explain those last two words.  The first two tables sufficiently contain all the info to reconstruct any desired BOM.  This is done by recursively traversing through the Binary table to find all Parent-Child relationships.  My fear is that with large BOM's (in our case, "large" means hundreds or thousands of items) and with an ever-growing Binary table this would be time-consuming and present a substantial load on the network, considering we would be accessing the BOM's often.  This would get worse when WhereUsed info is retrieved, as those data trees can get VERY large;  think of how many times a common bolt would show up in higher level assemblies.  Thus my idea was to "compile" a given BOM into a single string and put it into a memo field.  The structure of the string follows the demands of a TreeView control, and thus the string has sets of delimited  data for each node (or Item):  Parent_Item_ID, parent node key, Child_Item_ID, child node key, and node text.  Whenever the BOM is needed, only a single transfer of a string is needed across the network to move it to the user's PC, where it can be parsed into its original tree structure and displayed in a TreeView control.

I've got it working, and it does appear to be very fast loading into the TreeView control.  And I've found that memo fields in fact can hold MB's of data, so I should be able to build quite large BOM's.  

I know this violates normalization rules, but I'm really afraid of the system coming to a standstill over time with all the recursive BOM processing a "pure" approach might involve.  I am open to comments on this point however!

SOOO, with this more complete background, do you think what I'm doing is inspired, insane, or ????  I'm particularly concerned about the memo field's reliability.

WRA


I can't speak for 1William, but I think that's really a slick way to handle that.  And the Compiled table seem to be exactly what 1W recommended (1 id, 1 memo).

I would opine that as long as the memo fields get edited using a "full replacement" method you should be extremely well off.  I would start planning now to do regular maintenance compact/repairs, but that is easily enough handled.

Very nice WRA!! and a thoroughly understandable presentation of your case. Thanx!

regards
Jack

Jack,

Thanks for the kudos.  And, as usual, the fast reply!

I'll wait a bit to see if 1William and/or others have comments.

WRA
Yup, I concur except.....  If each record is (memo field) is that big (1mb), and there are thousands, very quickly, you are going to exceed the safe limit (1.5gig).  I have a feeling Access may not be the right container for the job.  For now though, the 'full replacement' is by far the safeset way to go.  Make sure you compact and backup daily.

Thanks again, Jack and 1William.  Jack, the function call you sent will come in handy.  Points are coming.  Hope this seems fair:
 - 120 to jadedata for the code and generally helpful comments
 - 80 to 1William for the solid assist

Wyatt


I need 'em to catch up with 1William!
Thanx WRA