Solved

High Concurrent User Count - Best Practices?

Posted on 2008-10-15
11
408 Views
Last Modified: 2013-11-29
I have a database that maintains about 3500 - 5000 active records, with basic maintenance (decompile/compact/repair & archive) run daily.  It has 30-40 concurrent users.  All users have to update records, though none delete or add records.  The users do not ever overlap, so they do not access a record that is in use by someone else, but we're at the point where the BE gets corrupted a few times a day now.

There is a main data table in the back end on the server, with front end MDE's distributed to the agents.  There is also an import database that runs VBA code to update the data table hourly with new records from an IBM-3278 terminal interface via screen scraping.

What I'm looking for are a list of best practices (other then not exceeding 8 users), tips or tricks that will minimize corruptions.  
0
Comment
Question by:sweekes283
  • 5
  • 3
  • 2
  • +1
11 Comments
 
LVL 84
Comment Utility
First: High Concurrency and Access are two terms that typically you don't hear together. Don't get me wrong, I'm a big advocate of Access and one of the first to defend it but given your specifications and working conditions, then it might be time to consider upgrading.

What type of connections are you using? Are these typical LAN connections, or do you have any remote users (i.e. WAN, VPN etc)?

Regardless: I'd say that with a 30 - 40 concurrent usercount you're reaching the upper limits of Access and should consider moving to a more robust data store. SQL Server, MySQL etc would be in order, and in most cases you can simply link the tables and be done with it.

Check this link:
http://support.microsoft.com/kb/300216/
0
 
LVL 6

Author Comment

by:sweekes283
Comment Utility
I'm well aware that we're beyond the sane limit of concurrent users in Access, which is why I'm only after tips for minimizing the impact.  A SQL server is not an option in this situation, else I would have already moved it to our server.
0
 
LVL 6

Author Comment

by:sweekes283
Comment Utility
Oh - and to your question, all the users are on the LAN and connect through an MDE on their PC.
0
 
LVL 84
Comment Utility
Okay ...

Daily corruption would be a major source of concern to me. Decompile should never be necessary for data-only applications (i.e. your Backend). You might try importing the data into a new, blank database occasionally however.

Good luck ... here's a link to a couple of highly regarded data recovery services, you'll need them sooner than you think if you're corrupting daily:

www.everythingaccess.com
www.vb123.com

And archiving a corrupt database is as bad (if not worse) than not archiving at all. Have you tested some of your recent backups to insure they're valid?
0
 
LVL 44

Expert Comment

by:Arthur_Wood
Comment Utility
"A SQL server is not an option in this situation"  why not?  SQL Server Express Edition is FREE - you can download it directly from Microsoft, at absolutely no cost, whatsoever.
 
check out:
http://www.microsoft.com/express/sql/register/
 
SQL Server Express is full SQL Server, but limited to a Max database size of 4 GB.
 
AW
0
What Should I Do With This Threat Intelligence?

Are you wondering if you actually need threat intelligence? The answer is yes. We explain the basics for creating useful threat intelligence.

 
LVL 6

Author Comment

by:sweekes283
Comment Utility
The archive process moves records from the main database to a secondary access database.  Both are backed up daily with weekly tapes being retained indefinitely, and ultimately this is just a workflow, the actual data is retained in the mainframe we're scraping data from.  Even if we lost all the data, we could simply wipe it and pull it from the mainframe again.

And again - A SQL server is not an option in this situation, else I would have already moved it to our server.
0
 
LVL 44

Accepted Solution

by:
Leigh Purvis earned 500 total points
Comment Utility
I have to admit - there's something strangely appealing about your situation.  In an almost massochistic way - the thought of the absolute impossibility of moving to a server RDBMS but the requirement to support 30-40 users in a reliable fashion with a Jet BE.  
It would be fun assuming you were given the leeway to get it wrong a few times (if folks are going to lay down conditions then they need to accept consequences while the constrained perform their tests :-)
Fundamantally - it should be doable.  But almost impossible to remove the chances of corruption.
All you can do is minimise the likelihood of it.
Of everything you described - the one thing that stood out to me was:
"an import database that runs VBA code to update the data table hourly with new records "
It only appends records - or it performs updates too, just as the users are doing?
30-40 concurrent users and a mass update or rows really is pushing things.

Obviously - how sound is your structure.  To support more users - you need that much better and efficient a data structure.
Small tables - large numbers of records.  (3500 - 5000 ain't a whole lot really).
Good and appropriate index use? (Lots of indexes will harm your update time and therefore concurrency.  Reading shouldn't be corrupting your database).
Are you implementing row level locking?  Table could yield better results ironically.
Have you considered padding your records to fill data pages to simulate row level locking?
Going unbound could help with that (Scott will be able to help you all you need in discussing that if memory serves he's a big fan of unbound Access apps).
WIthout seeing the application it's hard to offer definitive advice.
As I say - with it in hand on the network in use - it could be interesting to see.
Remotely it's hard to offer any nugget of inspiration though if that's what you're after.
Consider this old similar situation of Steve's a while back going from here http://www.experts-exchange.com/Microsoft/Development/MS_Access/Q_21755013.html through http://www.experts-exchange.com/Microsoft/Development/MS_Access/Q_22393835.html to http://www.experts-exchange.com/Microsoft/Development/MS_Access/Q_22433474.html
0
 
LVL 44

Expert Comment

by:Leigh Purvis
Comment Utility
lol for some reason I see I typed "Table" - but of course I meant "Page".
"Are you implementing row level locking?  Page could yield better results ironically"
I suppose if you wanted to lock tables instead you should get far less corruption ;-)
(Of course - with some less than happy users having to wait :-D)
0
 
LVL 6

Author Comment

by:sweekes283
Comment Utility
I wouldn't consider it 'fun'...  It started out as a hacked together 3 day pilot, and has exploded over the last 3 months.  Though thankfully they accept that errors are going to occur, I'm just trying to minimize it.

The import adds records and updates a datetime in existing records.  The users close records (boolean, datetime and string) or enter followup info (datetime and string).  The import doesn't touch anything the users touch.

Would splitting the two tables be beneficial or just slow things down further?

I hadn't thought to look into indexes, and I have no clue what locking is set to (presumably Access default).  

As for "Have you considered padding your records to fill data pages to simulate row level locking?"  I'm afraid I have no idea what that means.

I'll dive into the indexes and go through those links, and I'll toss up a new question for the unbound if I decide to pursue that.  Thanks!
0
 
LVL 6

Author Closing Comment

by:sweekes283
Comment Utility
I have a quick question I posted, if you make your way back to the post, but thanks for the help!  
0
 
LVL 44

Expert Comment

by:Leigh Purvis
Comment Utility
Well - one man's "fun" is another man's "hell" :-)
Locking is probably Page anyway.
You could try to force Row (but it would potentially not help and hasn't exactly been the great fix of issues it was likely intended to be).
Look for indexes - unecessary amounts of them in particular.  (Any relationships in your database?  Duplicate indexes should be ignored anyway really - but the decision making process by Jet could still slow things down)
Splitting the tables - do you mean into separate MDBs?  I don't see too much value in that.  The concurrency problems shouldn't be file but at the Page level.  (I imagine most users are accessing more than one table simultaneously anyway?)
That the mass import doesn't hit the same rows as the users doesn't mean that the db isn't attempting to use the same Pages.
New records get written to the end of the file (last and new pages) and if your user's records were also on that page - or just Index pages - then there would be conflict right there.
As for padding - it's a pretty simple concept that requires a bit of thought and code to implement.
If your records are bigger than half of a data page - then Jet can only possibly fit one record per page.  It's effectively page level locking.  It's just a question of padding our redundant fields in your table with data to accomplish this - based on the amount of data entered in the real data fields of your table.  (You're aiming at a 4K page - so over 2K is required.... ).
As I recall - Jim Dettman has offered some examples in the past of code to calculate for you.
To be fair - it's a fairly extreme method.  But - well, extreme situations.
See how you go with unbound if you decide to go that way.  FWIW there's a pretty simple unbound demo in the examples page linked to in my profile.
0

Featured Post

Complete Microsoft Windows PC® & Mac Backup

Backup and recovery solutions to protect all your PCs & Mac– on-premises or in remote locations. Acronis backs up entire PC or Mac with patented reliable disk imaging technology and you will be able to restore workstations to a new, dissimilar hardware in minutes.

Join & Write a Comment

Most if not all databases provide tools to filter data; even simple mail-merge programs might offer basic filtering capabilities. This is so important that, although Access has many built-in features to help the user in this task, developers often n…
Regardless of which version on MS Access you are using, one of the harder data-entry forms to create is one where most data from previous entries needs to be appended to new records, especially when there are numerous fields and records involved.  W…
In Microsoft Access, learn the trick to repeating sub-report headings at the top of each page. The problem with sub-reports and headings: Add a dummy group to the sub report using the expression =1: Set the “Repeat Section” property of the dummy…
In Microsoft Access, when working with VBA, learn some techniques for writing readable and easily maintained code.

772 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

11 Experts available now in Live!

Get 1:1 Help Now