I have logic that passes thru a local table with 300K records. What the routine does is simple but it keeps stopping with the 'File sharing locks exceeded' when I get to around record 10,000.
I did some research about this issue on EE and added the statement
DAO.DBEngine.SetOption dbMaxLocksPerFile, 200000
At the top of the processing. There was no change, the error still occurred around record 10,000.
I also saw that registry revision is another way of handling this issue but that wouldn't work because I would have to do it on every user machine. There are close to 40 users.
When I first wrote the routine it processed all 300k records with no issues, this problem came later.
Since this is a local table there is no need for any locks on the table. No one but the current user will access it.
I pretty much use the same open statement on all of my ADO files. Is there something that I can do to make the open more suitable for this process. I don't know the effect of the options. Here is the open I use.
One possibly key difference is that in this case the loop involves only one table. I am passing thru a table and updating each records as I pass thru.