Link to home
Start Free TrialLog in
Avatar of Bob Collison
Bob CollisonFlag for Canada

asked on

Acess 2010 - Pause Until Part Of Event Completes

Hi Experts,

I know I asked a similar question before but I can't find it in EE.

I have a Form Event that Extracts data to a table and then runs a report using the extracted data from the table.  On slower PC's the report runs before the data is created and so thinks there is no data.  I don't want to 'pause' the processing for a specific period of time since I can't guarantee the PC's speed.

I think the answer before suggested doing a loop and 'checking' something that would indicated if the data extraction had completed.

I'm open to any suggestions.

Thanks,
Bob C.
SOLUTION
Avatar of crystal (strive4peace) - Microsoft MVP, Access
crystal (strive4peace) - Microsoft MVP, Access

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of Bob Collison

ASKER

Hi Crystal,

The simplest way to look at this is that I want to be able to not move from one section of Form Event Code to another without completing the first section.

Example
Part 1 Extracts data to a table which may take a while since it may involve a number of lookups and / or a large number of records.

Part 2 Runs an Access Report that has an Access Query as its data source.  The Access Query needs to read ALL data extracted to the table in Part 1or it won't have all data to say, sort with.

I don't want Part 2 to start before Part 1 completes.

Thanks,
Bob C.
Last time I checked, code execution was linear.  Are you shelling out to other applications?  Please post the code and indicate where Access skips ahead.
Hi Pat,

It is linear.  I am assuming (incorrectly?) that the time lag is the issue.

The primary reason for this is that it works perfectly on a powerful PC (32 Gig RAM) but intermittently on a much smaller PCs.  To me this indicates that issue isn't with the code itself.  Therefore the fundamental question is why does this work on a high powered PC and not on one with much less power.

I am attaching the code.  A couple of words of explanation.

The Parameter Table contains a single record that contains the Parameter Values that are used by the Query.  The creation of the Parameter Record is a Common Function that I use in many instances without any issues.

At one time I used to delete the Parameter Record immediately after the Report ran.  It was suggested that I take this out and only run it before creating the Parameter Record and during general clean-up.  i.e. When a User Logs Off the System.

The Report Query uses 13 Tables to extract all of the data to support the report.

To provide the actual application issue code to you is quite a task.

Hopefully this will clarify the issue and the coding sufficiently.

Thanks,
Bob C.
R-10-210---Member-Info-Report---Maxi.txt
Are you saying that - Call AddParameterDataTableRecord01 - doesn't complete prior to - DoCmd.OpenReport stDocName_R10210, acPreview?  You didn't post that code.

Did you try doing as Crystal suggested and set a variable in the called procedure that you check prior to opening the report?
Hi Pat,

Sorry I didn't post the Parameter Code as I use it extensively and it has never been problem so I didn't supply it to avoid cluttering up the issue.  I have attached it now.

I haven't added the Check Processing Status Variable Crystal suggested to the code yet but I will try it today and let you know the results.

Thanks,
Bob C.
AddParameterDataTableRecord01.txt
Hi Experts,

I have put the Check Processing Status Variable in the Code and the Parameter Record does always get created prior to running the Report.

However, today on the same PC as I was encountering the original issue (I am the only one using it and just for the testing) I can't get it to fail as it was originally.  I have tried a dozen times and it works successfully every time.

I did mention that the Report Query accesses 13 Tables.  Is this potentially an issue with performance and if so will a Report try to display before all of the data in the query has been extracted?

One possible solution I have though of would be to extract the data from the 13 tables into two tables and then change the query to only read the two tables.  Is this be a better approach.

Thanks,
Bob C.
Are you saying that the original version isn't failing or that the version that checks the complete flag isn't failing?  Isn't "isn't failing" good?

Is the query updating the underlying data but the report is still showing the old data?  We still haven't seen all the code you are running (and what we have seen is very difficult to read being all upper case and not indented or spaced out well).

13 tables or 2 shouldn't matter since you have to get the 13 down to 2 before you can use them.

The report runs it's own query.  You don't need to run it first.  In fact, the easy way to determine if the report has data is to use the NoData event of the report to either put something in a box and print one page or display an error and not open the report at all.
Hi Pat,

The original code (accde) that was failing is now not failing and I haven't changed the code or the data!  The report is showing the correct data. and not indication 'NoData' which was the original problem.

As mentioned it is very difficult to provide a complete copy of the code for this report along with the tables, data etc.  That is why I haven't provided it.  Sorry about the format of the code but obviously its not something I can change quickly.

I have the NoData Event coded in this and all my reports and it was what triggered me to the fact that there was a problem.  I display a No Data Error Message in a Message Field on the form from which it is run.

Currently the query is the Record Source for the report. The Parameter table is simply another table populated as required to pass criteria to the query.  My question was around whether having 13 tables linking together in a query was significantly less efficient than loading two tables with the required data using VBA Code and then having a simple query as the Record Source for the report.

At this point I don't know where to go with this as it is working so I can't really do anything else to try to get to the bottom of it.

By the way the 'new' pane displaying 'Suggested Solutions' I terribly irritating.  You might pass that on to EE Development.

Thanks,
Bob C.
ASKER CERTIFIED SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Hi Pat,

Thanks for your comments / suggestions.

I assumed that creating Temp Tables creates bloat.  The tables I am referring to making would be permanent BE Tables that I load with data,  as you suggested containing often used data from multiple tables,  use and them empty as you suggested.

All of my BE Tables are on the same PC as the FE Application.  I don't use SQL Server.

I also appreciate your suggestion to stick with Queries regarding performance.

At this point since I can't duplicate the problem any more I'm closing the case.

Thanks,
Bob C.
Thanks for your assistance.  Although your commnets suggestions didn't solve the issue they provided additional insight into best practices.
Temp tables fall into two categories.
1. Make table queries that make a new table each time.
2. Permanent tables that you run delete and append queries against.

Both cause the same bloat issue.  Access cannot recover free space unless you compact a database so if you add a million rows and delete them and add a million more, the table space occupies 2 million rows until you compact.  The append does not overlay the  deleted original contents.

The second option is better than the first because you can define indexes which help with other processes but the best solution is the one I described earlier where the "temp" table is in a separate BE and you use a fresh copy of the BE each time you start the process.  The template database contains whatever tables you need along with any indexes you want.  make sure to empty the tables and compact the database after you build it.  That way it always starts as small as it can be.

Sorry I couldn't solve the actual problem.
Hi Pat,

Thanks for the additional information.

Thanks again,
Bob C.