Link to home
Start Free TrialLog in
Avatar of pcalabria
pcalabriaFlag for United States of America

asked on

Need help with : [Microsoft][ODBC SQL Server Driver]Timeout expired

Hello.

I have an MS Access 2K application with a split FE/BE configuration.
The workstation is XP Pro and MS SQL SERVER 2008 R2 is installed on computer via LAN.
Other databases are Access 2K on a windows 2K server.
This is a windows 2003 server enviorment.

I'm running a very long routine...on 10,000 + records...  The routine loops through each records and creates HTML code that can be uploaded to a website.

The routine fails after looping approximately through 10,000 records.  I get the following error:

[Microsoft][ODBC SQL Server Driver]Timeout expired

Can anyone help?
Avatar of Dale Fye
Dale Fye
Flag of United States of America image

Well, generally that result is encountered when you attempt to query the SQL Server and the query exceeds the timeout interval identified in the specific query.  Does the code loop execute some query inside the loop?  If so, are you running a pass-thru query or just querying the linked table?  Either way, the first place I would start is to check the timeout value in the properties dialog when you are editing the query.

If you are creating a query string on the fly in the loop, I would recommend creating saved query and simply updating the SQL of that query after you build the SQL string.  This will allow you to set that timeout value, which you cannot do if you are simply using code similar to:

set rs = currentdb.Openrecordset("Select * from yourSQLTable WHERE ...")

Dale
I don't like my server to get flatlined because it makes users unhappy, so when my app starts, it runs a passthrough query
which has this SQL in it

exec spTimeout15

(Now, it wasn't until later I discovered you shouldn't name your own store procedures to start with 'sp' lest they conflict with the system's)

In said stored procedure is
SET LOCK_TIMEOUT 15000
SELECT @@LOCK_TIMEOUT


That sets the maximum time a query is allow to take to 15000 milliseconds.
Which is reasonable almost all the time.

BUT, you have those times when a bitch of a query needs to run and complete...
So I also have a stored procedure with

SET LOCK_TIMEOUT 120000
SELECT @@LOCK_TIMEOUT

I run that one when I KNOW I need to run something heavy.
Sounds like you need something similar.

Or you need to create a better query!
Avatar of pcalabria

ASKER

Dale,

My code looks like this:

  Set oRS = oDB.OpenRecordset(strSQLtext, dbOpenDynaset, dbSeeChanges)
   
        With oRS
       
        'oRS is all parts in SelectPartsTable with the ComponentMaster info joined

        .MoveFirst
       

    Do While Not .EOF
    DoEvents

<<about 200 lines of code that calculate values, prices, and create HTML>>

.movenext
.loop
end with
opps..forgot this

 strSQLtext = "SELECT SelectedParts.*, ComponentMaster.Notes,ComponentMaster.TasPath1,  PackageMaster.PackageCode,  ManufacturerMaster.IPSEname, ManufacturerMaster.ShortName, ManufacturerMaster.FullName, ManufacturerMaster.Abbreviation, PackageMaster.Package, PackageMaster.PackageImage, PackageMaster.Leads, PackageMaster.PackageMaterial, "
        strSQLtext = strSQLtext & "PackageMaster.PackageName , PackageMaster.MountingType, PackageMaster.sellingQuantityIncrement, PackageMaster.EnforceSellingQuantity, ComponentMaster.OrderNumber, ComponentMaster.DescriptionPrimary, ComponentMaster.title80, ComponentMaster.DescriptionRoot,  ComponentMaster.Description, ComponentMaster.FamilyName, ComponentMaster.GenericName,  ComponentMaster.FamilyName, ComponentMaster.PartClass, ComponentMaster.PartFamily, ComponentMaster.PartFamily2, ComponentMaster.FamilyDescription,ComponentMaster.Pb, ComponentMaster.DocName, ComponentMaster.DocName2, ComponentMaster.MSL, ComponentMaster.EURoHs, ComponentMaster.DataSheet, ComponentMaster.DateObsolete , ComponentMaster.ECCN, ComponentMaster.ScheduleB, "
        strSQLtext = strSQLtext & "ComponentMaster.PackageLength , ComponentMaster.PackageWidth, ComponentMaster.PackageHeight, ComponentMaster.SeatedPlaneHeight, ComponentMaster.PinPitch, ComponentMaster.PackageDiameter, ComponentMaster.MaximumReflow, ComponentMaster.OrderNoVerified, ComponentMaster.SEdate, ComponentMaster.SEautoscan, ComponentMaster.DistiVersion, ComponentMaster.OriginalCost, ComponentMaster.RecPrice, ComponentMaster.AskPrice, ComponentMaster.Value, ComponentMaster.tolerance, ComponentMaster.FullReelQty, ComponentMaster.TapeWidth, ComponentMaster.caseSize, ComponentMaster.TCR, ComponentMaster.Power, ComponentMaster.Voltage, ComponentMaster.OperatingRange, ComponentMaster.ReelSize, ComponentMaster.WebImage, ComponentMaster.SeriesName, ComponentMaster.Title80, "
        strSQLtext = strSQLtext & "ComponentMaster.Title255 , ComponentMaster.SeriesID, ComponentMaster.PartName, ComponentMaster.ClassFamily, ComponentMaster.AlternateOrderNumber, ComponentMaster.AlternateOrderNumber2, ComponentMaster.AlternateOrderNumber3 , "
        strSQLtext = strSQLtext & "ComponentMaster.AlternateOrderNumber4 , ComponentMaster.AlternateOrderNumber5, ComponentMaster.TapePitch "
        strSQLtext = strSQLtext & "FROM ((SelectedParts INNER JOIN ComponentMaster ON (SelectedParts.SearchNumber = ComponentMaster.SearchNumber) AND (SelectedParts.LineID = ComponentMaster.LineID)) LEFT JOIN ManufacturerMaster ON ComponentMaster.LineID = ManufacturerMaster.LineID) LEFT JOIN PackageMaster ON (ComponentMaster.PartClass = PackageMaster.PartClass) AND (ComponentMaster.PackageCode = PackageMaster.PackageCode); "
1) I don't see any VBA variable, controls or anything else in there, so

Why don't you save this query so the engine can optimize it?
And then access it in code through QueryDefs("TheQueryName")
Are you pulling too many fields for that?

2) With a Select * in there, Access is likely to request the whole shebang from the server
3) Are there indexes on all the fields on both sides of the Joins( SearchNumber ,LineID,PartClass,PackageCode)
--a missing index can be a performance killer--
And 4)
If this really is heavy duty, why not do it as a View or Stored Procedure of the SQL Server?
Do you have the permission to create views or stored procedures?
Because this is fairly straight-forward SQL and wouldn't be a bear to do up.
Hello Nick, thanks for helping once again.  Hello Dale, too, same to you.

The ComponentMaster, ManufacturerMaster, and PackageMaster tables are still MS Access tables, which run on a Windows 2K server connected to our LAN.  This is basically all the info we have about parts that we sell.

The SelectedParts tables is on the MS SQL Server 2008 R2, which is also connected to our LAN, but it runs from a workstation.  We cycle through the records, on at a time, and create lots of intelligent HTML which is saved in the [*Description] field and [*Title] fields.  It calculates a few more things, just numbers, so not much horsepower needed for that stuff.  The [*Description] field does cause the table to get very large... many gigabytes.


1) Nick, I'm not sure what you meant about the QueryDefs and optimizing...
2) I can get rid of the SelectedParts.* line.  Was wondering if that was worth the effort.

I can also break of the routine into two or three passes, but it seems silly that I would have to do that.

3) Indexes, yes!

4) Not sure if I can do this because of the access tables...can I?
Nick, should I be trying: SET LOCK_TIMEOUT 120000

The only way I can tell if it works will be to run a procedure and wait several hours to see if it fails.
SET LOCK_TIMEOUT 120000 is 2 minutes.
If you really need several hours than something else is severely wrong, either in the networking, the server or the query.
Even old OS's and hardware shouldn't take hours to do things short of billion-row ETLs

The first thing, if it were me, that I would discover is if the query is the bottleneck or the code
if you do this to your code

  Set oRS = oDB.OpenRecordset(strSQLtext, dbOpenDynaset, dbSeeChanges)    
         With oRS        
         'oRS is all parts in SelectPartsTable with the ComponentMaster info joined
         .MoveFirst    

     Do While Not .EOF
'I'm gonna comment it all out and see how long it takes to walk down the recordset
     'DoEvents
'all 200 lines commented out
 '<<about 200 lines of code that calculate values, prices, and create HTML>>
'not processing nothing, just seeing how my query performs

 .movenext
 .loop
 end with

How long does it take?
If the answer is hours, you have a query and server problem
if the answer is seconds or minutes, well then you have a code problem.

One thing at a time!
One option could be to import SelectedParts to an empty local Access database, link this, and modifying the query using this. It will probably run much faster as well.

/gustav
Gustav has a good point.  when you are doing queries that include tables from both Access and SQL Server, the query can run slow.  If you could either move SelectedParts to Access (as a temp table prehaps) or physically move the other 3 tables to SQL Server, the query would probably run much quicker.
I am so sorry, I did not mention something that will clear up a few things.

The process I am running consists of 35 different routines.  Once, initiated, the routines run in order, one by one until all 35 routines are run.

The objective is to create a table that can be exported to eBay in the format that eBay requires to programmatically list the inventory for sale.  

Some of the routines are simply and only take seconds to run, others are complex and take much longer.  For example, my pricing routine (routine #22) queries 50K sales records, 200K quotes, and 5 million competitor prices to establish an asking price.  This routine runs fine but may take 30 minutes to complete when I'm processing 100,000 rows.

Routine #33 is the problem, and can take my code a couple of hours just to being this routine.  This routine creates an html description.  Its simple code, pulls stuff form the database and includes labels as well as html code to make them easy on the eyes.  The thing that makes this routine unique is that every 250 records or so, expands the filesize by 30MB, so 100,000 records, if stored in an Access table, would cause the table to grow by 12GB.

The reason I moved SelectedParts to the SQL server was to get away from the 2GB limitation of MS Access, so this table must be on the server.

Of course, I can rewrite the routine that fails so it could be run as a stand alone procedure to see what happens, but it would be so much simpler to simply solve the timeout expired problem since the existing code otherwise performs very well. (Functionally at least!)
Anyone who can help, please!  :-)

I'd like to try Nicks suggestion to create a stored procedure, but I'm an Access guy, I don't remember doing a stored procedure before.

I used the following T-SQL code:
CREATE PROCEDURE uspFixedTimeoutProblem
AS
SET LOCK_TIMEOUT 120000
SELECT @@LOCK_TIMEOUT
GO

I think this is what Nick suggested.

Now I think I need to create a pass through query to execute this as part of my application??

When I create a pass through query,  with the following code:
exec uspFixedTimeoutProblem

Access jumps to the Select Data Source screen.  When I enter the FileDSN that I use, I enter my password, and then a query screen shows up with EXPR1000 and one row, 120000.  I don't understand this..

I need to add a line of code to my routine that will run the stored procedure.  Can anyone help?
I think you've got it.  Your procedure works, runs and sets the max query timeout to 2 minutes.   EXPR1000 is the given default name for the column.  120000 is the new value set.  This is global to the server and will remain that way until you issue a new Set Lock_Timeout command.

Now, even though you've got the timeout tweaking query in place, I don't think that it will be your solution.
DoEvents is a nice idea, right up until you really understand it.
It basically makes Access wait -- but Access consumes 100% of the processor core time that it's thread is on while it does so.  On a single core machine, this essentially makes the machine unresponsive.  On a quad core machine 25% of the processor time is consumed, and so on.
You want to be very careful about using DoEvents.  It has its place when you shell out to some external process that needs to complete before your VBA carries on, but you do need to carefully check and construct the code that uses it so that an error doesn't lead to an infinite loop.

Anyway.

 My code looks like this:
Set oRS = oDB.OpenRecordset(strSQLtext, dbOpenDynaset, SeeChanges)    
         With oRS        
         'oRS is all parts in SelectPartsTable with the ComponentMaster info joined
         .MoveFirst        
     Do While Not .EOF
     DoEvents
 <<about 200 lines of code that calculate values, prices, and create HTML>>
 .movenext
 .loop
 end with


strSQLText creates a big, huge, ugly recordset that times out.
What I would suggest is
Dim OuterLoopSQL as string ' SQL that will get only enough PK fields to get a unique record of the set you want
OuterLoopSQL = "You've got to Build this"
Dim InnerSQLSelect as string 'The Select and from clause of the big ugly query
InnerSQLSelect  = "And Build this"
Set outerRS = oDB.OpenRecordset(OuterLoopSQL, dbOpenDynaset, dbSeeChanges)
     
         With outerRS
              .MoveFirst
              Do While Not .EOF
              DoEvents
             strSQLtext = InnerSQLSelect & " An Appropriate WHERE Clause contructed with values from outerRS
             Set oRS = oDB.OpenRecordset(strSQLtext, dbOpenDynaset, SeeChanges)  
         'oRS is one record of all parts in SelectPartsTable with the ComponentMaster info joined
 <<about 200 lines of code that calculate values, prices, and create HTML for this one record>>
         oRS.close
         set oRS = nothinh
 .movenext 'now we'll build the next SQL statement and do it
 .loop
 end with


So your outer recordset used in the loop would be a lot lighter, and inside the loop, you would create the big, ugly recordset for just a single record at a time.

This would have a significantly smaller chance of timing out because each recordset would be much lighter than your present one.
Ugh... I've been working with this all weekend and still do not have a solution.  I just finished adding the "exec uspFixedTimeoutProblem" line after each line that opened a new recordset... I guess I didn't have to do this?

I've modified my code so that the routine that creates the html does nothing but create the code.
I now call it as a subroutine from the main routine.

The query is nothing more than a select query to extract the data we need from three table to build a webpage.
Its not the query eating up that time, is the code that is executing in the loop.

This code is doing nothing more than create HTML.  
As near as I can tell, the code is processing about 20 records per second.  Now when we consider 250,000 records in our inventory, that means to cycle through every record once, our objective, at 20 records/second and 250K records, it would take 3.5 hours to cycle through the table once.


Up until now, I've only been processing groups of 10 to 50K records.  I did try this today with all 250, 000 records and a routine that runs before the troublesome routine failed with a similar error (System Resource exceeded #3035.)

Is there a way to turn the timeout process off, or set it for six hours or so?

Thanks
I guess I didn't have to do this?
Nope.  It's a one off
Is there a way to turn the timeout process off, or set it for six hours or so?
1000 milliseconds per second
60 seconds per minute
60 minutes per hour
6 hours

Alter PROCEDURE uspFixedTimeoutProblem
 AS
 SET LOCK_TIMEOUT 21600000
 SELECT @@LOCK_TIMEOUT
 GO

is the code that is executing in the loop.

Post it if you'd like.  It can't hurt.
200 lines is more than a guy'd like to wrap his brain around, but you're in the weeds.
And getting Access VBA to build HTML is something I do, too.
Hopefully you've got it reasonably commented so it makes good sense.

Set BatFile = fs.CreateTextFile(BuiltPath & "\homeview.htm", True)
'write the html
BatFile.WriteLine ("<html>")
BatFile.WriteLine ("<head>")
BatFile.WriteLine ("    <title>In The Weeds</title>")
BatFile.WriteLine ("    <style type='text/css'>")
BatFile.WriteLine ("        .style9")
BatFile.WriteLine ("        {")
BatFile.WriteLine ("            font-family: Tahoma;")
BatFile.WriteLine ("            font-size: small;")
BatFile.WriteLine ("            font-weight: normal;")
BatFile.WriteLine ("            color: #1E279C;")
BatFile.WriteLine ("        }")
BatFile.WriteLine ("        Table")
BatFile.WriteLine ("        {")
BatFile.WriteLine ("        border-collapse:collapse;")
BatFile.WriteLine ("        }")
BatFile.WriteLine ("    </style>")
BatFile.WriteLine ("</head>")
BatFile.WriteLine ("<Body>")
BatFile.WriteLine ("<p align = 'left'>")

pcalabria is in the weeds

BatFile.WriteLine ("</body>")
BatFile.WriteLine ("</html>")
BatFile.WriteLine vbCrLf

BatFile.Close
Set BatFile = Nothing

Open in new window


More fun things include knocking stuff into Excel, and then walking rows and columns to create <tr> and <td>
Private Function CreateSummaryHTML(oSheet As Excel.Worksheet, NumCols, NumRows)
'ok I want to write the summary.htm out because Excel is unreliable to stream it directly to the Outlook page
'or to the web page
'create an html page in the correct location
Dim fs As Object
Dim BatFile As TextStream
Dim x As Integer
Dim y As Integer
Dim LineToWrite As String

Set fs = CreateObject("Scripting.FileSystemObject")
Set BatFile = fs.CreateTextFile("c:\tempPDF\Summary.htm", True)

'write the html for the page
'create the table
BatFile.WriteLine ("<table border = 1>")
For x = 1 To NumRows + 1
    'start the row
    
    BatFile.WriteLine ("<tr>")
    'If x = 1 Then LineToWrite = LineToWrite & "<strong>"
    For y = 1 To NumCols
        LineToWrite = ""
        'start the detail
        LineToWrite = LineToWrite & "<td>"
        If x = 1 Then LineToWrite = LineToWrite & "<strong>"
        'we are writing oSheets.Range(letter & number).value
        If IsDate(oSheet.Range(Chr(y + 64) & x).Value) = True Then
            LineToWrite = LineToWrite & Format(oSheet.Range(Chr(y + 64) & x).Value, "dd-mmm-yyyy")
        Else
            LineToWrite = LineToWrite & oSheet.Range(Chr(y + 64) & x).Value
        End If
        If x = 1 Then LineToWrite = LineToWrite & "</strong>"
        'end the detail
        LineToWrite = LineToWrite & "</td>"
        BatFile.WriteLine (LineToWrite)
    Next y
    'end the row
    BatFile.WriteLine ("</tr>")
Next x


BatFile.WriteLine ("</table>")
BatFile.WriteLine vbCrLf
BatFile.WriteLine vbCrLf

BatFile.Close
Set BatFile = Nothing

End Function

Open in new window


You've got something unwieldy to conquer.
My last suggestion was to modularize the SQL in the loops.
Next would be to modularize the html creation to one textfile per record in the recordset.
To open them one-by-one and concatenate them together is easily codable.

Here, though, is another thing re. your 3035 error
http://answers.microsoft.com/en-us/office/forum/office_2010-customize/how-do-i-prevent-run-time-error-3035-system/49b66089-4486-4a83-94e2-13eced66d686

This, too, seems to be a one-off thing to run
DAO.DBEngine.SetOption dbMaxLocksPerFile, 1000000

Nick67
The thing that makes this routine unique is that every 250 records or so, expands the filesize by 30MB, so 100,000 records, if stored in an Access table, would cause the table to grow by 12GB.

Why?
You aren't storing images in the tables, or files, are you?
Nick67,

Your procedure works, runs and sets the max query timeout to 2 minutes.   EXPR1000 is the given default name for the column.  120000 is the new value set.  This is global to the server and will remain that way until you issue a new Set Lock_Timeout command.
Just for the record and assuming your are talking about SQL Server, SET LOCK TIMEOUT has nothing to do with a query timeout.  It is certainly not global to the server and will only remain in effect until the session completes.

Incidentally the default is -1 which means it will wait forever.
Nick,
No, I'm not storing any images in the table, just links to images.
I'm surprised at that file size too, I guess its just a lot of characters.
I am, however, storing the code in the table, not in  a file as I'm sure you understand.

I have made the code changes as per your posting, and am testing now.  Just logged in to report back and saw Anthony's note...

Anthony,
Are you suggesting the Set Lock Timeout is not causing the problem?  If so, what do you think is?

It does look like the process times out.  It does not encounter an error, it just stops running with the Timer Expired message (more than an hour after the routine starts).
ASKER CERTIFIED SOLUTION
Avatar of Nick67
Nick67
Flag of Canada image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Nick,

I did experience the timeout error again, so I look forward to trying this new line of code:
oDB.QueryTimeout = qq

I added the line after about 15 of the 35 or more queries that I use in this routine.
I'll set qq in the initialization setup, so I can try different delays.

Its going to tie up the machine for several hours before I get any results, with this said, are you still feeling good that this is the line to day?  My concern is that I'm not actually running queries, am I ?


ie..

Set oRS.OpenRecordSource(strsqltext)
do while not ors.eof
<<Does this count as a query?>>
loop
are you still feeling good that this is the line to day?
All I can do is give you suggestions,
You haven't posted anything that will permit anything else.
I don't think you are addressing the root cause, but again, I can only guess.
You have something causing a query time out.
Generally, those will occur when something seizes a lock on a table or row and can't/won't release it.
Then, then next item to want to access it won't be permitted, and a timeout will occur.

You have a mix of technologies on the go.
There has been a suggestion that you should move all the data to SQL Server.
I'd concur with that.
I've suggested making your logic more atomistic, breaking your looping into two components.
An outer loop that pulls only enough fields to permit the inner workings to create their own, smaller SQL statements.

Other things to look at are read-only vs dynaset.
Anytime you ask for a dynaset, you are asking for a read-write dataset.
There's nothing wrong with that, but if two separate bits of your code try to alter the same row of the same table, things may collide and deadlock.
Breaking apart a monolithic procedure into numerous smaller functions can make the result more readable, but also more fault-tolerant as objects get closed out as their function ends.

So am I feeling good?  I am afraid at my end, it's a coin toss

<<Does this count as a query?>>
Yes.
We call our saved query definitions 'queries', but in reality a query is any SQL Statement that you execute, saved as a querydef, or just raw text executed for a recordset.

Dim qdf as QueryDef
Dim rs as recordset
Dim SQL as string
SQL = "Select SometingNice from TheGoodTable;"
Set qdf = Currentdb.QueryDefs("SomeSavedQuery")
qdf.SQL  = SQL

It pretty much immaterial to the do this
Set rs = qdf.OpenRecordset(dbOpenDynaset, dbSeechanges)
or
Set rs = db.OpenRecordset(SQL,dbOpenDynaset, dbSeechanges)

they're both queries.
Great.  That sounds good.  I will try it again tonight.
BTW... <are you still feeling good that this is the line to day?  >
should have been <are you still feeling good that this is the line to add? >

We can thank auto-correct for that sloppy correction.

BTW if you would like to see that end product of this process:

http://www.ebay.com/itm/15-PCS-LINEAR-IC-LF444CN-OP-AMP-QUAD-GP-18V-14-PIN-MDIP-RAIL-NATIONAL-444-/281647691811

I'm not sure if you can see it without logging into Ebay.  A good portion of the code is redundant, from listing to listing, ultimately, to upload data we need one row per item.
Definitely bumping up the query timeout limit won't hurt a machine that isn't doing end-user work
Bumping the query timeout to 2 minutes would be agony to an end user waiting for something to error out.
But it may be what you need -- because you WANT to wait long enough for the query to try to complete

Of course, if the cause of the timeout doesn't resolve itself within the course of the period you select (say because the issue is a lock of some sort that doesn't free itself) it won't help, either.

We are treating symptoms and not causes.

I went to the page and View'ed Source and saved it.
428 KB, so ~0.5 MB
So your size statements make sense.
How much is boilerplate that you could do a find and replace on?
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Nick67,

@Anthony Perkins has forgotten more about SQL Server than I will ever know
Thank you.  That was very kind of you, but you really should not underestimate your SQL Server knowledge that much.  I am sure you know more than you say you do.

I don't want my app to wait for eternity for a lock to clear -- for the end user, that appears as an application hang.
I agree with you 100%.  That is a lousy situation.  Even the default CommandTimeout of 30 seconds is too long, don't you think?.
@acperkins
mea culpa on the Lock_Timeout and how to lengthen the timeout for ADO or DAO is in ID: 40698632

My bad.
I have a few heavy, ugly queries that lengthening it helps with...
But only because I knock it down to 15 seconds in the first place.
I am sure you know more than you say you do.
I've made a SQL Server Express 2005 and now 2008 R2 instance run the backend of the business very well for about a decade now.
But only because they were very good products to start with, and I haven't been an idiot with them. :)
So why don't we address the real problem as I see it.

Your query looks like this (using aliases for clarity):
SELECT  s.*,
        c.Notes,
        c.TasPath1,
        p.PackageCode,
        m.IPSEname,
        m.ShortName,
        m.FullName,
        m.Abbreviation,
        p.Package,
        p.PackageImage,
        p.Leads,
        p.PackageMaterial,
        p.PackageName,
        p.MountingType,
        p.sellingQuantityIncrement,
        p.EnforceSellingQuantity,
        c.OrderNumber,
        c.DescriptionPrimary,
        c.title80,
        c.DescriptionRoot,
        c.Description,
        c.FamilyName,
        c.GenericName,
        c.FamilyName,
        c.PartClass,
        c.PartFamily,
        c.PartFamily2,
        c.FamilyDescription,
        c.Pb,
        c.DocName,
        c.DocName2,
        c.MSL,
        c.EURoHs,
        c.DataSheet,
        c.DateObsolete,
        c.ECCN,
        c.ScheduleB,
        c.PackageLength,
        c.PackageWidth,
        c.PackageHeight,
        c.SeatedPlaneHeight,
        c.PinPitch,
        c.PackageDiameter,
        c.MaximumReflow,
        c.OrderNoVerified,
        c.SEdate,
        c.SEautoscan,
        c.DistiVersion,
        c.OriginalCost,
        c.RecPrice,
        c.AskPrice,
        c.Value,
        c.tolerance,
        c.FullReelQty,
        c.TapeWidth,
        c.caseSize,
        c.TCR,
        c.Power,
        c.Voltage,
        c.OperatingRange,
        c.ReelSize,
        c.WebImage,
        c.SeriesName,
        c.Title80,
        c.Title255,
        c.SeriesID,
        c.PartName,
        c.ClassFamily,
        c.AlternateOrderNumber,
        c.AlternateOrderNumber2,
        c.AlternateOrderNumber3,
        c.AlternateOrderNumber4,
        c.AlternateOrderNumber5,
        c.TapePitch
FROM    SelectedParts s
        INNER JOIN ComponentMaster c ON s.SearchNumber = c.SearchNumber AND s.LineID = c.LineID
        LEFT JOIN ManufacturerMaster m ON c.LineID = m.LineID
        LEFT JOIN PackageMaster p ON c.PartClass = p.PartClass AND c.PackageCode = p.PackageCode

Open in new window


Step 1 should be run that in SSMS and look at the Actual Execution Plan.  If you are not comfortable doing that and you don't want to post it here, then we will have to address it the old fashioned way a sort of Q/A if you will.

1. How long did it take to run in SSMS?
2. How many rows did it return?
3.  Why are you still using *?  Do you really need all the columns in Selected Parts?  This is very typical in sloppy MS Access code, where you could get away with it.  It is never a good idea in SQL Server (or Sybase or Oracle or PostgreSQL or even MySQL)
4. Can you post the schema to all the tables involved (SelectedParts, ComponentMaster, ManufacturerMaster and PackageMaster) including all the indexes defined.
@acperkins
Part of the evil is that it's coming from three separate datasources.
Only one of them is SQL Server -- and that one only because @pcalabria hit the 2 GB file size limit for Access.
The other two remain in Access.

The suggestion has been put on the table that SQL Server Migration Assistant for MS Access should be used to move ALL the data to SQL Server.  It's easy to say ... but ensuring nothing breaks and pushing out new frontends is ... a bit time-consuming.

What's the upper limit for columns returned in SQL Server?  Part of why @pcalabria has things structured the way he does (Select *) is that he's got more than 255 'needed' fields.

Of course, I can rewrite the routine that fails so it could be run as a stand alone procedure to see what happens, but it would be so much simpler to simply solve the timeout expired problem since the existing code otherwise performs very well. (Functionally at least!)

I think, and I think you'd agree, that he really does need to refactor this.
It's always sad when you hit the day that things will no longer scale to what you need :)
On the other hand, it means you built something successful.

Nick67
Part of the evil is that it's coming from three separate datasources.
Than I am afraid I cannot help much.  My knowledge of MS Access is even less than what I have forgotten from SQL Server.

What's the upper limit for columns returned in SQL Server?
You can return as many as you want.  That is not the problem. The problem is that there is a penalty for each extra byte that needs to travel over the network.

he does (Select *) is that he's got more than 255 'needed' fields.
That is unfortunate.  But if they have tables with that many columns they have bigger problems than a pesky timeout that can be handled any number of ways.

I think, and I think you'd agree, that he really does need to refactor this.
Absolutely.  As it stands now, the very best you can do is return all the rows needed from SQL Server using a firehose (forward only, read only) cursor and then deal with them in MS Access.
Good morning.

I would again like to thank both of you for your ongoing help.
Here is some background, answers and comments:

The ComponentMaster, ManufacturerMaster, and PackageMaster databases are MS Access which are populated and read by an application with more than a thousand forms, queries, reports, and procedures using an application I first developed in 2000 that have continually updated.  Its used each day by my staff for everything they need to do.  Moving these tables to SQL Server is a huge project as the SQL statements developed for Access are not interchangeable with the statements required by SQL Server.  As a result, I do need to selectively migrate tables to SQL server, when size becomes the problem.

The SelectedParts table start as an MS Access table but was migrated to SQL (as Nick explained) because of the size restriction in MS Access.  This table is used by a brand new routine developed to list our inventory in ECommerce channels.  It collects and creates the fields each channel needs, and organizes them into a table, with one row per inventory item.

We don't expect to run this new routine very often.  Once a generates a listing for us we can leave the listing in place until the items sells, so performance is not critical.  The routine can be run after hours, so it doesn't matter whether it take three hours or six hours, as long as when we come back to work the "Process Completed" message is on the screen, and not an error message.  

<Select *>  I can get rid of Select * fairly easily, if you believe it will help.  

<firehouse> I can use a forward only cursor if it makes a difference.

<Actual Execution Plan>I've read a bit about this, but never created one or looked into this.  I'm willing to try but will have resistance to publishing any info that can help our competitors mimic what I've done in a public forum.
Moving these tables to SQL Server is a huge project as the SQL statements developed for Access are not interchangeable with the statements required by SQL Server.
The suggestion has been put on the table that SQL Server Migration Assistant for MS Access should be used to move ALL the data to SQL Server.

If you are not using this tool, you should be.  Unless you have done unhappy things like attachments, BLOBS and hyperlink fields -- and I don't think you have -- this tool pretty much seamlessly moves Access data to SQL Server.
The SQL syntax for using a linked table is exactly the same as using the local table was.
The creation of views and stored procedures does require some syntax changes, but not many, really and for the most part they are straight-forward.  Double-quotes for single quotes, percent for asterisks, and passing in dates as properly formatted string literals instead of between hashes are most of it.  Our friend Nz() is replaced by IsNull(), and IIF() by Case, but generally I've found that I can create a query in Access, copy the SQL and paste it into SSMS and sort out its issues in relatively short order for Views and simple stored procedures.

The biggest adjustment is that in SQL Server, you want to use set logic and not looping logic -- and you have the tools to do so, but your looping tools are pretty much gone (cursors are generally an evil crutch to use as a last resort)

12 GB of data when its done, yes?
That isn't happening in Access.
I take it that is going into SQL Server, so you will need something read-write on that end.

The tool creates new tables prefaced with dbo_
I renamed those all to the old table names and then NOTHING needed to change in code or in queries.
<Actual Execution Plan>I've read a bit about this, but never created one or looked into this.
Unfortunately that is not a valid option as there is only one tables in SQL Server.
Nick, you make it sound so easy! :-)

I would love to find a reasonable way to migrate to SQL Server and have used the SQL Server Migration Assistant for MS Access to move six of my estimated 1000 tables.  Most of the time it has worked very well.  Not all the time.  Numerous problems have been encountered. In fact, its one of my nightmares!

Imagine you have more than 1000 routines, some of which you haven't looked at in 10+ years.  Some of the routines use up to 30 different sql statements, and of course some use none.

You mentioned a few problems I haven't encountered yet.. so its worse than I though! LoL
Seriously, replacing * with % is annoying, but forgetting to remove # around dates simply causes the query to not return results without causing an error message that I can address!  Some query some where just doesn't do what I don't remember what it was supposed to do if I still need it do it but it probably took me a day to get it working in the first case and that's only one routine. ouch!

With this said, there are a few more tables I will need to migrate, so I'll be here looking for you and the other experts to help throw some water on the next fire.

BTW... I think the last change fixed the current problem, last change being objDB.QueryTimeout = qq
The codes been running 5 hours without an error message...so far!
I'll report back when the process finishes!
Seriously, replacing * with % is annoying, but forgetting to remove # around dates simply causes the query to not return results without causing an error message that I can address!  Some query some where just doesn't do what I don't remember what it was supposed to do if I still need it do it but it probably took me a day to get it working in the first case and that's only one routine. ouch!
And you are focusing on the easy catches, what about the fact that T-SQL is not the same as the SQL dialect used in MS Access.  So unless you are prepared to re-design and re-write you app you had best stay with MS Access.
Access on the front-end, certainly.
But on the back-end SQL Server.
UNLESS you are moving data operation into SQL Server via views and stored procedures, very little on a working Access app needs to change.
You run SSMA, and it keeps your old local tables as a safeguard, and pushed schema and data to SQL Server.
The one real complaint I had was that the new tables were all prefaces 'dbo_'
Shear that off and it was pretty much good-to-go off the nose.

Back in the day, there was grief with bit/Boolean fields, but that's been fixed.
In code, so long as you referred to TRUE and not -1, throwing dbSeeChanges into each
Set rs = db.OpenRecordset
statement was the major PITA.

I've flanged three subsidiary backends into the main application's SQL Server database when keeping them separate really no longer served any purpose.  There wasn't much grief to it.
@pcalabria's got me by about an order of magnitude.
I've got about 200 tables, 500 queries and 110K lines of code.

BUT

I DO NOT have much SQL in VBA code.
QueryDefs are my friend
Hand-editing SQL in the VBA editor is NOT any fun, and I have few instances of more that simple, perhaps 10-field-SELECTS hand-coded.

Most heavy lifting is done with recordsets, not SQL executes in VBA code
very little on a working Access app needs to change.
Unless of course you are using SQL, in which case all bets are off and you have no recourse but to re-design and re-write.
Thanks again for your help.

Nick, I have to agree completely with Anthony Perkins on this one:

<And you are focusing on the easy catches, what about the fact that T-SQL is not the same as the SQL dialect used in MS Access.  So unless you are prepared to re-design and re-write you app you had best stay with MS Access>

Although if I've had ten dollars each time an expert here has suggested I convert to MS SQL, it would n't need to write this software...ok, only kidding, but there is so much SQL, in routines that I haven't touched for so many years, my strategy is to move tables to the SQL Server only when necessary, usually due to size limitations.

As my MS Access backend files grow to about 1GB, I split them.  That seems to work pretty well and is pretty easy as long as its not one specific table that is growing... which is a problem I will be facing very soon....

Two more things...
Anthony was correct, Set Lock Timeout did not fix the problem.  Nick, your suggestion, to oDB.QueryTimeOut=qq solved my problem completely!  Thanks you both very much.  Your help has been incredible!

DOEVENTS COMMENT
@Nick
One more note, DoEvents is one of my best friends and this is why.  I use it  when things are going to take a while and I want the user to see screen updates to know what's going on.  I'll display a progress indicator, ie Processing record 12,000 of 585,000 and then use DoEvents to keep the user from rebooting because the screen turned white.  Not sure about the downsides you mentioned in my apps, but the upside, nice looking updated screens, is worth a slight performance penalty.  BTW, I only display updates every 1000 records or so as it seems the display card slows things down.
Great help guys.  SetQueryTimeout worked
oDB.QueryTimeOut=qq solved my problem completely!  
Not to rain on your parade, but lengthening the timeout is not a solution, it is a workaround.  See my previous comment
The ODBC driver is timing out, so you either have to optimize your query (as it stands it is could do with some tuning) or if you are unable to do that and as a last resort you have no choice but to lengthen the timeout.
Here is why, unless you have set the timeout to infinite (not recommended), you may be revisiting this to increase the timeout again.  I believe in American parlance that is called "punting the ball".  If you have to do this where there is a user interface than this is a recipe for disaster.
I agree Anthony, and thank you for your thoughts.

Just to be clear, however, the only think this routine is doing is cycling through the records, start to finish, forward only, in order to create html code which is stuffed into a single field in the SQL table.

I'm wondering if there may be any benefit in trying a different cursor, but my thought is that it just takes a long time to cycle through all the records.  I don't believe it is hitting any record locks because I was the only person logged into the system.

I used the variable qq so that I can set qq via the user interface, along with a default value and option to increase the value if the routine fail in the future.
DoEvents vs Sleep
Sleep can be a much more efficient way of keeping the UI responsive depending upon why you need the VBA to pause.

Unless of course you are using SQL, in which case all bets are off and you have no recourse but to re-design and re-write.  You can use Access-flavor SQL within Access using ODBC-linked backend tables without any alteration beyond dbSeeChanges and ensuring you use TRUE when you mean true and not using -1.

If you want to use Views, stored procedures and functions via passthroughs, well then you need to know T-SQL -- but you have to build those things in SQL Server Management Studio anyway, so the Books On Line and Google are there to help you knock it into shape.  Until you REALLY move beyond what Access is capable of, the syntax changes aren't major.

Although if I've had ten dollars each time an expert here has suggested I convert to MS SQL, it wouldn't need to write this software...ok, only kidding, but there is so much SQL, in routines that I haven't touched for so many years, my strategy is to move tables to the SQL Server only when necessary, usually due to size limitations.

I would respectfully go the other direction.  I'd move the small tables first.  You can do object dependency to see what queries and objects use them.  You can do CTRL-F to find where they are used in code.  The smaller and less critical they are, the easier that will be.  Cut your teeth on those.  Comment and document the code as you go. Work your way toward the bigger tables.  You will have been perusing the schema and code base as you work towards the more difficult cases and that will be useful.  Eventually, they'll all be in SQL Server and you'll be done.  Your present approach is crisis management.  That's not any fun.  And if I recall correctly, you've had your own personal crises to deal with.  Thinking about long-term maintainability should be moving to the forefront--and avoiding SQL Server except as a crutch is not going to make that easier.
Sleep?
Nick, I feel like you have forgotten more Access VBA than I've ever known!  I never heard of sleep!
When I want Access to pause I've been using a Routine I wrote called delay, where I pass the number of seconds, and loop through doevents.  Perhaps I'll try "sleeping".. (It would be nice to get some sleep with all of this crisis management I've been doing!)

With this said, I use doevents when I don't really want to cause a delay to read something on the screen, but when my video card is too slow to display something like : me.txtAnswer="12345".  With an XP machine and Dell 3000 MB video, if I run a query immediately after the statement above, Access never has a chance to display the answer before the query starts running, so the answer,  and user, must wait.  Doevents placed before the query, gives Access, or at least my video card, and chance to display the result before starting the query.

I've never heard of "object dependency" either.  Time to start researching....

BTW... access queries on SQL tables also need the the # removed and the * to % change... I'm not sure if there are other changes like those... still learning..
Sleep is WinAPI.
You can call WinAPI from VBA, but the syntax is C/C++, and therefore not for the faint-of-heart.
Highly useful though.
Like being able to access the computername and loginname of the present Access user & machine.
Start here
http://access.mvps.org/access/index.html

Sleep is pretty straightforward
In a code module, not a forms or report one!,
Public Declare Sub Sleep Lib "kernel32" (ByVal dwMilliseconds As Long)

DoEvents for a pause usually is constructed as a loop with some wait = timer type interior for a kick out.
That's flatout useless busywork for the processor and thread, wasting CPU cycles.
Sleep tells the OS not to bother returning the VBA thread to the top of the queue until x number of milliseconds have elapsed.  If you need the VBA to pause up while some exterior process completes (file copy, pdf creation, etc) Sleep is more efficient.

You have to have Name Autocorrect turned on for Object Dependency.  Most devs will turn that off in production.  
It's a global option to the mdb/accdb.
Once it is on, Access can trace what objects another depends upon, and which objects depend upon it.
Another VERY GOOD reason to have your SQL in saved Queries and not as text in your VBA code
User generated image
BTW... access queries on SQL tables also need the the # removed and the * to % change... I'm not sure if there are other changes like those... still learning..  As you can see in the image, almost everything is linked ODBC SQL Server tables (Access 2003) and no
SELECT * from tblinsDetails where [Date] = #2-Jan-2015# works just fine in both queries and code.

Now, for a passthrough -- which bypass Access entirely -- yes then I'd need SELECT * from tblinsDetails where [Date] = '2-Jan-2015'
Just to be clear, however, the only think this routine is doing is cycling through the records, start to finish, forward only, in order to create html code which is stuffed into a single field in the SQL table.
In the SQL Server world, "cycling through the records" is only done as a last resort and is commonly referred to as  RBAR (Row By Agonizing Row). Instead you have to start thinking in concepts of sets.

One thing you may want to try and do since you are "cycling through the records" is retrieve all the rows into an array using the ADO GetRows method.  Yes, I do realize you are still using DAO, but perhaps in this case it may be worth the switch for this one routine.  The concept is quite simple, you open the ADO recordset object and you then use the GetRows method to load an array in memory.  You can then close the Recordset and you can then navigate the array, like they taught us in the 80's. This should prove 100 times faster than going row by row.

You can use Access-flavor SQL within Access using ODBC-linked backend tables without any alteration
As Nick67 has pointed out it depends where the query is executed.  If it is on MS Access than it will use the MS Access SQL dialect and that means it will first have to retrieve all the rows for tables coming from SQL Server, which is precisely the problem you have now.
For a complete reference to the Windows API for VB there is no one that comes close to Dan Appleman's book "Visual Basic Programmer's Guide to the Win32 API.

For an online reference to more API calls than you will ever need I recommend VBnet Visual Basic Developers Resource Centre

But if all you want to do is make a sleep call, why not just use VBScript for that?
@Anthony
Great explanation- RBAR (Row By Agonizing Row)
I'm excited about trying this approach although my knowledge of ADO is minimal.
Do you have any code you can share to open a recordset using getrows, and then save it back?  I still remember the 80's so I'll have no trouble changing the data and manipulating the array.  Of course I'm not sure a 32-bit computer will have enough memory to work in this case, I have lots of RBAR code that will work.

@Nick
A couple of things...

<<SELECT * from tblinsDetails where [Date] = #2-Jan-2015# works just fine in both queries and code.>>

Was this an improvement to Access 2003 from Access 2000?  On my development system (XPpro workstations, MS SQL Server R2 2008 connected through LAN, Access 2000, tables linked using FileDSN) using the # does not work.  It does not cause an error, it simply does not return records.   The asterisk * does not work either when in code.

Leaving Name Autocorrect has been a major performance destroyer for me.  It's increased one hour processes to many many hours.  Are you suggesting that I may be able to turn in on to use dependencies, and then turn in off again when I'm ready to roll out a new version?
http://www.techrepublic.com/article/10-tips-for-using-wildcard-characters-in-microsoft-access-criteria-expressions/

I've done 99% of my developing in A2003, but there was no change that I know of.  Now, in the same Options location as you turn on/off AutoCorrect, you have an opportunity to force ANSI 92 syntax
User generated image Maybe that's been done?  I have no explanation as to why you have syntax issues.

Leaving Name Autocorrect has been a major performance destroyer for me
Not just you!  And that's why most devs leave it turned off except for when they need it
Are you suggesting that I may be able to turn in on to use dependencies, and then turn in off again when I'm ready to roll out a new version
Turn it off again as soon as you are done looking at Object Dependencies.
Yep, that's it Nick.
The ANSI 92 section is not part of Access 2K, at least Access 2000 SR1.

By the way,  ran the routine last night, and after 4+ hours received the message:

Processing Record 44500 of 55385
creating text for D01813H103MLD
system resource exceeded

Perhaps the query timeout = 12000 did not solve the problem.  Changed it to 24000 and running again.
Also spent four hours today getting rid of SelectedParts.*
Running with specific field names.

Was going to try the forwardonly cursor, but I didn't want to look the recordcount for my status message.
Yep, that's it Nick.
 The ANSI 92 section is not part of Access 2K, at least Access 2000 SR1


No, then that's not it.
The ANSI 92 choice FORCES Access to use ANSI 92 compliant syntax, which is not the default
BUT...
http://bytes.com/topic/access/answers/905809-wildcard-vba-search-using-where
Very likely with Access 2000 SR1 that you are not on Jet 4.0
And that'd be the issue.

Some day soon my friend, you have to get the people up the foodchain to cough up for new Office versions.  Sooner or later you're going to encounter a viral disaster.  And people who can help you with versions that old are getting thin on the ground -- as are Google searches that are applicable to it.
Office 2013 (and I think 2010 and 2007) has options to TURN OFF the Outlook security messages that keep you trapped 15 years in the past.
Nick, the link indicates Jet 4 is used with Access 2000...

More important, however, my routine just error out again.

I skipped the stuff ahead of the problem routine and kept track of time.  It was running for 52 minutes when the error occurred with the query timeout set to 24000...

So I'm back at square one.. still can't run the process:
Time Expired error is gone.  Now System Resource are a problem.

Error occurred after 52 minutes
Processing 45,300 of 55,385
error creating : WP900221L1T REV9 408560085
System Resource exceeded
QueryTimeout =24000

Any ideas?
Memory?
In your references, which comes first
Microsoft DAO x.x
or
Microsoft ActiveX Data Objects x.x

System Resource exceeded.
That's unhappy.
Conceivably, looking in Task Manager, is the MSAccess.exe process growing over 2 GB large?
It cannot.  It may not.  It will not.  That's a hard limit.
Look carefully at the code.
You are opening a lot of stuff doing this routine.
Are you religiously closing and set=nothing everything as well?

You have a lot you cannot show us.  That problem won't yield to inspired guesswork, I am afraid.
Do you have any code you can share to open a recordset using getrows, and then save it back?
GetRows is a one way street.  In other words, your retrieve the information read only.  In order to save it back you would have to execute an Update/Insert query for each row.

By the way,  ran the routine last night, and after 4+ hours received the message:
And that is precisely what I feared when I said this was a workaround and not a solution.   I just thought it would have bought you some more time.

At this point, I see no option but to re-design and re-write this portion of the code.  Suggest you do it in small batches of say 100 rows at a time.
@Nick
Spent hours last night reviewing the code but I did not find anything.  DAO reference comes before ActiveX Data objects in my code.  Also, my routine has been simplified (with a goto) to skip all but this one routine.  I create one recordet, movelast then movefirst so recordcount will work, and then move Row by Agonizing Row through each record.  The code does two three things... it analyzes pricing info in the database for each row and sets Quantity, [*StartPrice], and LotSize for each record, it concatenates up to 13 image fields to create a single field with image URLs separated by the pipe symbol, a creates html code that it saves in the [*Description] field.  Each pass through the loop things are reset, so I really don't understand the resource problem, I would think that each .update would  release resources???  In desperation, I guess, I could always close the recordset part way through, and then reopen it, but that seems like crazy program design.  Like perhaps groups of 20K records?  I can also separate out the code that creates the html and then run a second pass through the code to set prices and create the URL string... but I can't believe that uses many resources.... I suspect this portion of the code which is the problem, is the portion that creates the HTML. The routine starts by deleting everything in the selected parts table and there is very little of anything else in the database.  When the routine fails, the database that includes selectedparts is 4.2GB.  By the way, this is an XP Pro workstation with 3.5GB of RAM.  


@AnthonyPerkins
Just wrote Nick a response and then read yours....   Please check the boldface response and let me know your thoughts as to best strategy.

Thanks again!
How do you delete everything in the selected parts table?  I trust you are not doing the equivalent of DELETE SelectedParts, but rather TRUNCATE TABLE SelectedParts.  That alone should make a huge difference.
Yes, using s stored procedure to truncate and yes it made a huge difference.

The point I mas making is that the workstation has 3.5 GB of RAM and the routine that causes a resource error fails while creating a table that's about the same size as the machines memory.

Remember it's no longer failing because of a timeout error.  Each time through the loop I update and then reset all variables. Perhaps this does not recover the memory?
You could create a stupidly large page file
User generated imageinstead of a system managed one.

I wouldn't store the html in fields.
I'd store it in .txt or .htm files and store the path in the fields, if it were me.
@Nick,
Sorry for the delay in responding. Other responsibilities... :-(

Storing in an external file will not work in this application. The code must generate a csv file that can be imported into Ebay's file exchange system.  To import, I go to the Ebay file exchange page, click an "Upload File" button, then point at my csv file.  Each row contains one Ebay Auction item.  The description column is basically the html for the Ebay ad.

As far as the page file suggestion, are you suggesting that each time through the loop the system may be consuming memory that it does not release, and the operation continues until the system runs out of memory?

Here is a simplified version of my code:  Is it possible that the memory used by strT is not being released each time through the loop????

Public Sub MakeEbayFileExchangeFile

Call MakeHeader(strH)
Call MakeFooter(strF)

With oRS
Do while not .eof
strT="Product Description<br>"
strT=strT & "Part Number: " & !PartNumber & "<br>"
strT=strT & "Part Manufacturer: " & !Manufacturer & "<br>"
strT=strT & "Part Type: " & !PartType  & "<br>"
strT=strT & "RohS: " & !RohS & "<br>"
strT=strT & "Class: " & !Class& "<br>"
strT=strT & "Family: " & !Family& "<br>"
<and so forth and so forth>

.edit
!Description=strH & strT & strF
.update

.movenext
loop
End with

End Sub
The code must generate a csv file that can be imported into Ebay's file exchange system.  To import, I go to the Ebay file exchange page, click an "Upload File" button, then point at my csv file.

Building files is building files, my son.
It's Access VBA
FileSystemObject and TextStream are your friends.
Write whatever .csv file you want

In bold below, this will create a csv with a start and end time, and 1 through 100

Public BatFile As TextStream

Private Sub BuildBatFile()
Dim fs as Object
dim x as integer
Set fs = CreateObject("Scripting.FileSystemObject")
Set BatFile = fs.CreateTextFile("c:\temp\" & Year(Now) & "\" & Forms!SomeForm.SomeField & ".csv", True)
BatFile.WriteLine ("Start: " & Format(Time, "hh:mm:ss AMPM") &",")
for x = 1 to 100
    BatFile.WriteLine (x & ",")
next x
BatFile.WriteLine ("Stop: " & Format(Time, "hh:mm:ss AMPM")& ",")
BatFile.Close
Set BatFile = Nothing
Set fs = Nothing
end sub


Here is a simplified version of my code:  Is it possible that the memory used by strT is not being released each time through the loop
Who knows?
But if the running apps cause windows to exceed the size of the pagefile, you'll get system resource execeeded.  So it is worth a shot to check what the pagefile settings are, and perhaps adjust them.
The full folder structure has to exist, but here's a sample
build-csv.mdb
'create an html page in the correct location
Dim fs As Object
Dim BatFile As TextStream
Set BatFile = fs.CreateTextFile("c:\temp\index.htm", True)
BatFile.WriteLine ("<html>")
BatFile.WriteLine ("<head>")
BatFile.WriteLine ("    <title>page for pcalabria</title>")

BatFile.WriteLine ("</head>")
BatFile.WriteLine vbCrLf
BatFile.WriteLine vbCrLf

With oRS
 Do While not .eof
 BatFile.WriteLine("Product Description<br>")
 BatFile.WriteLine("Part Number: " & !PartNumber & "<br>")
 BatFile.WriteLine("Part Manufacturer: " & !Manufacturer & "<br>")
 BatFile.WriteLine("Part Type: " & !PartType  & "<br>")
 BatFile.WriteLine("RohS: " & !RohS & "<br>")
 BatFile.WriteLine("Class: " & !Class& "<br>")
 BatFile.WriteLine("Family: " & !Family& "<br>")
 BatFile.WriteLine("<and so forth and so forth>")
BatFile.WriteLine ("</body>")
BatFile.WriteLine ("</html>")
BatFile.Close
Set BatFile = Nothing

It can all get done
Thanks Nick,
although I may have oversimplified this project.

My code first creates a table called selectedparts which includes about 30 different fields.  These field are all necessary to use File Exchange.  They include things like selling price, best offer accept price, shipping charges, return policies, best offer accept messages, quantity, title, yada, yada, yada.

In addition one of the 30 or so fields is called Description.. (Actually [*Description]), and this is the field causing the problem.

Upon creation of the table, that allows me to run other routines which allows me to select which parts I want to list, by setting a Boolean field to true.  I can also review the table, part by part, and make manual changes to each listing.

Upon selecting which fields I want to use,  I can create a new listing by clicking a button that runs a routine called MakeEbayCSVFile.

So, getting this code to create the description field really is what I need to do!
I understand that the complexity is off-the-hook.
But I don't see any reason whatsoever that the contents of [*Description] that you presently jam into a table -- and seem to be at the root of some of your grief -- can't be made into text files and stored on the filesystem, to be retrieved and dealt with in later routines.

Functionally, in the end, there isn't any difference to reading out a memo or nvarchar(max) field into a string in code and doing something with it, and this

Dim fs As Object
Set fs = CreateObject("Scripting.FileSystemObject")
Dim f As TextStream
Dim theContents as string
Set f = fs.OpenTextFile("c:\temp\myfile.txt", ForReading, TristateFalse)
Do While Not f.AtEndOfStream
    theContents  = theContents & f.ReadLine &vbcrlf
    If f.AtEndOfStream Then
        Exit Do
    End If
Loop

Except that this isn't going to require jamming 12GB of text data into the database.
It's just going to open and read textfiles instead.
Which may, or may not, be a better approach than blowing in wicked amounts of text to a nvarchar(max) field and pulling them back.

In the end, it is the .csv files that you want, correct?
Then why not architect it to create them without the intermediate step of punching a whole pile of text into the database?
Nick,
Okay, I'm trying to give this concept fair consideration.
If I understand you are suggesting that my routine which creates the html for [*Description] be modified to create a text file for each item instead.  In this case, I would use our ten digit house number as the file name and add a .txt extension.

Here are a couple of questions before I begin coding.

Q1-We currently have more than 250,000 line items in stock, and this number grows ever day.  Will you expect any problems if I try to create say, 500,000 or more text files and store them into a single NTFS directory?

Q2-Does it make sense that the reason the routine runs out of memory is because it is storing the html in a table?
I currently build the HTML code into a variable (strT) as the actually creating of the code is much more complex than the example I provided (for example, I only store a Y, N, or U in the !Rohs field and use this value to write a context sensitive sentence).  To create the text file all I really have to do is replace one line of code,  ![Description]=strT with a call to a subroutine which creates the text file.  (Perhaps I should first rem out this line and run the routine to insure that it does not cause the same resource problem).

Q3-Does it make more sense to move the existing routine that creates the SelectedParts table, without changes to Access 2013 and run it from the same machine as MS SQL Server is running?  This 64-bit machine uses Windows 7 Professional, has multiple cores, and  10GB of memory?